February 20, 2018

Learning to Learn

“I didn’t know robots had advanced so far,” a reader remarked after last week’s post about how computers are displacing knowledge workers. What changed to make that happen? The machines learned how to learn.

This is from Artificial Intelligence Goes Bilingual—Without A Dictionary, Science Magazine, Nov. 28, 2017.

“Imagine that you give one person lots of Chinese books and lots of Arabic books—none of them overlapping—and the person has to learn to translate Chinese to Arabic. That seems impossible, right?” says . . . Mikel Artetxe, a computer scientist at the University of the Basque Country (UPV) in San Sebastiàn, Spain. “But we show that a computer can do that.”

Most machine learning—in which neural networks and other computer algorithms learn from experience—is “supervised.” A computer makes a guess, receives the right answer, and adjusts its process accordingly. That works well when teaching a computer to translate between, say, English and French, because many documents exist in both languages. It doesn’t work so well for rare languages, or for popular ones without many parallel texts.

[This learning technique is called] unsupervised machine learning. [A computer using this technique] constructs bilingual dictionaries without the aid of a human teacher telling them when their guesses are right.

Hmm. . . . I could have used that last year, when my wife and I spent three months visiting our daughter in South Korea. The Korean language is ridiculously complex; I never got much past “good morning.”

Go matches were a standard offering on the gym TV’s where I worked out. (Imagine two guys in black suits staring intently at a game board — not exactly a riveting workout visual.) Go is also ridiculously complex, and mysterious, too: the masters seem to make moves more intuitively than analytically. But the days of human Go supremacy are over. Google wizard and overall overachiever Sebastian Thrun[1] explains why in this conversation with TED Curator Chris Anderson:

Artificial intelligence and machine learning is about 60 years old and has not had a great day in its past until recently. And the reason is that today, we have reached a scale of computing and datasets that was necessary to make machines smart. The new thing now is that computers can find their own rules. So instead of an expert deciphering, step by step, a rule for every contingency, what you do now is you give the computer examples and have it infer its own rules.

A really good example is AlphaGo. Normally, in game playing, you would really write down all the rules, but in AlphaGo’s case, the system looked over a million games and was able to infer its own rules and then beat the world’s residing Go champion. That is exciting, because it relieves the software engineer of the need of being super smart, and pushes the burden towards the data.

20 years ago the computers were as big as a cockroach brain. Now they are powerful enough to really emulate specialized human thinking. And then the computers take advantage of the fact that they can look at much more data than people can. AlphaGo looked at more than a million games. No human expert can ever study a million games. So as a result, the computer can find rules that even people can’t find.

Thrun made those comments in April 2017. AlphaGo’s championship reign was short-lived: it was unseated a mere six months by a new cyber challenger that taught itself without reviewing all that data. This is from “AlphaGo Zero Shows Machines Can Become Superhuman Without Any Help,” MIT Technology Review, October 18, 2017.

AlphaGo wasn’t the best Go player on the planet for very long. A new version of the masterful AI program has emerged, and it’s a monster. In a head-to-head matchup, AlphaGo Zero defeated the original program by 100 games to none.

Whereas the original AlphaGo learned by ingesting data from hundreds of thousands of games played by human experts, AlphaGo Zero started with nothing but a blank board and the rules of the game. It learned simply by playing millions of games against itself, using what it learned in each game to improve.

The new program represents a step forward in the quest to build machines that are truly intelligent. That’s because machines will need to figure out solutions to difficult problems even when there isn’t a large amount of training data to learn from.

“The most striking thing is we don’t need any human data anymore,” says Demis Hassabis, CEO and cofounder of DeepMind [the creators of AlphaGo Zero].

“By not using human data or human expertise, we’ve actually removed the constraints of human knowledge,” says David Silver, the lead researcher at DeepMind and a professor at University College London. “It’s able to create knowledge for itself from first principles.”

Did you catch that? “We’ve removed the constraints of human knowledge.” Wow. No wonder computers are elbowing all those knowledge workers out of the way.

What’s left for human to do? We’ll hear from Sebastian Thrun and others on that topic next time.


[1] Sebastian Thrun’s TED bio describes him as “an educator, entrepreneur and troublemaker. After a long life as a professor at Stanford University, Thrun resigned from tenure to join Google. At Google, he founded Google X, home to self-driving cars and many other moonshot technologies. Thrun also founded Udacity, an online university with worldwide reach, and Kitty Hawk, a ‘flying car’ company. He has authored 11 books, 400 papers, holds 3 doctorates and has won numerous awards.”

 

Kevin Rhodes writes about individual growth and cultural change, drawing on insights from science, technology, disruptive innovation, entrepreneurship, neuroscience, psychology, and personal experience, including his own unique journey to wellness — dealing with primary progressive MS through an aggressive regime of exercise, diet, and mental conditioning.

Check out Kevin’s latest LinkedIn Pulse article: Leadership and Life Lessons From an Elite Athlete and a Dying Man.

Capitalism on the Fritz

Capitalism on the Fritz[1]

In November 2008, as the global financial crash was gathering pace, the 82-year-old British monarch Queen Elizabeth visited the London School of Economics. She was there to open a new building, but she was more interested in the assembled academics. She asked them an innocent but pointed question. Given its extraordinary scale, how as it possible that no one saw it coming?

The Queen’s question went to the heart of two huge failures. Western capitalism came close to collapsing in 2007-2008 and has still not recovered. And the vast majority of economists had not understood what was happening.

That’s from the Introduction to Rethinking Capitalism (2016), edited by Michael Jacobs and Mariana Mazzucato.[2] The editors and authors review a catalogue of chronic economic “dysfunction” that they trace to policy-makers’ continued allegiance to neoliberal economic orthodoxy even as it has been breaking down over the past four decades.

Before we get to their dysfunction list, let’s give the other side equal time. First, consider an open letter from Warren Buffett published in Time last week. It begins this way:

“I have good news. First, most American children are going to live far better than their parents did. Second, large gains in the living standards of Americans will continue for many generations to come.”

Mr. Buffett acknowledges that “The market system . . . has also left many people hopelessly behind,” but assures us that “These devastating side effects can be ameliorated,” observing that “a rich family takes care of all its children, not just those with talents valued by the marketplace.” With this compassionate caveat, he is definitely bullish on America’s economy:

In the years of growth that certainly lie ahead, I have no doubt that America can both deliver riches to many and a decent life to all. We must not settle for less.

So, apparently, is our Congress. The new tax law is a virtual pledge of allegiance to the neoliberal economic model. Barring a significant pullback of the law (which seems unlikely), we now have eight years to watch how its assumptions play out.

And now, back to Rethinking Capitalism’s dysfunction’s list (which I’ve seen restated over and over in my research):

  • Production and wages no longer move in tandem — the latter lag behind the former.
  • This has been going on now for several decades,[3] during which living standards (adjusted) for the majority of households have been flat.
  • This is a problem because consumer spending accounts for over 70% of U.S. GDP. What hurts consumers hurts the whole economy.
  • What economic growth there has been is mostly the result of spending fueled by consumer and corporate debt. This is especially true of the post-Great Recession “recovery.”
  • Meanwhile, companies have been increasing production through increased automation — most recently through intelligent machines — which means getting more done with fewer employees.
  • That means the portion of marginal output attributable to human (wage-earner) effort is less, which causes consumer incomes to fall.
  • The job marketplace has responded with new dynamics, featuring a worldwide rise of “non-standard’ work (temporary, part-time, and self-employed).[4]
  • Overall, there has been an increase in the number of lower-paid workers and a rise in intransigent unemployment — especially among young people.
  • Adjusting to these new realities has left traditional wage-earners with feelings of meaninglessness and disempowerment, fueling populist backlash political movements.
  • In the meantime, economic inequality (both wealth and income) has grown to levels not seen since pre-revolution France, the days of the Robber Barons, and the Roaring 20s.
  • Economic inequality means that the shrinking share of compensation paid out in wages, salaries, bonuses, and benefits has been dramatically skewed toward the top of the earnings scale, with much less (both proportionately and absolutely) going to those at the middle and bottom. [5]
  • Increased wealth doesn’t mean increased consumer spending by the top 20% sufficient to offset lost demand (spending) by the lower 80% of income earners, other than as reflected by consumer debt.
  • Instead, increased wealth at the top end is turned into “rentable” assets — e.g., real estate. intellectual property, and privatized holdings in what used to be the “commons” — which both drives up their value (cost) and the rent derived from them. This creates a “rentier” culture in which lower income earners are increasingly stressed to meet rental rates, and ultimately are driven out of certain markets.
  • Inequality has also created a new working class system, in which a large share of workers are in precarious/uncertain/unsustainable employment and earning circumstances.
  • Inequality has also resulted in limitations on economic opportunity and social mobility — e.g., there is a new kind of “glass floor/glass ceiling” below which the top 20% are unlikely to fall and the bottom 80% are unlikely to rise.
  • In the meantime, the social safety nets that developed during the post-WWII boom (as Buffett’s “rich family” took care of “all its children”) have been largely torn down since the advent of “workfare” in the 80’s and 90’s, leaving those at the bottom and middle more exposed than ever.

The editors of Rethinking Capitalism believe that “These failings are not temporary, they are structural.” That conclusion has led some to believe that people like Warren Buffett are seriously misguided in their continued faith in Western capitalism as a reliable societal institution.

More on that next time.


[1] I wondered where the expression “on the fritz” came from, and tried to find out. Surprisingly, no one seems to know.

[2] Michael Jacobs is an environmental economist and political theorist; at the time the book was published, he was a visiting professor at University College of London. Mariana Mazzucato is an economics professor at the University of Sussex.

[3] “In the US, real median household income was barely higher in 2014 than it had been in 1990, though GDP had increased by 78 percent over the same period. Though beginning earlier in the US, this divergence of average incomes from overall economic growth has not become a feature of most advanced economies.”  Rethinking Capitalism.

[4] These have accounted for “half the jobs created since the 1990s and 60 per cent since the 2008 crisis.” Rethinking Capitalism.

[5] Meanwhile, those at the very top of the income distribution have done exceedingly well… In the US, the incomes of the richest 1 percent rose by 142 per cent between 1980 and 2013 (from an average of $461,910, adjusted for inflation, to $1,119,315) and their share of national income doubled, from 10 to 20 per cent. In the first three years of the recovery after the 2008 crash, an extraordinary 91 per cent of the gains in income went to the richest one-hundredth of the population.” Rethinking Capitalism.

 

Kevin Rhodes left a successful long-term law practice to scratch a creative itch and lived to tell about it… barely. Since then, he has been on a mission to bring professional excellence and personal wellbeing to the people who learn, teach, and practice the law. He has also blogged extensively and written several books about his unique journey to wellness, including how he deals with primary progressive MS through an aggressive regime of exercise, diet, and mental conditioning.

The Stupidity Paradox

Every day I ride a bus that has a row of seats up front that are folded up, with a sign next to them:

NOTICE
Seats Not in Service
The bus manufacturer has determined
that these seats not be used.

I’ve seen that sign for over a year. Never really thought about it. But recently I wondered: you don’t suppose both those seats and the sign were installed in the factory? It could happen — cheaper than a recall maybe. If so, it would be right in line with this week’s topic: a kind of on-the-job behavior that professors and business consultants Mats Alvesson and André Spicer[1] call The Stupidity Paradox.

Their book by that name began when they were sharing a drink after a conference and found themselves wondering, “Why was it that organisations which employed so many smart people could foster so much stupidity?” They concluded that the cause is “functional stupidity” — a workplace mindset implicitly endorsed because it works.

“We realized something: smart organisations and the smart people who work in them often do stupid things because they work — at least in the short term. By avoiding careful thinking, people are able to simply get on with their job. Asking too many questions is likely to upset others — and to distract yourself. Not thinking frees you up to fit in and get along. Sometimes it makes sense to be stupid.”

In fact, stupidity works so well it can turn into firm culture:

Far from being “knowledge intensive,” many of our most well-known chief organisations have become engines of stupidity. We have frequently seen otherwise smart people stop thinking and start doing stupid things. They stop asking questions. They give no reasons for their decisions. They pay no heed to what their actions cause. Instead of complex thought we get flimsy jargon, aggressive assertions or expert tunnel vision. Reflection, careful analysis and independent reflection decay. Idiotic ideas and practices are accepted as quite sane. People may harbour doubts, but their suspicions are cut short. What’s more, they are rewarded for it. The upshot is a lack of thought has entered the modus operandi of most organisations of today.

I.e., it pays to be stupid on the job: you get things done, satisfy expectations, don’t stand out from the crowd, aren’t labelled a troublemaker. We learned all of that in middle school; we learn it again on the job.

We learn from management:

A central, but often unacknowledged, aspect of making a corporate culture work is what we call stupidity management. Here managers actively encourage employees not to think too much. If they do happen to think, it is best not to voice what emerges. Employees are encouraged to stick within clearcut parameters. Managers use subtle and not so subtle means to prod them not to ask too many tough questions, not to reflect too deeply on their assumptions, and not to consider the broader purpose of their work. Employees are nudged to just get on with the task. They are to think on the bright side, stay upbeat and push doubts and negative thoughts aside.

And then we school ourselves:

Self-stupifying starts to happen when we censor our own internal conversations. As we go through our working day, we constantly try to give some sense to our often chaotic experiences. We do this by engaging in what some scholars call “internal reflexivity.” This is the constant stream of discussion that we have with ourselves. When self-stupidification takes over, we stop asking ourselves questions. Negative or contradictory lines of thinking are avoided. As a result, we start to feel aligned with the thoughtlessness we find around us. It is hard to be someone who thinks in an organization that shuns it.

Back to the seats on my bus… A “manufacturer” is a fiction, like “corporation” is a fiction: both act through humans. Which means that somewhere there’s an employee at a bus manufacturer whose job is to build those seats. Someone else installs them. Someone else puts up the sign. And lots of other people design, requisition, select, negotiate, buy, ship, pack and unpack, file, approve, invoice, pay bills, keep ledgers, maintain software, write memos, confer with legal, hold meetings, and make decisions. All so that the “manufacturer” — i.e., the sum total of all those people doing their jobs — can tell me not to sit there.

Functional stupidity is as common as traffic on your commute. We’ll look more into it next time.


[1] Mats Alvesson is Professor of Business Administration at the University of Lund, Sweden, University of Queensland, and Cass Business School, City University of London. André Spicer is Professor of Organisational Behaviour at Cass Business School, City University of London.

 

Kevin Rhodes is on a mission to bring professional excellence and personal wellbeing to the people who learn, teach, and practice the law. His past blog posts for the CBA have been collected in two volumes — click the book covers for more information.

Could Be Worse

Meaningless work is not inevitable, but we’re often prevented from taking remedial action because our thinking has become corrupted with feelings of powerlessness. As Studs Terkel said in his book Working:

You know, “power corrupts, and absolute power corrupts absolutely.”
It’s the same with powerlessness.
Absolute powerlessness corrupts absolutely.

If we believe there’s something patriotic, virtuous, even sacred about the way we have always viewed working for a living, then if we feel despair about our jobs it must be a personal problem, a character flaw. We ought to put up, shut up, and get cracking. The shame associated with that kind of judgment is absolutely disempowering. As long as we hold onto it, we’ll stay stuck in workplace despair and meaning malaise — a state of mind poet Richard Cecil captures in “Internal Exile,” collected in Twenty First Century Blues (2004):

Although most people I know were condemned
Years ago by Judge Necessity
To life in condos near a freeway exit
Convenient to their twice-a-day commutes
Through traffic jams to jobs that they dislike
They didn’t bury their heads in their hands
And cry “oh, no!” when sentence was pronounced:
Forty years accounting in Duluth!
Or Tenure at Southwest Missouri State!
Instead, they mumbled, not bad. It could be worse,
When the bailiff, Fate, led them away
To Personnel to fill out payroll forms
And have their smiling ID photos snapped.

And that’s what they still mumble every morning
Just before their snooze alarms go off
When Fluffy nuzzles them out of their dreams
Of making out with movie stars on beaches.
They rise at five a.m. and feed their cats
And drive to work and work and drive back home
And feed their cats and eat and fall asleep
While watching Evening News’s fresh disasters —
Blown-up bodies littering a desert
Fought over for the last three thousand years,
And smashed-to-pieces million-dollar houses
built on islands swept by hurricanes.

It’s soothing to watch news about the places
Where people literally will die to live
When you live someplace with no attractions —
Mountains, coastline, history—like here,
Where none aspire to live, though many do.
“A great place to work, with no distractions”
Is how my interviewer first described it
Nineteen years ago, when he hired me.
And, though he moved the day that he retired
To his dream house in the uplands with a vista,
He wasn’t lying—working’s better here
And easier than trying to have fun.

Is that the way it is where you’re stuck, too?

Good question. How would you answer it?

True, one of the factors behind job wretchedness is internal exile: we’re estranged from what we really want out of our work, or we’ve given up on ever having it, and so we settle for could be worse. But there’s more to it than that. There are external factors at work, too — global winds of change propelling people who want to work with passion in directions they never thought they’d be going.

There are krakens out there in the deep. One of them is something two business writers call the “Stupidity Paradox”: a prevalent workplace model that — like the bureaucracies we looked at last week — encourages obeisance to rules (we might say “best practices”) at the cost of independent thinking.

We’ll look at the Stupidity Paradox next time.

 

Kevin Rhodes left a successful long-term law practice to scratch a creative itch and lived to tell about it… barely. Since then, he has been on a mission to bring professional excellence and personal wellbeing to the people who learn, teach, and practice the law. He has also blogged extensively and written several books about his unique journey to wellness, including how he deals with primary progressive MS through an aggressive regime of exercise, diet, and mental conditioning.

Professional Paradigms New and Old (Part 7): Traumatic Transformation, and What Do You Do When Your Paradigm is Done Shifting?

Professional paradigm shifts require transformation not just for the profession’s culture, but for the individuals in it.

wired%20to%20createIn their book Wired to Create: Unraveling the Mysteries of the Creative Mind, authors Scott Barry Kaufman and Carolyn Gregoire identify several ways individual paradigm-shifting transformation gets started. One is inspiration, which they say comes in three stages:

The first stage is that unsolicited moment when we feel inspired, “by a role model, teacher, experience, or subject matter.”

“Next comes transcendent awakening — a moment of clarity and an awareness of new possibilities.

“Which leads to the third hallmark feature of inspiration: a striving to transmit, express, or actualize a new idea, insight, or vision.” (Emphasis in original.)

Individual paradigm shifts are also prompted by traumatic life events, resulting in what psychologists call “posttraumatic growth.” Again from Wired to Create:

After a traumatic event, such as a serious illness or loss of a loved one, individuals intensely process the event—they’re constantly thinking about what happened, and usually with strong emotional reactions.

[T]his kind of repetitive thinking is a critical step toward thriving in the wake of a challenge… we’re working hard to make sense of it and to find a place for it in our lives that still allows us to have a strong sense of meaning and purpose.

I have personal experience with both inspiration and trauma. As I wrote a couple weeks ago, “I have a personal, real-time, vested interest in change because I’ve been on a steep personal transformation learning curve for nearly a decade — for all sorts of reasons I’ve written about in my books, my personal blog, and sometimes in this column.” Learning, writing, and conducting workshops about the psychological and neurological dynamics of transformation has been has been my way of being proactive about something I’ve come to call “traumatic transformation.”

ApocalypseIn fact, I just finished a new book that completes my decade-long intensive on personal transformation. As always, I’ve learned a lot writing it, but the most startling discovery is that paradigm shifts don’t go on forever: a time actually comes when the new fully replaces the old. Now that I’ve finished it, I can see that writing the book was in part a way for me to bring closure to my years of personal paradigm shifting.

That being the case, I’ve decided that it’s time for me to set aside my transformation journey and let its lessons play out for awhile. Which is why, after today’s post, I’m going to take an indefinite vacation from writing this column. At this point, I have no fresh thoughts to add to what I’ve been writing about for the past several years. Instead of repeating myself, I want to take a break and see if anything new comes up. If so, I’ll come back and share it.

In the meantime, my endless thanks to the Colorado Bar Association and CBA-CLE and to my fabulous editor Susan Hoyt for letting me trot out my research and theories and personal revelations in this forum. And equally many thanks to those of you who’ve read and thought about and sometimes even taken some of these ideas to heart and put them into practice.

On the wall above the desk where I write, I have a dry-mounted copy of the very last Sunday Calvin and Hobbes comic strip, which I cut out of the newspaper the morning it ran. (Speaking of paradigm shifts, remember newspapers?) There’s a fresh snow, and our two heroes hop on their sled and go bouncing down a hill as Calvin exults, “It’s a magical world, Hobbes ol’ buddy… Let’s go exploring!”

I suspect Calvin and Hobbes are still out there, exploring. I plan to join them.

You?

Apocalypse: Life On The Other Side Of Over was just published yesterday. It’s a free download from the publisher, like my other books. Or click on this link or the book cover for details.

And if we don’t run into each other out there exploring, feel free to email me.

 

Professional Paradigms New and Old (Part 6): Law Beyond Blame

rhodes(At the end of last week’s post, I promised a follow up this week. We’ll get to that next week. In the meantime, the following was just too pertinent to pass up.)

In several posts over the past couple years, we’ve looked at how technology acts as a disruptive innovator, shifting paradigms in the legal profession. I recently came across another disruptor: the biology of the brain. Its implications reach much further than, let’s say, Rocket Lawyer.

David Eagleman is his own weather system. Here’s his website — talk about creds. His short bio is “a neuroscientist at Baylor College of Medicine, where he directs the Laboratory for Perception and Action, and the Initiative on Neuroscience and the Law.” The latter’s website posts news about “neulaw,” and includes CLE offerings. Among other things, neulaw tackles a bastion of legal theory: the notion of culpability.

Incognito_Cover_EaglemanEagleman’s book Incognito: The Secret Lives of the Brain contains a long chapter entitled “Why Blameworthiness Is The Wrong Question.” It begins with the story of Charles Whitman, who climbed a tower at the University of Texas in August 1966 and started shooting, leaving 13 people dead and 38 wounded before being killed himself. He left a suicide note that included the following:

“I do not understand myself these days. I am supposed to be an average reasonable and intelligent young man. However, lately (I cannot recall when it started) I have been a victim of many unusual and irrational thoughts… If my life insurance policy is valid please pay off my debts… donate the rest to a mental health foundation. Maybe research can prevent further tragedies of this type.”

Whitman’s brain was examined and a tumor was found in the sector that regulates fear and aggression. Psychologists have known since the late 1800s that impairment in this area results in violence and social disturbance. Against this backdrop, Eagleman opens his discussion of blameworthiness with some good questions:

Does this discovery of Whitman’s brain tumor modify your feelings about his senseless murdering? If Whitman had survived that day, would it adjust the sentencing you would consider appropriate for him? Does the tumor change the degree to which you consider it “his fault”?

On the other hand, wouldn’t it be dangerous to conclude that people with a tumor are somehow free of guilt, or that they should be let off the hook for their crimes?

The man on the tower with the mass in his brain gets us right into the heart of the question of blameworthiness. To put it in the legal argot: was he culpable?

The law has accommodated impaired states of mind for a long time, but Eagleman’s analysis takes the issue much further, all the way to the core issue of free will, as currently understood not by moral and ethical theorists but by brain science. Incognito is an extended examination of just how much brain activity occurs beneath the level of conscious detection, in both “normal” and impaired persons. Consider these excerpts:

[T]he legal system rests on the assumption that we do have free will — and we are judged based on this perceived freedom.

As far as the legal system sees it, humans . . . use conscious deliberation when deciding how to act. We make our own decisions.

Historically, clinicians and lawyers have agreed on an intuitive distinction between neurological disorders (“brain problems”) and psychiatric disorders (“mind problems”).

The more we discover about the circuitry of the brain, the more the answers . . . move toward the details of the biology. The shift from blame to science reflects our modern understanding that our perceptions and behaviors are controlled by inaccessible [neurological] subroutines that can be easily perturbed.

[A] slight change in the balance of brain chemicals can cause large changes in behavior. The behavior of the patient cannot be separated from his biology.

Think about that for a moment — as a lawyer, and as a human being. The idea that our biology controls our behavior — not our state of mind or conscious decision-making — is repugnant not only to the law, but to our everyday perceptions of free will and responsibility. Tamper with free will, and a whole lot of paradigms — not just legal notions of culpability — come crashing down.

Eagleman’s discussion of these issues in Incognito is detailed and thoughtful, and far too extensive to convey in this short blog post. If you’re intrigued, I recommend it highly.

Kevin Rhodes has been a lawyer for over 30 years. Drawing on insights gathered from science, technology, disruptive innovation, entrepreneurship, neuroscience, and psychology, and also from his personal experiences as a practicing lawyer and a “life athlete,” he’s on a mission to bring wellbeing to the people who learn, teach, and practice the law.

Professional Paradigms New and Old (Part 5): Why Change if We Don’t Have To?

rhodesWhy change if we don’t have to?

Good question. I Googled it. The most hits were about the hazards of not changing your car’s oil, plus a few along the same lines about furnace filters or the water filter on the fridge. There was one about changing your underwear, and a few about lifestyle changes related to health issues. All of those are maintenance issues — mechanical, hygiene, health — which we would generally consider have to’s.

What about changing to keep up with the competitive pressures of the marketplace? There’s a lot of keeping up with the Joneses thinking out there, but in my observation, making yourself afraid of what the competition might do rarely results in anything other than drama. No have to in that.

Recently, at a CLE workshop in South Carolina, a participant asked, “Aren’t there some things we don’t need to change?” The question brought me up short, reminded me why we were investing a whole day talking about change: we were there to enhance professionalism, help us do our work better, keep us ethical, and maybe even help us to be happy practicing law — or find the courage to get out. That’s why we needed to talk about things like law school inflicted brain damage, lawyer substance abuse, depression, anxiety, and suicide, and the value of personal happiness in supporting ethical behavior. Some things are broken and need to be fixed, and some things we do to keep our edge — both are necessary maintenance, part of our professional have to’s.

But there was a second part to my answer. Beyond those maintenance issues, I agree: let’s not change if we don’t want to. I’m not sure it’s even possible. I do know that grudging change never seems to work.

I say that even though I think and write a lot about change — particularly the psychological and neurological dynamics of personal transformation. (You may have noticed.) If I were still in law practice, I would no doubt be incorporating the not-so-futuristic practice developments into my firm, and otherwise actively engaging with the huge paradigm shift happening in our profession.

But that’s not everybody’s choice, and I get that. They’re content to let those developments play out by the process of cultural evolution. If a day comes that threatens obsolescence beyond mere fear-mongering, it will become a shared maintenance issue, and we’ll take care of it together… but probably not before.

All that went into my answer to the question in South Carolina. Which made me ask myself once again what’s behind my own commitment to change. Bottom line is, I have a personal, real-time, vested interest in change because I’ve been on a steep personal transformation learning curve for nearly a decade — for all sorts of reasons I’ve written about in my books, my personal blog, and sometimes in this column. Thinking and writing about it is my way of being proactive about my own best interests.

More next time on why that’s relevant to this blog.

Rhodes_4

Check out this collection of last year’s Future of Law blog posts. It’s a FREE download. Also included is the Culture of Law series from the second half of 2015. Click this link or the cover for downloading details.

Professional Paradigms New and Old (Part 4): Failure As A Virtue

rhodesAs we saw last week, one way to engage with a paradigm shift is to “walk in stupid every day.” That won’t be easy for professionals: our job is to be smart; our brains are culturally wired with that expectation. Being “stupid” turns that cultural expectation on its ear, makes our brain circuits fritz.

So does another powerful paradigm-busting tool: learning to embrace failure. Professional cultural paradigms include conventional wisdom about how to succeed; flying in the face of them is a set up for failure.

In their book Wired to Create (which we looked at last time), Scott Barry Kaufman and Carolyn Gregoire cite the work of psychologist Robert J. Sternberg, who identified several key attributes of people who are “willing to generate and promote ideas that are novel and even strange and out of fashion” — i.e., who would embrace a paradigm shift. According to Dr. Sternberg, that kind of person:

  • Tries to do what others think is impossible;
  • Is a noncomformist;
  • Is unorthodox;
  • Questions societal norms, truisms, and assumptions.

Life is risky for nonconformists. According to Kaufman and Gregoire:

Sternberg found that artists [who participated in his study] said that a creative person is one who takes risks and is willing to follow through on the consequences of those risks. Businesspeople, meanwhile, responded that a creative person in the business world is one who steers clear of the pitfalls of conventional ways of thinking.

The inherent risks of unconventional thinking require a willingness to fail — so says organizational psychologist Adam Grant in his TED talk on “The Surprising Habits of Original Thinkers”:

The greatest originals are the ones who fail the most, because they’re the ones who try the most. You need a lot of bad ideas in order to get a few good ones.

No wonder W+K — the uber-creative ad agency we looked at last time — has a Fail Harder Wall.

Then what about our professional obligation to be smart, and steer clear of risk and failure? David P Barash, evolutionary biologist and professor of psychology and biology at the University of Washington, tackles that conundrum in an article entitled “Paradigms Lost” that begins this way:

Science is not a “body of knowledge” – it’s a dynamic, ongoing reconfiguration of knowledge and must be free to change.

The capacity for self-correction is the source of science’s immense strength, but the public is unnerved by the fact that scientific wisdom isn’t immutable. Scientific knowledge changes with great speed and frequency – as it should – yet public opinion drags with reluctance to be modified once established. And the rapid ebb and flow of scientific “wisdom” has left many people feeling jerked around, confused, and increasingly resistant to science itself.

Unlike science, the law profession’s conventional cultural paradigm does not embrace change “with great speed and frequency.” On the other hand, the new paradigm/technology-driven legal practice developments do precisely that — which, according to the existing paradigm, makes them a high risk, fast road to failure.

Those who choose to innovate in the face of this risk need creativity and courage. Once again, this is from Wired to Create:

The history of creative thought and social progress is littered with similar stories of banned books, culture wars, persecuted artists, and paradigm-shifting innovations that change the way we look at the world.

In choosing to do things differently, [creative people] accept the possibility of failure — but it is precisely this risk that opens up the possibility of true innovation.

But can a professional paradigm truly embrace failure? More next time.

Rhodes_4

Check out this collection of last year’s Future of Law blog posts. It’s a FREE download. Also included is the Culture of Law series from the second half of 2015. Click this link or the cover for downloading details.

Professional Paradigms New and Old (Part 3): “Walk in Stupid Everyday”

We looked last year at physicist Thomas Kuhn’s model for how paradigms shift, and also explored another scientist’s exhortation “The best way to predict the future is to create it.”

Good, quotable advice, but how do you create what you can’t see? Richard and Daniel Susskind say often in their book The Future of the Professions that, as they travel the world delivering their message, many professionals agree that there’s a massive paradigm shift currently happening in the professions, just not their own.

Why this paradigm shift blindness?

wired%20to%20createReason 1: Too Much Expertise

Authors Scott Barry Kaufman and Carolyn Gregoire describe this phenomenon in their marvelous book Wired to Create: Unraveling the Mysteries of the Creative Mind:

While experience is an important aspect of excellence in any creative discipline, one risk of being a seasoned pro is that we become so entrenched in our own point of view that we have trouble seeing other solutions. Experts may have trouble being flexible and adapting to change because they are so highly accustomed to seeing things in a particular way.

Reason 2: Cultural Blindness

In each of the past two years (here and here), we’ve also looked at research from the emerging field of cultural neurology that suggests our brains’ observation and cognitive faculties are so linked to our cultural context that we simply can’t see paradigm shifts when they happen. Our cultural bias blinds us — it determines what we see and don’t see, and can literally blind us to new developments happening in our midst.

Reason 3: Not Being a Newcomer

Again from Wired to Create: “the newcomers to a field are sometimes the ones who come up with the ideas that truly innovate and shift paradigms.” In the law, the newcomers are responsible for the wave of new practice models and technologies. As I said last year, “By the time the new paradigm’s opponents eventually die, and a new generation grows up that is familiar with it, the paradigm we can’t see now will be the only one the new generation has ever known.”

MavericksA Cure for Paradigm Shift Blindness: Get Stupid

Dan Wieden is imminently quotable. He ought to be: he’s one of the namesakes of legendary ad agency Wieden+Kennedy, and personally created Nike’s “Just Do it” slogan.

W+K has offices all over the world and bills over a billion dollars annually. Their website is a creative trip all its own — you might enjoy cruising it, if you have a moment. The firm was profiled in a 2006 business bestseller, Mavericks at Work: Why the Most Original Minds in Business Win, where Wieden was famously quoted as saying this about his approach to keeping W+K at the top of its game:

Whatever day it is, something in the world changed overnight,
and you better figure out what it is and what it means.
You have to forget what you just did and what you just learned
You have to walk in stupid every day.

Lawyers aren’t the only professionals who will have trouble following that advice. People pay us to be smart; their benefit and our livelihood depend on it. True, but there’s a whole lot of shaking goin’ on around us. We might want to get stupid enough to see it.

Next time, we’ll look at another paradigm shifting skill that won’t come easy: embracing failure.

 

Mavericks at Work may be the best business book I’ve ever read. If you like that kind of thing, you owe it to yourself.

And Wired to Create is the best I’ve ever read on its topic. Author Scott Barry Kaufman is the scientific director of the Imagination Institute in the Positive Psychology Center, University of Pennsylvania, and Carolyn Gregoire is a senior writer at the Huffington Post, covering psychology, mental health, and neuroscience. And that’s just the first sentence of each of their author bios. Talk about creds.)

 

Rhodes_4

Check out this collection of last year’s Future of Law blog posts. It’s a FREE download. Also included is the Culture of Law series from the second half of 2015. Click this link or the cover for downloading details.

Professional Paradigms New and Old (Part 2): You Had Me At The Creds

rhodesI met a friend for a beer last Thursday, and told him about my blog post that day about the future (actually the end) of the professions.

“I’ve got a story for you about that,” he said. “I thought now that I’m retired, I should get my affairs in order.”

I practiced estate planning, so my ears perked up. He told me about all the useful information, forms, and software he’d found online, also about the estate planning seminars he’d attended and the presenting lawyers’ “don’t try this at home” pitches. And his incredulous response to their fee quotes “for things I could do myself.”

He’s newly retired from an illustrious teaching career — an Ivy League grad, six published books, awards and accolades everywhere. He has a huge and healthy respect for the professions and professionalism. And he had more to say.

“In education, it’s gotten to the point where it’s, why even bother to go to school? It’s all available online. You can learn what you want, your own way.”

Then he paused. “But I still wouldn’t go to a surgeon who didn’t have the credentials.”

Ah, the credentials. Is that why people still go to law school, med school, get a CPA, a teaching certificate?

Yes, in part, but the world of professional credentials is changing. I talked about this in a post last March called Strange Bedfellows: Commercial Law and Legal Ethics. Here’s an excerpt:

Peer-to-peer is what’s driving the new sharing economy. Consider this from a recent article in Time Magazine:

The key to [the sharing economy] was the discovery that while we totally distrust strangers, we totally trust people — significantly more than we trust corporations or governments. Many sharing-company founders have one thing in common: they worked at eBay and, in bits and pieces, recreated that company’s trust and safety division. Rather than rely on insurance and background checks, its innovation was getting both the provider and the user to rate each other, usually with one to five stars. That eliminates the few bad actors who made everyone too nervous to deal with strangers.

In that post, I made these two predictions (among others):

  • The peer-to-peer dynamic will prevail in significant economic sectors — including the professional service sector of which the legal profession is a part.
  • The resulting consumer satisfaction data will have a curious side effect as a new kind of legal ethics watchdog.

As for the latter, I said this:

Peer-to-peer is the ultimate in self-policing, which makes its extension to legal ethics unlikely but logical. Rule 8.3 — the duty to report unethical behavior among our peers — has long been a part of the Model Rules of Professional Conduct, but has been more honored in the breach than the observance. The new, democratized marketplace will take this matter into its own hands.

In other words, the professional paradigm will shift — in fact, is already shifting — to include peer-to-peer review as an alternative form of professional credentialing.

True, the typical consumer still wants law school and bar admittance credentials for the legal equivalent of surgery, but for the rest, we’re seeing a major shift in consumer attitudes toward my friend’s — to the point where the consumer is more likely to buy from someone (lawyer or not, which is its own topic) who gets 20 five-star ratings for estate planning offered at a reasonable price (which my buddy gave as 10 percent of what the seminar lawyers were charging). They’ve got the creds the consumer wants… just a different kind.

Like it or not, it’s happening out there in the New Economy marketplace, and we’ll see more of it in our house. We’re not all the way to lawyers posting client ratings on a five-star scale yet, but one day… I’ll bet it happens. I also bet that day will come way sooner than most lawyers would care to predict.

 

For Bill Gates’ take on the value of a college education credentials, check out his post yesterday on LinkedIn Pulse.

And for a toe dip into the New Economy, take a look here and here.

Rhodes_4Check out this collection of last year’s Future of Law blog posts. It’s a FREE download. Also included is the Culture of Law series from the second half of 2015. Click this link or the cover for downloading details.

Professional Paradigms New and Old (Part 1): The Future Is Here, And We’re Not In It

The%20Future%20of%20the%20ProfessionsThe first six months of 2015, this blog ran a series on the Future of Law. About halfway through, I discovered the work of law futurist Richard Susskind, and quoted his books several times after that.

Richard and his son Daniel recently teamed up to publish The Future of the Professions: How Technology Will Transform the Work of Human Experts.

The book takes commitment to get through — it is exhaustively (sometimes exhaustingly) researched, and written with the painstaking (sometimes painful in its meticulousness) logic of philosophy (or a legal brief). But if you want to make your own contribution to the future of the profession, it’s an absolute must-read.

Among other things, you’ll find lots of new news about practice models and technologies — not just in law, but the other professions as well — which gives a sense of the vastness of the paradigm shift currently well underway in all the professions.

Here’s how the book summarizes its message:

[T]he professions are our current solution to a pervasive problem, namely, that none of us has sufficient specialist knowledge to allow us to cope with all the challenges that life throws at us. We have limited understanding, and so we turn to doctors, lawyers, teachers, architects, and other professionals because they have ‘practical expertise’ that we need to bring to bear in our daily lives. In a print-based society, we have interposed the professions, as gatekeepers, between individuals and organizations, and the knowledge and experience to which they need access.

In the first two parts of the book we describe the changes taking place within the professions, and we develop various theories (largely technological and economic) that lead us to conclude that, in the future—in the fully fledged, technology-based Internet society—increasingly capable machines, autonomously or with non-specialist users, will take on many of the tasks that currently are the exclusive realm of the professions.

While we do not anticipate an overnight, big-bang revolution, equally we do not expect a leisurely evolutionary progression into the post-professional society. Instead, we predict what we call and ‘incremental transformation’ in the way in which we organize and share expertise in society, a displacement of the traditional professions in a staggered series of steps and bounds. Although the change will come in increments, its eventual impact will be radical and pervasive.

In other words, the professions as we have known them are facing the full implications of a massive paradigm shift from analog to digital in how we create, curate, and communicate wisdom, expertise, and specialized knowledge. The old paradigm relied on manuscripts and human brains; the new is proliferated in digitized forms most of us can barely conceive of.

The result? Let’s put it this way: the Susskinds could have called their book not the Future of the Professions, but the End of the Professions.

As I’ve said before, this paradigm shift is way bigger than our individual opinions of it. This series will offer some thoughts on how we reckon with it.

 

Rhodes_4For last year’s version of the Future of Law, check out this collection of those blog posts. It’s a FREE download. Also included is the Culture of Law series from the second half of 2015. Click this link or the cover for details.

The Anti-Motivation Strategy (Part 8): Last Lessons From a Couple Personal Ethos Heroes

Employee-Motivation

 

Last week’s post introduced the concept of personal ethos — your core, essential self, the inner drive that defines you, that will be expressed simply because you are alive on this planet, here and now, doing what you do, for no other reason than that’s what you do. You don’t need motivation to do that. Besides, it’s what you do best, and love doing to boot.

Let’s end this series with a couple sports stories. Bear with me if that’s not your thing, but it’s a nice wrap up.

I once heard an interview in which Michael Jordan’s father said, “God decided to make a perfect basketball player, so he made Michael.” He wasn’t the only one who used that kind of language to describe his son. At the end of the 1986 season, Jordan came back from a broken ankle (too early, risking his career, the experts said) and played only 18 games, then burned the Boston Celtics for 63 points in a playoff game, causing Larry Bird to famously remark,

That was God disguised as Michael Jordan.”

It was a folksy thing to say. But what if he was right… I mean, really right? What if the sentence could be completed by someone observing your life and saying, “That was God disguised as [your name]”? Where would you find something that strong?

By looking at what you already do. What you’ve always done. What you’re going to do anyway, because that’s what you do, and you love doing it and you’re good at it.

When God decided to make a perfect _______, he made [your name].

I know, it sounds corny, but try it on. Go ahead — it won’t kill you. According to the concept of personal ethos, those bold statements are not a reach for a stress-fueled motivational challenge, they’re facts that come from the essence of who you are, at the level of your deepest, core self.

Tap that, and you can quit making trips to the dry, stressful motivation well. You won’t need it, You will do what you will do, irrepressibly and indomitably. You won’t be able to help it; you won’t want to. That’s what it means to operate from your personal ethos.

Larry-Bird

Sure, you’ll face the challenge of staying focused on individual and collective mission and goals, but you face that challenge already anyway. Only now you’ll face it with more honesty, authenticity, and laser focus. Which means you can expect more explosive results. You’ll become this:

Michael-Jordan

Now, isn’t that a whole lot better than the carrot and stick you’ve been waving around?

If you’re interested in more about personal ethos, I wrote two books about it. Both are available as FREE downloads. For more, click the book covers.

 

Running-for-my-Life

One reader said this: “Running For My Life is a unique and thought provoking read. On the surface it is a story about a man with primary progressive MS reshaping his life through a+ strict diet and extreme exercise regimen. However, if you take the time to explore the pages, you will find that it is really a story about Kevin and about yourself. This book invites you to take a look inwards at your own limitations, and then holds your hand as you figure out how to push past them together.”

 

EthosEthos is a stand-alone version of Book Three of Running For My Life. It is a Personal Ethos Credo — the things I believe about it, and how I practice it.