December 10, 2018

There’s No Such Thing as a Free Lunch — True or False?

Last time, we were introduced to the idea of a universal basic income (UBI). We can assume that the pros and cons have been thoroughly researched and reasonably analyzed, and that each side holds its position with utmost conviction.

We can also assume that none of that reasonableness and conviction will convert anyone from one side to the other, or win over the uncommitted. Reason doesn’t move us: we use it to justify what we already decided, based on what we believe. SeeWhy Facts Don’t Change Our Minds,” The New Yorker (February 2017) and “This Article Won’t Change Your Mind,” The Atlantic (March 2017).

History doesn’t guide us either — see Why We Refuse to Learn From History from Big Think and Why Don’t We Learn From History, from military historian Sir Basil Henry Liddell Hart. The latter contains conventional wisdom such as this:

The most instructive, indeed the only method of learning to bear with dignity the vicissitude of fortune, is to recall the catastrophes of others.

History is the best help, being a record of how things usually go wrong.

There are two roads to the reformation for mankind— one through misfortunes of their own, the other through the misfortunes of others; the former is the most unmistakable, the latter the less painful.

I would add that the only hope for humanity, now, is that my particular field of study, warfare, will become purely a subject of antiquarian interest. For with the advent of atomic weapons we have come either to the last page of war, at any rate on the major international scale we have known in the past, or to the last page of history.

That’s seems like good advice, but it mostly goes unheeded. It seems we’d rather make our own mistakes.

If reasoned analysis and historical perspective don’t inform our responses to radically new ideas like UBI, then what does? Many things, but cultural belief is high on the list. Policy is rooted in culture, culture is rooted in shared beliefs, and beliefs are rooted in history. Cultural beliefs shape individual bias, and the whole belief system becomes sacred in the culture’s mythology. Try to subverts cultural beliefs, and the response is outrage and entrenchment.

All of which means that each of us probably had a quick true or false answer to the question in this week’s blog post title, and were ready to defend it with something that sounded reasonable. Our answer likely signals our kneejerk response to the idea of UBI. The “free lunch”— or, more accurately, “free money” — issue appears to be the UBI Great Divide: get to that point, and you’re either pro or con, and there’s no neutral option. (See this for more about where the “no free lunch” phrase came from.[1])

The Great Divide is what tanked President Nixon’s UBI legislation. The plan, which would have paid a family of four $1,600/year (equivalent to $10,428 today) was set to launch in the midst of an outpouring of political self-congratulation and media endorsement, only to be scuttled by a memo from a White House staffer that described the failure of a British UBI experiment 150 years earlier. UBI was in fact a free lunch; its fate was thus sealed.

As it turns out, whether the experiment failed or not was lost in a 19th Century fog of cultural belief, so that opponents of the experiment pounced on a bogus report about its impact to justify passing the Poor Law Amendment Act of 1834 — which is what they wanted to do anyway. The new Poor Law was that era’s version of workfare, and was generated by the worst kind of scarcity mentality applied to the worst kind of scarcity. Besides creating the backdrop to Charles Dickens’ writing, the new Poor Law’s philosophical roots still support today’s welfare system:

The new Poor Law introduced perhaps the most heinous form of “public assistance” that the world has ever witnessed. Believing the workhouses to be the only effective remedy against sloth and depravity, the Royal Commission forced the poor into senseless slave labor, from breaking stones to walking on treadmills. . . .

For the whole history lesson, see “The Bizarre Tale Of President Nixon’s Basic Income Plan.”

And so we’re back to asking whether UBI is a free lunch or not. If it is, then it’s an affront to a culture that values self-sufficiency. If it isn’t, then it requires a vastly different cultural value system to support it. The former believes that doing something — “making a living” at a job — is how you earn your daily bread. The latter believes you’re entitled do sustenance if you are something: i.e., a citizen or member of the nation, state, city, or other institution or community providing the UBI. The former is about activity, the latter is about identity. This Wired article captures the distinction:

The idea [of UBI] is not exactly new—Thomas Paine proposed a form of basic income back in 1797—but in this country, aside from Social Security and Medicare, most government payouts are based on individual need rather than simply citizenship.

UBI is about “simply citizenship.” It requires a cultural belief that everybody in the group shares its prosperity. Cultural identity alone ensures basic sustenance — it’s a right, and that right makes Poor Laws and workfare obsolete.

The notion of cultural identity invites comparison between UBI and the “casino money” some Native American tribes pay their members. How’s that working? We’ll look at that next time.


[1] Yes, Milton Friedman did in fact say it, although he wasn’t the only one. And in a surprising twist, he has been criticized for advocating his own version of UBI.

 

Kevin Rhodes studies and writes about economics in an effort to understand the world his kids are growing up in, which is also the world he’s growing old in. You might enjoy his latest LinkedIn Pulse article “The Fame Monster: Rockstars And Rockstar Entrepreneurs.”

Old Dog, Old Trick, New Showtime

Blockchain consultant and futurist Michael Spencer called it a conspiracy by the 0.01 percenters to enslave the rest of us for good.[1] A growing number of those 0.01 percenters have already supported it, but they’re not alone: this poll conducted shortly after the 2016 election showed that half of Americans supported it as well. A parade of think tanks (here’s one) and other professional skeptics (more than I can cite with hyperlinks in a single sentence) have given it a thorough vetting and mostly concluded yeah well maybe it’s worth a try.

What is “it”? This idea: give the poor what they lack — money. Ensure everyone a livable income while getting rid of the expensive and draconian welfare system. And just to be fair, go ahead and give everyone else money, too, even the billionaires.

The idea mostly goes by the name “universal basic income” (UBI). It’s rooted in the futuristic fear that technology will eventually put humans out of work. That’s not an old fear: UBI is “far from a new idea,” says Martin Ford, another Silicon Valley entrepreneur and a popular TED talker, in his New York Times Bestselling Rise of the Robots: Technology and the Threat of a Jobless Future.

In the context of the contemporary American political landscape . . . a guaranteed income is likely to be disparaged as “socialism” and a massive expansion of the welfare state. The idea’s historical origins, however, suggest something quite different. While a basic income has been embraced by economists and intellectuals on both sides of the political spectrum, the idea has been advocated especially forcefully by conservatives and libertarians.

Friedrich Hayek, who has become an iconic figure among today’s conservatives, was a strong proponent of the idea. In his three-volume work. Law, Legislation and Liberty, published between 1973 and 1979, Hayek suggested that a guaranteed income would be a legitimate government policy designed to provide against adversity, and that the need for this type of safety net is the direct result of the transition to a more open and mobile society where many individuals can no longer rely on traditional support systems:

There is, however, yet another class of common risks with regard to which the need for government action has until recently not been generally admitted. . . . The problem here is chiefly the fate of those who for various reasons cannot make their living in the market . . . that is, all people suffering from adverse conditions which may affect anyone and against which most individuals cannot alone make adequate protection but in which a society that has reached a certain level of wealth can afford to provide for all.

LBJ foresaw the possibility of massive technological unemployment back in the 60s and appointed an “Ad Hoc Committee on the Triple Revolution” to study the topic. The Committee included co-Nobel Prize winners Friedrich Hayek and Swedish economist and sociologist Gunnar Myrdal.[2] Rise of the Robots describes the Committee’s findings:

“Cybernation” (or automation) would soon result in an economy where “potentially unlimited output can be achieved by systems of machines which will require little cooperation from human beings.” The result would be massive unemployment, soaring inequality, and, ultimately, falling demand for goods and services as consumers increasingly lacked the purchasing power necessary to continue driving economic growth.

The Ad Hoc Committee went on to propose a radical solution: the eventual implementation of a guaranteed minimum income made possible by the “economy of abundance” such widespread automation would create, and which would “take the place of the patchwork of welfare measures” that were then in place to address poverty.

The Triple Revolution report was released to the media and sent to President Johnson, the secretary of labor, and congressional leaders in March 1964. An accompanying cover letter warned ominously that if something akin to the report’s proposed solutions was not implemented, “the nation will be thrown into unprecedented economic and social disorder.” A front-page story with extensive quotations from the report appeared in the next day’s New York Times, and numerous other newspapers and magazines ran stories and editorials (most of which were critical), in some cases even printing the entire text of the report.

The Triple Revolution marked what was perhaps the crest of a wave of worry about the impact of automation that had arisen following World War II. The specter of mass joblessness as machines displaced workers had incited fear many times in the past — going all the way back to Britain’s Luddite uprising in 1812 — but in the 1950s the 60s, the concern was especially acute and was articulated by some of the United States’ most prominent and intellectually capable individuals.

Four months after the Johnson administration received the Triple Revolution report, the president signed a bill creating the National Commission on Technology, Automation, and Economic Progress. In his remarks at the bills signing ceremony, Johnson said that “automation can be the ally of our prosperity if we will just look ahead, if we will understand what is to come, and if we will set our course wisely after proper planning for the future.” The newly formed Commission then . . . quickly faded into obscurity.

A few years later, Richard Nixon introduced UBI legislation that he called “The most significant piece of social legislation in our nation’s history.” That legislation also faded into obscurity — more on that another time.

Thus, UBI is an old idea responding to an old fear: how do we make a living if we can’t work for it? A half century after LBJ and Nixon, that fear is all too real, and lots of people think it might be time for the historical UBI solution to make its appearance.

But not everyone is jumping on the UBI bandwagon. The very thought that jobs might not be the source of our sustenance is the rallying cry of UBI’s most strident opponents.

More on UBI next time.


[1] Spencer followed with a similarly scathing assessment in this article.

[2] Myrdal’s study of race relations was influential in Brown v. Board of Education. He was also an architect of the Swedish social democratic welfare state. Hayek and Myrdal were jointly awarded the Nobel Prize in Economics in 1974.

The Success Delusion

How did the social safety net turn into a poverty trap? It was a victim of the success of the job as an economic force.

Psychologists call it “the success delusion.” You do something and get a result you like, so you keep doing it, expecting more of the same. It keeps working until one day it doesn’t. Do you try something new? No, you double down — it worked before, surely it will work again. You keep doubling down until you’ve made a mess.

You’re a victim of your own success. If you could listen, hindsight would tell you that there was more to it than what you were doing, that a lot of what happened was you being in the right place at the right time. You might believe that or not, but what matters now is that the times have changed and you didn’t.

That’s what happened to social welfare. 40 years of post-WWII economic success positioned the steady job as the cornerstone of economic prosperity and upward mobility. Then, in the 80s and 90s, about the time the job was starting to lose its economic vitality, policy-makers doubled down on it: work had raised the welfare of the whole world since the days of the telegraph and railroad, and surely it was still the best route out of poverty. So now we had workfare instead of welfare, and, as we saw last time, social welfare became “a system of suspicion and shame.”

Standin’ in line marking time
Waiting for the welfare dime
‘Cause they can’t buy a job
The man in the silk suit hurries by
As he catches the poor old lady’s eyes
Just for fun he says, “Get a job.”

That’s The Way It Is”
Bruce Hornsby and the Range

Rutger Bregman sums it up this way:

We’re saddled with a welfare state from a bygone era when the breadwinners were still mostly men and people spent their whole lives working at the same company. The pension system and employment protection rules are still keyed to those fortunate to have a steady job, public assistance is rooted in the misconception that we can rely on the economy to generate enough jobs, and welfare benefits are often not a trampoline, but a trap.

Utopia for Realists (2017).

Guy Standing explains it this way:

The period from the nineteenth century to the 1970’s saw what Karl Polanyi, in his famous 1944 book, dubbed “The Great Transformation.”

The essence of labourism was that labour rights — more correctly , entitlements — should be provided to those (mostly men) who performed labour and to their spouses and children.

Those in full-time jobs obtained rising real wages, a growing array of “contributory” non-wage benefits, and entitlements to social security for themselves and their family. As workers previously had little security, this was a progressive step.

Labourism promoted the view that the more labour people did, the more privileged they should be, and the less they did the less privileged they should be. The ultimate fetishism was Lenin’s dictate, enshrined in the Soviet constitution, that anybody who did not labour should not eat.

The labourist model frayed in the 1980’s, as labour markets became more flexible and increasing numbers of people moved from job to job and in and of employment.

To defend labour-based welfare, social democratic governments turned to means testing, targeting benefits on those deemed the deserving poor.

The shift to means testing was fatal. As previous generations of social democrats had understood, benefits designed only for the poor are invariably poor benefits and stand to lose support among the rest of society.

Ironically, it was mainly social democratic parties that shifted policy towards workfare, requiring the unemployed to apply for non-existent or unsuitable jobs, or to do menial, dead-end jobs or phony training courses in return for increasingly meagre benefits.

Today, we are living in a Second Gilded Age — with one significant difference. In the first, which ended in the Great Crash of 1929, inequality grew sharply but wages on average rose as well. The Second Gilded Age has also involved growing inequality, but this time real wages on average have stagnated or fallen. Meanwhile, those relying on state benefits have fallen further behind, many pushed into homelessness, penury and dependence on inadequate private charity.

Since the 1980s, the share of income going to labour has shrunk, globally and in most countries of economic significance. . . . The labour share fell in the USA from 53 per cent in 1970 to 43.5 per cent in 2013. Most dramatically, it slid by over twenty percentage points in China and also dropped steeply in the rising industrial giant of South Korea.

Besides falling wages, there has been an increase in wage differentials and a less-documented decline in the share of people receiving non-wage benefits, such as occupational pensions, paid holidays, sick leave or medical coverage. Thus worker compensation, in terms of “social income,” has fallen by more than revealed by wages alone.

As a consequence of these developments, “in-work poverty” has rocketed. In some OECD [Organisation for Economic Cooperation and Development — 34 industrialized member countries], including Britain, the USA, Spain and Poland, a majority of those in poverty live in households where at least one person has a job.

The mantra that “work is the best route out of poverty” is simply false.

The Corruption of Capitalism (2017).

Not only are jobs doing a poor job at social welfare — for both employed and unemployed alike — but they are themselves an endangered species. More to come…

 

Kevin Rhodes studies and writes about economics in an effort to understand the world his kids are growing up in, which is also the world he’s growing old in. You might enjoy his latest LinkedIn Pulse article “The Fame Monster: Rockstars And Rockstar Entrepreneurs.”

Archeconomics

I made up the term “archeconomics.” I’m using “arch” in the sense of “first principles” — e.g., as in “archetype.” An “arch” is the larger version of the smaller expressions of itself — e.g., not just a villain but an arch-villain, not just an angel but an archangel. Life goes big when an arch-something is at work: experience expands beyond circumstance, meaning magnifies, significance is exaggerated.

Archeconomics is therefore the larger story behind economics.

I ended last week’s post by referring to the larger story behind the rentier economy. As usually happens when I’m on a research trail, several commentaries have appeared in my various feeds lately that look beyond the usual opinionated mash of current events and instead address over-arching ideas and issues. All of them deal in one way or another with the current status and possible future of the liberal worldview — an arch-topic if there ever was one.

The term “liberal” in this context doesn’t refer to political liberal vs. conservative, but rather to historical liberalism, which among other things gave us post-WWII neo-liberal economics. Mega-bestselling author Yuval Noah Harari describes this kind of liberalism in his latest book 21 Lessons for the 21st Century:

In Western political discourse the term “liberal” is sometimes used today in a much narrower sense, to denote those who support specific causes such as gay marriage, gun control, and abortion rights. Yet most so-called conservatives also embrace the broad liberal worldview.

The liberal story cherishes human liberty as its number one value. It argues that all authority ultimately stems from the free will of individual humans, as expressed in their feelings, desires, and choices. In politics, liberalism believes that the voter knows best. It therefore upholds democratic elections. In economics, liberalism maintains that the customer is always right. It therefore hails free-market principles. In personal matters, liberalism encourages people to listen to themselves, be true to themselves, and allow their hearts — as long as they do not infringe on the liberties of others. This personal freedom is enshrined in human rights.

If you read Harari’s books Sapiens and Homo Deus. you have a sense of what you’ll find in 21 Lessons, but I found it worth reading on its own terms. Two recent special magazine editions also take on the fate of liberalism: “Is Democracy Dying? from The Atlantic andA Manifesto for Renewing Liberalism” from The Economist. The titles speak for themselves, and both are offered by publications with nearly two centuries of liberal editorial perspectives.

Another historical liberal offering from a conservative political point of view is “How Trumpism Will Outlast Trump,” from Time Magazine. Here’s the article’s précis:

These intellectuals are committed to a new economic nationalism . . . They’re looking past Trump . . . to assert a fundamental truth: whatever you think of him, Donald Trump has shown a major failing in the way America’s political parties have been serving their constituents. The future of Trump’s revolution may depend on whether this young group can help fix the economy.

Finally, here’s a trio of offerings that invoke environmental economics — the impact of the global ecology on global economics being another archeconomics topic. The first is a scientific study published last week that predicted significant environmental degradation within a surprisingly short time. Second is an article about the study that wants to know “Why We Keep Ignoring Even the Most Dire Climate Change Warnings.” Third is last week’s announcement that the winner of this year’s Nobel Prize in Economics is an environmental economist.

Some or all of those titles should satisfy if you’re in the mood for some arch-reading.

Next time, we’ll return to plain old economics, with a look at how the low income social strata is faring in all the dust-up over rentiers and economic inequality, robotcs and machine learning, and the sagging paycheck going to human labor.

 

Kevin Rhodes studies and writes about economics in an effort to understand the world his kids are growing up in, which is also the world he’s growing old in. You might enjoy his latest LinkedIn Pulse article “The Fame Monster: Rockstars And Rockstar Entrepreneurs.”

The Rentier Economy — Primer Part 1

As we saw last week, the original Monopoly game — then known as The Landlord’s Game — offered a choice of two different games, one played under “Prosperity” rules and the other under “Monopoly” rules. The post-WWII economic surge was a real-life Prosperity game: it generated a rising tide of economic benefit that floated all boats across all social classes. The surge peaked in the 1970’s, and since then the Monopoly rules have increasingly asserted themselves, resulting in, among other things, stagnant employee compensation (except for the top 10%) and rising returns to capital owners — the lion’s share paid in the form of rents. The latter reflects the rise of a “rentier economy.”

First, we need to define “rent”:

Economists use the term ‘rent’ in a special way. For them, rent refers . . . to the excess payment made to any factor of production (land, labor, or capital) due to scarcity.

The scarcity factor that gives rise to rents can be natural, as with the case of land.

But rents can also arise from artificial scarcity — in particular, government policies that confer special advantages on favored market participants.

The Captured Economy: How the Powerful Enrich Themselves, Slow Down Growth, and Increase Inequality, Brink Lindsey and Steven Teles (2017).

And “rentier”:

A rentier is someone who gains income from possession of assets, rather than from labour. A rentier corporation is a firm that gains much of its revenue from rental income rather than from production of goods and services, notably from financial assets or intellectual property. A rentier state has institutions and policies that favour the interests of rentiers. A rentier economy is one that receives a large share of income in the form of rent.

The Corruption of Capitalism, Why Rentiers Thrive and Work Does Not Pay, Guy Standing (2016)

Economists didn’t see the rentier economy coming. They especially didn’t foresee how government policy would create it. The following is from The Corruption of Capitalism:

John Maynard Keynes, the most influential economist of the mid-twentieth century, famously dismissed the rentier as the ‘functionless investor’ who gained income solely from ownership of capital, exploiting its ‘scarcity value.’ He concluded in his epochal General Theory that, as capitalism spread, it would mean the “euthanasia of the rentier,” and, consequently, the euthanasia of the cumulative oppressive power of the capitalist to exploit the scarcity value of capital:

“Whilst there may be intrinsic reasons for the scarcity of land, there are no intrinsic reasons for the scarcity of capital. . . . I see, therefore, the rentier aspect of capitalism as a transitional phase which will disappear when it has done its work.”

Keynes was mistaken because he did not foresee how the neoliberal framework built since the 1980’s would allow individuals and firms to generate ‘contrived scarcity’ of assets from which to gain rental income. Nor did he foresee how the modern ‘competitiveness’ agenda would give asset owners power to extract rental subsidies from the state.

Eighty years later, the rentier is anything but dead; rentiers have become the main beneficiaries of capitalism’s emerging income distribution system.

The old income distribution system that tied income to jobs has disintegrated.

And this is from The Captured Economy:

The last few decades have been a perplexing time in American economic life. Following a temporary spike during the Internet boom of the 1990’s, rates of economic growth have been exceptionally sluggish. At the same time, incomes at the very top have exploded while those further down have stagnated.

As a technical matter, rent is a morally neutral concept. . . . Nevertheless, the term ‘rent’ is most commonly used in a moralized sense to refer specifically to bad rents. In particular, the expression ‘rent-seeking’ refers to business activity that seeks to increase profits without creating anything of value through distortions to market processes, such as constraints on the entry of new firms.

Those advantages can also take the form of subsidies or rules that impose extra burdens on both existing and potential competitors. The rents enjoyed through government favoritism not only misallocate resources in the short term but they also discourage dynamism and growth over the long term. Their existence encourages an ongoing negative-sum scramble for more favors instead of innovation and the diffusion of good ideas.

Economists have had an explanation for the latter trend, which is that returns to skill have increased dramatically, largely because of globalization and information technology. There is clearly something to this explanation, but why should the more efficient operation of markets be accompanied by a decline in economic growth?

Our answer is that increasing returns to skill and other market-based drivers of rising inequality are only part of the story. Yes, in some ways the US economy has certainly grown more open to the free play of market forces during the course of the past few decades. But in other ways, economic returns are now determined much more by success in the political arena and less by the forces of market competition. By suppressing and distorting markets, the proliferation of regulatory rents has also led to less wealth for everyone.

To be continued.

 

Kevin Rhodes studies and writes about economics in an effort to understand the world his kids are growing up in, which is also the world he’s growing old in. You might enjoy his latest LinkedIn Pulse article “The Fame Monster: Rockstars And Rockstar Entrepreneurs.”

The Landlord’s Game

“Buy land – they aren’t making it anymore.”
Mark Twain

You know how Monopoly games never end? A group of academicians wanted to know why. Here’s an article about them, and here’s their write-up. Their conclusion? Statistically, a game of Monopoly played casually (without strategy) could in fact go on forever.

I once played a game that actually ended. I had a strategy: buy everything you land on, build houses and hotels as fast as possible, and always mortgage everything to the hilt to finance acquisition and expansion. I got down to my last five dollars before I bankrupted everybody else. It only took a couple hours. Okay, so the other players were my kids. Some example I am. Whatever economic lessons we might have gained from the experience, they certainly weren’t what the game’s creator had in mind.

While Andrew Carnegie and friends were getting rich building American infrastructure, industry, and institutions, American society was experiencing a clash between the new rich and those still living in poverty. In 1879, economist Henry George proposed a resolution in his book Progress and Poverty: An Inquiry into the Cause of Industrial Depressions and of Increase of Want with Increase of Wealth: The Remedy.

Travelling around America in the 1870s, George had witnessed persistent destitution amid growing wealth, and he believed it was largely the inequity of land ownership that bound these two forces — poverty and progress — together. So instead of following Twain by encouraging his fellow citizens to buy land, he called on the state to tax it. On what grounds? Because much of land’s value comes not from what is built on the plot but from nature’s gift of water or minerals that might lie beneath its surface, or from the communally created value of its surroundings: nearby roads and railways; a thriving economy, a safe neighborhood; good local schools and hospitals. And he argued that the tax receipts should be invested on behalf of all.

From “Monopoly Was Invented To Demonstrate The Evils Of Capitalism,by new economist Kate Raworth.[1]

George’s book eventually reached the hands of Elizabeth Magie, the daughter of newspaperman James Magie and a social change rabble-rouser in her own right. Influenced by her father’s politics and Henry George’s vision, she created The Landlord’s Game in 1904 and gave it two sets of rules, intending for it to be an economic learning experience. Again quoting from Ms. Raworth’s article:

Under the ‘Prosperity’ set of rules, every player gained each time someone acquired a new property (designed to reflect George’s policy of taxing the value of land), and the game was won (by all!) when the player who had started out with the least money had doubled it. Under the ‘Monopolist’ set of rules, in contrast, players got ahead by acquiring properties and collecting rent from all those who were unfortunate enough to land there — and whoever managed to bankrupt the rest emerged as the sole winner (sound a little familiar?).

The purpose of the dual sets of rules, said Magie, was for players to experience a ‘practical demonstration of the present system of land grabbing with all its usual outcomes and consequences’ and hence to understand how different approaches to property ownership can lead to vastly different social outcomes.

The game was soon a hit among Left-wing intellectuals, on college campuses including the Wharton School, Harvard and Columbia, and also among Quaker communities, some of which modified the rules and redrew the board with street names from Atlantic City. Among the players of this Quaker adaptation was an unemployed man called Charles Darrow, who later sold such a modified version to the games company Parker Brothers as his own.

Once the game’s true origins came to light, Parker Brothers bought up Magie’s patent, but then re-launched the board game simply as Monopoly, and provided the eager public with just one set of rules: those that celebrate the triumph of one over all. Worse, they marketed it along with the claim that the game’s inventor was Darrow, who they said had dreamed it up in the 1930s, sold it to Parker Brothers, and become a millionaire. It was a rags-to-riches fabrication that ironically exemplified Monopoly’s implicit values: chase wealth and crush your opponents if you want to come out on top.

“Chase wealth and crush your opponents” — that was my winning Monopoly strategy. It requires a shift away from the labor economy — selling things workers make or services they provide — to the rentier economy — owning assets you can charge other people to access and use. The scarcer the assets, the more you can charge. Scarcity can be natural, as is the case with land, or it can be artificial, the result of the kind of “regressive regulation” we looked at last time, that limits access to capital markets, protects intellectual property, bars entry to the professions, and concentrates high-end land development through zoning and land use restrictions.

Artificial scarcity can also be the result of cultural belief systems — such as those that underlie the kind of stuff that shows up in your LinkedIn and Facebook feeds: “7 Ways to Get Rich in Rental Real Estate” or “How to Create a Passive Income From Book Sales and Webinars.” In fact, it seems our brains are so habitually immersed in Monopoly thinking that proposals such as Henry George’s land ownership  tax — or its current equivalents such as superstar economist Thomas Piketty’s wealth tax, Harvard law and ethics professor Lawrence Lessig’s notions of a creative commons, or the widely-studied and broadly-endorsed initiation of a “universal basic income” — are generally tossed off as hopelessly idealistic and out of touch.

More to come.


[1] Kate Raworth holds positions at both Oxford and Cambridge. We previously looked at her book Doughnut Economics: Seven Ways to Think Like a 21st-Century Economist  (2017).

 

Kevin Rhodes studies and writes about economics in an effort to understand the world his kids are growing up in, which is also the world he’s growing old in. You might enjoy his latest LinkedIn Pulse article “The Fame Monster: Rockstars And Rockstar Entrepreneurs.”

The Pledge

19th Century Steel Baron Andrew Carnegie was (a) more than okay with the right to make as much money as you want; but he (b) was not okay with spending it any old way you like. He had some very specific notions about the latter:[1]

By the late 1880s, Carnegie’s place as one of the wealthiest men in the United States was cemented… With the time afforded him as the controlling shareholder, Carnegie put forth theories on capitalism. the human condition, and the American Republic. In 1889, Carnegie wrote an article simply titled “Wealth” — it would soon become known as “Gospel of Wealth.” . . . In it he offered an unapologetic defense of the system that enabled great wealth such as his.

[Carnegie believed that] the price for… material progress — “cheap comforts and luxuries” — was great wealth inequality. . . . Any thinking person, Carnegie surmised, would conclude “that upon the sacredness of property civilization itself depends — the right of the laborer to his hundred dollars in the savings bank, and equally the legal right of the millionaire to his millions.” But his defense of capitalism was a setup for a most startling conclusion.

In the article Carnegie argued that the greatest of men, capitalists, should be unencumbered to accumulate wealth. But once great wealth was achieved, these men should, during their lifetimes, give it away. As the possession of wealth was proof to society of great achievement, aptitude, industriousness, and ability, it made little sense that it should be bequeathed to descendants. Inherited wealth would undermine the argument that those with wealth earned it, deserved it.

Next, he held that if men waited until death to give the money away, less competent men unused to large sums would squander it thoughtlessly, however well-intentioned. While Carnegie viewed wealth as a symbol of intellectual mastery, the actual possession of it should be considered only a trust fund, with “the man of great wealth becoming mere trustee for his poorer brethren, bringing to their service his superior wisdom, experience, and ability to administer, doing for them better than they would or could for themselves. The man who dies thus rich, dies disgraced.”

Carnegie was hailed by newspapers, socialists, workingmen, and, more discreetly, even his fellow capitalists . . . for such enlightened views.

Carnegie’s legacy of endowments endures to this day. (I have clear memories of our small town Carnegie library.) Carnegie’s fellow Robber Barons created similarly enduring legacies, such as those reflected in the following names: Johns Hopkins, Leland Stanford, Ezra Cornell, Cornelius Vanderbilt, and James Duke.

Carnegie’s philosophy also endures today. albeit expressed in terms more in tune with the ethos of our times. Consider, for example, the Giving Pledge, formed “in an effort to help address society’s most pressing problems by inviting the world’s wealthiest individuals and families to commit more than half of their wealth to philanthropy or charitable causes either during their lifetime or in their will.”

As of May 2018, 183 individuals or couples from 22 countries had taken the pledge, representing total net worth closing in on a trillion dollars. Some of the Pledgers are household names; most aren’t. I randomly clicked several of their photos on the Giving Pledge home page, which takes you to their statements about why they took the pledge. Noticeably absent is Carnegie’s belief that capitalists are “that the greatest of men,” that “the possession of wealth [is] proof to society of great achievement, aptitude, industriousness, and ability,” or that wealth is a “symbol of intellectual mastery.” Nor is there an expressed fear that “less competent men unused to large sums would squander it thoughtlessly, however well-intentioned.” Instead, there’s a certain humility to many of the statements: they often mention lessons learned from forebears or other role models, and often express gratitude for having been “blessed” or gotten lucky, such as this one:

Allow me to start by saying that I am not sure I am a worthy member of this group of extraordinary individuals. I consider that I have been lucky in life.

Other themes in the statements are (a) a recognition that attaining great wealth is not solely a matter of rugged individualism, but that cultural and historical context deserve a lot of credit, and (b) a belief that giving back is a way to honor this reality. I.e., wealth made possible by historical and cultural circumstance ought to benefit all members of that culture, including the most needy. As it turns out, this isn’t just a kind-hearted philosophy of life, it’s a statement of the economic terms upon which much wealth has in fact been created and in the past and continues to be created today.

State-sponsored policies that favor timely and innovative ideas and technologies represent a significant type of societal support for wealth creation . We’ll look at that next time.


[1] Americana: A 400-Year History of American Capitalism, Bhu Srinivasan, (2017).

 

The Great Gatsby Lawyer

How okay are we, really, with the right of everyone (a) to make as much money as they want, and (b) to spend it any way they like? If we would limit (a) or (b) or both, then how and why?

Consider for a moment what your (a) and (b) responses have been to the upward mobility stories we’ve looked at so far: Richard Reeves, Matthew Stewart, Steven Brill. Travie McCoy. David Boies, Eric and I. Now consider this story from an article in Above the Law:

[P]ersonal injury attorney Thomas J. Henry threw a lavish bash to celebrate his son, Thomas Henry Jr.’s, 18th birthday. And the price tag for the Gatsby-mixed-with-burlesque-themed fête? A cool $4 million.

To rack up such a hefty bill, the event had lots of performers which included showgirls, aerial performers, art installations, and contortionists (oh my!). Plus, there were musical performances and celebrity guests.

And don’t think the over-the-top party was the only gift the birthday boy received:

The star of the party, who sat on a throne-like chair when he wasn’t dancing, was given a fully loaded blue Ferrari, an IWC Portugieser Tourbillion watch and a custom-made painting from Alec Monopoly.

Henry’s work as a trial attorney is obviously pretty lucrative. The big payouts he’s been able to secure for his clients have made him a member of the Multi-Million Dollar Advocates Forum.[1]

Henry is known for throwing giant parties. Just last year, he spent $6 million for his daughter’s quinceañera. I guess we know which one is really daddy’s favorite.

The writer telegraphs her attitude about the story with the article’s tone and with the understated lead line, “this seems extreme.” Apparently she would cast a vote for limitations on (b). When I’ve shared the story with friends, the response is usually stronger than “this seems extreme.”

I wonder why. Maybe it’s because this looks like a case of conspicuous consumption, which never goes down well. Economist/sociologist Thorstein Veblen coined the term in his 1889 book, The Theory of the Leisure Class, to describe how the newly prosperous middle class were buying things to communicate their move up the social ladder. The neighbors were rarely impressed — that is, until they made their own purchases, and then the game turned into keeping up with Joneses.

The conspicuous consumption shoe might fit here: Mr. Henry’s website tells a bit of his upward mobility story — German immigrant, raised on a farm in Kansas, etc. Or maybe there’s something going on here that transcends his personal story. In that regard, the term “affluenza” comes to mind.

The term “affluenza” was popularized in the late 1990s by Jessie O’Neill, the granddaughter of a past president of General Motors, when she wrote the book “The Golden Ghetto: The Psychology of Affluence.” It’s since been used to describe a condition in which children — generally from richer families — have a sense of entitlement, are irresponsible, make excuses for poor behavior, and sometimes dabble in drugs and alcohol.

From an article by Fox News. See also these descriptions from CNN and New York Magazine.

Definitions of the term come loaded with their own biases, judgments, and assumptions. This is from Merriam-Webster:

Affluenza: the unhealthy and unwelcome psychological and social effects of affluence regarded especially as a widespread societal problem: such as

feelings of guilt, lack of motivation, and social isolation experienced by wealthy people

extreme materialism and consumerism associated with the pursuit of wealth and success and resulting in a life of chronic dissatisfaction, debt, overwork, stress, and impaired relationships

And this is from the popular PBS series that came out shortly after The Golden Ghetto:

Af-flu-en-za n. 1. The bloated, sluggish and unfulfilled feeling that results from efforts to keep up with the Joneses. 2. An epidemic of stress, overwork, waste and indebtedness caused by dogged pursuit of the American Dream. 3. An unsustainable addiction to economic growth.

Affluenza made quite a splash in the estate planning world where I practiced, spawning a slew of books, CLE presentations, and new approaches to legal counseling and document design. Affluenza went mainstream in 2014 with the highly-publicized trial of Ethan Couch, the “Affluenza Teen,” when a judge reduced his sentence on four counts of intoxicated manslaughter and two counts of intoxicated assault after an expert witness testified that his wealthy upbringing had left him so psychologically impaired that he didn’t know right from wrong.

For a great number of my clients, that their kids might catch affluenza was their worst nightmare.[2] Their fear suggests this consensus to Thomas Henry’s partying habits:

(a) it’s okay to make all the money you want,

(b) but it’s not okay if you use your money to make your kids a danger to themselves and to others.

I wonder — would it temper our rush to categorize and judge Mr. Henry if we knew his philanthropic history and philosophy? This is from his website:

Mr. Henry’s overall philosophy is that helping others when you have the good fortune of being successful is not an elective decision but a mandatory decision. People who achieve success have a duty to help others.

That statement closely mirrors the beliefs of Robber Baron Andrew Carnegie. We’ll look at that next time, along with the perceptions of other 0.01 percenters about the social responsibilities of wealth.


[1] The Forum’s website says that “fewer than 1% of U.S. lawyers are members,” which appropriately signals Thomas Henry’s position in the economic strata.

[2] I used to tell my clients that if I had a dime for every time a client said, “I don’t want my money to ruin my kids,” I would have been a rich man. That was hyperbole, of course: a dime each time wouldn’t have made me rich. On the other hand, a million dollars each time might have made me a billionaire. A billion is a BIG number.

 

The Matthew Effect

“For to everyone who has will more be given, and he will have abundance;
but from him who has not, even what he has will be taken away.”

The Gospel of Matthew 25:29, Revised Standard Version

Economists call it the Matthew Effect or the Matthew Principle. Columbia sociologist Robert K. Merton used the former when he coined the term[1] by reference to its Biblical origins.[2] The more pedestrian version asserts that the rich get richer while the poor get poorer.

According to the Matthew Effect, social capital is better caught than taught, better inherited than achieved. That notion is borne out by current economic and demographic data[3] showing that the only children with a statistically relevant shot at experiencing a better standard of living than their parents are the ones born with a silver spoon in their mouths — or, as David Graeber says in Bullshit Jobs, the ones “from professional backgrounds” where they are taught essential social capital mindsets and skills “from an early age.”[4]

Statistics are susceptible to ideological manipulation, but bell curves conceptualize trends into observable laws of societal thermodynamics. The Matthew Effect bell curve says it’s harder to get to the top by following the Horatio Alger path: you’re starting too many standard deviations out; your odds are too low. On the other hand, if you start in the center (you’re born into the top), odds are you’ll stay there.

That might depend, however, on how long your forebears have been members of the club. Globetrotting wealth guru Jay Hughes has spoken and written widely of the concept of “shirt sleeves to shirt sleeves in three generations.” According to the aphorism, if the first generation of a family follows the Horatio Alger path to wealth, there’s a 70% chance the money will be gone by the end of the third generation, which means the social capital will be gone as well. That first generation might defy the odds through hard work and luck, but odds are they won’t create an enduring legacy for their heirs.

My own law career was an exercise in another folk expression of the Matthew Effect: “you can take the boy out of the country but you can’t take the country out of the boy.” (No, that’s not me in the photo — I just thought it made the point nicely.) My career finally hit its stride when I created a small firm serving “millionaire next door” clients — farmers, ranchers, and Main Street America business owners who became financially successful while remaining in the social milieu where they (and I) began. Nearly all of those families created their wealth during the post-WWII neoliberal economic surge, and are now entering the third generation. I wonder how many are experiencing the shirt sleeves aphorism.

Curiously, my transition out of law practice was also dominated by social capital considerations — in particular, a social capital misfiring. I had a big idea and some relevant skills (i.e., some relevant human capital — at least other people thought so), but lacked the social capital and failed to make the personal transformation essential to my new creative business venture.[5]

In fact, it seems the Matthew Effect might be a larger theme in my life, not just my legal career. In that regard, I was surprised to find yet another one of my job stories in Bullshit Jobs. This one was about a townie who took a job as a farm laborer. His job included “picking rocks,” which involves tackling a rocky field with a heavy pry bar, sledge hammer, pick axe, spade, and brute strength, in an effort to remove the large rocks and make it tillable. I’d had that job, too. I was a teenager at the time, and it never occurred to me that it might be “completely pointless, unnecessary, or pernicious” (Graeber’s definition), which is how the guy in the book felt about it. In fact, when I told my parents about my first day of picking rocks over dinner, my dad was obviously so proud I thought he was going to run out and grill me a steak. Obviously I’d made some kind of rite of passage.

Picking rocks is just part of what you do if you work the land, and there’s nothing meaningless about it. I enjoyed it, actually — it was great training for the upcoming football season. I can scarcely imagine what my law career and life might have been like if I’d felt the same way about my first years of legal work as I did about picking rocks.

The Matthew Effect has far-reaching social, economic, legal, and ethical implications for the legal profession, where social capital is an important client- and career-development asset. Next time we’ll look at another lawyer who, like David Boies, rose from humble origins to superstar status, and whose story brings a whole new set of upward mobility issues to the table.


[1] Merton was originally trying to describe how it is that more well-known people get credit for things their subordinates do — for example, professors taking credit for the work of their research assistants — the professors enriching their credentials at the expense of their minions’ hard and anonymous work. Merton might just as well have been talking about law partners taking credit for the work of paralegals, law clerks. and associates.

[2] As for why “Matthew” when the other Synoptic Gospels (Mark and Luke) have the same verse, I suspect that’s in part because Matthew is the first book in the New Testament canon, but it may also substantiate a derivative application of Merton’s law made by U of Chicago super-statistician Stephen Stigler, known as the Law of Eponymy, which holds that “No scientific discovery is named after its original discoverer.” I.e., later arrivals collect the accolades the” original discoverer” never did. In that regard, Mark’s gospel is believed to have been written first, with Matthew and Luke’s coming later and deriving from it. That would make Mark the true original discoverer. That this economic phenomenon is not called the “Mark Effect” is therefore another example of Stigler’s law.

[3] See, e.g., the “Fading American Dream” graph and the “Geography of Upward Mobility in America” map in this NPR article.

[4] The phenomenon has been widely reported. See this study from Stanford and our trio to new Meristocrats from a few weeks back: Richard V. Reeves and his book Dream Hoarders and his Brookings Institute monograph Saving Horatio Alger (we looked at those last time). The second was philosopher Matthew Stewart, author of numerous books and a recent article for The Atlantic called The 9.9 Percent is the New American Meritocracy. The third was Steven Brill, founder of The American Lawyer and Court TV, author of the book Tailspin: The People and Forces Behind America’s Fifty-Year Fall—and Those Fighting to Reverse It and also the writer of a Time Magazine feature called How Baby Boomers Broke America.

[5] I’ve told that story elsewhere, and won’t repeat it here, but if you’re interested in more on this issue, a look at that particular social capital disaster might be illustrative. See my book Life Beyond Reason: A Memoir of Mania.

 

Rebel Without A Cause

Continuing with David Graeber’s analysis of Eric’s job experience from last time:

What drove Eric crazy was the fact that there was simply no way he could construe his job as serving any sort of purpose.

To get a sense of what was really happening here, let us imagine a second history major — we can refer to him as anti-Eric — a young man of a professional background but placed in exactly the same situation. How might anti-Eric have behaved differently?

Well, likely as not, he would have played along with the charade. Instead of using phony business trips to practice forms of self-annihilation, anti-Eric would have used them to accumulate social capital, connections that would eventually allow him to move on to better things. He would have treated the job as a stepping-stone, and this very project of professional advancement would have given him a sense of purpose.

But such attitudes and dispositions don’t come naturally. Children from professional backgrounds are taught to think like that from an early age. Eric, who had not been trained to act and think this way, couldn’t bring himself to do it.

Like Eric, I couldn’t bring myself to do it either — although it was not so much that I couldn’t, it was more a case of not knowing how. I was bright enough, had a knack for the all-important “likeability factor” with clients and colleagues, and worked with lots of clients and other professionals who were members of the Red Velvet Rope Club. But like Eric, I remained on the outside looking in, and I spent a lot of time feeling envious of others who fit in so easily.

Those dynamics dogged the early years of my law career. In time, a general sense of inadequacy became depression, which I compensated for by nursing a rebel-without-a-cause attitude.

My experience didn’t have to be that way. Consider, for example, the story of super-lawyer David Boies. Like Eric and me, Boies was also born to working class parents and grew up in a farming community, but that’s where the resemblance ends. Chrystia Freeland introduces him this way in her book Plutocrats: The Rise of the New Global Super-Rich and the Fall of Everyone Else(2012):

As the world economy grows, and as the super-elite, in particular, get richer, the superstars who work for the super-rich can charge super fees.

Consider the 2009 legal showdown between Hank Greenberg and AIG, the insurance giant he had built. It was a high-stakes battle, as AIG accused Greenberg, through his privately-held company, Starr International, of misappropriating $4.3 billion worth of assets. For his defense, Greenberg hired David Boies. With his trademark slightly ratty Lands’ End suits (ordered a dozen at a time by his office online), his Midwestern background, his proud affection for Middle American pastimes like craps, and his severe dyslexia (he didn’t learn how to read until he was in the third grade), Boies comes across as neither a superstar or a member of the super-elite. He is both.

Boies and his eponymous firm earned a reputed $100 million for the nine-month job of defending Greenberg. That was one of the richest fees earned in a single litigation. Yet, for Greenberg, it was a terrific deal. When you have $4.3 billion at risk, $100 million — only 2.3 percent of the total — just isn’t that much money. Further sweetening the transaction was the judge’s eventual ruling that AIG, then nearly 80 percent owned by the U.S. government, was liable for up to $150 million of Greenberg’s legal fees, but he didn’t know that when he retained Boies.

What did Boies have that Eric and I didn’t? Well, um, would you like the short list or the long? Boies is no doubt one of those exceptionally gifted and ambitious people who works hard enough to get lucky. I suspect his plutocrat switch was first activated when his family moved to California while he was in high school, and from there was exponentially supercharged by a series of textbook upwardly mobile experiences: a liberal arts education at Northwestern, a law degree from Yale, an LL.M. from NYU, joining the Cravath firm and eventually becoming a partner before leaving to found his own firm.

That’s impressive enough, but there’s more to his story: somehow along the way he was transformed into the kind of person who belongs — in his case, not just to the 9.9% club, but to the 0.1 %. Yes, his human capital was substantial, but it was his personal transformation that enabled him to capitalize (I use that term advisedly) on the opportunities granted only by social capital.

And now, if the 9.9 percenters we heard from a couple weeks back are correct, the pathway he followed is even more statistically rare (if that’s even possible) than when he travelled it — in part because of an economic principle that’s at least as old as the Bible.

We’ll talk about that next time.

Eric and Kevin’s Most Excellent Career Adventures

       

David Graeber’s book Bullshit Jobs is loaded with real-life job stories that meet his definition of “a form of employment that is so completely pointless, unnecessary, or pernicious that even the employee cannot justify its existence even though the employee feels obliged to pretend that this is not the case.” One of those stories rang a bell: turns out that “Eric” and I had the same job. The details are different, but our experiences involved the same issues of social capital and upward mobility.

Eric grew up in a working class neighborhood, left to attend a major British university, graduated with a history major, landed in a Big 4 accounting firm training program, and took a corporate position that looked like an express elevator to the executive suite. But then the job turned out to be… well, nothing. No one would tell him what to do. He showed up day after day in his new business clothes and tried to look busy while trying in vain to solve the mystery of why he had nothing to do. He tried to quit a couple times, only to be rewarded with raises, and the money was hard to pass up. Frustration gave way to boredom, boredom to depression, and depression to deception. Soon he and his mates at the pub back home hatched a plan to use his generous expense account to travel, gamble, and drink.

In time, Eric learned that his position was the result of a political standoff: one of the higher-ups had the clout to fund a pet project that the responsible mid-level managers disagreed with, so they colluded to make sure it would never happen. Since Eric had been hired to coordinate internal communication on the project, keeping him in the dark was essential. Eventually he managed to quit, kick his gambling and drinking habits, and take a shot at the artistic career he had envisioned in college.

My story isn’t quite so… um, colorful… but the themes are similar. I also came from a strong “work with your hands” ethic and was in the first generation of my family to go to college, where I joined the children of lawyers, neurosurgeons, professors, diplomats, and other upper echelon white collar professionals from all 50 states and several foreign countries, At the first meeting of my freshmen advisory group, my new classmates talked about books, authors, and academic disciplines I’d never heard of. When I tackled my first class assignment, I had to look up 15 words in the first two pages. And on it went. Altogether, my college career was mostly an exercise in cluelessness. But I was smart and ambitious, and did better than I deserved.

Fast forward nine years, and that’s me again, this time signing on with a boutique corporate law firm as a newly minted MBA/JD. I got there by building a lot of personal human capital, but my steel thermos and metal lunch bucket upbringing was still so ingrained that a few weeks after getting hired I asked a senior associate why nobody ever took morning and afternoon coffee breaks. He looked puzzled, and finally said, “Well… we don’t really take breaks.” Or vacations, evenings, weekends, or holidays, as it turned out.

A couple years later I hired on with a Big 4 accounting firm as a corporate finance consultant. My first assignment was my Eric-equivalent job: I was assigned to a team of accountants tasked with creating a new chart of accounts for a multinational corporation and its subsidiaries. Never mind that the job had nothing to do with corporate finance. Plus there were two other little problems: I didn’t know what a chart of accounts was, and at our first client meeting a key corporate manager announced that he thought the project was ridiculous and intended to oppose it. Undaunted, the other members of the consulting team got to work. Everybody seemed to know what to do, but nobody would tell me, and in the meantime our opponent in management gained a following.

As a result, I spent months away from home every week, trying to look busy. I piled up the frequent flyer miles and enjoyed the 5-star accommodations and meals, but fell into a deep depression. When I told the managing partner about it, he observed that, “Maybe this job isn’t a good fit for you.” He suggested I leave in two months, which happened to be when our consulting contract was due for a renewal. Looking back, I suspect my actual role on the team was “warm body.”

Graeber says that, at first blush, Eric’s story sounds like yet one more bright, idealistic liberal arts grad getting a real-world comeuppance:

Eric was a young man form a working-class background… fresh out of college and full of expectations, suddenly confronted with a jolting introduction to the “real world.”

One could perhaps conclude that Eric’s problem was not just that he hadn’t been sufficiently prepared for the pointlessness of the modern workplace. He had passed through the old educational system . . . This led to false expectations and an initial shock of disillusionment that he could not overcome.

Sounds like my story, too, but then Graeber takes his analysis in a different direction: “To a large degree,” he say, “this is really a story about social class.” Which brings us back to the issues of upward mobility and social capital we’ve been looking at. We’ll talk more about those next time.

In the meantime, I can’t resist a Dogbert episode:

 

Mobility and Meritocracy

Occupy got the math wrong. They weren’t the 99%, they were the 90%. And that extra 9% makes things much worse for them.

Of the top 10%, the stratospheric 0.1% wears a visible-from-space economic inequality target. Not so the 9.9%: they’re folks like you and me — the Horatio Alger heroes of our times, people like Peter Boies (we met him last time). Ironically, though, a closer look reveals that they’ve done such a perfect job of upward mobility that they’ve pulled up the ladder behind themselves. They didn’t mean to, that’s just the way it worked out. Which means the 90% are quite likely to remain the Heathcliffes of the world, blocked by the red velvet rope, barred by the glass ceiling.

Says who? Says the 9.9%, and they ought to know. They’ve become the New American Meritocracy — or Aristocracy, depending on who’s analyzing the data. And now that they’ve got the Central Park view, the rest of us have to deal with the implacable security guard. (I saw one of those once, at the entrance to a 5th Avenue luxury condo high rise. He looked like Clubber Lang from Rocky III, had positioned himself feet defiantly apart and arms crossed in the main entrance revolving door, so that you had to move him to move it. Don’t even think about it.)

I learned about this new social class recently from three different sources. The first was Richard V. Reeves and his book Dream Hoarders and his Brookings Institute monograph Saving Horatio Alger (we looked at those last time). The second was philosopher Matthew Stewart, author of numerous books and a recent article for The Atlantic called The 9.9 Percent is the New American Meritocracy. The third was Steven Brill, founder of The American Lawyer and Court TV, author of the book Tailspin: The People and Forces Behind America’s Fifty-Year Fall—and Those Fighting to Reverse It and also the writer of a Time Magazine feature called How Baby Boomers Broke America.

Reeves, Stewart, and Brill are all members of the 9.9%. All three got their by rising up from below. And all three cite the same economic and related social data to support their conclusion that their class has barred the way for the rest of us. Pause for a moment and wonder, as I did, why would they write books and articles to implicate themselves in that way? It’s easy to suspect a bad case of Thriver (not Survivor) Guilt, but after reading their work, I think it’s because their success turned into an ideology buster: not only are the Horatio Alger condos sold out, but Clubber Lang is barring the way to any newcomers, and that kind of thing just doesn’t happen in America.

Until it does.

I’ll let Matthew Stewart speak for the others, quoting from his article in The Atlantic:

I’ve joined a new aristocracy now, even if we still call ourselves meritocratic winners. To be sure, there is a lot to admire about my new group, which I’ll call—for reasons you’ll soon see—the 9.9 percent. We’ve dropped the old dress codes, put our faith in facts, and are (somewhat) more varied in skin tone and ethnicity. People like me, who have waning memories of life in an earlier ruling caste, are the exception, not the rule.

By any sociological or financial measure, it’s good to be us. It’s even better to be our kids. In our health, family life, friendship networks, and level of education, not to mention money, we are crushing the competition below.

The meritocratic class has mastered the old trick of consolidating wealth and passing privilege along at the expense of other people’s children. We are not innocent bystanders to the growing concentration of wealth in our time. We are the principal accomplices in a process that is slowly strangling the economy, destabilizing American politics, and eroding democracy. Our delusions of merit now prevent us from recognizing the nature of the problem that our emergence as a class represents. We tend to think that the victims of our success are just the people excluded from the club. But history shows quite clearly that, in the kind of game we’re playing, everybody loses badly in the end.

So what kind of characters are we, the 9.9 percent? We are mostly not like those flamboyant political manipulators from the 0.1 percent. We’re a well-behaved, flannel-suited crowd of lawyers, doctors, dentists, mid-level investment bankers, M.B.A.s with opaque job titles, and assorted other professionals—the kind of people you might invite to dinner. In fact, we’re so self-effacing, we deny our own existence. We keep insisting that we’re “middle class.”

One of the hazards of life in the 9.9 percent is that our necks get stuck in the upward position. We gaze upon the 0.1 percent with a mixture of awe, envy, and eagerness to obey. As a consequence, we are missing the other big story of our time. We have left the 90 percent in the dust—and we’ve been quietly tossing down roadblocks behind us to make sure that they never catch up.

If you want more on this topic, I recommend starting with their articles, and then their books. They’re all well-researched and well-written, honest and personally disclosing, and economically and socially remarkable. Plus, for me personally, they illuminate a dimension of my own career and economic journey that were always a bit of a mystery to me. We’ll talk about that next time.