Fund Your Utopia Without Me.™

02 October 2011

Nostalgianomics

 

Liberal economists pine for days no liberal should want to revisit.

 

“The America I grew up in was a relatively equal middle-class society. Over the past generation, however, the country has returned to Gilded Age levels of inequality.” So sighs Paul Krugman, the Nobel Prize–winning Princeton economist and New York Times columnist, in his recent book The Conscience of a Liberal.

The sentiment is nothing new. Political progressives such as Krugman have been decrying increases in income inequality for many years now. But Krugman has added a novel twist, one that has important implications for public policy and economic discourse in the age of Obama. In seeking explanations for the widening spread of incomes during the last four decades, researchers have focused overwhelmingly on broad structural changes in the economy, such as technological progress and demographic shifts. Krugman argues that these explanations are insufficient. “Since the 1970s,” he writes, “norms and institutions in the United States have changed in ways that either encouraged or permitted sharply higher inequality. Where, however, did the change in norms and institutions come from? The answer appears to be politics.”

To understand Krugman’s argument, we can’t start in the 1970s. We have to back up to the 1930s and ’40s—when, he contends, the “norms and institutions” that shaped a more egalitarian society were created. “The middle-class America of my youth,” Krugman writes, “is best thought of not as the normal state of our society, but as an interregnum between Gilded Ages. America before 1930 was a society in which a small number of very rich people controlled a large share of the nation’s wealth.” But then came the twin convulsions of the Great Depression and World War II, and the country that arose out of those trials was a very different place. “Middle-class America didn’t emerge by accident. It was created by what has been called the Great Compression of incomes that took place during World War II, and sustained for a generation by social norms that favored equality, strong labor unions and progressive taxation.”

The Great Compression is a term coined by the economists Claudia Goldin of Harvard and Robert Margo of Boston University to describe the dramatic narrowing of the nation’s wage structure during the 1940s. The real wages of manufacturing workers jumped 67 percent between 1929 and 1947, while the top 1 percent of earners saw a 17 percent drop in real income. These egalitarian trends can be attributed to the exceptional circumstances of the period: precipitous declines at the top end of the income spectrum due to economic cataclysm; wartime wage controls that tended to compress wage rates; rapid growth in the demand for low-skilled labor, combined with the labor shortages of the war years; and rapid growth in the relative supply of skilled workers due to a near doubling of high school graduation rates.

Yet the return to peacetime and prosperity did not result in a shift back toward the status quo ante. The more egalitarian income structure persisted for decades. For an explanation, Krugman leans heavily on a 2007 paper by the Massachusetts Institute of Technology economists Frank Levy and Peter Temin, who argue that postwar American history has been a tale of two widely divergent systems of political economy. First came the “Treaty of Detroit,” characterized by heavy unionization of industry, steeply progressive taxation, and a high minimum wage. Under that system, median wages kept pace with the economy’s overall productivity growth, and incomes at the lower end of the scale grew faster than those at the top. Beginning around 1980, though, the Treaty of Detroit gave way to the free market “Washington Consensus.” Tax rates on high earners fell sharply, the real value of the minimum wage declined, and private-sector unionism collapsed. As a result, most workers’ incomes failed to share in overall productivity gains while the highest earners had a field day.

This revisionist account of the fall and rise of income inequality is being echoed daily in today’s public policy debates. Under the conventional view, rising inequality is a side effect of economic progress—namely, continuing technological breakthroughs, especially in communications and information technology. Consequently, when economists have supported measures to remedy inequality, they have typically shied away from structural changes in market institutions. Rather, they have endorsed more income redistribution to reduce post-tax income differences, along with remedial education, job retraining, and other programs designed to raise the skill levels of lower-paid workers.

By contrast, Krugman sees the rise of inequality as a consequence of economic regress—in particular, the abandonment of well-designed economic institutions and healthy social norms that promoted widely shared prosperity. Such an assessment leads to the conclusion that we ought to revive the institutions and norms of Paul Krugman’s boyhood, in broad spirit if not in every detail.

There is good evidence that changes in economic policies and social norms have indeed contributed to a widening of the income distribution since the 1970s. But Krugman and other practitioners of nostalgianomics are presenting a highly selective account of what the relevant policies and norms were and how they changed.

The Treaty of Detroit was built on extensive cartelization of markets, limiting competition to favor producers over consumers. The restrictions on competition were buttressed by racial prejudice, sexual discrimination, and postwar conformism, which combined to limit the choices available to workers and potential workers alike. Those illiberal social norms were finally swept aside in the cultural tumults of the 1960s and ’70s. And then, in the 1970s and ’80s, restraints on competition were substantially reduced as well, to the applause of economists across the ideological spectrum. At least until now.

Stifled Competition

The economic system that emerged from the New Deal and World War II was markedly different from the one that exists today. The contrast between past and present is sharpest when we focus on one critical dimension: the degree to which public policy either encourages or thwarts competition.

The transportation, energy, and communications sectors were subject to pervasive price and entry regulation in the postwar era. Railroad rates and service had been under federal control since the Interstate Commerce Act of 1887, but the Motor Carrier Act of 1935 extended the Interstate Commerce Commission’s regulatory authority to cover trucking and bus lines as well. In 1938 airline routes and fares fell under the control of the Civil Aeronautics Authority, later known as the Civil Aeronautics Board. After the discovery of the East Texas oil field in 1930, the Texas Railroad Commission acquired the effective authority to regulate the nation’s oil production. Starting in 1938, the Federal Power Commission regulated rates for the interstate transmission of natural gas. The Federal Communications Commission, created in 1934, allocated licenses to broadcasters and regulated phone rates.

Beginning with the Agricultural Adjustment Act of 1933, prices and production levels on a wide variety of farm products were regulated by a byzantine complex of controls and subsidies. High import tariffs shielded manufacturers from international competition. And in the retail sector, aggressive discounting was countered by state-level “fair trade laws,” which allowed manufacturers to impose minimum resale prices on nonconsenting distributors.

Comprehensive regulation of the financial sector restricted competition in capital markets too. The McFadden Act of 1927 added a federal ban on interstate branch banking to widespread state-level restrictions on intrastate branching. The Glass-Steagall Act of 1933 erected a wall between commercial and investment banking, effectively brokering a market-sharing agreement protecting commercial and investment banks from each other. Regulation Q, instituted in 1933, prohibited interest payments on demand deposits and set interest rate ceilings for time deposits. Provisions of the Securities Act of 1933 limited competition in underwriting by outlawing pre-offering solicitations and undisclosed discounts. These and other restrictions artificially stunted the depth and development of capital markets, muting the intensity of competition throughout the larger “real” economy. New entrants are much more dependent on a well-developed financial system than are established firms, since incumbents can self-finance through retained earnings or use existing assets as collateral. A hobbled financial sector acts as a barrier to entry and thereby reduces established firms’ vulnerability to competition from entrepreneurial upstarts.

The highly progressive tax structure of the early postwar decades further dampened competition. The top marginal income tax rate shot up from 25 percent to 63 percent under Herbert Hoover in 1932, climbed as high as 94 percent during World War II, and stayed at 91 percent during most of the 1950s and early ’60s. Research by the economists William Gentry of Williams College and Glenn Hubbard of Columbia University has found that such rates act as a “success tax,” discouraging employees from striking out as entrepreneurs.

Finally, competition in labor markets was subject to important restraints during the early postwar decades. The triumph of collective bargaining meant the active suppression of wage competition in a variety of industries. In the interest of boosting wages, unions sometimes worked to restrict competition in their industries’ product markets as well. Garment unions connived with trade associations to set prices and allocate production among clothing makers. Coal miner unions attempted to regulate production by dictating how many days a week mines could be open.

MIT economists Levy and Temin don’t mention it, but highly restrictive immigration policies were another significant brake on labor market competition. With the establishment of countryspecific immigration quotas under the Immigration Act of 1924, foreign-born residents of the United States plummeted from 13 percent of the total population in 1920 to 5 percent by 1970. As a result, competition at the less-skilled end of the U.S. labor market was substantially reduced.

Solidarity and Chauvinism

The anti-competitive effects of the Treaty of Detroit were reinforced by the prevailing social norms of the early postwar decades. Here Krugman and company focus on executive pay. Krugman quotes wistfully from John Kenneth Galbraith’s characterization of the corporate elite in his 1967 book The New Industrial State: “Management does not go out ruthlessly to reward itself—a sound management is expected to exercise restraint.” According to Krugman, “For a generation after World War II, fear of outrage kept executive salaries in check. Now the outrage is gone. That is, the explosion in executive pay represents a social change…like the sexual revolution of the 1960’s—a relaxation of old strictures, a new permissiveness, but in this case the permissiveness is financial rather than sexual.”

Krugman is on to something. But changing attitudes about lavish compensation packages are just one small part of a much bigger cultural transformation. During the early postwar decades, the combination of in-group solidarity and out-group hostility was much more pronounced than what we’re comfortable with today.

Consider, first of all, the dramatic shift in attitudes about race. Open and unapologetic discrimination by white Anglo-Saxon Protestants against other ethnic groups was widespread and socially acceptable in the America of Paul Krugman’s boyhood. How does racial progress affect income inequality? Not the way we might expect. The most relevant impact might have been that more enlightened attitudes about race encouraged a reversal in the nation’s restrictive immigration policies. The effect was to increase the number of less-skilled workers and thereby intensify competition among them for employment.

Under the system that existed between 1924 and 1965, immigration quotas were set for each country based on the percentage of people with that national origin already living in the U.S. (with immigration from East and South Asia banned outright until 1952). The explicit purpose of the national-origin quotas was to freeze the ethnic composition of the United States—that is, to preserve white Protestant supremacy and protect the country from “undesirable” races. “Unquestionably, there are fine human beings in all parts of the world,” Sen. Robert Byrd (D-W.V.) said in defense of the quota system in 1965, “but people do differ widely in their social habits, their levels of ambition, their mechanical aptitudes, their inherited ability and intelligence, their moral traditions, and their capacity for maintaining stable governments.”

But the times had passed the former Klansman by. With the triumph of the civil rights movement, official discrimination based on national origin was no longer sustainable. Just two months after signing the Voting Rights Act, President Lyndon Johnson signed the Immigration and Nationality Act of 1965, ending the “un-American” system of national-origin quotas and its “twin barriers of prejudice and privilege.” The act inaugurated a new era of mass immigration: Foreign-born residents of the United States have surged from 5 percent of the population in 1970 to 12.5 percent as of 2006.

This wave of immigration exerted a mild downward pressure on the wages of native-born low-skilled workers, with most estimates showing a small effect. Immigration’s more dramatic impact on measurements of inequality has come by increasing the number of less-skilled workers, thereby increasing apparent inequality by depressing average wages at the low end of the income distribution. According to the American University economist Robert Lerman, excluding recent immigrants from the analysis would eliminate roughly 30 percent of the increase in adult male annual earnings inequality between 1979 and 1996.

Although the large influx of unskilled immigrants has made American inequality statistics look worse, it has actually reduced inequality for the people involved. After all, immigrants experience large wage gains as a result of relocating to the United States, thereby reducing the cumulative wage gap between them and top earners in this country. When Lerman recalculated trends in inequality to include, at the beginning of the period, recent immigrants and their native-country wages, he found equality had increased rather than decreased. Immigration has increased inequality at home but decreased it on a global scale.

Just as racism helped to keep foreign-born workers out of the U.S. labor market, another form of in-group solidarity, sexism, kept women out of the paid work force. As of 1950, the labor force participation rate for women 16 and older stood at only 34 percent. By 1970 it had climbed to 43 percent, and as of 2005 it had jumped to 59 percent. Meanwhile, the range of jobs open to women expanded enormously.

Paradoxically, these gains for gender equality widened rather than narrowed income inequality overall. Because of the prevalence of “assortative mating”—the tendency of people to choose spouses with similar educational and socioeconomic backgrounds—the rise in dual-income couples has exacerbated household income inequality: Now richer men are married to richer wives. Between 1979 and 1996, the proportion of working-age men with working wives rose by approximately 25 percent among those in the top fifth of the male earnings distribution, and their wives’ total earnings rose by over 100 percent. According to a 1999 estimate by Gary Burtless of the Brookings Institution, this unanticipated consequence of feminism explains about 13 percent of the total rise in income inequality since 1979.

Racism and sexism are ancient forms of group identity. Another form, more in line with what Krugman has in mind, was a distinctive expression of U.S. economic and social development in the middle decades of the 20th century. The journalist William Whyte described this “social ethic” in his 1956 book The Organization Man, outlining a sensibility that defined itself in studied contrast to old-style “rugged individualism.” When contemporary critics scorned the era for its conformism, they weren’t just talking about the ranch houses and gray flannel suits. The era’s mores placed an extraordinary emphasis on fitting into the group.

“In the Social Ethic I am describing,” wrote Whyte, “man’s obligation is…not so much to the community in a broad sense but to the actual, physical one about him, and the idea that in isolation from it—or active rebellion against it—he might eventually discharge the greater service is little considered.” One corporate trainee told Whyte that he “would sacrifice brilliance for human understanding every time.” A personnel director declared that “any progressive employer would look askance at the individualist and would be reluctant to instill such thinking in the minds of trainees.” Whyte summed up the prevailing attitude: “All the great ideas, [trainees] explain, have already been discovered and not only in physics and chemistry but in practical fields like engineering. The basic creative work is done, so the man you need—for every kind of job—is a practical, team-player fellow who will do a good shirt-sleeves job.”

It seems entirely reasonable to conclude that this social ethic helped to limit competition among business enterprises for top talent. When secure membership in a stable organization is more important than maximizing your individual potential, the most talented employees are less vulnerable to the temptation of a better offer elsewhere. Even if they are tempted, a strong sense of organizational loyalty makes them more likely to resist and stay put.

Krugman blames the conservative movement for income inequality, arguing that right-wingers exploited white backlash in the wake of the civil rights movement to hijack first the Republican Party and then the country as a whole. Once in power, they duped the public with “weapons of mass distraction” (i.e., social issues and foreign policy) while “cut[ting] taxes on the rich,” “try[ing] to shrink government benefits and undermine the welfare state,” and “empower[ing] businesses to confront and, to a large extent, crush the union movement.”

Obviously, conservatism has contributed in important ways to the political shifts of recent decades. But the real story of those changes is more complicated, and more interesting, than Krugman lets on. Influences across the political spectrum have helped shape the more competitive more individualistic, and less equal society we now live in.

Indeed, the relevant changes in social norms were led by movements associated with the left. The women’s movement led the assault on sex discrimination. The civil rights campaigns of the 1950s and ’60s inspired more enlightened attitudes about race and ethnicity, with results such as the Immigration and Nationality Act of 1965, a law spearheaded by a young Sen. Edward Kennedy (D-Mass.). And then there was the counterculture of the 1960s, whose influence spread throughout American society in the Me Decade that followed. It upended the social ethic of group-minded solidarity and conformity with a stampede of unbridled individualism and self-assertion. With the general relaxation of inhibitions, talented and ambitious people felt less restrained from seeking top dollar in the marketplace. Yippies and yuppies were two sides of the same coin.

Contrary to Krugman’s narrative, liberals joined conservatives in pushing for dramatic changes in economic policy. In addition to his role in liberalizing immigration, Kennedy was a leader in pushing through both the Airline Deregulation Act of 1978 and the Motor Carrier Act of 1980, which deregulated the trucking industry—and he was warmly supported in both efforts by the left-wing activist Ralph Nader. President Jimmy Carter signed these two pieces of legislation, as well as the Natural Gas Policy Act of 1978, which began the elimination of price controls on natural gas, and the Staggers Rail Act of 1980, which deregulated the railroad industry.

The three most recent rounds of multilateral trade talks were all concluded by Democratic presidents: the Kennedy Round in 1967 by Lyndon Johnson, the Tokyo Round in 1979 by Jimmy Carter, and the Uruguay Round in 1994 by Bill Clinton. And though it was Ronald Reagan who slashed the top income tax rate from 70 percent to 50 percent in 1981, it was two Democrats, Sen. Bill Bradley of New Jersey and Rep. Richard Gephardt of Missouri, who sponsored the Tax Reform Act of 1986, which pushed the top rate all the way down to 28 percent.

What about the unions? According to the Berkeley economist David Card, the shrinking of the unionized labor force accounted for 15 percent to 20 percent of the rise in overall male wage inequality between the early 1970s and the early 1990s. Krugman is right that labor’s decline stems in part from policy changes, but his ideological blinkers lead him to identify the wrong ones.

The only significant change to the pro-union Wagner Act of 1935 came through the Taft-Hartley Act, which outlawed closed shops (contracts requiring employers to hire only union members) and authorized state right-to-work laws (which ban contracts requiring employees to join unions). But that piece of legislation was enacted in 1947—three years before the original Treaty of Detroit between General Motors and the United Auto Workers. It would be a stretch to argue that the Golden Age ended before it even began.

Scrounging for a policy explanation, economists Levy and Temin point to the failure of a 1978 labor law reform bill to survive a Senate filibuster. But maintaining the status quo is not a policy change. They also describe President Reagan’s 1981 decision to fire striking air traffic controllers as a signal to employers that the government no longer supported labor unions.

While it is true that Reagan’s handling of that strike, along with his appointments to the National Labor Relations Board, made the policy environment for unions less favorable, the effect of those moves on unionization was marginal.

The major reason for the fall in unionized employment, according to a 2007 paper by Georgia State University economist Barry Hirsch, “is that union strength developed through the 1950s was gradually eroded by increasingly competitive and dynamic markets.” He elaborates: “When much of an industry is unionized, firms may prosper with higher union costs as long as their competitors face similar costs. When union companies face low-cost competitors, labor cost increases cannot be passed through to consumers. Factors that increase the competitiveness of product markets increased international trade, product market deregulation, and the entry of low-cost competitors—make it more difficult for union companies to prosper.”

So the decline of private-sector unionism was abetted by policy changes, but the changes were not in labor policy specifically. They were the general, bipartisan reduction of trade barriers and price and entry controls. Unionized firms found themselves at a critical disadvantage. They shrank accordingly, and union rolls shrank with them.

Postmodern Progress

The move toward a more individualistic culture is not unique to the United States. As the political scientist Ronald Inglehart has documented in dozens of countries around the world, the shift toward what he calls “postmodern” attitudes and values is a predictable cultural response to rising affluence and expanding choices. “In a major part of the world,” he writes in his 1997 book Modernization and Postmodernization, “the disciplined, self-denying, and achievement-oriented norms of industrial society are giving way to an increasingly broad latitude for individual choice of lifestyles and individual self-expression.”

The increasing focus on individual fulfillment means, inevitably, less deference to tradition and organizations. “A major component of the Postmodern shift,” Inglehart argues, “is a shift away from both religious and bureaucratic authority, bringing declining emphasis on all kinds of authority. For deference to authority has high costs: the individual’s personal goals must be subordinated to those of a broader entity.”

Paul Krugman may long for the return of self-denying corporate workers who declined to seek better opportunities out of organizational loyalty, and thus kept wages artificially suppressed, but these are creatures of a bygone ethos—an ethos that also included uncritical acceptance of racist and sexist traditions and often brutish intolerance of deviations from mainstream lifestyles and sensibilities.

The rise in income inequality does raise issues of legitimate public concern. And reasonable people disagree hotly about what ought to be done to ensure that our prosperity is widely shared. But the caricature of postwar history put forward by Krugman and other purveyors of nostalgianomics won’t lead us anywhere. Reactionary fantasies never do.



Related Reading:

Soft Apartheid:  The Income Inequality Factor Liberals Can't Talk About

Unsurprisingly, Higher Minimum Wages Do Nothing To Alleviate Income Inequality

Obama's Minimum Wage Delusion

Obama's Minimum Wage Hike: A Case of Zombie Economics

No, Sergeant Schultz, A Higher Minimum Wage Will Not Address Income Inequality

Obama's Progressive Mythology

Will The Real Paul Krugman Please Stand Up?

Nostalgianomics



 

No comments: