How We Got Here: Stein, Cowie, and Arrighi on the Post-Industrial Economy

In 1974 an autoworker from Michigan named Dewey Burton remarked disconsolately to a reporter, “I wanted to be somebody…It wasn’t about the money so much as that I just wanted to have some kind of recognition, you know, to be more tomorrow than I was yesterday, and that’s what I was working for.” The economic repercussions of the 1973 oil shock hit America’s working class especially hard and were perhaps the first major signs that the period Eric Hobsbawm calls the Golden Age of capitalism (1950-70) was coming to a dramatic end. Indeed, aside from the Great Depression, the 1970s is the only decade in which Americans ended up poorer than they began. As if history was taking its cue from Schumpeter (or Marx, depending on your taste), the seventies saw the gradual decline of the American manufacturing sector and the ascendancy of the service sector.

For the well off, the latter meant opportunities in finance were multiplying, as the financial sector would soon loom large over the economy. For those less fortunate it meant closing plants, mass lay-offs, stagnant or dropping wages, and a shift to less lucrative precincts of the service sector—waiting tables and cleaning bathrooms. This dramatic decline in fortune for the majority of Americans was compounded by the steady dismantlement of America’s already very modest welfare state. This gives credence to the despondent joke of a good friend: “Starbucks is the new social safety net.” Unfortunately, gallows humor can do little to help us understand the gravity of the general despair of the 1970s. “It takes so much to just make it that there’s no time for dreams and no energy for making them come true—and I’m not sure anymore that it’s ever going to get better,” Dewey explained. Upon reflecting about how hard he had worked to merely survive, Dewey summarized his feelings thusly, “I realized I was killing myself, and there wasn’t going to be any reward for my suicide.”[1]

So what happened? Economic historian Judith Stein (Running Steel, Running America) provides a formidable explanation for the demise of American manufacturing in her latest book, The Pivotal Decade: How the US Traded Factories for Finance in the Seventies. But before going into her thesis, it is worth outlining alternative explanations for the dismal economic performance since the seventies.

As is well known, the period following World War II saw unprecedented economic growth. During this “Golden Age,” the economy grew at about 4 percent a year, disposable income grew by 15 percent in real terms, the percentage of population living below the poverty line dropped from 40 to 10 percent, and perhaps most significantly, “The income of the lowest fifth increased 116 percent, while the top fifth grew 85 percent; the middle also gained more than the top.”[2] Since about the mid seventies, wages have been stagnant, household income for the majority of Americans have barely budged despite the rise of dual-income households, inequality has risen dramatically, and economic growth has slowed dramatically (by about half) even taking into account what Joseph Stiglitz called the “Roaring Nineties.”

A number of factors contributed to America’s economic growth in the Golden Age, but foremost among them is luck. World War II had just decimated America’s main industrial competitors in Europe and Asia. This created more than a mere competitive advantage. In many instances, American products were the only ones in town. Moreover, America’s late entrance into World War II coupled with its relatively minor role in defeating Nazi Germany assured economic preeminence for some time after the end of the war. [Gerhard Weinberg puts American casualties in the war at 300,000, while recent research puts Soviet casualties at 25 to 27 million. About nine million Germans died.] Robust government policies also played a fundamental role. In effect, a whole host of New Deal programs that laid the foundation for unprecedented growth acted as gigantic subsidies for the middle class. “New Deal programs in communications, banking, housing, and airlines stabilized investment,” Stein notes. “To make the telephone accessible to all, the government set tariffs so that richer and urban consumers subsidized poorer and rural customers. [Real America™, if you will] New Dealers removed the risk in the mortgage market so that banks and other institutions could lend to many who otherwise never could have entered the housing market. They did not ignore renters, funding extensive public housing construction. The government promoted airlines by offering mail contracts to sustain the new business. The state funded research in defense and space and was itself the market for the products of that research. The New Deal kit contained many tools—government spending to reduce unemployment but also regulations to promote industries valuable to the nation.”[3]

This being said, it is important not to romanticize the 1950s and 1960s. As Thomas Sugrue explains in his classic The Origins of the Urban Crisis, contrary to widespread belief, the “urban crisis” actually has its roots in the 1950s and 1960s when, among many other factors, inadequate and discriminatory housing policy combined with misguided private sector and neighborhood policies created disastrous conditions for African Americans in urban areas. Stein soundly makes note of the mixed outcomes of the fifties and sixties. Despite major gains, Americans were a long ways away from being comfortably affluent. “The median family income for 1968 was $8,632, when it had been $3,931 in 1947. But $8,632 was about a thousand dollars less than what the Bureau of Labor Statistics defined as ‘modest but adequate’ income for an urban family of four,” she states. Likewise, despite reductions in overall poverty, “In 1970, government figures indicated that 30 percent of the nation’s working-class families were living in what was actually poverty, with incomes of less than $7,000. Another 30 percent were above a poverty budget but below the intermediate level. Thus, 60 percent were either poor or hovering between poverty and the very modest level of the intermediate budget.”[4] Nonetheless, progress was very real.

An explanation for what went wrong depends on whom you ask. A favored explanation for those on the right puts the blame on rising labor costs due to overzealous unions and overregulation of the economy. While even Marxists agree there is something to this argument, it does not explain the current dramatically lower growth rates compared to 1950-70 despite deregulation of nearly every sector of the economy and stagnant wages at home and access to dirt cheap labor abroad.

Perhaps unsurprisingly, liberals have had an even more difficult time coming up with explanations for the crises of the seventies. Aside from the very real and very detrimental impact of the oil shocks of 1973 and 1978-79, Keynesian liberals like Paul Krugman admit that it is “still somewhat mysterious” why growth slowed so dramatically in the seventies. Although some liberals point to misguided monetary and price and wage control policy (Stein argues that both played important roles), there seems to be no liberal consensus on the 1970s.

Marxist scholars, on the other hand, provide some compelling arguments. David Harvey, for example, argues that falling profit margins pushed industry to tap what Marx called the “reserve army of labor” in the third world. What caused the falling profit margin? Marx would have comforted modern conservatives by agreeing that excessive labor costs typically create profit margin crises. Harvey takes into account several factors, but they boil down to these: labor-saving technology, increased organizational efficiency, increased mobility of capital, and capital’s successful efforts to depress wages through various means all left consumers—which is also labor—in the world’s biggest market (the US) with less income to consume capital’s products. Since labor (and its wages) is necessary for the consumption of capital’s products, the diminishing fortunes of the former led to lower aggregate demand for the latter, which, in the long term, means falling profits. This problem, Harvey notes in his latest Enigma of Capital, was quickly resolved by extending credit to millions of middle and working class people in order to fuel growth through debt-based consumption.

Pivotal Decade provides an argument that complements those of the Marxists. At the heart of Stein’s story is the conflict between America’s Cold War foreign policy priorities and the wellbeing of its own citizens. Put briefly, the Soviet victory over Nazism and the sheer destituteness of Western Europe and East Asia provoked fear in Washington (and London) of fertile grounds for communist revolution. Aside from providing massive amounts of economic aide and negotiating a dramatic devaluation of (Western) European currencies to make its exports more competitive, in an effort to stifle hospitable conditions for communist revolution, the United States opened up its market to European, Japanese, and Korean exports despite their restrictions on foreign (mainly American) imports. (The US also used covert and overt force by helping rig elections and supporting fascists in France, Italy, and Greece, among other places.) European and East Asian exports also received a boost by their own governments in the form of generous subsidies, import quotas, and other forms of protection which allowed them to develop their infant industries to increase their international competitiveness. Incidentally, this whole affair would have given Alexander Hamilton a nasty migraine.

In effect, the United States government helped dismantle its own economy in order to get an upper hand in the Cold War. Stein argues that several factors contributed to this act of slow motion self-immolation. Some of this self-harm was unforeseeable while other policies seemed downright foolhardy. An example of somewhat more excusable undermining of self-interest is America’s quasi-inadvertent role in developing Japan’s steel industry. The US had eyed Japanese production during the Korean War both for its proximity to the conflict and to boost Japanese employment so as to avoid the possibility of a potentially radicalized mass of unemployed people. It is no wonder why the president of Toyota called the Korean War “Toyota’s Salvation:” “the U.S. military’s order of a thousand trucks a month made up for the steep decline of the company’s sales.”[5] Likewise, US military spending in Japan coincided with the peak years of Japanese growth in 1966-70, which also happened to be the height of the Vietnam War.

Other policies were less excusable. Democratic and Republican administrations alike refused to seriously confront Cold War allies that were restricting American imports while taking advantage of open access to American markets. The results were nothing short of staggering. For example, Stein points out that between 1967 and 1970, Japanese exports to the US increased by 96 percent.[6] The flood of subsidized imports on the American market made it nearly impossible for American producers of goods to compete. This created an incentive for American corporations to expand their business and increase market share by divesting at home and investing abroad. For example, since trade barriers prevented easy access to European markets for American goods, American capital fled to Europe where there were capital controls that prevented capital from fleeing in light of any unruly labor activity in Europe. There was no equivalent American policy. The Ford Administration’s William Seidman conceded as much but his only solution to the problem was to cut corporate taxes in order to increase profits at home.

This sort of capital flight was nothing new of course. It was in fact just a grand scale example of the destructive consequences of capital’s mobility that the US experienced internally in the late 1940s and early 1950s as manufacturing moved to the South and Midwest while deindustrializing much of the Northeast. As Jefferson Cowie argues in his insightful edited volume, Beyond the Ruins: The Meanings of Deindustrialization, “we must jettison the assumption that fixed capital investment in resource extraction, heavy manufacturing, and value-added production defines the stable standard against which all subsequent changes are to be judged. Rather, we should see this political-economic order and the culture it engendered as temporary and impermanent developments in space and time.”[7] Though this provides little comfort to those whose lives are destroyed by capital’s mobility, it is a cold truth about capitalism that must be acknowledged.

Economic inequality is a feature, not a bug

Although Stein does not go into much depth on finance, the work of Giovanni Arrighi complements her thesis on the centrality of the oil shocks to the demise of manufacturing and the rise of finance. Stein aptly quotes historian Steven Scheider’s conclusion about the 1973 oil shock: “The oil-exporting countries had secured the greatest nonviolent transfer of wealth in human history.”[8] Ironically, this new wealth obtained by oil-exporting countries needed somewhere to sit and accumulate interest. Naturally, the capital went to Wall Street. Arrighi states that this surplus of capital helped spur “innovations” within the western financial sector and promoted (initially) low interest loans to poor countries. In the short term, Stein notes, these loans funded projects throughout the developing world such as shipyards, petrochemical plants, and steel mills that looked to the US market as an ideal destination for their final product. Strangely enough, Stein points out that western (mostly American-based) banks “lobbied for Third World access to American markets because that was the only way to get repayment of their loans. The new lending policies thus created new conflicts between American finance and American manufacturing.”[9]The encore oil shock of 1978-79 funneled even more petrodollars to Wall Street, which fueled even more speculation.

Another conclusion shared by Stein and Arrighi is the effect the abandonment of fixed exchange rates had on the growth of the financial sector. In 1971, the US had its first merchandise trade deficit since the late 19th century.[10] The fixed exchanged rate system established after World War II gave America’s competitors an advantage since their undervalued currency made their exports cheaper than American products. Ever the subtle mind, Nixon’s solution to this problem was to dismantle the Bretton-Woods System by establishing floating exchange rates. In the short-term, the results were mixed. Stein notes that although the value of the dollar fell, the Fed’s increase of the money supply and the establishment of flexible exchange rates did little to assuage the fears of financial markets about America’s trade deficit and overall balance of payments. Sure enough, capital fled to Europe.[11] More importantly, however, flexible exchange rates fueled speculative finance:

The breakdown of the regime of fixed exchange rates added a new momentum to the financial expansion by increasing the risks and uncertainty of the commercial-industrial activities of corporate capital. Under the regime of fixed exchange rates, corporate capital was already engaged in currency trade and speculation. “But for the most part the acknowledged responsibility of the central banks for holding the rates fixed relieved corporate financial managers of the need to worry about day-to-day changes” (Strange 1986: 11). Under the regime of flexible exchange rates, in contrast, corporate capital itself had to deal with day-to-day shifts in exchange rates. The coming and going in corporate bank accounts of money in different currencies forced corporations to engage in forward currency trading in order to protect themselves against shortfalls in their accounts due to changes in the exchange rates of the currencies in which their expected receipts and anticipated payments were quoted. Moreover, fluctuations in exchange rates became a major factor in determining variations in corporate cash flow positions, sales, profits, and assets in different countries and currencies. In order to hedge against these variations, corporations had little choice but to resort to the further geopolitical diversification of their operations.[12]

This distinction between speculative capitalism and production of actual goods plays an important role in Arrighi’s thesis of the late phase of systemic cycles of accumulation throughout capitalist history, reaching back to the 15th century. To be sure, one might ask how one can locate pre-industrial roots to capitalism. Arrighi uses French historian Fernand Braudel’s unique conception of market and capitalist economies. In Braudel view, there are three distinct levels of capitalism: the first level is based on the basic exchanges of material for subsistence which can be located before the industrial revolution; the second level is the more familiar system of commodity exchanges based on producers and firms (i.e. “the market”); and the third level in which capitalists engage in high finance and other more abstract forms of exchange of capital (rather than goods) to increase profit margin through monopolizations. As such, Arrighi borrows Braudel’s conclusion that capitalists (of the third level) see a competitive market as a barrier to be overcome. Since capital’s goal is to monopolize markets in order to have control over profit maximization, it is intrinsically opposed to any barriers—including legal standards for fair competition common throughout liberal capitalist economies—to its monopolization of the market.

They taught Bernie Madoff everything he knows

Arrighi identifies four systemic cycles of accumulation or periods of economic and political hegemony. The Genoese (fifteenth to early seventeenth century), Dutch (late sixteenth through eighteenth century), British (latter half of eighteenth century through the early twentieth century) and American (late nineteenth century to present) cycles of accumulation all have distinct features but share certain patterns. Each cycle contains periods or epochs of material expansion and financial expansion in which accumulation occurs through financial transactions and exchanges of money (insurance, stocks, and derivatives) rather than investment in and exchanges of commodities (actual stuff). Borrowing from Braudel, Arrighi observes that “we identify the beginning of financial expansions with the moment when the leading business agencies of the preceding trade expansion switch their energies and resources from commodity to the money trades. And like Braudel, we take the recurrence of this kind of financial expansion as the main expression of a certain unity of capitalist history from the late Middle Ages to our own days.”[13] For our purposes, the most important pattern of all cycles of accumulation identified by Arrighi is the reoccurring periods of transition from productive investment (material expansion) to the phase of speculation (financial expansion). These transitions, according to Arrighi, are marked by a “sudden intensification of inter-capitalist competition.”[14] Although Stein’s micro-historical approach is drastically different than Arrighi’s broad political-economic 600 year history, Stein’s narrative provides additional empirical evidences to Arrighi’s thesis by detailing America’s rising trade deficit during the 1950s and 1960s as the American economy switched its economic orientation to finance in the 1970s.

All in all, these processes led not only to the undermining of the American manufacturing, but the destruction of the welfare state as well.

An honest look at the trajectory of American history affirms Cowie’s argument that the New Deal era was anomalous. Unlike in Western Europe where labor movements of the late sixties and early seventies led to profound transformations of economic life for the working and middle classes, the 1970s ended with a shrug and whimper for American labor. In fact, to borrow Paul Krugman’s memorable phrase, despite the seemingly linear progress of the working class during the 1950s and 1960s, Americans were in fact “living through the end of an ‘interregnum between Gilded Ages.’”[15]

So where does that leave Dewey Burton? The answer lies not in the false dichotomy of Dewey being slapped around and strangled by Adam Smith’s invisible hand or saved by the “moral sentiments” of man that were supposed to save us from ourselves. Rather, the fate of the Deweys of the world depends on whether we can acknowledge and transcend the brutal internal contradictions of capital accumulation. Needless to say, this leaves most of us in a very precarious situation.

This post is part of a series on the idea of the post-industrial society and the political economy of the late twentieth century United States.  Previous posts include The Rural Roots of America’s Cities of Knowledge, Looking for the City of Knowledge, and FIRE and ICE: The Realities of Twenty-First Century Urban Development.

[1] Jefferson Cowie, Stayin’ Alive: The 1970s and the Last Days of the Working Class (New York: The New Press, 2009), 11.

[2] Judith Stein, The Pivotal Decade: How the US Traded Factories for Finance in the Seventies (New Haven: Yale University Press, 2010), 1-2.

[3] Stein, Pivotal Decade, 4-5.

[4] Stein, Pivotal Decade, 14.

[5] Stein, Pivotal Decade, 6.

[6] Stein, Pivotal Decade, 35.

[7] Cowie and Joseph Heathcott, eds., Beyond the Ruins: The Meanings of Deindustrialization (Ithaca, NY: Cornell University Press, 2003), 5.

[8] Stein, Pivotal Decade, 81.

[9] Stein, Pivotal Decade, 95.

[10] Stein, Pivotal Decade, 41.

[11] Stein, Pivotal Decade, 46-49.

[12] Giovanni Arrighi, The Long Twentieth Century (London: Verso, 1994), 310.

[13] Arrighi, The Long Twentieth Century, 86.

[14] Arrighi, The Long Twentieth Century, 87.

[15] Cowie, Stayin’ Alive, 71.