A NEW LEVEL OF PROSPERITY

UNDER THE EXTREME pressures of World War II, the men and women of the belligerent countries performed near miracles of effort and endurance. In this they were helped by the heavy industry that now dominated capitalist economies, making possible a level of production that broke all records. Necessity again proved to be the mother of invention with all the contenders innovating in synthetics, medicine, communication, aviation, and, of course, weaponry. When hostilities ended, the destructive power that had been unleashed sobered everyone, vanquished and victorious alike. It had been a dreadful thirty-one years, but most had survived.

For a second time in a quarter of a century, Europe had been devastated. For a second time the United States flexed its remarkable industrial muscles. For the first time a country implacably hostile to capitalism appeared on the world scene. Spurred by two world wars with a depression in between, governments learned to play a larger role in economic matters. Both Keynesian economics and socialist prescriptions provided rationales for maintaining a permanent role in economic matters, if only to prime the economic pump during persistent downturns.

In World War II the belligerents set rules for producers and laborers, freezing prices and wages. They even took over some private companies and commandeered whatever resources were deemed necessary for the war effort. By 1945 there were many bureaucrats experienced in telling investors, entrepreneurs, managers, and laborers what to do. Quite naturally they looked upon their recommendations as constructive. Many advised continuing government oversight of the economy. Prominent socialists in Great Britain, Italy, and France called for the abandonment of laissez-faire policies. It was an open question whether capitalism would move back into a modern version of the political orbit that it had escaped in the eighteenth century.

Three paths opened up for postwar leaders as they confronted the task of reestablishing their physical plants, transportation systems, financial institutions, and the trade arrangements that had structured their global economy. We might call these paths the indicative, imperative, and informative. In the first, the road forward is indicated; in the second one, it is ordered; and in the third, the coded language of markets informs participants about their choices. The government too responds to information rather than acts out of ideological imperatives. The most powerful commercial players after the war were corporations, many of them international, but the catch basin of capitalism contained hundreds of smaller outfits and even more people detailing projects on the backs of envelopes.

France, Sweden, and Great Britain chose the indicative option. In a four-year plan, the French government set the direction for economic planning, using subsidies and loans as guide dogs. The British Labour government came into office in 1945 under the banner of eradicating the five giant evils of want, squalor, disease, ignorance, and unemployment. Quickly the government nationalized the railroads, utilities, the Bank of England, coal mines, and steel factories. A national health plan gave “cradle to grave” coverage, and the government invested heavily in public housing. Sweden was the most generous of all industrialized nations, providing universal pensions, health and disability insurance, child and family allowances, poor relief, and subsidized low-income housing. These governments established priorities and pointed out the direction for private enterprise.

The Soviets had a command economy in which almost all enterprises were owned by the state. Central planners set production goals with little attention paid to market signals. Because Europeans had come to prize the private property rights they had wrested from monarchs long ago, many Russians resisted the appropriation of their property, so political repression had accompanied the Soviets’ economic restructuring. After the war, Soviet planners announced new economic goals that made control even tighter. The Soviet government was determined never again to be exposed to a horrendous invasion like that of Hitler’s, so they created a buffer zone comprising the countries of Poland, Hungary, Yugoslavia, Romania, Czechoslovakia, Albania, and Bulgaria. Buffering, as it turned out, involved imposing upon these nations command economies, one-party rule, and subordination to the Soviet Union. Only Yugoslavia avoided the embrace of the Soviet leader Joseph Stalin.

Russian industry had performed magnificently during the war. Its economy recovered prewar production levels within five years. Without the imperatives of war, government planners became less surefooted. Little was actually known about the poor outcome of the Soviets’ successive five-year plans until much later. This left Marxists in Italy, Germany, Britain, France, and even the United States free to agitate for communism in their countries.

American leaders chose the informative option. Not having experienced war at home and without a very strong labor tradition, Americans had little interest in radical programs that allowed the government to direct economic initiatives. Those members of Roosevelt’s New Deal who had favored more political control had been replaced during the war by businessmen, the so-called dollar-a-year men, who won back public confidence by meeting war production goals. Lost during the Depression, this renewed confidence bolstered their arguments against allowing the government’s wartime intervention to become a prelude to more central planning.

The Depression had exposed the two great weaknesses of capitalism: its wayward oscillations between good times and bad and the vastly unequal distribution of the wealth it produced. While American leaders rejected both central planning and central guidance, they recognized the need to moderate these tendencies. There would be no returning to the business mentality of the early twentieth century even if Americans remained loyal to free enterprise. The term itself became freighted with ideological meaning as the differences between the Soviet Union and the United States passed from principles to form the matrix for foreign policies.

American diplomats involved in reviving the prostrate countries of Europe favored letting market forces do the job. America’s great wealth and wealth-making capacity gave its preference preponderant influence, and these were to let loose the efficient operation of markets rather than follow political dictates. But the Depression had left leaders aware of the need to restrain some nationalist impulses for international free trade to operate optimally. The failure of Congress to go along with the League of Nations after the First World War remained a vivid memory. Roosevelt and his advisers began planning for peace while the war raged on. They had learned from experience and secured agreements before their allies could start fiddling with their currencies and protecting domestic industrial and agricultural producers.1

Always alert to the need of bringing the public along with him in formulating policies, Roosevelt gave priority to a conference on food and agriculture, knowing that Americans would feel keenly the need to feed a starving people after the war. Held in 1943, the conference provoked the acerbic British economist John Maynard Keynes to ruminate that Roosevelt “with his great political insight has decided that the best strategy for post-war reconstruction is start with vitamins and then by a circuitous route work round to the international balance of payments!”

When victory seemed certain in 1944, Keynes got his meeting on fiscal matters when 730 delegates with their staffs, representing forty-four countries, arrived on special trains at the newly refurbished Mount Washington Hotel at Bretton Woods, in New Hampshire’s White Mountains and hammered out an impressive agreement to end the nationalistic practices that had scuttled recovery from the Great Depression.2

Two catastrophic wars had crushed the spirit of vengeance. The pervasive influence of the United States carried the day. Hopes for an international trade organization faded, but at least countries were willing to buy into the General Agreement on Tariffs and Trade. GATT negotiations have been functioning ever since, now under the World Trade Organization. At Bretton Woods, the participants established the World Bank and the International Monetary Fund, the first to make the long-term investments in developments that private entrepreneurs balked at and the second to manage loans and monitor currencies. The magnetic center of world trade moved permanently from London to New York. It actually had passed after World War I, just as London had taken over from Amsterdam in the eighteenth century and Amsterdam from Genoa in the seventeenth century. By 1958 the monetary system established at Bretton Woods worked so well that all major European currencies could be converted into dollars.3

Europeans did not experience the immediate prosperity that Americans enjoyed. War had plunged some people back into a primitive past. The winter of 1946–1947, the second since peace returned, was unusually severe, so severe that it ruined the potato crop. In Germany, even when farmers had potatoes to sell, they wouldn’t do so because the value of the currency was too unpredictable. Germany, once at the pinnacle of capitalist development, witnessed scenes of city people going into the countryside with a lamp, chair, or picture frame to sell, returning home with sacks of precious potatoes. The next year brought a record-setting drought. The harvest of 1947 was the worst of the twentieth century. Millions in Germany and elsewhere were homeless, left to wander through the rubble that everywhere marked the destructive power the war had unleashed. Refugee became a common status for men, women, and their children, who had been dislodged or expelled or rescued from prison at war’s end. Several million Jews survived Hitler’s fiendish plans to eliminate them. Now free to move about, refugees took to the road or clustered in new displaced persons camps.

The extraordinary dimensions of need actually prompted an ambitious program for recovery. It started very much as a trial and error operation with a low budget. What the many people battered by war needed at first were the basics—food, clothing, and shelter. Then they faced the challenge of repairing the widespread destruction, and finally they required infusions of money to resuscitate their peacetime economies. Uncle Sam had the money and the will and began dispensing funds through the United Nations or directly from Washington. Canada too mounted a major relief effort. People do learn from experience. American leaders finally recognized the absolute necessity for the United States to accept the responsibilities of world leadership that it had eschewed after the First World War.

General Marshall’s Plan

Despite American and Canadian aid, getting the war-battered nations back on their feet was slow enough to raise doubts about capitalism and even stir sympathy for the Communist alternative to a market economy. In 1947, worried about this, Secretary of State General George Marshall announced a new program. Approved by Congress, the Marshall Plan appropriated dollars in loans and grants for food, seed, and fertilizer to feed the people, followed by money for capital goods, raw materials, and fuel to stimulate productivity. Though invited to participate, the Soviet Union declined for itself as well as its Eastern satellites. Sixteen Western European countries met in Paris to discuss the American offer. They ended their meetings by forming the Committee of European Economic Cooperation. Only Francisco Franco’s Spain failed to receive an invitation to join the group, though five years later right-wing dictatorships were accepted as allies in the fight against communism. Then the United States extended aid to Spain and received permission to establish air force bases there.

Overall, the United States invested eighteen billion dollars in the European Recovery Program between 1948 and 1952, at a time when the typical American clerical job garnered twenty-four hundred dollars a year. The quickness with which the Marshall Plan beneficiaries rebounded made the plan seem like a general panacea for economic backwardness. In 1948 the principles embodied in the Marshall Plan were applied outside Europe in President Harry Truman’s Point Four program for India. The uneven success of this costly effort clearly suggested that economic development required more than money, but this was not a popular conclusion. Many experts spoke, and continue to speak, of market success as a consequence of autonomous laws of nature when history teaches that capitalism functions like other social systems through indeterminate, personal interactions.

Europe in the postwar period is interesting to the history of capitalism because its different trajectory reminds us that there are many ways that enterprise can thrive. After the war it took devastated Western Europe about five years to recover its full industrial power. In 1950 its gross domestic product equaled that of the United States in 1905! That was the last time for such a disparity. The next two decades registered the largest sustained period of economic development ever recorded until that time. In the four years between 1948 and 1952, Western European economies grew an amazing 10 percent each year.

The Allies’ occupation of Germany, vexed by the conflict between the Soviet Union and the United States, resolved itself by splitting the nation into the Russian-occupied East and the western area, which the United States, Great Britain, and France supervised. The Western powers quickly realized that a recovered economy in West Germany, formally recognized as a separate country in 1949, was essential to their well-being. Just replacing the despised Nazi currency with deutsche marks had an impact. West Germans responded with alacrity to a sound currency. Hoarding stopped; shops became well stocked. It was a dramatic entrance for a currency that has maintained its strength for sixty years. So quickly did it happen that people called it an economic miracle, a term soon applied to the rapid recovery of all of Western Europe.4

Catching up put Western European economies into high gear as they moved beyond restoring their industrial plants to incorporating the technological developments of the past two decades. Capital from the United States greased the wheels of the new locomotive of recovery and supplied a model of economic advance. Western European countries already had the skilled labor force, savvy investors, sophisticated banking systems, and world-class educational institutions needed to revive their leading sectors of steel, automobile making, pharmaceuticals, and electric products. Perhaps the most elusive benefits of the Marshall Plan came from the confidence it conveyed and the easing of national rivalries. In the years between 1948 and 1964 the productivity from capital doubled, pretty much closing the gap between Western Europe and the United States.5 In this environment, even Ireland, Spain, Portugal, and Greece prospered.

Continental Western European countries adopted a corporatist economic form. Governments guided growth with fiscal and monetary policies, central banks virtually monopolized venture capital, and unions secured worker representation on corporate boards. Development with stability became the collective goal. This was especially true in Germany, where the Nazi regime had soured just about everyone on a powerful state, including socialists and the grand industrialists. Instead they sought mechanisms to contain the inevitable jostling for advantage among market participants. This system legitimated interest groups and created new institutions to determine the direction of the economy.6

There were obvious trade-offs in the corporatist and free market economies. Few vulnerable members of society fell through the European safety nets, as they did in the United States. While large corporations sponsored excellent research, especially in pharmaceuticals, innovation took a backseat to security in Europe. Groups making decisions for banks, management, labor, and government proved more risk-averse than individual entrepreneurs. Private persons in the United States found it easier to get backing for new ideas, and they were left to succeed or fail on their own. The economy, as a whole, benefited from the entire lot of efforts to build the proverbial better mousetrap.7 But turbulence remained a prominent feature of the American economy.

The immediate postwar agreements led to sustained international cooperation among the worlds industrial leaders, animated by the sense of mutual concerns that had been missing in the interwar period. Most people realized that economic growth was not a zero-sum pie. Nations got richer if their neighbors were rich, as Adam Smith had pointed out years ago. While protective tariffs didn’t disappear, they were moderated considerably from their mid-nineteenth-century highs. Still, all countries backed away from tackling the contentious issue of taking away domestic support from their farmers, a powerful political group everywhere.

The Committee of European Economic Cooperation metamorphosed into the Organization for Economic Cooperation and Development, which extended membership to the United States and Canada and later to Japan and Australia. With its European Payment Union working effectively, world trade grew at an average annual rate of 8 percent. World manufacturing output grew threefold between 1950 and 1973.8 Not only had productivity taken a huge jump, but governments took advantage of increased revenue to provide extensive public services.

New Initiatives in International Cooperation

It is said that it’s an ill wind that doesn’t blow some good. The eruption of two devastating world wars within twenty years of each other would certainly test that proposition. The shortness of the interval of peace explains one good wind. The adult years of men like Jean Monet and Robert Schuman covered both wars. By the end of the second catastrophic conflict, these leaders were determined to do things differently this time around. Monet had learned about British, American, and European commerce representing his family’s brandy firm before becoming a diplomat serving in the League of Nations. Schuman, who had gone from being a German to a French citizen when Alsace-Lorraine was returned to France after World War I, made a career in French politics.

The two men proposed a dramatic plan: link the steel and iron industries of Western Europe under a single authority. This was definitely an idea whose time had come. In 1951, France, Germany, Italy, the Netherlands, Belgium, and Luxembourg formed the European Coal and Steel Community. With one market for coal and steel products, the members hoped to assure a steady supply. They encouraged profit making in order to pay for the constant pace of modernization. What a difference a second bloodbath made! How unlike the vengeful spirit of Georges Clemenceau at the Versailles treaty negotiations was that of Monet and Schuman and the others who helped them succeed.

While the actual results were more inspirational than practical, the ECSC succeeded in bringing Germany back into the European fold.9 This accomplishment kept the powerful concept of transnational union alive. Six years later the Treaty of Rome created the Common Market, formally known as the European Economic Community. The Maastricht Treaty of 1992 went one step further with the establishment of the European Union and a European citizenship for the people of the initial dozen member states. During the thirty-one years it took to be ratified, Maastricht’s original economic and monetary union expanded to include policies for justice, foreign relations, and security. Capitalism triumphed over nationalism.

There’s a crucial point about capitalism to be made here. The economic integration of Europe, while no panacea for all market woes, has been fundamental to the peace and prosperity of its participants. Yet nothing in the behavior patterns promoted by free enterprise points to such a cooperative effort. The replacement of competition with cooperation and a nationalist spirit with an international one came from individuals like Monet and Schuman, not from any economic laws. These men and others imagined a different world from the one whose horrors they had witnessed. And here is where the critical importance of the Marshall Plan came in. The United States used its gifts to leverage the war-wracked countries to move toward free market institutions. At the same time, the shower of money mitigated the sacrifices demanded by such breathtaking acts of conciliation.10 The shape and direction of capitalism are always set by its participants and never by any inexorable laws. Experts’ generalizations contain the unstated premise of ceteris paribus—this will happen if all else remains the same—but all else rarely stays the same with human beings, especially when successive generations imbibe different lessons.

Unlike American efforts to level the playing field through antitrust litigation, European countries tended to foster a front-runner in its industrial sectors, thinking more in terms of national growth than internal competition. The role of government in the economy was far larger than it had been before the war, but its investment never exceeded one-third of a nation’s total. There was in fact a nice division of responsibility: The government offered help to its citizens who needed it and relied on the private sector to produce goods and services.11 In Europe, many business leaders believed that the social democratic welfare state mitigated public unhappiness during economic downturns and tempered labor agitation for higher wages. With access to the technology generated in the United States and without its military expenditures, it might be said that Western Europe had a good deal.

European countries did exceedingly well in steel production, automobile manufacturing, pharmaceuticals, and electronics. Germany also played a big part in the development of automaking in the postwar era. Karl Benz and Nikolaus Otto had pioneered commercial cars. It took the slowdown in the 1920s for American automakers to get a foothold there. General Motors took over Opel, and Ford established a successful subsidiary. The Depression reduced Germany’s 150 auto companies to a dozen, including Opel and Ford, but the ones that remained were strong.

Automobile Makers and the War

With the coming to power of the Nazis in 1933, car manufacturing had acquired a political cast. Hitler wanted to imitate Ford with a mass-produced car.12 At this point the Austrian automotive wizard Ferdinand Porsche entered the picture. Daimler Motors brought him to Germany but, after its merger with Benz, Porsche failed to please with his ideas for a Mercedes-Benz. He fared better with Hitler, who chose his design for his Strength through Joy automobile. Hitler planned a new factory, a kind of German River Rouge. Its work force was composed of German military prisoners, concentration camp inmates, captured Poles, and Russian POWs; the town that grew up around the plant resembled concentration camps with their accompanying abuses.13

The people’s car never got beyond the prototype. The plant turned out a kind of German jeep during the war until the British army took possession of it in 1945. Renaming the car Volkswagen, the army ordered ten thousand of them. Then it offered the factory to British automakers, who laughed at the VW’s ridiculous shape. Ford wasn’t interested either, nor were French automakers. The plant reverted to the German government.

Meanwhile Ferdinand Porsche was detained for twenty months as a war criminal. The French government arrested another leading auto manufacturer, Louis Renault, for collaborating with the Nazi Vichy government. He died in prison. The complicity of these automobile makers was too egregious to be ignored by Germany or its conquerors. Porsche’s son Ferry was apolitical but, like Ferdinand, a superb designer. Eager to get the money to secure his father’s release, Ferry made a sports car. The Porsche 356 became the first car to carry the Porsche name, soon to be associated with a succession of upscale models.

The government invited the Porsche firm to work on the VW design and gave it a royalty on all future sales of what now was called the Beetle in recognition of the VW’s unique profile. In the ensuing years, Porsche produced close to one hundred thousand 356s while twenty million VW Beetles rolled off the production line and onto the streets of every country. In the 1990s yet another Porsche, Ferdinand’s grandson Ferdinand Piech, brought Volkswagen out of the financial doldrums, making it one of the world’s top four automobile companies. Germany’s postwar rebound owes much to these successes, for one out of every seven jobs in the country depended on automaking with VW, Daimler-Benz, and BMW dominating the market.

The rate of growth in Western Europe after 1950 could not have been sustained without an influx of immigrants, even though European agriculture continued to shed workers as European farmers mechanized. Political instability and economic hardship produced freshets of refugees who were lured to Western European countries by their abundant jobs. Labor shortage became so acute in the 1960s that Germany, France, Switzerland, and Belgium invited in “guest workers” from Portugal, Spain, Italy, Greece, Yugoslavia, Turkey, and North Africa.14 England received immigrants from the Caribbean Commonwealth countries while in a reverse migration some English and Scots moved to New Zealand and Australia. Jewish survivors of Hitler’s concentration camps found new homes in Western Europe, the United States, and the new state of Israel, created from former Palestinian lands in 1948. Emigration to the United States continued strong after World War II, but more people came from the countries of Asia and Central America than from Europe.

With economic growth so strong, immigrants in Western Europe found employment, but not a comfortable place in their chosen society. Not considering themselves “lands of immigrants,” as the United States did, European countries resisted incorporating the newcomers into their fabric. The guest workers tended to be residentially segregated as Latinos and African Americans were in the United States. When growth slowed, as it did in the late 1970s, calls came for sending the “guests” back to their homes.15 Their presence strengthened xenophobic political parties. Still, long-term labor shortages loomed as the baby boom of 1946–1960 passed into retirement and the decline of European birthrates accelerated. By the 1960s countries throughout Europe had passed below the 2.1 replacement rate. Prosperity and a widening ambit of possible careers for women changed the mores of millennia. The individual decision making at the center of capitalism has infected whole societies.16

The American Economy in High Gear

The first two years after the war in the United States saw the swiftest peacetime conversion on record. Government control boards disappeared as fast as the military demobilized its soldiers, sailors, nurses, and merchant mariners. The more than 12 million men and women in uniform dropped to 1.5 million. (In 1939, at the start of European hostilities, the U.S. Army numbered 120,000 officers and soldiers!) Just as quickly as the military shed personnel, the labyrinth of prohibitions, priorities, quotas, limitations, set asides, price controls, subsidies, rationing, and interest rate pegging that had characterized war production disappeared.17 While taking a backseat in economic decisions, Congress greased the wheels of the transition. The top income tax rates remained at 87 percent until 1981, but corporate tax rates came down. Withholding income taxes from wages and salaries had begun during the war and continued. By 1959 the Internal Revenue Service had the world’s largest collection of personal data.

As an expression of gratitude to its veterans, the government dispensed favors that had a salubrious effect on the economic climate. Almost a million veterans took advantage of the GI Bill, which paid the costs of a college or technical education along with a stipend to live on. At the peak year of 1947, nearly half of America’s college students were vets, the majority of them the first in their families to go to college. Quite incidentally this investment in education yielded a talent dividend for years as skilled labor became more and more important in the work force. (So important was it that economists added “human capital” to their discussions of the labor-land-capital component of production.)

Another nine hundred thousand unemployed veterans, almost half of those in the work force without jobs, drew upon the fifty-two weeks of unemployment benefits that Congress voted them. Several programs enabled veterans to get cheap mortgages. This promoted a construction boom. A developer named William Levitt built seventeen thousand houses within a stone’s throw of a large U.S. Steel Company plant on Long Island, New York. Levittown was the first of a number of instant communities. Developers across the land began building tracts of houses on level land within commuting distance of America’s cities. They mass-produced houses from similar blueprints with many items like cabinets trucked in. True to the prejudices of the day, blacks were usually excluded.

Investing as though good times were going to last forever, American firms expanded. They financed conversions and improvements with earnings, wartime savings, and new issues of company stocks and bonds. When unemployment rose above 5 percent, President Dwight D. Eisenhower pushed Congress to pass the Federal Highway Acts of 1954, 1956, and 1958. In Keynesian fashion, government funds poured into building an interstate highway system with ribbons of four-lane roads tying the country together as it generated hundreds of thousands of jobs. As a young lieutenant colonel Eisenhower had participated a generation earlier in the caravan of army vehicles sent across the country to see how easily troops could be moved from the East to the West Coast. “Not very easy” was the answer. The trip took sixty-two days and sometimes required oxen to pull the trucks out of the mud. The new interstate highway system followed the same route, the old Lincoln Highway, as the army convoy of 1919.18

Organized labor became a force in the American economy after passage of the Wagner Act, formally known as the National Labor Relations Act of 1935. This Magna Carta for labor gave statutory protection to organizing workers. Public opinion, as well as court decisions, had begun to turn in labor’s favor, first in the twenties for the right to assemble and then during the Depression for the right to organize. Congress restricted the use of injunctions to stop labor meetings; in successive decisions in 1938 and 1939 the Supreme Court interpreted the First Amendment as making streets and parks a “public forum” that protected peaceful picketing.

A bitter rivalry marred labor’s coming into its own when eight unions in the AFL withdrew to protest its indifference to organizing unskilled workers in mass production industries. Their exploratory committee turned into the Congress of Industrial Organizations in 1938. The CIO was much more welcoming to immigrants as well as to African Americans. Under the banner of “Negro and White: Unite and Fight,” the CIO added half a million black workers during World War II. Racism among American unions was just as strong as it was among white-collar workers, but the CIO, led by the fiery mining workers’ leader John L. Lewis, was pushing hard against those destructive attitudes. The CIO also successfully recruited immigrants and their second-generation progeny. Here it acted as a democratic force, showing these outsiders how to claim power at the work site and take up their place in a culturally diverse citizenry.19

With the increase in war production, many companies settled with their laborers in order to win military contracts, swelling the ranks of union members. When after the war those companies tried to scale back wages, unions fought successfully to hold on to or increase their gains. During the decade and a half of citizen solidarity, forged through the shared pain of the Depression and war, labor succeeded in convincing most Americans that wages should not be set by the impersonal workings of some “law” of supply and demand. Rather they put forward the twin goals of achieving a living wage and fully incorporating blue-collar workers into the prosperity beckoning when peace finally came. Disputes continued; big labor, big industry, and big government found a balance they could work with.

It was a magical combination for the American economy. Where war made most of Europe and part of Asia destitute, the American economy had actually grown 50 percent between 1939 and 1945! Canada and Argentina had grown even faster. People who’d lived in apartments all their lives bought houses; returning service personnel got married and began having those children destined to compose the baby boom of 1946 to 1960. Small businesses flourished, many of them serving families in the proliferating suburbs with bicycle shops, cleaners, and the like. The American peacetime economy, stalled since 1930, moved into high gear, where it stayed for a quarter of a century.

If one of the signs of a healthy economy is its capacity to recover from disruption, the speedy return to peacetime production indicated a strong constitution. The resiliency of the American economy astounds. Between 1945 and 1947, it found work for the nine million veterans who did not take up the GI Bill and it absorbed the twenty-two million employees formerly holding military-related jobs. Pent-up demand and all those savings in war bonds helped, but not as much as the restoration of confidence in a free market.20 Before the war the United States economy had been half the size of the combined economies of Europe, Japan, and the Soviet Union. Seven years later it surpassed all of them together.

A Chilly New Peace

Americans interpreted the Soviet Union’s suppression of its neighbors as part of a plan for world domination. In 1947 President Harry Truman announced his intention to contain communism by sending military and economic support to Greece and Turkey, both fighting Communist insurgencies. The Truman Doctrine advanced the notion that Soviet pressure through surrogate rebel groups, if successful, would produce a domino effect, with one country lost to communism bringing down its neighbors. The increasing mutual hostility between Russia and America inspired a full-scale propaganda battle, backed up by aid like that which Truman sent to Turkey and Greece. Winston Churchill coined a memorable metaphor when he announced in a speech in the United States that an “iron curtain” had dropped down between Eastern and Western Europe. A cold war between the former Allies took over from the hot one that they had fought together. Capitalism became the signature economic system for the West, its wealth-producing capacity carrying new moral overtones.

Russian paranoia and the near hysteria about socialism in the United States worked effectively to build mutual distrust and animosity. Every event became grist for propaganda; every foreign country’s allegiance became a trophy to be won by one side or the other. A realist might add that the two systems of belief and governance were too divergent to make any other outcome likely. Free enterprise, free elections, and the personal freedoms of movement, speech, religion, and political participation came to epitomize the West’s cherished values; the Soviets extolled their full employment, public ownership of the nation’s goods, and equality of treatment for its people. Anticommunism united the nations of Western Europe and the New World. It also limited the range of acceptable political thought in the United States, which threatened to stifle the robust public debates that an economy based on innovation and initiative needed.

The East-West rivalry became more intense than those preceding the First and Second World Wars, but there was a crucial diffference that prevented the Cold War from heating up. As World War II ended, the atomic age began. In 1949, after a crash program, Soviet scientists produced an atomic bomb like the ones that the United States had dropped on Japan. Closely following the U.S. program, the Russians then followed up with development of the more powerful hydrogen bomb and an intercontinental ballistic missile system for delivering the bombs. Now both countries, appropriately called superpowers, were capable of obliterating each other. Mutual annihilation became a clear and present danger. In the years that followed, seven other countries acquired the secrets of the Manhattan Project through a handoff from their allies. Perhaps four more are moving in that direction today.

Countries that had just come out of World War II now faced the aggressive policies of the Soviet Union. No one had a very good idea of how well the Soviet economy was doing, but everyone did know that the Red Army was a formidable fighting force. During an August night in 1961, when the East German government erected a wall in East Berlin to stop the flow of defectors to the West, Churchill’s iron curtain no longer seemed like a metaphor. Western Europe remained dependent upon the military strength of the United States, which began ringing the Soviet bloc with army bases and missile sites.

Despite several very scary episodes in the 1950s and 1960s, the United States and the USSR managed to curb their extremists and avoid mutual destruction. In this they were helped by the United Nations, formed in San Francisco by fifty participating countries in 1945. Given powers denied the defunct League of Nations, the UN Security Council and General Assembly kept alive the forms of deliberation, if not always their spirit. It remained more under the control of the United States than the Soviet Union, but Russia’s veto power in the Security Council acted as a balancing, if annoying, mechanism.

New Institutions for International Trade

Some farsighted people after the war saw the chance to achieve a relatively free world market, a goal that had eluded the best of intentions earlier. One of the seventeenth-century developments that had given England an economic boost had been the dismantling of local obstacles to trade within the kingdom. In France at the same time you couldn’t drive a car twenty-five miles without having to pay someone to cross a bridge or pass through a shortcut. The privilege of collecting such fees were highly prized and protected. In England goods and people moved within a single unified market, instead of the local and regional ones that dominated elsewhere. This had been a widely recognized stimulant to development, but national rivalries had nixed any effort to apply it to international trade. Instead countries erected tariffs or filled trade treaties with picayune demands for special treatment for a favored product or interest group.

World War II provided a new opening for international cooperation. The United States supplied its European allies with armament before its entrance into the war, through lend-lease treaties. In these, the American government demanded that after the war, recipients pitch in and help create a multilateral trade world that would speed recovery and promote growth, much as England’s internal market had done three centuries earlier. Despite its history of protecting so-called infant industry, the United States became the strongest advocate of free trade. Particularly offensive to American producers were the favored terms of trade within the British Commonwealth. Like the British powerhouse of the nineteenth century, the United States promoted free trade as a virtue rather than as the advantageous policy of the strong. The United States sometimes acted like a wealthy, but grouchy, uncle as in the final settling of the lend-lease agreements.21 Such behavior is not surprising. Rising above obnoxious national postures was the novelty.

The institutions established after the war created a propitious environment for economic development among the countries that participated. The dollar anchored international trade. By 1956 all Western European currencies could be converted easily, helped along by the European Payment Union. Beginning with a grant from the United States, the union promoted multilateral commerce by easing the means of payment. The dollars each country received from the Marshall Plan not only bought necessary goods but enabled it to buy from one another. Accounts for every country were settled at the end of each month, with only large debits or credits settled in gold or dollars.

Already geared toward consumption, the American economy boomed when millions of families bought big-ticket items like cars, refrigerators, washing machines, and dryers. A change in lending and borrowing made this great spending spree possible. Earlier, department stores and upscale groceries had created charge accounts. One of the conspicuous features of department store interiors was the pneumatic tubing that carried charge slips from every department up to the credit office, where they were sorted for monthly bills. So credit was not new to Americans, but it had never before been crafted into one of the pillars of prosperity. After the war, banks, retailers, manufacturers, lenders, collection agencies, and state and federal officials took the haphazard local lending industry of America and turned it into a coherent national system.

Americans now had enough purchasing power to pull the incredible flow of goods coming out of the postwar factories right into their houses and garages. From the borrowers’ point of view, buying cars, houses, and major appliances on an installment plan made lots of sense in a period of inflation. With steady and well-paying jobs as abundant as goods, the default rate was minimal. Buying on credit no longer seemed like an indulgence, but rather like prudent spending. By the 1960s national credit cards had begun to take over from individual charge accounts. That same decade saw the beginning of the malling of America as developers began creating entirely new shopping areas, often enclosed within walls with air conditioning against inclement weather. Usually anchored by a major department store, the malls that mushroomed across the country signaled the early obsolescence of downtown retailing. Even in this era of conspicuous consumption, though, creditors continued to discriminate against blacks and women.22Assumptions based on the separation of a man’s world of work and a woman’s world at home dissolved slowly. Still, female employment began hitting new highs in the 1950s, despite the return to the home of women who had been wartime workers.

Technology’s Social Impact

Into the American living room in the 1950s came the biggest novelty of all, television. The inventor of television, Philo Farnsworth, proves the randomness of mechanical genius. Growing up in Beaver County, Utah, Farnsworth tinkered with electricity from the time he was twelve. The first person to transmit a television picture, as he did in 1927, Farnsworth appropriately chose the dollar sign to send in an image with sixty horizontal lines. Although he lost a patent battle to the Radio Corporation of America, Farnsworth went on to invent 165 other devices, including vacuum tubes, electrical scanners, and the cathode ray. Farnsworth beautifully exemplifies one of the strengths of capitalism’s dependence upon innovation: It can’t ignore outsiders.

World War II gave a tremendous boost to the electronics industry with its developments in radar, sonar, radio navigation systems, and proximity fuses.23 Large orders for these radio-related products left American firms like RCA with expensive laboratories that at last could be devoted to long-delayed television projects. Far from representing a luxury for the very rich, television struck people of modest means as a lifetime entertainment investment, and besides, it could be paid for in installments. RCA, which represented a merger of U.S. and German companies, took the lead in commercializing television. It introduced color TV in the 1950s. By 1960 forty-five million homes had TV sets. Movie attendance took a dive, and radios found their best audiences in cars.

Widespread ownership of TV sets promoted one of the most intrusive novelties of the 1950s, the television commercial. A natural extension of newspaper, magazine, and radio advertising, the TV commercial seemed particularly impertinent. Television stations timed them for maximum viewing, interrupting plays, football games, and the news. The picture that is worth a thousand words became the thirty-second sequence of pictures that informed, persuaded, and irritated. Roundly criticized, TV commercials succeeded in selling everything from deodorants to life insurance. Soon political candidates saw the promise of television commercials for generating support. More effective than door-to-door canvassing, commercials soon took the lion’s share of campaign budgets. Fund raising acquired a new importance in American politics. Once again, commerce showed its power to shape institutions in unexpected ways.

Another newcomer to postwar American consumers was air travel. The U.S. government had promoted aeronautic research after the Wright brothers’ successful flight in 1903 but ceased to do so after World War I. The original airlines like American and United emerged from aircraft companies. Charles Lindbergh drew world attention in 1927, when he flew from New York to Paris in a single-engine monoplane. Lindbergh then became a pilot for Pan American, which, like the other pioneering, commercial airlines, relied on income from carrying the mail, especially to the countries of Latin America. During the 1930s fear and expense curbed commercial flying. One marketing effort to confront these obstacles boomeranged. An airlines company discovered that wives worried enough about their husbands’ safety to keep them from flying. To address this problem, the company extended free tickets to women who accompanied their husbands on business trips. Following up with questionnaires to the participating spouses, the advertisers discovered—from the angry replies they received—that not all the husbands had taken their wives!

World War II again involved the government in plane design and production. Like much else, air travel took off after the war. Jets took over from propeller planes in the 1960s, replacing such planes as the four-engine Constellation and the DC-3, which had carried cargo or twenty-one passengers for six decades. Jets could carry more passengers and get them where they were going faster. Greeting the new planes at Dulles Airport was a magnificent building designed by Eero Saarinen that looked as though it might take flight itself. At first jets were such a novelty that people went out to their local airport to see them land and take off.24 The Federal Aviation Administration took over safety issues and air traffic control in 1958.

Novelties didn’t end with television and flying. In the 1967 hit movie The Graduate, a family friend assails the hero at his graduation party with “I just want to say one word to you: plastics.” And he was right; there was a great future in plastics. Developed originally as a substitute for ivory in billiard balls, cellulose had intrigued chemists in England, the United States, Switzerland, and France for almost a century. Plastics took off after World War II.25 Then nylon stockings replaced silk ones, Bakelite dinnerware filled kitchen cabinets, and vinyl found its way onto sofas and lounge chairs. Manufacturers used polyethylene, the number one selling plastic, for soda bottles, milk jugs, storage containers, and dry-cleaning bags. Soon plastic Silly Putty hit the toy stores; Velcro came along later to replace buttons, snaps, and shoelaces. Very much a triumph of chemistry, plastics carried synthetics to a new commercial high.

The Push in American Higher Education

The most profound scientific influence on American thinking came not from the United States but from the Soviet Union. In 1957 the Soviets launched a 184-pound satellite into outer space. A month later a heavier Russian spaceship went into orbit with the dog Laika on board. Both transmitted a beep, beep, beep that was heard around the world. Americans were stunned; they had been beaten to the punch. Within four months the United States joined the Soviet in space with Explorer 1, but Sputnik had already done its public relations work, dispelling the notion that the Soviets were backward. The American reaction to this spectacular milestone in technology is what makes Sputnik so important to the history of capitalism. Pundits and politicians agreed that the United States had to make a gargantuan effort to excel in science and engineering; they agreed as well that American universities, not government research facilities, held the key.

Within a decade, public and private universities embarked on expansion programs that had the effect of changing the nature of higher education here and elsewhere. Because sending Sputnik into space represented the acme of achievement, there was no question of watering down college offerings, even with hundreds of thousands of new students. Besides, the GI Bill had shown how students from modest or even poor backgrounds had thrived in college. Women too entered universities in larger numbers in the postwar decades and often moved into nontraditional fields. The push for the inclusion of minority students came a bit later, but the post-Sputnik expansion provided the template for that effort. Enlarging higher education put special pressure on graduate programs to prepare more scientists and scholars for faculties all across the country.

The president of the University of California Clark Kerr played a major role in shaping public opinion. In a famous Harvard lecture of 1964, Kerr laid out a vision of a college education as a general right, not as something reserved for the privileged few. When he was born in 1911, only 5 percent of America’s eighteen-year-olds went beyond high school. Now Kerr insisted that the country must make room for every able student. He also called on universities to turn themselves into multiuniversities, offering a broad range of knowledge, theoretical and practical, ancient and current.26Sputnik acted as a catalyst, but it had also become increasingly obvious that capitalism’s growth was dependent upon engineers, physicists, business experts, and skilled mechanics.

Responding to this challenge, the California legislature passed the 1960 Master Plan for Higher Education, which developed a three-tiered avenue for students: The top eighth of California’s high school graduates could enter the University of California, the top third of graduates had a guaranteed place in one of the campuses in the state university system, and others could go to community colleges to prepare for later entrance into four-year institutions. Many states followed this model with multiple campuses radiating out from the original state university. In the East, where private education dominated, Massachusetts and New York started their first public university systems.

Expanding American universities rather than institutes of technology like those in California and Massachusetts, the U.S. government became a patron of the liberal arts as well as of the sciences. This is because in the United States, the first two years of college are dedicated to what is called general education, unlike other national systems, which have students specializing in secondary schools. So along with all the newly minted scientists who found good jobs in higher education there were thousands in literature, philosophy, history, political science, and sociology who did so as well. With tenured positions within the academy, much of the country’s intelligentsia lost the acerbic tone of skeptical outsiders, common in Europe. The economist Joseph Schumpeter feared that capitalism would fail because of its cultural opponents. The American public has resoundingly supported capitalism and its demands on society in part because they have not been exposed to the withering commentary of critics.

State legislatures and private philanthropists got behind the monumental effort to build university systems by opening up their purses. For that, they expected gratitude from the students. Instead campuses throughout the country and Europe became hotbeds of hotheads. Under the law of unintended consequences, the larger intake of students shaped by a liberal education in a conformist society, as that of the United States was at the height of the Cold War, produced protests and demonstrations over free speech, civil rights, and the war that the United States was fighting in Vietnam. The regents of the University of California removed Kerr in 1967 because of their unhappiness with student activism. By that time he had presided over the expansion of the university to nine campuses. Kerr, who then became head of the Carnegie Commission on Higher Education, commented that he had entered and left office “fired with enthusiasm.”

The Contribution of German Scientists to American Technology

Sputnik did more than promote higher education. It turned the exploration of space into a Cold War competition for which Congress obligingly spent billions of dollars. The United States may have demobilized its armed forces quickly, but it retained a major research and development program for new weaponry, as did the Russians. Sputnik, like America’s Explorer, drew upon German wartime developments. These in turn built on the work of America’s Robert Goddard, Russia’s Konstantin Tsiolkovsky, and Germany’s Hermann Oberth. Goddard had succeeded in firing a rocket using liquid fuel in 1926, but this aroused little interest in the United States. Quite the contrary in Germany. A young Wernher von Braun became fascinated by the possibility of space travel through the writings of Jules Verne and H. G. Wells. He joined a rocket society when he was seventeen in 1929 and learned about the work of Goddard, Tsiolkovsky, and, of course, Oberth. Three years later, von Braun entered the army. With a doctorate at age twenty-two he headed up the so-called rocket team that developed ballistic missiles. Nazi propaganda minister Joseph Goebbels named the first model “Vengeance Weapon No. 2.” Von Braun’s V-2 could deliver a two-thousand-pound warhead five hundred miles at a speed of thirty-five hundred miles per hour. Fortunately, it did not become operational until late in 1944.

But this is where the story of rockets gets really interesting. Although the Germans had relied upon many American patented devices such as gyroscopic controls, they alone possessed the knowledge of how to make liquid-propelled rockets. This pushed the American and Soviet military into a race to locate and bring back home as many scientists as possible once they entered Germany. Von Braun had seen the end of the war coming and was determined to place his work in the hands of the Western powers. He had actually arranged for the surrender of some five hundred German scientists along with lab papers and testing apparatus. Simultaneously in the summer and fall of 1945, the occupying armies were hunting down former Nazis to bring them to trial for war crimes. And here was the rub. The sought-after scientists were Nazis; no one could have worked on such sensitive programs without joining the party or one of its affiliates. Worse, some of them could also be charged as war criminals since they used slave labor in the Baltic factory that produced rockets.

The American State Department considered most of the German scientists unsavory applicants for admittance into the United States. A fight with the War Department ensued. The two departments agreed to a compromise to bring a select group of German scientists to the United States for debriefing. This revealed how extensive and profound German science had been during the war, ranging from work on rocketry to studies of the effects of radiation on the human body. The American military wanted these scientists to continue working in the United States, safe from any prospective enemy. “Ardent” became the relevant adjective to disqualify someone from entrance to the United States. Had he been an ardent Nazi? Another compromise was worked out. Only the scientists whose work appeared vital to U.S. interests would be allowed to emigrate. More than a hundred German physicists and engineers passed this screening. They were labeled “paperclip scientists” because the military reviewers had put paper clips on their papers to signify their importance. Lasting into the 1970s, the Paperclip program brought a total of seventeen hundred German scientists to America, where they laid the foundation for the American space program at White Sands, New Mexico, and Huntsville, Alabama.

This educational push greatly influenced the economy because it represented a huge investment of money and provided the intellectual infrastructure for the new wave of innovations in computers, pharmaceuticals, and aeronautics. Americans got a wonderful system of higher education, but they also took on the burden of paying for an accelerating program of research and development for military hardware from hydrogen bombs and atomic submarines to a full-fledged space program. The goal of security seamlessly succeeded that of winning the war, but wartime attitudes lingered. Secrecy sometimes cloaked inefficiencies in procurement, and members of Congress proved overly accommodating, especially if an item was made in their state. Aware of this, the Defense Department in the 1980s managed to parcel out the parts of the B-2 stealth bomber to every state in the Union.

During the war the army and navy, working on different tracks, developed the machine with the greatest future, the computer. Engineers and mathematicians had been struggling to design a device that could quickly do the complex calculations of modern mathematics. After Pearl Harbor, such an invention became even more imperative to compute firing and bombing tables. By June 1943 the first electronic, digital computer had emerged from the labs of the University of Pennsylvania, working on an army contract. The navy followed with a computer designed at the Massachusetts Institute of Technology. This one had a greater memory. The navy was about to abandon the project because of its prohibitive expense when the Soviet Union set off its atomic bomb in 1949. Now there was no turning back. The army’s pioneering ENIAC was a behemoth weighing in at thirty tons! Could anyone then have imagined that in another sixty years, people would be able to buy inexpensive handheld computers with vastly more power, speed, and versatility?

Spending on military research only increased during the long Cold War of 1947–1991. Institutionalizing this new security concern, Congress in 1947 combined the old War, Navy, and Air Force departments into a new Department of Defense. The computer introduced a revolutionary concept with widespread applicability, digital transmission. Here information or voices are converted into streams of binary digits (the bits we hear about). Analog transmission sends information as a continuous modulated wave form. During the Second World War the U.S. government funded the research that converted analog voice signals to a digital bit stream and in the 1970s installed the first fiber-optic cable system for digital data transmission.27

Contracting with universities, the government invested heavily in the research and development that corporations had already found to be the key to economic success. It took the lead in research in electronics, communications, aerospace design, and materials testing done by physicists, chemists, and ceramicists. The government did the heavy lifting, and in time companies like International Business Machines found commercial uses for much of this research. The American Telephone and Telegraph Company ran its own Bell Laboratories, and pharmaceutical companies also maintained first-rate research facilities of their own.28

Three days before he left office, President Dwight Eisenhower warned about the dangers of something he dubbed a military-industrial complex. Calling attention to the permanent war footing of the country and the vastly more complicated weaponry involved, he asked Americans to be alert to “the equal and opposite danger that public policy could itself become the captive of a scientific, technological elite.” After noting that the United States annually spent more on military security than the net income of all U.S. corporations, Eisenhower urged “the proper meshing of the huge industrial and military machinery of defense with our peaceful methods and goals, so that security and liberty might prosper together.29 Only the catchphrase “military-industrial complex” caught on; the warning went pretty much unnoticed. Those corporations that benefited from the government’s largess formed themselves into a powerful lobby to ward off any cuts in the Defense Department budget. Eventually American military spending surpassed the total military expenditures of all other nations.

Unhappily this glory period in the history of capitalism brought more environmental damage than all the horrendous destruction of the First and Second World Wars. A new term, “ecology,” the study of the relations of living organisms to their surroundings, began to penetrate public consciousness. What were we doing to our ecology? From an ecological perspective, economic progress had entailed a concerted assault on all facets of the environment. World population had grown from 1.6 billion in 1900 to more than 6 billion in 2007. These new people—burning fossil fuel, disposing of their waste, diverting rivers, dredging, earth moving, and plowing the earth’s surface—polluted, contaminated, and despoiled their habitat. Reversing this appalling process would not be easy.

Constructing magnificent dams in fact became the cherished achievement of nation builders. Their words capture the spirit of economic advance with no hint of the consequences. Winston Churchill, visiting Lake Victoria, saw the waters of Owen Falls rush into the Nile River below and rhapsodized not about the spectacular beauty but about the failure to tap into its force: “So much power running to waste…such a lever to control the natural forces of Africa ungripped, cannot but vex and stimulate the imagination. And what fun to make the immemorial Nile begin its journey by diving into a turbine.” Colonel Gamal Abdel Nasser, who came to power in Egypt in 1952, agreed. With unabated zeal he planned and funded the Aswan Dam, remarking: “In antiquity we built pyramids for the dead. Now we build pyramids for the living.” More briefly, Prime Minister Jawaharlal Nehru called his country’s new dams “temples of modern India.” These dams delivered cheap electricity and regular irrigation water, but at the high cost of interrupting the deposit of river silt on depleted soil, destroying fisheries, and causing salination.30 Environmentalists, a new phenomenon in themselves, delivered a wake-up call about this complex pattern of environmental degradation.

German scientists in the United States continued their research on radiobiology, ophthalmology, and the new field of medicine for those in outer space. All were subject to inquiry about the German use of prisoners in medical experiments without any very satisfactory resolution of the issue. The war created imperatives for combating infections, treating malaria, and healing the wounds of soldiers. This acted as a hothouse for medical research. The British scientist Alexander Fleming, had first isolated penicillin when he grew a mold that could dissolve disease-causing bacteria. British scientists came to the United States to continue this research during the war. The Pfizer Company took a heroic risk on a new way to produce penicillin that worked. By D-day, when Allied troops landed in France in June 1944, penicillin was available to treat the wounded. In three years the cost of a dose dropped from twenty dollars to fifty-five cents.

Working off Fleming’s penicillin, biochemists discovered a new class of drugs that proved particularly effective in combating pneumonia, meningitis, and other bacterial diseases. Fleming published his research results in 1929. His sulfanilamides, or sulfas, as people quickly shortened the name, became available in the treatment of the war wounded on both sides. Produced in powder form, soldiers carried their own rations into battle. When the Japanese cut off the Allies’ access to the quinine used to treat malaria, research produced a synthetic drug. Called atabrine, it saved the lives of thousands of Americans fighting in the South Pacific. The American Charles Drew discovered that blood plasma could replace whole blood that deteriorated rapidly. Soon the Red Cross began operating blood banks. By war’s end it had collected and sent to the battlefronts more than thirteen million units of blood. The pharmaceutical company Squibb pioneered a hypodermic syringe of morphine that medics could easily use on the battlefield.

The deployment of these new drugs and treatments to the civilian population laid the foundation for the postwar expansion of the pharmaceuticals industry. By 1952 Pfizer’s Lloyd Conover had taken a naturally produced drug and modified it to produce the antibiotic tetracycline. From this research a succession of antibiotics directed to specific infections became available throughout the United States and Western Europe. In Europe, governments provided universal health care while the United States stuck to a private system, extending help with the mounting costs of the myriad ways of staying well only to the old and the very poor.

The money the government spent advancing the science and technology behind computers, medical care, and aeronautics had a tremendous impact on the postwar economy, contributing significantly to the innovations that reached the market in the 1970s and 1980s. Not only did government pay for research hard to justify with future profits, but its contracts enhanced the size of each industry’s leaders, leaving them with enough revenues to continue costly research and development programs. The extended period of government support for research demonstrated the critical importance of maintaining a learning base not only for defense but for economic growth as well. Between 1941 and 1960 the government’s share of R&D funding increased thirteenfold until it represented 64 percent of the national total.

Another new element in the postwar economy was the rapidity of technological change. Corporate leaders had constantly to be on the lookout for the next significant racehorse. Yet they always risked betting on the wrong animal or moving their prospect too early or too late. On top of these problems, any new product inevitably destroyed its predecessor, which usually had been bringing in steady revenue and was familiar to both the fabricating and sales staffs.

The Entrance of Computers

One of the great success stories of the postwar era was IBM, which benefited from government research and an inspiring chief executive officer. Starting in 1911 with a line of scales, coffee grinders, time clocks, and adding machines, IBM found its Alfred Sloan when Thomas Watson joined the firm three years later. Like the great entrepreneurs of the nineteenth century, Watson infused his ideas and values into every facet of his firm, creating a loyal group of employees. A generous employer, he paid well and provided good benefits. He even built a country club accessible to all for a dollar a year and saw that regular dinners served there would spare wives some cooking. His obsession with teaching his staff how to please customers prompted him to create a company song and publish a monthly magazine filled with pep talks and product information. His employees responded fervently, imitating their boss in dress and behavior, often hanging Watson’s photograph on their office walls.31

The career of International Business Machines captures all the drama of staying out in front. IBM benefited more than other firms from government’s spending; various federal contracts paid half its research costs. Major contracts with the Defense Department gave IBM engineers access to the most advanced technology in magnetic core memories and circuit boards. IBM’s key product before the war had been the punch card, a little rectangle that conveyed data through punched holes. IBM steadily refined these cards, going from mechanical to electric to electronic processing. Watson’s idea was to confine all the elaborate collecting of information to one punch, which then could be stored, correlated, or printed. Hating to fire anyone, Watson continued making punch card machines when sales began to fall during the Depression. Happily for him, federal programs like the National Recovery Act and Social Security Administration required the manipulation of enormous amounts of data. When calls came for more IBM processing, there they were, stored in warehouses! The Nazis used IBM cards to code and manipulate the German census, with grim results.

Because of the complexity of the giant corporations that now dominated the economy, handling data became very important. All the statistics garnered to separate fixed from variable costs, returns on investments, and inventory ratios could be fed into an IBM machine to sort out. “Number crunching” entered the lexicon of management. Postwar insurance companies and banks relied on IBM punch card machines or, more exactly, relied upon IBM, for the company offered leasing contracts that included maintenance services. These were highly desirable because the apparatus became increasingly complex.

Relying heavily upon well-trained sales personnel, IBM spent considerable sums on their training. The company’s emphasis upon customer relations kept it in first place. Costly as it was to maintain sales staffs of thousands of men and women knowledgeable about IBM’s products, the firm prospered. Computers were too new and complicated for most customers to understand, but they did have confidence that IBM knew what it was doing. Watson kept IBM focused first on punch card machines and then on computers, eschewing the opportunity to become a conglomerate like Remington Rand or RCA. He even turned down the chance to buy the patent for the Xerox machine. Where people once wrote words, they now “processed” them! And all the while the amount of data that could be stored was doubling and tripling while the price of computers dropped. It was a rare technological trajectory, which subsequently became more familiar. In the first three decades after the war, the use of computers spread from the government to businesses to private persons.

IBM teetered on the edge of the computer revolution, uncertain about abandoning its punch card machines. Not until 1950 did it move from electrical to electronic data processing and then mainly because of the influence of Tom Watson, Jr., who became president in 1952. The critical shift was going from cards to magnetic tape. It proved the right move; IBM became the world’s largest producer of computers. In the late 1950s IBM added as many as twenty thousand new employees in some years. The data needing to be filed and manipulated in these marvelous new machines grew exponentially too. IBM also operated a separate branch, World Trade, where computers were made and sold abroad. Watson Senior was not an engineer, but he had an engineer’s cast of mind. Like Ford, he also thought like a salesman, educating his sales force and encouraging close relations with the firm’s customers, whose opinions were carefully weighed in management decisions. Despite all this, Watson remained old-fashioned. When his son decided that it was high time to make an organizational chart, he discovered that forty-seven units had been reporting directly to his father!32

Laws and litigation in the United States promoted and protected competition. After the war, European countries began to follow this American lead. Backed by the authority of the Sherman Antitrust Act, the Justice Department kept pretty close watch on America’s giant corporations. Its lawyers viewed IBM’s leasing policy as a restraint of trade because its comprehensiveness closed out companies that sold peripherals like monitors, printers, synchronous motors, gearboxes, keyboards, and scanners. They also challenged the Radio Corporation of America and American Telephone and Telegraph for their unwillingness to license patents.33

A 1956 consent decree stopped monopoly practices, opening up opportunities in data processing, consumer electronics, and telecommunications. Now both American and foreign companies could obtain precious licenses and develop their own lines in the growing field of electronics. A second antitrust suit in 1969, lodged against IBM’s bundling of services, enlarged competition in the field of plug compatibles.34 Soon the telecommunications industry would feel the sting of antitrust investigations.

Meanwhile IBM designed a computer system that could be disassembled for delivery and reassembled quickly. Over time customers had begun to complain about the fact that IBM’s seven different computers could be used only with their specific peripheries. At this juncture, developing something more adaptable would be risky, but so too would be not doing so. In 1961 IBM committed itself and five billion dollars, the equal of three years’ revenue, to designing a swifter, smaller all-purpose computer to replace all its special use ones. Its architecture was totally different. Its reception rewarded the risk taken. The impact on the computer industry of the System/360 was revolutionary, but alas, in the postwar environment of constant technological advance, even revolutionary systems hold the day for only a decade or two. Since any innovation kills its predecessors, stakes were high.35

The New Social Force of Labor

In the seventeenth and eighteenth centuries and in much of the nineteenth century, labor in the United States had been scarce. Farmer owners had represented a large proportion of the working class. With abundant land, American colonies had reversed the European ratio between abundant population and scarce land. This had a long-lasting effect on attitudes toward workers. European travelers were always astounded by the independence of servants in America. They also remarked on the intelligence and knowledge of men and women in rural areas. The flood of foreign labor into American factories at the end of the nineteenth century changed the character of the working class and created a large social gap between laborers and managers. After World War I, millions of black men and women moved into northern and southern cities to become part of the new proletariat.

It had been a stiff challenge organizing a work force riven by racial, ethnic, and religious differences. Unions, when successful, replaced the paternalism operating in most factories with explicit processes for hiring, firing, promoting, and evaluating workers. Elected stewards became the most important persons on the shop floor.36 For black workers the benefits were enormous, for within segregated plants—and most of them were—bargaining required that they cultivate their own leaders. Union halls offered sociability, entertainment, and education. Organizing had also provoked demonstrations, protests, walkouts, and the famous sit-down strikes in the automobile industry in 1936 and 1937.

Labor representatives won more than higher wages and complaint procedures when they sat down with management at the bargaining table. They gained respect that many managers had long been loath to extend to them. Beneficent employers from Wedgwood to Watson treated their employees as fellow human beings, but it took unions to secure recognition that manual laborers had legitimate interests of their own in the workplace, even if someone else owned it. Labor and management settled most conflicts peacefully, but strikes continued. President Truman ordered the U.S. Army to take over the railroads to end a 1950 strike. He tried to do the same thing with steelworkers in 1952, but these flare-ups did not halt the spread of union shops across America. Still, prosperity offered the best road to higher wages. The percentage of people living below the poverty level went from one-third in 1950 to 10 percent in 1973.37

What Americans didn’t get was a social safety net like those that were being put in place, or perfected, in Europe. Walter Reuther, head of the United Auto Workers, had taken a world tour for thirty-two months before the war, working around the globe. When he became head of the UAW after the war, he threw himself into lobbying Congress for full pensions, health care, and workers’ wage protection during bad times. His efforts coincided with Americans’ growing hostility to the Soviet Union, making his ideas sound like socialism—or worse, communism. They were rejected so Reuther, who had started out campaigning for all American workers, changed course and won these benefits for UAW members at the bargaining table. Those workers without unions had to catch as catch can.38

In retrospect, business leaders like General Motors’ Alfred Sloan made the wrong decision when they opposed public financing of pensions. They saddled their companies with costs that kept growing when the burden might have been spread through public funding, as was the New Deal’s Aid to Families with Dependent Children. Aside from the expansion of higher education, the engine of social reform had an uphill push in the 1960s. Progressive income tax rates and rising wages shrank the gap between rich and poor for twenty years while the economy moved ahead at full tilt. President Lyndon Johnson declared war on poverty, but the real war in Vietnam undercut many of his domestic goals.

The quest for acknowledgment of labor has been made more difficult by the language of economic analysis that depersonalizes workers. Labor is bundled with land and capital as the principal components of enterprise. In a subtle way, this has a dehumanizing effect, for it obscures the enormous difference between the human and material elements in production. We might consider the capitalist perspective that dominates public discourse as another perk for business. A recent New York Times headline announced, LABOR COSTS SOAR IN CHINA.39 Why not say, WORKERS’ WAGES HAVE RISEN IN CHINA? Even liberal institutions like universities act like hard-nosed employers when it comes to their own labor relations. In economic analysis, gains to labor can still be labeled “expropriation of profits by trade unions” and linked analytically to “extortion by organized crime.”40 From an ideological perspective, organized labor started with a deficit, relying, as it must, on collective action in a nation that celebrates the individual, even though it was the giant corporations that did most of the employing.

When companies changed owners, contracts won were lost. Union activity created strong incentives for management to mechanize as many tasks as possible. Far more significant for labor, business interests began a long campaign to push back on the Wagner Act’s support for unions. They succeeded with the Taft-Hartley Act of 1947 in limiting some union activity. From its heyday in the twenty years after World War II, union membership has steadily declined. In 1970, it peaked with 27 percent of all workers; in 1980 one in five workers belonged to a union; in 2007 one in eight, most of them working in the public sector. Legislative efforts to gain a Wagner Act-like protection for state and local government workers foundered in the 1970s.

The ease with which business interests passed the Taft-Hartley Act just a dozen years after the Wagner Act signaled the loss of momentum for organized labor. To avoid unionization, carpet and furniture-making firms began moving their plants to the South. Once there the manufacturers in the North formed a coalition with southern members of Congress to check labor’s demands for supportive legislation.41 Employers who wanted to keep out all unions kept up a steady eroding pressure on the power of organized labor. Representing a small minority, the business interests in the United States obscured in public discussions the interests of the great majority of wage earners. What management lacked in the number of its voters, it compensated for in superb organization.

There were other forces working against labor in the United States. The union’s reliance on mandatory dues and closed shops offended a sense of fairness to many in the public. Scandals over union bosses and their misuse of funds eroded respect. And then there was the fact that jobs were moving out of the industrial sector to workplaces harder to organize, like restaurants and hospitals. A renewed flow of immigrants, both legal and illegal, gave employers access to a compliant labor force, particularly after a change in the law in 1965 that eliminated the preference for European immigrants.42 Labor even lost out rhetorically as less and less was said about the “working class” and more about the “middle class,” a term that obfuscated the profound differences between well-paid professionals and those who worked but still lived in poverty.

Corporations employ not only laborers but salaried clerical and managerial employees. Their relations have not been as confrontational as those with wage earners, but they were often just as harsh. Top executives could be fired with brutal swiftness. Probably unique was the dismissal of a National Cash Register executive by CEO John Henry Patterson, who dragged the man’s desk outside, doused it with kerosene, and set it on fire. Still others have returned from lunch to discover workers scraping their names off their office doors.43 The stress of middle management—those who mediate between upper management and the work force—has spawned its own extensive literature.

Clerical workers rarely got paid what their skill and responsibilities would merit in other work, but the largely female work force accepted this disparity. When Sandra Day O’Connor, the first female member of the Supreme Court, left Stanford Law School, the only job offered her was as secretary in a law firm. She had graduated second in her class. (Supreme Court Chief Justice William Rehnquist was the first.) Once women moved into the professions in large numbers in the 1960s, clerical salaries went up. Soon there was a full-blown movement to secure “equal pay for equal work,” a term that originated in the labor movement in the 1930s but came to refer exclusively to pay discrimination against women. In 1963 President John Kennedy signed into law the Equal Pay Act, and the venerable gap between male and female salaries began to close. Since then it has narrowed from fifty-nine cents to every dollar earned by men to seventy-seven cents.

Disparities among all employed Americans shrank all through the postwar era until 1973. The rising tide, extolled in business literature, buoyed by strong unions, did lift all boats. Business gained conspicuous public support in the stock market. The American Telephone & Telegraph Company announced proudly that it had 1 million shareholders. By 1952 there were 6.5 million stockholders, 76 percent of them earning less than ten thousand dollars, the salary of salesmen and entry-level university instructors.

In the liberal postwar environment, government interference with business decisions came from a new origin, the Bill of Rights. In 1955 the Interstate Commerce Commission banned segregation on interstate trains and buses. This proved to be the launching pad for a national movement to disband the segregation of the races in southern public places. Peaceful protesters for a full range of causes won when courts defined malls as public space where the expression of opinion could not be squelched. State and municipal fair housing legislation prohibited landlords from discriminating against prospective tenants, though the implementing machinery was rarely sufficient to police these common practices. In the same spirit, more recently, pharmacists have been denied the power to refuse to fill prescriptions, like birth control pills, that might violate their conscience. Private enterprise because of its intimate interface with the public could no longer make arbitrary decisions affecting that public.

“Follow the money” was the advice “Deep Throat” gave the investigative reporters in the Watergate scandal, but that’s not always easy. What happened to the money generated by the golden period of postwar prosperity in the United States, Western Europe, Japan, and parts of Latin America? Certainly we can see that workers in Germany, France, Great Britain, and Scandinavia took a large hunk of it out in leisure with a predictable drop in productivity. Western European countries increased their investment in underdeveloped countries of the Third World and beefed up support for the World Bank. They paid more for public services. Their guest workers sent home remittances in the billions. At its peak in 2006, immigrants in the United States sent twenty-four billion dollars back to Mexico; remittances represented 29 percent of the Nicaraguan gross domestic product. Similar figures can be found for Turks in Germany, and those from Curaçao in the Netherlands and British Commonwealth islands in the Caribbean.

The End of the Postwar Boom

While most people old enough to have been alive remember where they were in 1963 when John Kennedy was assassinated, few recall their activities in 1973 with any clarity. Only in retrospect does that year emerge as the marker of more peaks and troughs than a roller coaster. The value of the dollar plunged, and the price of oil quadrupled. Union membership in the United States topped out, and the European birthrate began its long slide. Unemployment in the 1970s reached heights not seen since the Great Depression. Even increases in foreign trade, often described as an export boom, came to a rather abrupt stop in 1973 after having brought sustained prosperity to Western Europe and the United States. On average the rate of growth in the capitalist world halved in the next fourteen years.44

American military expenditures for the Vietnam War had greatly increased the number of dollars in circulation. Rather than raise taxes, President Lyndon Johnson preferred to have the Federal Reserve print money. This move exacerbated the ongoing weakening of the world’s major currency. The resulting glut made it difficult for the U.S. Treasury to continue to convert dollars into gold as it had promised to do in the Bretton Woods agreement. Johnson’s successor, Richard Nixon, pulled the dollar off the gold standard, in 1971. Now all currencies were free to float. In fact, agitated by worldwide inflation, they splashed around furiously for two years.45

Eroding even faster was American oil production. The United States had supplied almost 90 percent of the oil that the Allies used during World War II. At that time, the Middle Eastern countries, including all of the Arabian Peninsula, produced less than 5 percent. The voracious appetite for petroleum products during the boom period of the 1950s and 1960s changed all that. The Persian Gulf became the center of the oil world. Oil fields in Texas, Oklahoma, and California pumped around the clock, but it wasn’t enough. The United States had lost all its spare capacity at a time when world oil consumption was growing 7.5 percent a year. American production hit its high in 1955, and after that the United States turned increasingly to Mexico, Canada, and Venezuela for its oil. By 1955 two-thirds of the oil going to Europe was passing through the Suez Canal, which had regained the strategic importance lost when Britain left India a decade earlier. By 1973 the days of plentiful, and therefore cheap, oil were a thing of the past. Middle Eastern oil reserves were vast, but the actual production capacity of Arab states met 99 percent of demand, leaving a margin of 1 percent! Policy makers started talking about an oil crisis.

While the economic climate was losing some of its sunshine, far away a perfect storm was brewing. The hostility of the Arab countries to the presence of Israel in their neck of the globe led to the shock that made 1973 a year for capitalist countries to remember. It started on an October afternoon, when 250 Egyptian jets took off for the eastern bank of the Suez Canal to bomb Israeli positions in the Sinai Peninsula. The day was the holiest of the Jewish calendar. The Yom Kippur War might have remained a regional conflict had not other Muslim countries decided to use the “oil weapon.” They raised the price of oil 70 percent and cut production 5 percent for several months running. The price of gas at pumps in Europe and the United States rose twelvefold. In the next two decades the gross national product of the advanced capitalist countries fell from an average of 4.6 to 2.6 percent. Inflation found a new partner in unemployment.46

These decisions taken by the Arab members of the Organization of Petroleum Exporting Countries were an announcement of sovereignty; previously they had pretty much taken orders from Western producing companies like Exxon and Shell.47 Panic, shock, and disbelief coursed through the world, intensifying in those prosperous countries that depended most heavily upon petroleum products. The swiftness and unpredictability of the war and the subsequent embargo added more turbulence to the rising prices. It also caused episodic, local shortages. A whole way of life, a whole way of thinking about the future cracked, if they didn’t actually shatter. A bit of good wind in this storm of ill winds blew the way of local farmers and craftsmen who recovered old customers lost to larger cities earlier. Higher gas prices raised significantly the transportation component of costs. Flower growers in the upper Connecticut Valley, for instance, got back the trade that had gone to “the flower state” of New Jersey, an example of the old adage that one man’s disappointment is another’s opportunity.

Cassandra was a Trojan princess to whom the god Apollo had given the power of prophecy linked to the fate of never being believed. A U.S. Foreign Service officer named James Akins became a modern-day Cassandra. Undertaking a secret oil study for President Nixon’s State Department, he laid out in great detail the consequences of a rapidly expanding use of oil in the face of America’s declining control over its production. His recommendations sound familiar because they have been posted so many times since: development of synthetic fuels, greater conservation efforts, a hefty gas tax, and research on alternative ways to run industry’s machines.48 Akins’s proposals were summarily dismissed as exaggerated, possibly mendacious, and certainly un-American.

E. F. Schumacher, a German economist working in London, faired a bit better with Small Is Beautiful, a lovely book that appeared in 1973. Schumacher presented the oil crisis as a challenge to the West to mend its profligate ways. His critique of incessant consumption unfolded alongside his poetry, wit, and Buddhist wisdom. Admiral Hyman Rickover, developer of the atomic submarine, had sounded this alarm even earlier, in 1957. Schumacher’s and Rickover’s prescience reveals yet once again how self-interest can sharpen the mind. Schumacher worked for the British Coal Council, and Rickover was a prominent advocate of atomic energy.

For many observers the multiple setbacks of 1973 were just a bump in the road. In one sense they were right, but only if we ignore the shift in perceptions and attitudes. Artists and intellectuals were the first to get bored with the miracle years of prosperity. They began competing with one another to determine whether we had entered a postindustrial age or a postcapitalist or a postmodern one. Whatever it was, it was definitely “post.” In truth many had become sated with the market’s unending succession of new conveniences, new recreations, and new distractions. Several books published in the United States in the early sixties raised major issues that grew more insistent after the 1973 crisis.

Rachel Carson’s Silent Spring caught the nation’s attention in 1962 with its description of the environmental poison of pesticides and herbicides. A zoologist working for the U.S. Bureau of Fisheries, Carson detonated a fire storm with Silent Spring. Waiting in the wings were hundreds of experts who had been studying just how destructive the twentieth century had been to the planet that we inhabit. Environmentalists mounted one of the most successful political movements in history. In 1962 Michael Harrington in his The Other America: Poverty in the United States reminded the public that not everyone was prospering. Three years later Ralph Nader’s Unsafe at Any Speed took on America’s automakers; its subtitle delivers the message: The Designed-in Dangers of the American Automobile. Their words seemed even more prophetic with the multiple blows of an oil crisis, rising unemployment, and an inflation rate spiraling upward.

A younger generation took up the causes of the degrading environment, product safety, and the persisting plight of the poor and made them their own. Drawing upon the Enlightenment tradition of faith in reason with a commitment to social progress, the environmental movement took off in the 1970s. Dedicated to reestablishing a balance between human beings and nature, it also helped ease the tension generated in Western societies by the angry confrontations of the 1960s over civil rights, women’s status, sexual mores, and the war in Vietnam. For decades Westerners had been busy polluting their air, their soil, their waterways, and the habitats of themselves and the animal world without caring much about it.

At the same time, other experts were predicting the inexorable spread of industry. In 1953 W. W. Rostow, who served as special assistant for national security affairs to President Johnson, published the book that popularized the idea of inexorable modernization. What the general reader took away from The Stages of Economic Growth: A Non-Communist Manifesto was the ease with which a few good Western programs were going to transform the rest of the world in their image. Flush from the success of the Marshall Plan in Western Europe, Westerners had every reason and many incentives to believe this to be true. Traditional societies had built-in resistances to profound changes, much like Europe before the seventeenth century, but outside forces could bring them along with the introduction of technology and, above all, capital. A critical contribution of this theory was the idea that countries were not destined to be backward, poor, and dominated by atavistic cultures; all could aspire to capturing “the magic of modernity.” Today, having been immersed so long in the hopes and disappointments of modernization, we can easily miss how pathbreaking an idea this was.49

Readers could assume after reading Rostow that the quality of labor was a negligible factor in industrialization, as turned out not to be the case.50 Capital was much easier to introduce into underdeveloped countries than were production skills and entrepreneurial energy. America’s Alliance for Progress with Latin American countries had been a failure, as were other efforts to bring the Third World into the First, like the U.S. Point Four program in India. Evidently, it took more than the West’s insistence and World Bank loans to deflect the course of traditional societies. The categorization of traditional and modern began to appear too crude, too truncated to do justice to the variety of social settings in which the world’s people lived and drew meaning.51 As it turned out, the West was merely impatient and had to wait longer to see the sprouts from its investment seeds.

Probably the most striking feature of capitalism has been its inextricable connection with change—relentless disturbances of once-stable material and cultural forms. More than promote change, it offered proof that the common longings of human beings for improvement could be achieved. It opened up to a significant proportion of men and women in the West the possibility of organizing their energy, attention, and talents to follow through on market projects like forging a new trade link or meeting an old need with a commercial product. And one could do it on one’s own. One didn’t have to be tall, good-looking, young, rich, well connected, or even very smart to form a plan, though all those qualities were helpful. Capitalism sustained popular support by commanding this imaginative field. Newcomers found it hard to attract venture capital, and there were more failures than successes. Even when individual ventures were successful, few foresaw the unintended consequences of all this manipulation of nature and society. Turbulence was written into the system, but capitalism had already become self-sustaining before anyone could clearly see this.

It took a bit of time to realize that the shocks of 1973 marked the end of the “golden age” of capitalist prosperity in its homelands. Rising prices usually accompanied periods of growth, but, this time around, they came in with stagnating production. This introduced a new condition and term, “stagflation,” which in turn promoted an interest in monetary theory. In another disturbing trend, the gap between low and high incomes began its long stretch of stretching in 1969, though concern with this phenomenon rarely moved beyond rhetoric. With mounting studies documenting the neglect of the environment and the safety of workers and consumers in the most advanced societies, we could say that the greatest chapter in the history of capitalism ended with more of a whimper than a bang.

Meanwhile, back in the laboratories of Intel in Palo Alto, California, and Sony in Shinagawa, Tokyo, engineers were mapping out uses for something called a transistor. The transistor—short for “transfer resistor”—is a device that amplifies or switches the flow of electricity. It had been around for a couple of decades but now was being upgraded. Attached to an electronic circuit board, the transistor could do wondrous things because of its smallness and adaptability. Ingenuous people had found a new way to exploit the electromagnetism of our planet. This technological newcomer “creatively destroyed” the vacuum tube that had started off wireless technology. The relentless revolution continued without the benefit of a forward-looking name for the dawning era, though the United States acquired a new place-name, Silicon Valley, where things called start-ups and initial public offerings were creating a new crop of millionaires.

If you find an error please notify us in the comments. Thank you!