WHATEVER THE TWISTS AND TURNS in global politics, whatever the ebb of imperial power and the flow of national pride, one trend in the decades following World War II progressed in a straight and rapidly ascending line—the consumption of oil. If it can be said, in the abstract, that the sun energized the planet, it was oil that now powered its human population, both in its familiar forms as fuel and in the proliferation of new petrochemical products. Oil emerged triumphant, the undisputed King, a monarch garbed in a dazzling array of plastics. He was generous to his loyal subjects, sharing his wealth to, and even beyond, the point of waste. His reign was a time of confidence, of growth, of expansion, of astonishing economic performance. His largesse transformed his kingdom, ushering in a new drive-in civilization. It was the Age of Hydrocarbon Man.
Total world energy consumption more than tripled between 1949 and 1972. Yet that growth paled beside the rise in oil demand, which in the same years increased more than five and a half times over. Everywhere, growth in the demand for oil was strong. Between 1948 and 1972, consumption tripled in the United States, from 5.8 to 16.4 million barrels per day—unprecedented except when measured against what was happening elsewhere. In the same years, demand for oil in Western Europe increased fifteen times over, from 970,000 to 14.1 million barrels per day. In Japan, the change was nothing less than spectacular; consumption increased 137 times over, from 32,000 to 4.4 million barrels per day.
What drove this worldwide surge in oil use? First and foremost was the rapid and intense economic growth and the rising incomes that went with it. By the end of the 1960s, the populations of all the industrial nations were enjoying a standard of living that would have seemed far beyond their reach just twenty years before. People had money to spend, and they spent it buying houses, as well as the electrical appliances to go inside those houses and the central heating systems to warm them and the air conditioning to cool them. Families bought one car and then a second. The number of motor vehicles in the United States increased from 45 million in 1949 to 119 million in 1972. Outside the United States, the increase was even more monumental, from 18.9 million vehicles to 161 million. To produce the cars and appliances and package goods, to satisfy directly and indirectly the needs and wants of consumers, factories had to turn out ever-increasing supplies, and those factories were increasingly fueled by oil. The new petrochemical industry transformed oil and natural gas into plastics and a host of chemicals, and in every kind of application, plastics began to replace traditional materials. In a memorable scene in the 1967 motion picture The Graduate, an older man confided the true secret of success to a young man who was undecided about his future: “Plastics.” But, by then, the secret was already everywhere evident.
During the 1950s and 1960s, the price of oil fell until it became very cheap, which also contributed mightily to the swelling of consumption. Many governments encouraged its use to power economic growth and industrial modernization, as well as to meet social and environmental objectives. There was one final reason that the market for oil grew so rapidly. Each oil-exporting country wanted higher volumes of its oil sold in order to gain higher revenues. Using various mixtures of incentives and threats, many of these countries put continuing pressure on their concessionaires to produce more, and that, in turn, gave the companies powerful impetus to push oil aggressively into whatever new markets they could find.
The numbers—oil production, reserves, consumption—all pointed to one thing: Bigger and bigger scale. In every aspect, the oil industry became elephantine. For all the growth in production and consumption could not have been accomplished without infrastructure. Multitudes of new refineries were built—larger and larger in size, as they were designed to serve rapidly growing markets and to go for economies of scale. New technologies enabled some refiners to increase the yield of high-value products—gasoline, diesel and jet fuel, and heating oil—from less than 50 percent to 90 percent of a barrel of crude. The result was a sweeping conversion to jet planes, diesel locomotives and trucks, and oil heat in homes. Tanker fleets multiplied, and tankers of conventional size gave way to the huge, seagoing machines called supertankers. Gasoline stations, more and more elaborate, popped up at intersections and along highways throughout the industrial world. Bigger is better—that was the dominant theme in the oil industry. “Bigger is better” also enthralled the consumers of oil. Powered by huge engines and bedecked with chrome and extravagant tail fins, American automobiles grew longer and wider. They got all of eight miles to a gallon of gas.1
Old King Coal Deposed
In the buoyant decades following World War II, a new war was being fought, though not of the sort that was reported in communiques on the front page, but rather one buried in the pages of the day-to-day trade press. An astute student of oil affairs, Paul Frankel, called it “a war of movement.” It was also a war that reflected a great historical transformation for modern industrial societies. It had enormous economic and political consequences and profound impact on international relations and on the organization and patterns of daily life. It was the battle between coal and oil for the hearts and minds, and pocketbooks, of the consumer.
Coal had powered the Industrial Revolution of the eighteenth and nineteenth centuries. Cheap and available, it was truly King. Coal, wrote the nineteenth-century economist W. S. Jevons, “stands not beside but entirely above all other commodities. It is the material energy of the country, the universal aid, the factor in everything we do. With coal almost any feat is possible or easy; without it we are thrown back into the laborious poverty of early times.” King Coal held on to his throne through the first half of the twentieth century. Yet he could not resist, he could not stand unmoved, in the face of the great tidal wave of petroleum that surged out of Venezuela and the Middle East and flowed around the planet after World War II. Oil was abundant. It was environmentally more attractive and easier and more convenient to handle. And oil became cheaper than coal, which proved the most desirable and decisive characteristic of all. Its use provided a competitive advantage for energy-intensive industries. It also gave a competitive advantage to countries that shifted to it.
The wave broke first across the United States. Despite the motorcar, even the United States had remained primarily a coal economy until mid-twentieth century. By then, however, the coal industry’s own cost structure made it a sitting duck. With repeated price cuts, oil was becoming cheaper than coal in terms of energy delivered per dollar. There was yet another compelling reason to switch to oil: labor strife in America’s coal fields. Strikes by coal miners, led by John L. Lewis, the combative president of the United Mine Workers, were virtually an annual ritual. Lewis’s bushy eyebrows became a familiar totem among the nation’s editorial cartoonists, while his bellicose pronouncements shook the confidence of coal’s traditional consumers. Interrupt the production of coal, he boasted, and you can stop “every part of our economy.” To any manufacturer worried about the continuity of his production line, to a utility manager anxious about his ability to meet electricity requirements in the dead of winter, Lewis’s fiery rhetoric and the militancy of his United Mine Workers constituted a powerful invitation to find a substitute for coal. That meant oil, to which there was no such obvious threat, and in particular, fuel oil, a very high proportion of which was imported from Venezuela. “We ought,” a Venezuelan oil man once mused, “to take up a public subscription throughout Venezuela to erect a statue of John L. Lewis in the central square in Caracas—to honor him as one of the greatest benefactors and heroes to the Venezuelan oil industry.”2
The Conversion of Europe
The decline of King Coal followed a somewhat different course in Europe, spurred chiefly by the cheap and readily available oil of the Middle East. The first postwar energy crisis, in 1947, was Europe’s severe shortfall of coal. Its legacy for Britain was a specter of shortage. Fearing that coal supplies would be inadequate, the government began to encourage power plants to switch from coal to oil as a stopgap. But oil was hardly a stopgap. It was a relentless competitor. The 1956 Suez crisis did create a large question mark for Britain and other European countries about the security of Middle Eastern oil supplies. In the immediate aftermath of Suez, Britain decided to push ahead with its first major nuclear energy program to reduce dependence on imported oil. Plans were discussed among the industrial countries to maintain inventories beyond commercial needs—that is, emergency stocks—as a form of insurance against future disruptions. But the security concerns dissipated with surprising quickness, and Europe’s move away from coal continued unabated.
Part of the reason for oil’s victory over coal was environmental, especially in Britain. London had long suffered from “Killer Fogs” as the result of pollution from coal burning, particularly the open fires in houses. So thick were those fogs that confused motorists literally could not find their way home to their own streets and instead would drive their cars onto lawns blocks away from their own houses. Whenever the fogs descended, London’s hospitals would fill with people suffering from acute respiratory ailments. In response, “smokeless zones” were established where the burning of coal for home heating was banned, and in 1957 Parliament passed the Clean Air Act, which favored oil. Still, the biggest force promoting the switch was cost; oil prices were going down, and coal prices were not. From 1958 onward, oil was a cheaper industrial fuel than coal. Homeowners switched to oil (as well as to electricity and, later, to natural gas). The coal industry responded with a vigorous advertising campaign based upon the theme of the “Living Fire.” Despite the rhetoric, when it came to heating homes, coal was a dying ember.
Trying to balance the economic advantages of lower-priced oil against the costs and dislocations and job loss of an embattled coal industry, the British government struggled with policies that would give domestic coal some protection against cheap imported oil. But by the middle 1960s, the government had pretty much concluded that Britain’s international trade position required rapid growth in oil use. Otherwise, British manufacturers would be disadvantaged competing against foreign firms that used cheap petroleum. A government official summed up the transformation: “Oil has become the lifeblood of the economy, as of all other industrialized countries, and it affects every part of it.”
The pattern was indeed being repeated right across Western Europe. By 1960, the French government had officially committed itself to the rationalization and contraction of the domestic coal industry, and to a wholesale switch to oil. The use of oil, it emphasized, provided a way to promote the modernization of its industrial establishment. John Maynard Keynes had once said that “the German empire was built more truly on coal and iron, than on blood and iron.” But Germany, too, converted, as oil became cheaper than coal. The full extent of the conversion was dramatic. In 1955, coal provided 75 percent of total energy use in Western Europe, and petroleum just 23 percent. By 1972, coal’s share had shrunk to 22 percent, while oil’s had risen to 60 percent—almost a complete flip-flop.3
Japan: No Longer Poor
Japan was somewhat slower in initiating the shift to oil. Coal was its traditional basic energy source. Before and during World War II, oil had primarily fueled the military, with a small consuming sector in civilian transportation and a continuing use of kerosene for lighting. Refineries and the rest of the oil infrastructure were devastated in the finale of World War II. Not until 1949 did the American Occupation even permit the reestablishment of oil refining in Japan, and then only under the tutelage of Western companies—Jersey, Socony-Vacuum, Shell, and Gulf. With the end of the Occupation, the regaining of political independence, and the Korean War, Japan embarked on its remarkable process of economic growth.
So successful was the first phase, based on the rapid development of heavy industry and chemicals, that by 1956, the government was already able to make an epochal declaration: “We are no longer living in the days of postwar reconstruction.” Japan would not be poor forever, and coal, it was expected, would fuel the continuing growth. At the beginning of the 1950s, coal provided more than half of Japan’s total energy, and oil only 7 percent—less than firewood! But oil prices kept falling. By the beginning of the 1960s, there could be no question that the government and Japanese industry would bet on oil. As elsewhere, it would free the economy from the threat of labor unrest among coal miners, and once again, oil was much cheaper than coal.
As oil itself became more important to the Japanese economy, the government, as a matter of high policy, sought to reduce the foreign influence in its oil industry. The Ministry of International Trade and Industry, MITI, restructured the Japanese petroleum industry so that independent Japanese refiners would gain a substantial market share in competition with those companies linked directly to the international majors. The independents were seen as more reliable, more committed exclusively to Japanese economic objectives, more assuredly tied in to the Japanese economic and political system. A new oil law in 1962 gave MITI the authority to grant permits to import oil and to allocate sales. It used this power to bolster the independent refiners and to promote competition that would help keep the cost of oil as low as possible. Price wars resulted, as refiners battled strenuously for markets. And as though making up for lost time, Japan completed the conversion to an oil economy at a phenomenally rapid rate. In the second half of the 1960s, while the Japanese economy itself was growing at an altogether extraordinary 11 percent a year, oil demand was growing at an even more extraordinary rate, an annual 18 percent. By the end of the 1960s, oil was providing 70 percent of the total energy consumed in Japan, compared to 7 percent at the beginning of the 1950s!
Much of the increase in the demand for oil reflected the dynamism of Japanese industry. But another force was also at work—the Japanese automotive revolution. In 1955, the Japanese industry produced only 69,000 cars; just thirteen years later, in 1968, that same industry produced 4.1 million cars, of which 85 percent were bought and used within Japan, and only 15 percent were exported. That meant a tremendous rise in domestic gasoline consumption. The great auto export boom, which would help establish Japan as a formidable global economic power, had yet to begin.
The two wunderkinder of the postwar world were Japan and Germany, each of which not merely recovered from defeat but set envied and astonishing standards for economic performance. Looking back on their achievements, the economic historian Alfred Chandler succinctly summed up the recipe for their success: “The German and Japanese miracles were based on improved institutional arrangements and cheap oil.” Not all their allies and competitors had, by any means, the same access to “improved institutional arrangements,” but all had the benefits of abundant petroleum. As a result, in the boom years of the 1950s and the 1960s, economic growth throughout the industrial world was powered by cheap oil. In a mere two decades, a massive change in the underpinnings of industrial society had taken place. On a global basis, coal had provided two-thirds of world energy in 1949. By 1971, oil, along with natural gas, was providing two-thirds of world energy. What the economist Jevons had said in the nineteenth century about coal was now, a century later, true not for coal, but for oil. It stood above all other commodities; it was the universal aid, the factor in almost everything we did.4
The Struggle for Europe
Because of rapid economic growth and industrial expansion, combined with the switch from coal to oil and the advent of the popularly priced automobile, Europe was the most competitive market in the world in the 1950s and 1960s. With protectionist quotas now limiting the amount of oil that could be imported into the United States, all the American companies that had gone overseas to discover oil had to find other markets, and that meant Europe more than anywhere else. At the same time, the producing nations kept up the pressure on the companies to increase volume. “Each year our people made an annual pilgrimage to Kuwait City,” said Gulf executive William King. “It was universally the same. It would be a difficult meeting, with lots of threats and blandishments on both sides. The Kuwaitis would tell us how much they wanted us to increase our liftings there, and we would tell them that it was too much, there weren’t markets for it.” The Kuwaitis would point out that the Iranians had already won an increase. “Finally, both sides would agree on a number—a five or six percent increase.”
Where could all that additional oil be sold? There were some opportunities in the developing world. Gulf constructed a fertilizer plant in South Korea to help it obtain the right to build a refinery and distribution system in that country. It loaned money to such Japanese companies as Idemitsu and Nippon Mining so they could build refineries, with the collateral being a long-term crude contract. But Europe was by far the most significant market. Entry and expansion took not only economic ability, but also political skill, as there was much more direct and indirect government regulation and control than in the United States. For instance, companies could not simply go out and buy a plot of land and put up a gasoline station; governments exerted tight controls in allocating locations, with the result that there was tremendous jockeying for spots. “Competition was horrendously intense in Europe because the amounts of money involved were so great,” King said. “The people from the different companies would speak politely to each other and act friendly, and then we’d all go out and try to steal each other’s markets.”
Shell was the leading European marketer, which meant that it was on the defensive, and had to learn to be much more competitive. In West Germany, for instance, Deutsche Shell announced proudly that its 220 young salesmen were trained in “American-style aggressive selling.” Jersey had to be even more aggressive because it was trying to build its relative position. In Britain, a single gasoline station would often have pumps representing several companies, sometimes selling as many as six different brands. That was heresy to Jersey. It wanted stations that sold its Esso gasoline and only Esso, and it pretty much attained that objective. In order to win the affection of farmers across the Continent who were mechanizing, it sponsored a World Plowing Match in Europe. Calling on the great American tradition, its stations in Europe began to offer road maps and local tour information without charge, to win patronage both from Europeans and from the increasing number of American tourists who had grown up expecting the maps as a constitutional right, gratis.
Among the Goliaths striding across Europe, there were also a number of agile Davids that had developed production and were scrambling for markets, and in so doing, further stimulating the thirst for oil. Among them the most notable was the Continental Oil Company, later Conoco. Continental had begun life in 1929 as the result of a merger between a Rocky Mountain marketing company, originally part of the Standard Oil empire, and an Oklahoma-based crude producer and refiner. The new enterprise was a tightly defined American regional company. Then, in 1947, the board brought in a new president, Leonard McCollum, who had been Standard Oil of New Jersey’s worldwide production coordinator. McCollum wanted to focus on building up the company’s North American production. But he soon found that Continental was at a competitive disadvantage. Lower-cost foreign oil was pouring into the United States in the late 1940s, winning the incremental demand, while Continental’s domestic production was being restricted by prorationing in Texas, Oklahoma, and elsewhere. Continental, McCollum decided, would have to go overseas. The company spent a good deal of money drilling dry holes in Egypt and elsewhere in Africa over the next decade. Yet, despite the headaches and disappointments, McCollum was convinced that it was better, when it came to crude oil, to be a “have” than to be a “have-not” company. “If you set out to be a ‘have,’” he said, “you must have the audacity to acquire as much acreage as possible—to take a big bite. Though a small piece may look like a sure thing, you better take as much as you can so you don’t miss.”
In the mid-1950s, Continental took a considerable bite in Libya, in a partnership with Marathon and Amerada that was called the Oasis Group. At the end of the 1950s, Oasis began to strike it very big in Libya. But, just at that moment, the rules were being drastically changed in Washington, completely undercutting McCollum’s original strategic rationale. The new import quotas pretty much precluded Continental at the time from bringing its cheaper Libyan oil into the U.S. market, as it had planned. That meant the oil had to go elsewhere, and “elsewhere,” of course, meant Western Europe, the most competitive oil market in the world.
At first, Continental sold its surging Libyan output to the established majors and independent refiners in Europe. “We were brand new, and we had to go out and beat the bushes,” recalled one Continental executive. But the company had little flexibility, and it also had to offer considerable price concessions to its buyers. Thus, it faced the classic dilemma—dependence on others. At the turn of the twentieth century, William Mellon had turned Gulf into an integrated company, with its own refining and distribution, so that he would not have to say “by your leave” to Standard Oil or anyone else. Now, sixty years later, McCollum would do the same.
So, in three years, beginning in 1960, the company established its own downstream refining and distribution system in Western Europe and Britain, acquiring where it could, starting from scratch where it could not. Its higher-quality Libyan oil, which was particularly suited to making gasoline, pushed Continental to develop its own networks of gasoline stations. In addition, Continental negotiated long-term contracts with strategically placed independent refiners. It built a very efficient refinery in Britain, where it sold low-cost gasoline under the “Jet” name. By 1964, sixteen years after McCollum had initiated the foreign oil search, Continental was producing more overseas than in the United States. It had become a significant integrated international oil company, which had never been in McCollum’s original plan.
The multiplication of such companies, each organized as more-or-less autonomous chains, increased the competitive pressures in the marketplace and gave further push to the falling oil price. Their success would also stir up nationalist sentiment in the countries that supplied their oil. In short, the companies were most vulnerable at the extreme ends of the production chain, the wellhead and the pump.5
Courting the Consumer
The consumer, particularly the gasoline consumer in his motorized status symbol, was riding high, wide, and handsome in the 1950s and 1960s in America. The deprivation and rationing of the war years were but a distant memory. Huge investments in refinery construction and upgrading, combined with the growing volume of available oil, provided a perfect recipe for hard competition among the gasoline suppliers, driving down the price.
That suited American motorists just fine, especially when they were the beneficiaries of the frequent “price wars.” With gasoline stations sprouting on every street corner, their operators would rush hand-lettered signs out to the very edge of the property, proclaiming that their price was half a cent less than that of the station across the street. The first shot in the price wars was often fired by independent stations, unaffiliated with the larger companies, that picked up cheap surplus gasoline from the secondary market. The majors did not particularly like price wars—they could find themselves vulnerable to charges of predatory pricing—and often adopted a “we were forced” attitude. But despite the protests, the majors would sometimes initiate price wars when aggressively trying to break into new markets.
Competition took other forms as well. Never had motorists been better served. Tires and oil were checked, windows were washed, drinking glasses and sweepstakes entry forms were handed out—and all for free—in order to win and hold the affection of motorists. Credit cards were introduced in the early 1950s to tie the customer to a particular company. Television provided a whole new medium to advertise national brands and lure consumer loyalty. Texaco reached beyond its Metropolitan Opera radio fans to a far broader audience on television, with the Texaco Star Theater and Milton Berle, urging the millions and millions of loyal viewers to “Trust Your Car to the Man Who Wears the Star.” Texaco also proudly assured its patrons that, for their benefit, it had gone so far as to “register” all its restrooms throughout the forty-eight states.
Then there was the great hullabaloo over gasoline additives. Their whole purpose was to carve out brand identification for a product, gasoline, that was, after all, a commodity that was more-or-less the same, whatever its brand name. In a period of a year and a half, in the mid-1950s, thirteen of the top fourteen marketers began to sell new “premium” gasolines, racing hard to outdo one another in extravagant claims. In the years that saw the first tests of a hydrogen bomb, Richfield proclaimed that its gasoline “uses hydrogen for peace,” a bold but rather unexceptional statement, since all hydrocarbons, including gasoline, are composed of molecules that contain hydrogen. Shell claimed that its TCP (tricresyl phosphate), meant to counteract spark-plug fouling, was the “greatest gasoline development in thirty-one years.” Sinclair’s Power-X contained additives that were supposed to inhibit engine rust. Cities Service, not to be left behind, had concluded that, if one additive was grand, five would be fantastic, and introduced “5-D Premium.” As the list went on, the one common claim by all the companies, whatever the additive, was that it was the result of “years of research.”
Shell, pepped up with TCP, increased its sales by 30 percent in one year. This outrage could not be allowed to go unchallenged. Socony-Vacuum rushed out a “confidential” memorandum to its Mobilgas distributors warning that TCP was of little value and might actually damage automobile engines. “No other gasoline like it!” Socony proclaimed of its own “double powered” Mobil gas. Standard Oil of New Jersey went even further, declaring that TCP was a marketing hoax, a cure for a nonexistent problem, as spark-plug fouling did not really ever occur anymore. Instead, Jersey raised octane levels and introduced its own new “Total Power.” With the proliferation of “real” gasoline from so many companies, consumers now had a choice between so-called “regular” or “standard” gasolines and a variety of “high-octane” or “high-test” premiums. In due course, Mobil provided yet another variant, “high energy gasoline,” explaining that in its exclusive refining process, “light, low-energy atoms are replaced by huskier, high-energy atoms.” Naturally enough, a driver of a “high performance” automobile felt compelled to purchase a “high-performance” fuel—for several cents more a gallon—if only for the pleasure, real or imagined, of leaving some other hapless motorist eating his dust back at the stoplight.
Additives were but one way to win the hearts of consumers. In Britain in 1964, as part of its effort to develop a “new look” in gasoline marketing, Jersey came up with the first version of the Esso tiger and the slogan “Put a tiger in your tank.” The tiger made its way throughout Esso’s European marketing system, helping to provide a comprehensive brand identification. Its first appearance in the United States, however, was not all that successful. “This is not a very nice looking tiger,” was the sour judgment of one Jersey executive. Half a decade or so after the tiger’s original appearance, he was redrawn by a young artist who had once worked for Walt Disney Productions. This new, improved tiger, also the result of “years of research,” was friendly looking, cheerful, good-humored, helpful—and one hell of a salesman. A tiger in your tank, it seemed, could be even more effective in selling gasoline than any of the new additives. Irritated by the popularity of the Esso tiger and its advance into increasing numbers of motorists’ tanks in the United States, managers at Shell Oil began to refer to their star additive, TCP, more colloquially as “tom cat piss.”6
The New Way of Life: “Six Sidewalks to the Moon”
The inexorable flow of oil transformed anything in its path. Nowhere was that transformation more dramatic than in the American landscape. The abundance of oil begat the proliferation of the automobile, which begat a completely new way of life. This was indeed the era of Hydrocarbon Man. The bands of public transportation, primarily rail, which had bound Americans to the relatively high-density central city, snapped in the face of the automotive onslaught, and a great wave of suburbanization spread across the land.
While the move to the suburbs had begun in the 1920s, its development had been halted for a decade and a half, first by the Depression and then by World War II. It began anew immediately after the war. Indeed, the starting point may well have been 1946, when a family of builders named Levitt acquired what eventually added up to four thousand acres of potato farms in the town of Hempstead on Long Island, twenty-five miles east of New York City. Soon bulldozers were leveling the land, and building materials were being dropped off by trucks at exactly sixty-foot intervals. Saplings—apple, cherry, and evergreen trees—were planted on each lot. This first Levittown, its houses priced between $7,990 and $9,500, would eventually encompass 17,400 houses and become home to 82,000 people. And Levittown would in due course become the prototype of the postwar suburb, the embodiment of one version of the American dream and an affirmation of American values in an uncertain world. As William Levitt explained, “No man who owns his own house and lot can be a Communist. He has too much to do.”
Suburbanization quickly gathered amazing speed. The number of single-family new housing starts rose from 114,000 in 1944 to 1.7 million in 1950. With the developer’s magic, every kind of terrain—broccoli and spinach and dairy farms, apple orchards, avocado and orange groves, plum and fig groves, old estates, racetracks and garbage dumps, hillside scrub, and just plain desert—gave way to subdivisions. Between 1945 and 1954, 9 million people moved to the suburbs. Millions more followed thereafter. Altogether, between 1950 and 1976, the number of Americans living in central cities grew by 10 million, the number in suburbs by 85 million. By 1976, more Americans lived in suburbs than in either central cities or rural areas. In time it became intellectually fashionable to criticize suburbs for everything from their architecture to their values; but to the millions and millions who made their homes there, the suburbs provided better housing in which to nurture the baby boom as well as privacy, autonomy, space, yards for children to play in, better schools, and greater safety—and a haven for optimism and hope in post-Depression and postwar America.
Suburbanization made the car a virtual necessity, and formerly rural landscapes were reshaped by the pervasive automobile. The skyline of this new America was low, and new institutions emerged to serve the needs of suburban homeowners. Shopping centers, with acres of free parking, became meccas for the consumer and the strategic focus for retailers. There were only eight shopping centers in all of the United States as late as 1946. The first major planned retail shopping center was built in Raleigh, North Carolina, in 1949. By the early 1980s, there were twenty thousand major shopping centers, and they rang up almost two-thirds of all retail sales. The first all-enclosed, climate-controlled mall made its appearance near Minneapolis in 1956.
The word “mo-tel,” it is thought, was coined as early as 1926 in San Luis Obispo, California, and was applied to the clusters of cabins that grew up, often near gasoline stations, along the nation’s highways. But the reputation of this particular creation of the gasoline era was not, at first, anything to brag about. As late as 1940, J. Edgar Hoover, the director of the FBI, put the nation on alert that motels were “camps of crime” and “dens of vice and corruption.” Their main purpose, said the nation’s top G-man, was either as a rendezvous for illicit sex or a hideout for criminals. Warning the nation about the imminent dangers of the “hot pillow trade,” Hoover revealed that the cabins in some motels were rented out for sex as often as sixteen times a night. But respectability emerged out of necessity, as the American family took to the road in the postwar years. It was in 1952 that two entrepreneurs opened a “Holiday Inn” in Memphis. Thereafter, motels popped up, like mushrooms, everywhere. To parents at the end of their tether because of tired, cranky, quarrelsome children in the back seat, the green Holiday Inn sign, coming into view far down the road at dusk after a long day’s driving, was a desperately craved and infinitely welcome beacon of respite, relief, and even salvation. And throughout America, the whole family could find plenty of room at the motor inn, respectably equipped with televisions, individually wrapped bars of soap, “magic fingers” vibrator beds—and, out in the corridor, ice and soda pop machines.
People had to be fed, too, whether they were simply out for a drive in their own suburb or making a long-distance trip, so the nature of eateries changed, too. The nation’s first drive-in restaurant, Royce Hailey’s Pig Stand, had opened in Dallas in 1921. But it was not until 1948 that two brothers named McDonald fired the carhops at their restaurant in San Bernardino, California, sharply reduced the offerings on their menu, and introduced assembly-line-like food production. The new era of fast food, however, really began in earnest in 1954, when a milkshake machine salesman named Ray Kroc teamed up with the two McDonald brothers. The following year, they opened the first of their new outlets, called McDonald’s, in a suburb of Chicago, and yes, the rest was history.
America had become a drive-in society. In Orange County, California, it was possible to attend religious services sitting in your car at the “world’s largest drive-in church.” In Texas, you could sign up for your courses at a community college at the drive-in registration window. Movies flickered on the huge screens of drive-in theaters, dubbed “passion pits” by their teenage patrons. The annual model change in automobile showrooms in early autumn was a time of national celebration, when the entire populace seemed to pause reverentially to “ooh” and “aah” at Detroit’s latest innovations—wrap-around bumpers, more chrome, or longer tail fins—the tail fins being proffered on the grounds that they were needed to contain the complex system of lights that were now to be found at the back of a car. Sometimes, the additional argument was made that “these elaborate fins had a stabilizing effect on the vehicle’s motion.” As one scholar noted, that “might well have been so if the car was airborne.” But it was not. Still, it burned rubber on the ground, whizzing along at ever greater speeds, carrying its passengers to and from work and even serving as a mobile office—a boon to traveling salesmen. Ninety percent of American families took their vacations by car, and in 1964, some lucky motorist shoved into his glove compartment the five billionth road map provided free by an American gasoline station. Obtaining a learner’s permit and then a driver’s license became the major rites of passage for teenagers, their own “wheels” the most important symbol of their maturity and independence. The automobile was also absolutely central to dating, going steady, the acquisition of carnal knowledge, and the rituals of courtship. One survey in the late 1960s found that almost 40 percent of all marriages in America were proposed in a car.
The arteries and veins of this new way of life were the roads and highways. And here, as in so many other ways, public policy supported the requisite development. With a hike in the California gasoline tax in 1947, the building of the Los Angeles freeways began in earnest—including the all-critical downtown “interchange,” which tied individual freeways into a single grand system. In the same year, across the country in New Jersey, Governor Alfred E. Driscoll revealed in his inaugural address a vision for the great society in his state—a turnpike stretching from one end of the Garden State to the other, which would bring an end to the congestion and permanent traffic jam that was threatening New Jersey in the postwar years, and would save cross-state motorists an hour and ten minutes. Nothing, Driscoll believed, was more important for New Jersey’s future than a turnpike.
Construction started in 1949, exciting enormous enthusiasm in the state, where it was hailed as the “miracle turnpike” and “tomorrow’s highway built today.” There were no environmental impact studies in those days, no antidevelopment litigation, only the sense that in America you could get important things done quickly, and the whole job, from the first planning to the last toll booth, was accomplished in less than two years. The opening was celebrated with a breakfast to remember, every detail of which was personally overseen by none other than the grand maestro of America’s highway menu, the restaurateur Howard Johnson himself.
The New Jersey Turnpike soon became the busiest toll road in the United States, and probably the world. Its landmarks were the rest areas—the Walt Whitman near exit 3, the Thomas Edison, the Dolley Madison, the Vince Lombardi, and the rest—and the orange-tiled roofs of Howard Johnson’s restaurants. At the turnpike’s opening, Governor Driscoll announced: “The Turnpike has permitted New Jersey to emerge from behind the billboards, the hot dog stands, and the junkyards. Motorists can now see the beauty of the real New Jersey.” Few motorists would agree with that description. Other turnpikes were far more beautiful: the Merritt Parkway in Connecticut, the Taconic Parkway in New York. “It’s difficult to obscure major features of the landscape altogether,” one critic wrote, “but the [New Jersey] Turnpike manages it.” It was built, however, for speed and convenience, not beauty, a functional monument to Hydrocarbon Man’s urgent need to get expeditiously from one place to another. And it was only one short pathway in an ever-lengthening maze.
In 1919, Major Dwight D. Eisenhower had led his motorized military expedition across the United States with barely a road system to follow. It got him thinking about the motorways of the future. Thirty-seven years later, in 1956, President Dwight D. Eisenhower signed the Interstate Highway Bill, which provided for a 41,000-mile superhighway system (later raised to 42,500 miles) that would crisscross the nation. The Federal government would pay 90 percent of the cost, with most of the money coming from a specially designated, nondivertible highway trust fund accumulated out of gasoline taxes. The program was actively advocated and promoted by a broad coalition of interests that became known as the “highway lobby”—automobile makers, state governments, truckers, car dealers, oil companies, rubber companies, trade unions, real estate developers. There was even the American Parking Association; after all, no matter how great the distance covered, drivers would eventually have to come to the end of their trips—and park their cars.
Eisenhower himself advocated the interstate highway program on several grounds: safety, congestion, the many billions of dollars wasted because of inefficient road transport, and, evoking the darkest fears of the Cold War, the requirements of civil defense. “In case of atomic attack on our cities,” he said, “the road net must permit quick evacuation of target areas.” The resulting program was massive, and Eisenhower took great pride in the scale of the construction, using wondrous and mesmerizing comparisons. “The total pavement of the system would make a parking lot big enough to hold two third of all the automobiles in the United States,” he said. “The amount of concrete poured to form these roadways would build eighty Hoover Dams or six sidewalks to the moon. To build them, bulldozers and shovels would move enough dirt and rock to bury all of Connecticut two feet deep. More than any single action by the government since the end of the war, this would change the face of America.” His words were, if anything, an understatement. Meanwhile, public transport and the railroads would be the losers as Americans and American goods began to move in ever-larger streams along an endless ribbon of roads. If, in those expansive years, bigger was better, so was longer and wider.
Even in their living rooms, oil became part of the lives of Americans. Upwards of 60 million of them were entertained each week by a situation comedy called The Beverly Hillbillies, which became an instantaneous hit when it took to the airwaves in 1962 and was the number-one-rated show for a couple of years. Many millions more watched it elsewhere in the world. It was the story of the Clampetts, an Ozark hillbilly family that struck it rich when an oil well hit in their front yard and that forthwith left Hooterville for a mansion in Beverly Hills. The joke was their naiveté and innocence of “big city ways.” Viewers not only adored the show and the lovable oil multimillionaires but found that they couldn’t get the theme song out of their heads:
Come and listen to a story ’bout a man named Jed,
A poor mountaineer, barely kept his family fed,
Then one day he was shootin’ at some food,
When up through the ground come ’a bub-a-lin’ crude,
—oil that is, black gold, Texas tea.
The Beverly Hillbillies was a celebration of sorts. For, in truth, oil was not only “black gold” for the lucky Clampetts, but was also the “black gold” for consumers, enriching the industrial world by what it made possible. And yet there was a haunting question: How reliable was the flow of petroleum on which Hydrocarbon Man had come to depend? What were the risks?7
Crisis Again: “A Recurring Bad Dream”
Though Egypt’s Gamal Abdel Nasser had no oil with which to assert his will, he had military force. He was intent on shoring up his prestige, which had declined in the Arab world in the 1960s. He wanted to avenge Israel’s battlefield successes in 1956, and he reiterated his calls for the “liquidation” of Israel. His ultimate victory in 1956 made him overly confident of his luck. He was also being dragged along by Syria, which was sponsoring terrorist attacks on Israel, and he could not allow himself to be seen as insufficiently militant. In May 1967, Nasser ordered the United Nations observers, who had been on duty since the conclusion of the 1956 Suez Crisis, out of Egypt. He instituted a blockade against Israeli shipping in the Gulf of Aqaba, cutting off its southern port of Eilat and threatening to interrupt its ability to import petroleum, and he sent Egyptian troops marching back into the Sinai. King Hussein of Jordan put his armed forces under Egyptian command in case of conflict. Egypt began airlifting soldiers and military materiel into Jordan; and other Arab states were already sending, or planning to send, their own troops to Egypt. On June 4, Iraq adhered to the new Jordanian-Egyptian military agreement. To the Israelis, watching the mobilization of Arab military might all around them, the noose seemed to be growing very tight.
The next morning, June 5, at about eight o’clock, they responded by preempting and going on the offensive. The Third Arab-Israeli War, the Six-Day War, had begun. Gambling everything, Israel succeeded in the very first hours in catching on the ground the entire air forces of Egypt and the other belligerent states and quickly obliterated them. With mastery of the air thus assured, Israeli forces threw back the Arab armies. Indeed, insofar as Egypt and Jordan were concerned, the outcome of the Six-Day War was decided within three days. The Egyptian forces in the Sinai collapsed. By June 8, the Israeli Army had completely traversed the Sinai, destroying in the process 80 percent of all Egyptian materiel, according to Nasser himself, and had reached the eastern bank of the Suez Canal. Over the next few days, cease-fires were hastily put into place. But they left Israel in command of the Sinai, all of Jerusalem and the West Bank, and the Golan Heights.8
Among the Arabs, there had been talk for more than a decade about wielding the “oil weapon.” Now was their chance. On June 6, the day after fighting began, Arab oil ministers formally called for an oil embargo against countries friendly to Israel. Saudi Arabia, Kuwait, Iraq, Libya, and Algeria thereupon banned shipments to the United States, Britain, and, to a lesser degree, West Germany. “In compliance with the decision of the Council of Ministers taken in the session held last night,” Ahmed Zaki Yamani informed the Aramco companies on June 7, “you are requested hereby not to ship oil to the United States of America or the United Kingdom. You should see that this is strictly implemented and your company will be gravely responsible if any drop of our oil reaches the land of the said two states.”
Why would the oil-exporting countries deliberately cut off their own major source of revenues? For some, the decision was influenced by disturbances within their own borders—strikes by oil field workers, riots, sabotage—and by their fear of the ability of even a politically crippled Nasser to fire up the masses and street mobs over transistor radios. The worst disturbances were in Libya, where foreign oil company offices and personnel were assaulted by mobs; and a huge evacuation program, with planes leaving Wheelus every half hour, was quickly instituted for the Western oil workers and their families. Production was also interrupted by strikes and sabotage in Saudi Arabia and Kuwait.
By June 8, the flow of Arab oil had been reduced by 60 percent. Saudi Arabia and Libya were completely shut down. The huge Iranian refinery at Abadan was closed because Iraqi ship pilots were refusing to work in the Shatt-al-Arab waterway. The overall initial loss of Middle Eastern oil was six million barrels per day. Moreover, logistics were in total chaos not only because of the interruptions but also because, as in 1956, the Suez Canal and the pipelines from Iraq and Saudi Arabia to the Mediterranean were closed. “The crisis is more serious than at the time of the Suez blockage in 1956–57,” said a United States Assistant Secretary of the Interior on June 27. “At that time no major producer except northern Iraq was closed and the problem was exclusively one of transportation. Now … three-quarters of [Western Europe’s oil] comes from the Arab regions of the Middle East and North Africa, one-half of which is now out of production. Europe is, therefore, facing an immediate petroleum shortage of critical proportions.”9
The situation grew more threatening in late June and early July when, coincidentally, civil war broke out in Nigeria. That country’s Eastern Region, in which newly developed significant oil production was concentrated, wanted a bigger share of government oil revenues. The Nigerian government said no. Beneath the struggle over oil revenues were deep-seated ethnic and religious conflicts. The Eastern Region, calling itself Biafra, seceded, and the Nigerian government instituted a blockade against oil exports. The resulting conflict removed another 500,000 barrels per day from the world market at a critical moment.
Partly because of the overwhelming concentration on Vietnam, American policymaking on the Six-Day War took on so ad hoc a quality that it became known to participants as “the floating crap game.” In an effort to coordinate policy better, President Johnson established a special Ex-Com, chaired by Mc-George Bundy and modeled on the Ex-Com that John Kennedy had used during the Cuban Missile Crisis—and known thereafter as the “Unknown Ex-Com.” Bundy’s committee devoted much of its time to considering the implications of the closure of the Suez Canal. Meanwhile, the oil companies were compelled to take hasty and drastic action. The Interior Department in Washington, reverting once again to authorization dating from the Korean War, activated a Foreign Petroleum Supply Committee, composed of about two dozen American oil companies. If necessary, the antitrust laws could have been suspended so that the companies could jointly manage logistics and institute another oil lift to Europe. This was the same committee that had been called into action during the Iranian nationalization crisis of 1951–53 and again during the Suez Crisis in 1956–57. Said a lawyer who became, as he had in the previous crises, an adviser to the committee, “It’s like a recurring bad dream.”
The working assumption was that the oil committee of the Organization for Economic Cooperation and Development, representing the industrial countries, would, in the event of a crisis, declare an emergency and implement a “Suez system,” as in 1956, and coordinate the overall allocation among the Western countries. Yet when the United States requested such a step, many OECD countries, confident that they would be able to make their own supply arrangements, resisted. American officials were shocked. Without an OECD resolution to the effect that an emergency was at hand, the Justice Department would not grant the antitrust waiver necessary for American companies to cooperate with each other. Only when the United States warned that, without an OECD statement, American companies would not share information (and, by implication, oil) with foreign companies did the OECD unanimously, though with abstentions by France, Germany, and Turkey, approve a motion that the “threat of an emergency” existed, so allowing both American and international coordination measures to be put into effect.
The major problem once again turned out to be tankers and logistics. The normal flow of oil had to be massively reorganized. Petroleum from non-Arab sources was diverted to the embargoed countries (or, in the case of the United States, transported from the Gulf Coast to the East Coast), while the Arab oil originally destined for the United States, Britain, and Germany was sent elsewhere. The closure of the Suez Canal and the Mediterranean pipelines meant, as in 1956, much longer journeys around the Cape of Good Hope and thus resulted in a mad scramble for tankers. BP found the job of reorganizing transportation so complex that it gave up using computers—it could not write the programs quickly enough—and went back to pencil and paper. Yet the requirements of the much-longer voyages could be more easily met than was expected owing to the development of “supertankers,” an innovation spurred by the 1956 Suez crisis. By 1967, a mere eleven years after that crisis, supertankers five times larger than the tankers of 1956 were available. And six Japanese-built supertankers, each 300,000 deadweight-tons, seven times larger than the standard 1956 tanker, were poised to go into service, shuttling between the Persian Gulf and Europe.
Despite the high anxiety and uncertainty, the problems proved less severe than might have been expected. The domestic scene in the Arab countries calmed down, and once the Arab exporters got their production back into operation, the maximum loss was about 1.5 million barrels per day—the amount of Arab oil that normally went to the three embargoed countries, the United States, Britain, and Germany. The missing 1.5 million barrels could be made up in the very short term by drawing on the high stock levels and, over time, by additional production elsewhere. Seven years earlier, in 1960, the U.S. National Security Council had described American shut-in production as “Europe’s principal safety factor in the event of denial of Middle East oil.” That hypothesis was borne out in 1967. The national security arguments by some in the American government—and by the Texas independents—in favor of prorationing were vindicated; America had a large reserve capacity of shut-in oil that could be quickly called into production (though the reserve may not have been as large as was publicly claimed). With dispensation from the Texas Railroad Commission and the corresponding agencies in other states, American output surged by almost a million barrels per day. Venezuela’s output increased by over 400,000 barrels per day, and Iran’s by about 200,000. Indonesia also stepped up its production.10
By July 1967, a mere month after the Six-Day War, it was clear that the “Arab oil weapon” and the “selective embargo” were a failure; supplies were being redistributed to where they were needed. The Foreign Petroleum Supply Committee stuck to an informational and advisory role; the formal emergency machinery for joint operations and antitrust exemptions never needed to be implemented. The international companies themselves, working individually, had managed to handle the situation.
The biggest losers turned out to be the countries that instituted the embargoes. They were giving up substantial revenues to no obvious effect. Moreover, they were being called upon to pick up the bill and provide large, continuing subsidies to Egypt and the other “front-line” Arab states. Zaki Yamani began to question publicly the value of the embargo under such circumstances. Not everyone agreed. Iraq called for a complete three-month embargo on all shipments of oil to all customers, to teach the West a lesson. But Iraq found no takers among its Arab brethren. At an Arab summit meeting in Khartoum in late August 1967, Nasser, who had left 150 senior officers under arrest in Cairo to forestall a coup, admitted that his country was totally broke and desperately needed money. The assembled leaders concluded that pumping oil, and earning oil revenues, was the right thing to do; such represented an assertion of “positive” Arab strategy. By the beginning of September, the embargo on exports to the United States, Britain, and Germany had been lifted.
At this point, the risk of any shortage had disappeared. Even in August, while still observing the selective embargo, Arab oil producers had boosted their overall output to make up for lost volume and to hold on to overall market share. As a result, total Arab production was actually 8 percent higher in August than it had been in May, before the Six-Day War! The increases in Arab output alone were double what had been lost because of the Nigerian civil war.
Though this latest disruption had been dealt with rather easily, it could have been far more severe and difficult had overall production in the various exporting countries continued to be interrupted, whether by decision or by political unrest, or had it taken place under different market conditions. The U.S. Department of the Interior, in its report on the management of the crisis, drew two lessons: the importance of diversifying sources of supply and of maintaining “a large, flexible tanker fleet.” In the aftermath of the crisis, the Shah, always eager for higher production, came up with an ingenious notion that he thought would appeal to policymakers in Washington and win their backing in his continuing struggle with the oil companies. Iran, he said, should obtain a special American import quota for oil that would be stockpiled as a strategic reserve in old salt mines. This would give the United States greater security and supply flexibility, and would also give him a new market outlet. But it would take another oil crisis before the sensible idea of a stockpile was acted upon.
It was clear by the autumn of 1967 that available supplies would, at least in the short term, actually exceed demand as a result of the worldwide surge in production following the Six-Day War. In October, a lead story in the Wall Street Journal was headlined, “Shortage Fears Raised by Mideast War Yield to Threat of New Glut.” Oil and Gas Journal was already warning about a new crisis—“over-supply.” Executives in the industry no longer worried about availability of supply but instead recalled how the response to the 1956 Suez crisis had intensified the glut in the late 1950s, resulting in the imposition of U.S. import quotas, cuts in the posted price—and the birth of OPEC. Once again the pendulum appeared to be swinging in the all-too-familiar progression from shortage to glut.11
The Cassandra at the Coal Board
The outcome of the Six-Day War seemed to confirm how secure the supply of oil was. And Hydrocarbon Man continued to take his petroleum for granted. It defined and motivated his life, but because it was so pervasive, and so readily available, he hardly thought about it. After all, the oil was there, it was endlessly abundant, and it was cheap. It flowed like water. The surplus had lasted for almost twenty years, and the general view was that it would continue indefinitely, a permanent condition. That was certainly the way things looked to most people within the oil industry. “The ‘over-hang’ of surplus crude avails is very large,” said a study by Standard Oil of California (Chevron) in late 1968. “Pressures will exist to continue to produce in many areas in excess of market requirements.” If consumers gave the matter any consideration at all, they too would have expected cheap oil to continue as virtually a birthright, rather than the product of certain circumstances that could change; and their main concern would have been nothing more than whether to drive a couple of extra blocks to save two cents a gallon during a price war.
There were mavericks, skeptics, who questioned and said unfashionable things, but they were few. One was a German-born economist, E. F. Schumacher, who had studied at Oxford as a German Rhodes Scholar, and then at Columbia University, and then emigrated permanently to England in the late 1930s. A sometime writer for the Economist and the Times of London, he became in 1950 the economic adviser to the National Coal Board, which controlled the industry that Britain had nationalized after the war. It was a position he would hold in virtual anonymity for two decades. But Fritz Schumacher had a fertile, broad-ranging mind. He became fascinated with Buddhism, and he investigated what he called “intermediate technologies” for developing countries as an alternative to the high-cost, showcase industrial projects that were copied from the West.
As economic adviser to the Coal Board, Schumacher also had a specific agenda to defend. He was charged with providing the intellectual fodder for the coal industry in its great struggle against oil for market share. He was one of the strongest minds on what turned out to be—inevitably, it seemed—the losing side of that battle, and it was with bitterness and regret that he observed coal so unceremoniously deposed as “the universal aid.” Later, he would be much celebrated by environmentalists, yet he was defending coal, a dirtier fuel, against oil. But his focus was on the problem of depletion, not on the effects of combustion, which would much concern his followers two decades later.
“There is no substitute for energy,” Schumacher said in 1964, echoing Jevons, the nineteenth-century economist and celebrator of coal. “The whole edifice of modern life is built upon it. Although energy can be bought and sold like any other commodity, it is not ‘just another commodity,’ but the precondition of all commodities, a basic factor equally with air, water and earth.” Schumacher argued vigorously for the use of coal to supply the world’s energy needs. Oil, he believed, was a finite resource that should not be used wantonly. He also thought that it would not always be cheap, as reserves dwindled and exporters sought to capture a larger and larger share of the rents. More specifically, he warned about dependence on Middle Eastern oil. “The richest and cheapest reserves are located in some of the world’s most unstable countries,” he wrote. “Faced with such uncertainty, it is tempting to abandon the quest for a long-term view and simply to hope for the best.”
In an age of optimism, Schumacher’s own long-term view was glum. He expressed the risks in economic terms. With the fast consumption growth rates and low prices, he warned, “the world’s oil supply will not be ensured for the next twenty years, certainly not at current prices.” Once he was even tempted to put his warning in more metaphysical terms. Citing an eminent Oxford economics professor, he declared that “the twilight of the fuel gods will be upon us in the not very distant future.”
But his was a voice in the wilderness. There still was a huge oil surplus, and Schumacher continued to issue his jeremiads to an indifferent and uninterested public. In 1970, discouraged and figuring that he had done all he could in his battle against oil, he retired from the Coal Board. He had scored little with his arguments; indeed, his years at the Coal Board coincided almost exactly with the two decades in which petroleum had mercilessly dethroned Old King Coal and assumed suzerainty over industrial society. “The chickens are about to come home to roost,” Schumacher took to saying with resignation. At that point, he seemed an irritable, crankish killjoy, who had refused to enjoy the party. But he would soon publish a book that would challenge the precepts of the Hydrocarbon Age and the very foundations of the enthrallment of “Bigger is Better.” And before very long, events would make him look less like a killjoy and more like a prophet.12