During the latter part of the cold war, an eminent historian described American-Soviet competition in the Middle East as “new wine in old bottles.” What he meant by this was that the cold war struggle for influence in the region might be seen as an extension of the Eastern Question of the nineteenth century. Once again, great powers outside the Middle East intervened in the region to gain strategic advantage over their rivals. Only the cast of players and their immediate goals changed. Instead of the main actors being Great Britain, imperial Russia, and France, the main actors in the cold war drama were the United States and the Soviet Union. Instead of great powers defining their interests in terms of protecting their route to India or seeking warm water ports, the great powers defined them in terms of a struggle between rival ideological systems locked in a titanic contest for the future of the world. Each viewed their competition in the Middle East as just one more front in that contest.
Historians debate the exact date of the end of the cold war (1989? 1991?), and the road to the much heralded “new world order” has proved to be quite a bit bumpier than many had expected. Nevertheless, the world in which we currently live is a world defined by the defeat of the Soviet Union in the cold war by the United States and its allies. It is also a world in which the United States has held an undisputedly dominant position in international affairs. For these reasons, this chapter is written from the standpoint of the sole remaining superpower, the United States.
Before World War II, the Middle East held little interest for the U.S. government. This is not to say that private citizens and nongovernmental groups ignored the region. Ever since the first governor of the Massachusetts colony, John Winthrop, called on colonizers to make their new home a “city on the hill,” this image has resonated with Americans. Accordingly, over the course of American history, many Americans have felt a special affinity for that original city on the hill, located in the “Holy Land.” American missionaries and travelers went to the region to save souls and survey sites from the Bible. They also founded schools and hospitals. In 1866, American missionaries established the Syrian Protestant College, now known as the American University in Beirut. Its motto was, and continues to be, “That they might have life and have it in abundance.” In other words, American missionaries assumed the burden of bringing civilization and progress to the site of Christianity's birth — a site that, after the rise of Islam and centuries of “Turkish” rule, they believed had fallen on hard times.
“A little piece of home”: American University in Beirut, 1920. (From: The Collection of the author.)
The American government did undertake the occasional and desultory diplomatic and even military foray into the region before the cold war. Thomas Jefferson sent a naval squadron to “the shores of Tripoli” (in present-day Libya) after a local potentate declared war on America. The trouble began when the frugal president balked at paying the protection money his predecessors had paid to prevent the potentate's pirate ships from attacking American merchant vessels. Jefferson's successor, James Madison, followed suit, only this time sending a squadron to Algiers. Rather than warships, Abraham Lincoln sent a brace of pistols as a gift to ‘Abd al-Qadir al-Jazairi, the former Algerian resistance leader who, while in exile in Damascus, had intervened to protect Christians during the 1860 sectarian riots there. Lincoln also signed a treaty of commerce and navigation with the Ottoman Empire at a time when much of the world was unsure that there would be a United States for much longer. When a Moroccan bandit, Ahmad al-Rasuli, kidnapped an American businessman, Ion Pericardis, Theodore Roosevelt won public acclaim by storming, “Pericardis alive or Rasuli dead.” While Roosevelt was strutting around with his “big stick,” the Moroccan government quietly paid Rasuli the ransom he demanded. And during and immediately after World War I, U.S. presidents and
From Basra, Iraq to Mecca, California
Before there were Hershey Bars (invented 1900), Americans satisfied their sweet tooth with dates imported from eastern Arabia and Basra. As a matter of fact, over the course of the nineteenth century, America became the world's most lucrative market for the sticky fruit. One particular variety of date, called fardh, was an immediate hit with merchants and consumers alike. Merchants liked it because it could withstand the rigors of the one-hundred-day voyage from the Persian Gulf to America. Consumers liked it because it ripened in August, earlier than dates from cooler climes. The arrival of dates in New York thus coincided with the onset of the winter holiday season. And just as the Macy's Thanksgiving Day Parade now signals the beginning of our holiday season, the arrival of dates came to signal the same then.
Over the course of the century, sail gave way to steam, and with the opening of the Suez Canal, the one-hundred-day voyage gave way to a voyage that lasted sixty. This meant that later-ripening “golden dates” from Basra were now available to complement Thanksgiving Day meals. Then, something was added to make America's date-craving even sweeter: a competition. In 1899, shipping companies began competing with each other in an annual “date race” to see whose ship would arrive in New York first. American newspapers followed the progress of the ships steaming from Basra. At stake was not only bragging rights and a prize, but higher prices in a date-deprived market. In combination, the excitement of the date race, increased urbanization, an expanding consumer culture, and the domestication and commercialization of the fall/winter holidays resulted in a sevenfold increase in American date imports from 1885 to 1925.
There was a disturbing feature to the global trade in dates, however. Date farming is labor-intensive. Date palms need irrigation and pollination, for example. The first was done with primitive technology. The second was done by hand. Adding into the mix harvesting and packing, the date industry required a large workforce. It is ironic that the expansion of date exports, fueled by a modern world economy, encouraged the expansion of one of the oldest systems of labor: slavery. The labor of enslaved Africans was integral to satisfying America's sweet tooth.
Global trade sowed the seeds (so to speak) of the date trade's expansion and it sowed the seeds of its demise. In 1902, representatives from the United States Department of Agriculture began sending seedlings and offshoots of date palms home from the Persian Gulf. The USDA determined that the Salton Basin of California (where the Mecca of the title is located) held the most promise for their planting, and a little over a decade later a visitor estimated that the basin contained about 200,000 date palms. By the 1920s, American date production had taken off, throwing the Persian Gulf date economy into a tailspin.
And there was more to come. There was a second commodity that linked the nineteenth-century Persian Gulf to the world economy: pearls. Pearl diving, done in large measure by slaves, was a centuries-old tradition in coastal communities. In 1896, a Japanese noodle-shop owner named Kokichi Mikimoto perfected a method of creating artificial pearls. Between 1908-1911, his “cultured pearls” hit international markets. The Persian Gulf pearl industry collapsed alongside the date industry. Neither recovered.
(From the work of Matthew S. Hopper)
Congress weighed in on the Armenian massacres and Zionism (they deplored the former, supported the latter). Overall, however, when it came to foreign policy, the interest of the U.S. government lay outside the region. The Middle East — that is, the Ottoman Empire — was, after all, part of the concert of Europe throughout much of the nineteenth century. The United States thus let Europeans deal with Middle Eastern problems.
Even when the U.S. government stepped in to protect American oil interests in the Gulf from the “rapacity” of British and French oilmen during the interwar period, it was with the idea that others — the French and particularly the British-had the primary imperial responsibility for the area. Only in the wake of World War II did American policy makers work to replace the old imperialist powers in the region. It was not until after 1956, in the wake of the Suez War, that the United States accomplished this, finally replacing France and Britain as the primary Western power in the region.
Surprisingly, American policy with regard to the Middle East remained fairly stable throughout the second half of the twentieth century. This can be seen by comparing a policy statement made at the beginning of the cold war with one made toward its end. In July 1954, the National Security Council sent to President
Dwight D. Eisenhower a report entitled “United States Objectives and Policies with Respect to the Near East.” Under the section titled “Objectives,” the report lists the following:
a. Availability to the United States and its allies of the resources, the strategic position, and the passage rights of the area and the denial of such resources and strategic positions to the Soviet bloc.
b. Stable, viable, friendly governments in the area, capable of withstanding communist-inspired subversion from within and willing to resist communist aggression.
c. Settlement of major issues between the Arab states and Israel as a foundation for establishing peace and order in the area.
d. Reversal of the anti-American trends of Arab opinion.
e. Prevention of the extension of Soviet influence in the area.
f. Wider recognition in the free world of the legitimate aspiration of the countries in the area to be recognized as, and have the status of, sovereign states; and wider recognition by such countries of their responsibility toward the area and toward the free world generally.
In April 1981, Peter Constable, deputy assistant secretary of state for Near East and South Asian affairs in the administration of Ronald Reagan, testified before Congress “to provide an integrated picture of our policies toward the Middle East and Persian Gulf region.” He listed the fundamental American objectives in the region as promoting the security of friends, assuring the security and availability of resources, and protecting vital transportation and communications routes. Constable then identified three “threats and challenges.” First and foremost was Soviet expansion, both direct and indirect. Constable's testimony took place two years after the Soviet invasion of Afghanistan. The second threat to American interests was regional disputes and conflicts that jeopardized regional stability and provided fertile opportunities for external (Soviet) exploitation. Although a number of conflicts — the Lebanese Civil War, Iran vs. Iraq, Ethiopia vs. Somalia, and so on — posed a danger in American eyes, Constable focused much of his remarks on the Arab-Israeli conflict. Constable stated that “deep divisions and unresolved issues....will continue to affect United States interests, relationships, and objectives until they can be composed on broadly accepted terms.” Finally, Constable pointed to the destabilizing effects of political change, social development, and economic growth. In the age of Third World assertiveness and turbulence in Iran, it appeared that “change,” “development,” and “growth” did not bring stability, as policy makers had predicted they would at the onset of the cold war. They brought false hope, instability, and risk to America.
Between 1954 and 1981, and continuing through the end of the cold war, policy planners issued other pronouncements delineating American goals in the region. Although there were a few changes in the margins, most repeated pretty much the same policy objectives as the National Security Council and Peter Constable. Overall, then, we can identify six such objectives that guided American policy toward the region for over forty years.
First and foremost among American goals in the region was the containment of the Soviet Union. That is to say, the primary objective of the United States in the Middle East, as in all other areas of the cold war world, was to prevent the expansion of Soviet influence into the region. The United States had every reason to worry. The Soviet Union was located in the geographic heartland of the Eurasian continent and there was no reason to believe that its geopolitical ambitions were different from those of its predecessor, imperial Russia. As a matter of fact, the first cold war confrontation between the United States and the Soviet Union took place in the Middle East. In 1946, the Soviet Union refused to remove its troops from northern Iran, which it had occupied during World War II. It eventually withdrew them, but only under pressure. The heartland of the Middle East became a battleground between the two superpowers as Soviet strategy shifted in the late fifties. Under Nikita Khrushchev, who led the Soviet Union in one capacity or another from 1953 to 1964, Soviet strategists sought to spread Soviet influence by leapfrogging over surrounding states into the wider world. By doing so, Soviet strategists believed they could break containment, take advantage of anti-imperialist sentiments and the Third Worldist clamor for social and economic justice, and outflank the United States without directly confronting its nuclear-armed nemesis. Thus, from 1955 onward, the Soviet Union sought out allies in the heartland of the Middle East, including the three revolutionary republics: Egypt, Syria, and Iraq.
The second goal of the United States in the Middle East was to assure Western access to oil. There are two reasons for this: economic and strategic. Access to oil for domestic consumption has been, of course, a major concern for American policy planners for many years. At the beginning of the cold war, however, the United States did not depend on the Middle East for its oil. As a matter of fact, in the 1950s the international oil market was so glutted that President Eisenhower imposed import quotas to protect oil companies from falling prices. It was only in 1969 that the United States began importing crude oil from the region for domestic consumption. By the time of the oil crisis of 1973, the United States was importing more than a third of its oil from the Middle East. Subsequently, imports from the Middle East have been deliberately reduced: As of 2008, the United States imported a little over 23 percent of its oil from the Middle East (i.e., the Persian Gulf and North Africa). Canada remains America's largest source for imported oil.
But if oil for domestic consumption was not an immediate concern for the United States at the onset of the cold war, oil as a strategic commodity was. After World War II, the United States sustained European and Japanese economic recovery with cheap Middle Eastern oil. The United States viewed economic recovery in those regions as essential to prevent social revolutions — communist revolutions. American policy makers have viewed oil as a strategic commodity ever since. Europe still gets about a third of its oil from the Middle East; Japan gets about 90 percent.
The third goal of American policy in the Middle East was to ensure the peaceful resolution of conflicts and the maintenance of a regional balance of power in the region. The U.S. government feared that regional conflicts — most of all the Arab-Israeli conflict — would polarize the region. This would encourage some states to turn to the Soviet Union (always the second-best option for Middle Eastern states during the cold war) and might destabilize the governments of America's friends. The best way to ensure stability in the region was to establish some sort of regional balance of power. During the Truman administration, the United States and its allies agreed to coordinate arms sales to Israel and surrounding Arab states to make sure neither side would have a clear advantage. After that policy broke down, most American policy makers sought to assure peace by keeping Israel at least as strong as the sum total of its potential adversaries. American policy makers also sought to establish a balance of power in the Gulf. As a result, the United States “tilted” toward Iraq during the Iran-Iraq War and, three years after the war ended, led a coalition against Iraq in the Gulf War.
To ensure regional stability, the United States promoted stable, pro-Western states in the region. In addition, policy makers believed that if the states of the region were strong, and if they fulfilled the aspirations of their populations, they and their populations would resist Soviet blandishments. At first, American policy makers defined popular aspirations in terms of anti-imperialism, nationalism, and economic development. Thus, in the immediate aftermath of World War II, the United States encouraged decolonization (wherever the Soviets couldn't take advantage) in the region. It did not hurt that decolonization upset the system of “imperial preferences,” whereby colonial powers had privileged access to colonial markets, and opened up those markets to American business. In addition, state department officials, policy planners, and Central Intelligence Agency spooks often supported the “modernizing” military officers who took power in military coups d'état. As a result, the annals of contemporary Middle Eastern history are filled with stories — some probably fabricated — of ambassadors giving winks and nods to colonels and CIA agents distributing suitcases of money to local politicians and military officers.
The United States also supported the economic development of states in the region, both as a contributor of foreign assistance and as an advocate in international economic institutions such as the World Bank. As was the trend during the 1950s, development experts often encouraged the construction of colossal projects which they believed would provide the magic bullet for economic development. From 1953 to 1955, for example, the Eisenhower administration sent Eric Johnston, the former head of the Motion Picture Association, to the Middle East to negotiate a comprehensive plan for dividing the waters of the Jordan River among Israel, Lebanon, Jordan, and Syria. The plan, which included a blueprint for agricultural development, was modeled on the Tennessee Valley Authority, the showpiece of Depression-era public planning in America. Johnston's efforts failed, as did all American peace-through-economic-development schemes proposed during the 1950s and 1960s. (A Jordanian government official actually told the American ambassador there, “We've been impoverished for a thousand years. Rather than making peace with Israel, we'll be impoverished for another thousand.”) The United States (and Great Britain) also backed Egypt's request for World Bank financing to build the Aswan High Dam. Like the Johnston Plan, the Aswan High Dam was a megaproject that, by regulating the flow of the Nile and harnessing its waters, was expected to provide the foundation for Egyptian development. When the United States, angered by Nasser's recognition of “Red” China, withdrew its support from the project, Nasser sought to make up the shortfall by nationalizing the Suez Canal. The nationalization set off the chain of events that led to the Suez War of 1956.
Although the United States replaced Britain and France as the dominant outside power in the Middle East in the wake of the Suez War, it soon found its ambitions in the region threatened by the very anti-imperialism and nationalism it had sought to channel. The United States had supported Nasser and the Free Officers in Egypt in 1952, but by 1958 Secretary of State John Foster Dulles was referring to Nasser as “nothing but a tinhorn Hitler.” In the wake of the 1958 coup d'état in Iraq, the United States placed itself in opposition to Nasser and the pan-Arab nationalism Nasser personified. About a decade and a half later, when the United States perceived “excessive” economic nationalism to be a direct threat to its national security, it did the same with state-guided economic development. By the close of the cold war, the United States, once again in conjunction with the international financial institutions it dominated, was preaching the message that economic growth and political stability could only be achieved in the Middle East if states would liberalize their economies and give vent to private initiative. “Globalization” had replaced “modernization” as the mantra of economists.
The fifth goal of American policy during the cold war was the preservation of the independence and territorial integrity of the state of Israel. The American/ Israeli alliance did not begin immediately. The decision made by President Truman to recognize Israel in 1948 was by no means a sure thing. Policy planners feared that the partition of Palestine would lead to a bloodbath that would divert American troops and attention away from Europe. They also feared that U.S. recognition of Israel would jeopardize American relations with the Arab world and thus jeopardize European and Japanese economic recovery. When President Truman announced at a closed-door meeting with policy makers that he planned to endorse partition, Secretary of State George Marshall stated, “Mr. President, if you proceed with that position, in the next election I will vote against you.” Eight years later, Eisenhower was so outraged by Israel's participation in the Suez conspiracy that he threatened economic retaliation if Israel did not withdraw from Egyptian territory. It was not until John F. Kennedy that an American president used the word “ally” when referring to Israel.
Nevertheless, the United States has consistently reaffirmed its commitment to Israeli sovereignty and security. Numerous factors contributed to the American/ Israeli alliance, from ideological to strategic to domestic. In terms of ideology, the Israelis have presented their case well in the United States, portraying Israel as the sole democracy and repository of American values in the region. In terms of strategy, U.S. policy makers oftentimes viewed Israel as a proxy in the fight against Soviet influence in the region. In terms of domestic politics, presidents and congressmen have attempted to garner Jewish — and, more recently, Christian evangelical — votes by portraying themselves as supporters of Israel. None of this means, however, that the American-Israeli relationship has been trouble free, or that the United States has agreed with Israel across the board on such issues as borders, Israeli settlement policies in the occupied territories, approaches to ending the Arab-Israeli conflict, or the status of Jerusalem.
The final objective of American policy during the cold war was the protection of sea lanes, lines of communications, and the like, connecting the United States and Europe with Asia. The Middle East is, after all, the middle East. Its geographic position alone makes it a prize worth fighting for by any power with global pretensions.
In the most abstract sense, then, American objectives in the Middle East— containing the Soviet Union, maintaining access to oil, achieving a peaceful resolution of conflicts and a balance of power among states of the region, safeguarding Israel, and capitalizing on the strategic location of the region — remained consistent over the course of the forty-year cold war. Why, then, does it appear to have been otherwise?
There are several reasons why U.S. policy appears to have been inconsistent. First of all, although American administrations faithfully advocated the same six policy objectives for forty years, the approaches the American government used to achieve them varied over time. For example, during the course of the cold war there were two main strategies of containment: peripheral containment and strong-point containment. The idea behind peripheral containment was to ring the Soviet Union with an unbroken string of pro-American states linked together through a system of alliances. This seemed the appropriate response to Soviet expansion across its borders during the early cold war period.
While the most famous and most successful of these alliances was the North Atlantic Treaty Organization (NATO), there were others. In 1955, for example, the British organized the “Baghdad Pact,” made up of Britain, Turkey, Iraq, Pakistan, and Iran. The pact was a failure. Because Egypt and Iraq were locked in a rivalry for leadership of the Arab world throughout much of the cold war, Egypt opposed it. The Egyptians signed an arms deal with the Soviet-bloc state of Czechoslovakia in 1955, thus rendering the alliance irrelevant. After military officers deposed the Iraqi monarchy in 1958, Iraq withdrew from the alliance anyway. All that was left was an empty shell called the Central Treaty Organization (CENTO), made up of the remaining states. In all, the Baghdad Pact and CENTO proved as effective in preventing the spread of Soviet influence in the Middle East as SEATO (Southeast Asia Treaty Organization) did in Southeast Asia.
With the failure of peripheral containment in regions outside Europe, American policy makers adopted the strategy of strong-point containment. Strong-point containment called for the judicious strengthening of a few “fortress” allies in various regions. It was hoped that this would prevent the Soviets from projecting their power abroad through proxy states bound to the Soviet Union by treaty. The United States chose its fortress states on the basis of the strength of their economy or military or government apparatus. Thus, during the 1970s the United States came to depend on Israel in the western Middle East to prevent the Soviets from using their Syrian ally to spread their influence. In the eastern Middle East, the United States depended on the Iranian government (and, to a lesser extent, Saudi Arabia) to prevent the Soviets from using Iraq in the same way. While successful in the short term, strong-point containment in the Middle East ultimately contributed to disastrous consequences for the United States in the region: the Iranian Revolution of 1978-1979 and the Israeli invasion of Lebanon in 1982.
The containment of the Soviet Union was one policy goal that might be achieved in multiple ways. The preservation of the independence and territorial integrity of the state of Israel was another. During the cold war, some policy makers believed that this goal could be achieved by regarding Israel as a “strategic asset,” a phrase coined during the Reagan administration. Another approach was expressed in the title of an article written in 1977 by that embodiment of the pipe-smoking American foreign policy establishment, George Ball. The article was entitled “How to Save Israel in Spite of Herself.” According to Ball, Israel's long-term security depends on a settlement of the Arab-Israeli dispute and good relations with its neighbors. Israeli intransigence not only prolongs the atmosphere of hostility, but undermines the governments of moderate neighbors, such as Jordan, which have nothing to show for their moderation. Therefore, if the United States truly has Israel's best interests at heart, it should adopt a more “evenhanded approach” and drag Israel, kicking and screaming if need be, to the bargaining table to negotiate a fair peace. Needless to say, successive Israel governments and their supporters in the United States have had problems with Ball's approach.
A second reason why U.S. cold war policy in the region seems inconsistent is that policy planners often attempted to achieve one objective at the expense of others. In 1969, on his way home from a trip to Asia, President Richard Nixon stopped on the island of Guam and held a press conference at which he alluded to what would become known as the Nixon Doctrine. The United States was, at that time, embroiled in Vietnam and was looking for ways to avoid similar entanglements in the future. According to the Nixon Doctrine, the United States would give support to regional surrogates engaged in the fight against international communism without itself deploying forces. The idea was to put teeth in the words of Nixon's predecessor, Lyndon Baines Johnson, who announced (falsely as it turned out), “We are not about to send American boys nine or ten thousand miles away from home to do what Asian boys ought to be doing for themselves.” Soon thereafter, OPEC decided to raise oil prices. The U.S. government hardly let out a whimper. After all, price increases would allow America's regional surrogates (particularly Iran) to use their newly acquired wealth to buy the American weapons that, in turn, would enable them to block Soviet and Iraqi ambitions in the Gulf. In this case, containment trumped oil.
U.S. policy also seems to have been inconsistent because of what might be termed “the law of unexpected consequences.” When formulating and implementing Middle East policy, the United States does not operate in a vacuum. For every move the United States made in the Middle East, the Soviets and local actors could be expected to make a countermove — very often an unexpected countermove— thereby forcing the United States to reevaluate its tactical or strategic approach.
Moves that the United States made not only affected individual states, they frequently had effects — often unexpected — on the regional balance of power. Although Jimmy Carter was widely applauded for his role in mediating the Camp David Accords between Israel and Egypt, the accords had consequences none of the negotiators could have anticipated. After Egypt signed a peace treaty with Israel, it was expelled from the Arab League. This left Iraq as the dominant force in the inter-Arab balance of power. Many political scientists argue that Iraq invaded Iran in 1980 to consolidate its hegemonic position in the Gulf. Many also trace the Israeli invasion of Lebanon in 1982 to Camp David. The clauses in the Camp David “Framework for Peace in the Middle East” dealing with a solution to the Palestinian question were stillborn. According to some scholars, the Israeli government thus decided to impose its own solution on the Palestinians. All that stood between Israel and that solution was the Palestine Liberation Organization (PLO), safely ensconced in Lebanon. The Israeli government therefore thought it could kill two birds with one stone: destroy the PLO once and for all and impose a settlement for the West Bank and Gaza Strip unilaterally. It is doubtful that the Israeli government would have committed itself to this adventure had it not believed that Egypt would abide by the peace treaty it signed and not threaten Israel from the south. Add to the mix the assassination of Anwar al-Sadat, which came about as a direct or indirect result of Camp David (depending on whom you ask), and the handshake on the White House lawn loses much of its luster.
Finally, U.S. policy during the cold war appears to have been inconsistent because even a superpower does not have a boundless capacity to impose its will on the world, and failures prompted the reassessment of policies. As successive American administrations learned from attempts to move Israelis and Arabs to the bargaining table, to impose unpopular economic policies in Egypt, or to build a viable state in Lebanon, the American ability to direct events or reconstruct states in its own image is, at best, limited.
Consistent or not, was American policy in the region successful during the cold war? Before the events of September 11, 2001, former National Security Council member William Quandt wrote a number of articles arguing that it was. Quandt compares the costs of U.S. policy in the region with the benefits the United States derived from that policy. According to his tally, U.S. policy in the Middle East was far more successful than United States policy in many other parts of the world. During the forty-year cold war, approximately five hundred Americans lost their lives in service to their country in the Middle East. Almost half that number were American marines killed in a single incident in Beirut in 1983. Compare that figure with the number of Americans killed in ten years (1965-1975) in Southeast Asia — over fifty thousand. And America was far more successful in achieving its objectives in the Middle East than in Southeast Asia. Of its six policy objectives, the United States clearly accomplished five (containment, oil, stable states, Israel, sea lanes and communications) and split on one (the United States was not able to end regional conflicts, particularly the Arab-Israeli dispute, but for the most part was able to maintain a regional balance of power). All this, for a mere expenditure of an estimated $150 to $200 billion over forty years. (This figure apparently includes the $1 million bribe allegedly paid by the U.S. government to Gamal ‘Abd al-Nasser soon after he took power.) In terms of current value, it is less than half the amount spent by the United States to wage the futile Vietnam War. Holding the expenditures in blood and treasure against the results the United States gained from those expenditures, it might be said that Americans got “more bang from the buck” (to borrow a phrase from the Eisenhower administration) from their involvement in the Middle East during the cold war than from probably any other region in the world.
Quandt does qualify his triumphalism a bit. He does not ignore the fact that American policy in the region had its share of disasters and near-disasters during the cold war. In the first category we might include the inability of the United States to foresee or deal effectively with the Iranian Revolution. In the latter category, we might include the narrowly averted nuclear confrontation with the Soviet
Union that occurred at the tail end of the 1973 Arab-Israeli War. Quandt is also conscious of the fact that his cost/benefit analysis weighs success in American terms and takes no account of the effects of American policy on the region itself. The United States has achieved its goals by supporting truly appalling regimes, for example, and U.S. policy has inflicted its own share of horrors on the population of the region as well. American weapons have been used against civilian populations in Lebanon in 1982 and in the Palestinian territories to this very day. The United States cynically abandoned Palestinians and Lebanese to their fate in 1983, the Kurds to theirs in 1975 and 1988, and the Shi‘is of southern Iraq to theirs in 1991. The United States pressured regimes in the region to adopt economic policies that have more often than not brought economic hardship rather than economic growth. These effects might be more easily brushed away as unfortunate side effects of an otherwise successful U.S. policy were it not for 9/11. It is to the aftermath of that event that we must now turn.
IRAQ AND AFTER
According to an old cliche, American foreign policy has historically swung between two poles: messianic idealism, on the one hand, and hard-headed realism, on the other. Idealists believe that America is more than just a country — it is also that shining “city on a hill” mentioned earlier. Thus, idealists hold that the United States has a special mission in the world, and that mission is to promote “American values” such as freedom, justice, or liberty internationally. Perhaps the most famous idealist in American history was Woodrow Wilson, who called Americans to arms in World War I to “make the world safe for democracy” And as we have seen, he also pressed Britain and France to adopt the “noble principles” embedded in his Fourteen Points as the price the two countries had to pay for American entry into the war — much to their chagrin. Realists, on the other hand, believe that the United States is a state like any other and that states are not driven by ideals, but rather by self-interest. They also believe that the international system can only attain stability when competing states achieve a balance of power among themselves, and that it is the duty of wise policy makers to pursue such a balance. Perhaps the most famous recent practitioner of realism was Henry Kissinger, whose doctoral dissertation, later published as A World Restored, lauded the role played by the conservative Austrian prince Metternich in establishing the post-Napoleonic European balance of power. As secretary of state, Kissinger supported the overthrow of a democratically elected leftist government in Chile and escalated the bombing of Vietnam and its neighbors while simultaneously pursuing detente (cooperation) with the Soviet Union and opening relations with “Red” China — all in the interest of maintaining America's strategic position within a durable world order.
Like all cliches, the idealism/realism divide is an oversimplification. Woodrow Wilson well understood the benefits to the United States of a world in which protected colonial markets were open to all. Similarly, those who would reduce
America's 2003 invasion of Iraq to the idealist impulse to spread democracy tend to forget American policy makers' very unidealist concerns about oil supplies and Americas strategic position in the Middle East. And many of those policy makers who have in the recent past called for promoting democracy and free markets worldwide have not been shy about linking the spread of those values to American hegemony in international affairs. Nevertheless, like all cliches there is a germ of truth in this one as well.
Under the guidance of Henry Kissinger, American foreign policy during the first half of the 1970s was firmly in the hands of realists. But not all policy makers and pundits approved of Kissingerian Realpolitik. Some argued that Kissinger and like-minded realists underestimated both Soviet strength and intentions and that, as a result, detente endangered American security. Their hand was strengthened by a number of government officials who resented Kissinger's success as a bureaucratic infighter — success that diminished their own authority and their capacity to mold foreign policy. Others were appalled at the realist assumption that the United States and the Soviet Union were equivalent players on an international chessboard and that the United States might disavow its moral authority in the cold war. Among them were conservative Democrats alienated by the rise of the anti-Vietnam-War faction in their party, by their party's support for any number of social experiments at home, or by both.
The anti-realists of the 1970s were thus an eclectic group. There were old-fashioned Republican cold warriors who aligned themselves with disgruntled public officials working on the foreign policy fringes. There were Democrats affiliated with Senator Henry “Scoop” Jackson of Washington (also known as the “Senator from Boeing” because of his ties to the Seattle-based defense contractor), who pushed for a stronger defense and took up causes — freedom for Soviet dissidents and the right of Jews to emigrate from the Soviet Union — that highlighted the totalitarian nature of America's adversary. There were former Marxist-Jewish intellectuals in New York who felt the sting of the Left's abandonment of Israel as well as its infatuation with hot-button domestic programs like affirmative action. And there were intellectuals inspired by University of Chicago-based philosopher Leo Strauss, whose philosophy challenged moral relativism and championed a special role for intellectual elites in making public policy. The more ideologically motivated of these anti-realists came to be known as neoconservatives.
While neoconservatism is, at best, an imprecise category, most neoconservatives agree that American interests are linked to the spread of American values; that America's friends are those nations that adhere to those values and its enemies are those that oppose them; that it is legitimate to use force in the pursuit of policy goals; and that the United States cannot trust international institutions, international law, or international agreements to protect American interests. And at the end of the cold war, most neoconservatives came to believe that the United States was and had to remain the sole dominant power in the world. This meant that the United States was free to do what it wanted, where it wanted, when it wanted, regardless of whatever roadblocks other members of the international community might put in its way. Some even began to talk of a “benevolent American empire.”
Beginning in the second half of the 1970s, neoconservatives and their allies undertook a number of activities to keep their realist adversaries off balance. They joined think tanks, wrote op-ed pieces, and edited magazines. They participated in special commissions that accused the CIA of underestimating Soviet strength and intentions. They took out full-page ads in major newspapers warning of the Soviet threat and the danger of American lethargy. And they found a hero in Ronald Reagan, who increased defense spending, supported anti-communist movements from Central America to Eastern Europe, and referred to the Soviet Union as an “evil empire.”
The collapse of the Soviet Union confirmed for neoconservatives the effectiveness of Reagan's muscular defense policy. It also left a void at the center of American strategic planning for the first time in half a century. Reagan's immediate successor, George H. W. Bush, adopted a posture of cautious realism. Thus, although the United States drove Iraq out of Kuwait in 1991, it did so only after winning international sanction for its efforts and with the help of a multinational force. And once Iraqi troops were defeated, the United States made no attempt to “democratize” Iraq and left Iraqi president Saddam Hussein in power. Bill Clinton wavered between realism (no humanitarian intervention in Rwanda) and idealism (humanitarian intervention in Kosovo), but for the most part focused American foreign policy on the opportunities afforded by globalization. Even George W. Bush, who filled his administration with neoconservatives and their allies, began his presidency as a realist. Then came 9/11.
After the al-Qaeda attacks on the United States, the neoconservatives and their enablers in the Bush administration came to the fore. Although the administration won international support for its campaign against al-Qaeda and the Taliban government in Afghanistan that gave al-Qaeda sanctuary, big changes were in the offing. Almost immediately after the attacks, the Bush administration announced a “global war on terror,” ignoring those who argued that fighting terrorism was a law enforcement problem as well as those who argued against a nation declaring war on a tactic and the open-endedness of this undertaking. Within a year of the attacks, the National Security Council issued a new set of foreign policy guidelines that reflected the neoconservative agenda. Henceforth, the National Security Council proclaimed, American policy would rest on three pillars: a right to take preemptive and unilateral action when necessary (a policy which came to be known as the “Bush Doctrine”), unchallengeable American dominance of international affairs, and the active promotion of pro-American democracies throughout the world. The guidelines also underscored the danger posed by weapons of mass destruction falling into the hands of terrorists or “rogue states” such as Iraq, Iran, and North Korea — states dubbed by George W. Bush as “the axis of evil.”
No one could doubt that Iraq was a rogue state of special interest to the administration. Iraq had been in neoconservative sights since the abrupt end of the first Gulf War. For neoconservatives, the fact that Saddam Hussein not only remained in power but thumbed his nose at the sanctions imposed by the international community after the war made a mockery of America's claim to dominance of global affairs. As early as the fall of 2001, military planners were busy making preparations, and soon thereafter troops and equipment were redeployed from the Afghanistan front for the coming invasion of Iraq. At first, the administration tried to link Saddam Hussein to al-Qaeda and international terrorism. When met with skepticism, it focused on Iraq's pursuit of weapons of mass destruction. None were ever found. Finally, the administration unveiled its ultimate justification for making war on Iraq: By liberating Iraq and imposing democracy there, the United States would create a model for the democratic transformation of the entire region and dry up the authoritarian swamp that breeds terrorism.
As everyone knows, things did not go as planned. Some placed the blame on “tactical” miscalculations made by American war planners and occupation authorities. These miscalculations range from deploying an inadequate force to secure the country to disbanding the Iraqi army and loosing on Iraqi civilians hordes of armed and unemployed young men. Others have pointed to the mistaken assumptions and fundamental errors in judgment made by neoconservative advocates of the war. Rather than being greeted as liberators by all but a few regime loyalists (as Iraqi exiles in the United States had predicted), American forces found themselves fighting a stubborn insurgency. Rather than providing a model for democratization throughout the region and drying up the terrorist swamp, Iraq descended into sectarian violence, and the invasion created an anti-American backlash in the region of unprecedented proportions. Rather than demonstrating American dominance on the world stage, the American campaign in Iraq stretched American capabilities to the breaking point and enhanced the regional power of another member of the axis of evil — Iran. Iran, after all, had advocated the removal of Saddam Hussein since his attack on that country in 1980. With Hussein gone, there has been no power in the Persian Gulf to counterbalance Iran. Faced with a weakened Iraq or an Iraq dominated by its Shi‘i (and pro-Iranian) majority, Iran seems to be facing a win-win situation. Finally, there are the costs in lives and treasure: As alluded to in the Introduction, as of this writing more than four thousand Americans have died during the invasion and occupation, and documented Iraqi civilian deaths range between several thousand above or below 100,000—all this at a cost to the United States of an estimated $3 trillion.
The criticisms voiced against neoconservatives have ranged far beyond Iraq. According to some, American unilateralism has dissipated the goodwill the United States had gained after the events of 11 September. It will be, they claim, a long time before the French newspaper Le Monde again runs a headline like the one that dominated the front page of its 12 September 2001 edition: “We are all Americans.” Others argued that the neoconservative quest for American global superiority was bound to provoke a reaction among other powers, such as Europe, China, and Russia, as it has. Furthermore, while neoconservatives believe that their Manichean division of the world into good and evil provides policy makers with a clear road map for action, many policy makers believe it actually limits
America's options in the world. During the conflict between Hizbullah and Israel in the summer of 2006, for example, the United States refused to talk to the only powers that might have had any influence on Hizbullah, Iran and Syria, because their “terrorist connections” put them beyond the pale. Compare that reaction with the shuttle diplomacy of Henry Kissinger, who not only parleyed with Syrians, Israelis, and Egyptians after the 1973 Arab-Israeli War, but managed to keep the Russians in the loop as well. Finally, criticism of neoconservatives has come from the American Right: Since when are social engineering and nation-building conservative values?
America's neoconservative moment appears to be over, at least for the foreseeable future, much as America's passion for formal empire in the 1890s soon dissipated with no one really understanding why it occurred in the first place. In the spring of 2003—around the time of the invasion of Iraq — approximately 75 percent of Americans thought the use of force against that country was “the right decision.” Approximately five years later, only 38 percent thought so. That shift should provide sufficient warning to any politician inclined to support future “preventive wars” or advocate spreading democracy at the point of a gun. And during George W. Bush's second term, some of the most prominent neoconservatives and their enablers in his administration were either cast off or left, leaving those who remained with an opportunity to attempt realigning the administration's course. As a result, for example, the second-term Bush administration found both the United Nations and the International Atomic Energy Agency — two institutions treated with contempt in the run-up to the Iraq invasion — useful when it came to exploring the possibility of taming Iranian (and North Korean) nuclear ambitions. And it was under Bush's watch that the United States and Iraq signed a “Status of Forces” agreement that committed the United States to withdraw all its combat forces from that beleaguered country by 2011.
Nevertheless, it would be unwise to overstate the shift in approach to international affairs from Bush's first term to his second, as some analysts have done. There was never a repudiation of the Bush Doctrine or the Global War on Terrorism, two of his signature policies, and multilateralism in foreign affairs remained at best hit or miss. It would also be unwise to overstate the shift in policy signaled by the election of Barack Obama — a fact that did not go unnoticed among many of his most diehard supporters. While troops were being drawn down from Iraq, for example, Obama acceded to his generals and redoubled America's commitment to the anti-Taliban campaign in Afghanistan. He deployed a “surge” of thirty thousand additional troops, much as Bush had surged combat forces in Iraq three years earlier. Politics, after all, remains the art of the possible, and whatever the rhetoric, the options available to any president for dealing with a stalemated Israeli-Palestinian conflict or an Iran hell-bent on nuclear enrichment are limited.
Yet differences between the Bush and Obama approaches to foreign policy should not be trivialized either. The Bush administration"cast al-Qaeda as a threat to the very existence of the United States, placing it on a par with communism and Nazism. Hence, the designation “Islamo-fascism” when referring to our enemy and the Global War on Terror — with all that the word war connotes — when referring to America's response. The Obama administration, on the other hand, pointedly downgraded each. Islamo-fascism became “violent extremism,” and the stature of the Global War on Terror was reduced in both rhetoric and practice to “overseas contingency operations.” Rather than relying on a big stick policy, Obama deployed cultural and diplomatic soft power along with coercive hard power. In his Cairo speech addressed to the Muslim world, for example, he offered “a new beginning between the United States and Muslims around the world...based on mutual interest and mutual respect.” This is soft power. And whereas George W. Bush once identified Christ as his favorite political philosopher/thinker, Obama identified the American theologian Reinhold Niebuhr as one of his. The choice is telling. Niebuhr was a leading voice among mainstream Protestants who, at the dawn of the cold war, used his influence to promote the staunchly realist policy of containment. As he counseled the so-called wise men who pioneered that policy, in an imperfect world, American idealism has to be tempered with the knowledge of its limitations.