William Henry Gates III started tinkering with computers at age thirteen. At that time in the late 1960s, computers were big, bulky machines that filled entire rooms. As a teenager in 1969, the enterprising Gates and some of his friends set up a business to make computerized traffic counters to gauge the speed of automobiles and other vehicles, for which they earned $20,000.
His brilliant mathematical mind and entrepreneurial inclinations led Gates to enroll at Harvard, where those same qualities soon led him to drop out. He spent more time at the university's computer center than he did in class, and in 1974 he became interested in microcomputers as an alternative to large conventional computers. A year later, Gates went to Albuquerque, New Mexico, and formed a computer software company called Microsoft (an amalgam of microcomputer and software). He envisioned the microcomputer on every office desktop and in homes throughout America. Gates actively pursued the lucrative financial rewards that microcomputers would bring, in contrast to other microcomputer pioneers who did not seek commercial gain and shared information and software with one another freely and unconditionally.
Bill Gates succeeded beyond all expectations. In 1980 Microsoft, now headquartered in Bellevue, Washington, collaborated with International Business Machines (IBM) to create a software package for IBM's new line of personal computers. Additional technological breakthroughs came rapidly. Microsoft joined the financial boom of the 1980s and in 1986 became a publicly traded company on the New York Stock Exchange. Within a decade, Gates became the richest man in America, and like other industrial titans a century earlier, he has donated generously to fund philanthropic activities worldwide.
Despite the enormous benefits of computer technology, the digital revolution has had unforeseen consequences. The terrorists who attacked the World Trade Center and the Pentagon on September 11, 2001, communicated through e-mail and cell phones and trained on computerized flight simulators. They belonged to al-Qaeda, an international terror network that spread its ideology and raised and transferred money over the Internet.
Kristen Breitweiser was a young housewife and mother living in suburban New Jersey on September 11, 2001 (9/11). Her husband, Ron, a senior vice president at an investment management service, worked in Tower Two of the World Trade Center. When one of the planes commandeered by terrorists crashed into the building, her husband died in the fiery collapse of the building, leaving her a widow with a two-year-old daughter. Breitweiser's loss transformed her from a grieving victim and a stay-at-home mother into a political activist.
She started attending meetings of the Victim Compensation Fund established by the federal government following 9/11. She met Mindy Kleinberg, Lorie Van Auken, and Patty Casazza, widows like herself living in New Jersey. The “Jersey Girls," as they became known, addressed concerns over victims' compensation but soon confronted larger political issues. Seeking more than financial compensation for their losses, they demanded to know how the 9/11 attacks could have happened and what the federal government might have done to prevent them. Breitweiser and her colleagues favored an investigation by an independent commission to gather information about what had occurred and to make recommendations to prevent other attacks. However, they found themselves in opposition to President George W. Bush, who initially resisted the creation of such a commission.
Undeterred, the four women mounted a vigorous campaign to pressure the White House and Congress to form a national commission. Breitweiser testified before the Joint Congressional Intelligence Committee to garner support. The women's perseverance paid off: In November 2002, Congress established a bipartisan commission that President Bush signed into law.
However, the commission's final report in 2004 disappointed Breitweiser. She called the report “hollow" and criticized President Bush for not fully and openly cooperating with the investigation. Although Breitweiser had voted for Bush in the 2000 election and considered herself a conservative, her rapid political education since 9/11 turned her against his candidacy in 2004. She also spoke out against the Iraq War, which the administration had initiated in 2003 in response to the 9/11 attacks. Breitweiser continued her political activism as a blog writer for The Huffington Post. In this way, this housewife from New Jersey shared her views with millions of people through technology that Bill Gates's generation had developed.
A girl sits on her father's shoulders at an Occupy Miami protest in Miami, Florida, 2011. Joe Raedle/Getty Images
TORIES of Bill Gates and Kristen Breitweiser were deeply affected by the twin forces of digital technology and terror that dominated life in the United States and throughout the world at the start of the twenty-first century.
Computers, the Internet, and Cell Phone Technology
reformulated commerce and social relations, fostering the globalization that emerged after the Cold War (see chapter 28). Google, the Web, Facebook, and Twitter became household words and broke down domestic and global barriers that twentieth-century technology had not yet demolished. Computer technology revolutionized political communication and organization, mobilized ordinary citizens into action, and expanded opportunities for disgruntled and oppressed citizens of foreign countries to overthrow despotic rulers. Driven by new computer models for trading in financial securities, the stock market grew highly volatile, and downturns in the economy became greater in intensity and scope. At the same time, the threat of terrorism continued to preoccupy the United States and its allies around the world, and their governments used the latest technologies to monitor suspected terrorists. Ordinary people paid for this increased surveillance each time they boarded an airplane and were subjected to intimate security searches.
The decade of the 1990s was a period of great economic growth and technological advancement in the United States. Computers stood at the center of the technological revolution of the late twentieth and early twenty-first centuries, allowing both small and large businesses to reach new markets and transform the workplace. Digital technology also altered the personal habits of individuals in the way they worked, purchased goods and services, communicated, and spent their leisure time. As the Internet and World Wide Web connected Americans to the rest of the world, corporate leaders embraced globalization as the key to economic prosperity. They put together business mergers so that their companies could operate more powerfully in the international market. Government officials generally supported their efforts by reducing regulations on business and financial practices, thus encouraging greater risk taking and easing the way for freer trade overseas. Globalization not only thrust American business enterprises outward but also brought a new population of immigrants to the United States.
The Computer Revolution
The first working computers were developed for military purposes during World War II and the Cold War and were enormous in size and cost. Engineers began to resolve the size issue with the creation of transistors. Invented by Bell Laboratories in the late 1940s, these silicon pieces of equipment came into widespread use in running computers during the 1960s. As companies manufactured smaller and smaller silicon chips, computers became faster, cheaper, and more reliable. The design of integrated circuits in the 1970s led to the production of microcomputers in which a silicon chip the size of a nail head did the work once performed by huge computers. Bill Gates was not the only one to recognize the potential market of microcomputers for home and business use. Steve Jobs, like Gates a college dropout, founded Apple Computer Company in 1976. By 1980 the company had become a publicly traded corporation, turning its founder into a multimillionaire.
Microchips and digital technology found a market beyond home and office computers. Beginning in the 1980s, computers replaced the mechanical devices that ran household appliances such as washing machines, dishwashers, and refrigerators. Over the next twenty years, computers operated everything from standard appliances such as televisions and telephones, to new electronic gadgets such as VCR and CD players, fax machines, cell phones, and iPods. Computers controlled traffic lights on the streets and air traffic in the skies. They changed the leisure patterns of youth: Many young people preferred to play video games at home, rather than engage in outside activities. Consumers purchased goods online, and companies such as Amazon sold merchandise through cyberspace without any actual retail stores. Computers became the stars of movies such as The Matrix (1999), A.I. Artificial Intelligence (2001), and Iron Man (2008). In 2010 The Social Network became a hit in portraying the life of Mark Zuckerberg, the primary developer of the social media Web site Facebook, which in 2012 had 900 million users worldwide.
The Internet—an open, global series of interconnected computer networks that transmit data, information, electronic mail, and other services—made social networking possible. The Internet grew out of military research in the 1970s, when the Department of Defense constructed a system of computer servers connected to one another throughout the United States. The main objective of this network was to preserve military communications in the event of a Soviet nuclear attack. At the end of the Cold War, the Internet was repurposed for nonmilitary use, and it now links government, academic, business, and organizational systems. In 1991 the World Wide Web came into existence as a way to access the Internet and share documents and images. Search engines like Google and Yahoo were developed to allow computer users to “surf the Net” and gain access to Web pages. Consumers could shop online as they once had in stores, and researchers could find information previously available only in libraries. Politicians learned how to use the Internet to raise campaign funds and spread their messages to voters more widely and more quickly than they had been able to do in person or on television. Terrorist groups, such as al-Qaeda, also went online. In 2010 around 75 percent of people in the United States used the Internet, as did nearly 2 billion people worldwide, about a quarter of the globe’s population.
Digital communication revolutionized globalization. Bill Gates’s computer software programs, along with the Internet and World Wide Web, dramatically reduced the time it took for trading partners around the world to converse and make business decisions. Consumers in the United States called customer service operators stationed in India and other remote sites. E-mail largely replaced postal mail, allowing Americans to instantly contact relatives, friends, or professional and business associates around the country or the world.
The incredible growth of the computer industry led to increased business consolidation, making it possible for large firms to communicate instantly within the United States and throughout the world and to keep control over their far-flung operations. In addition, the federal government aided the merger process by relaxing financial regulation. Media companies took the greatest advantage of this situation. In 1990 the giant Warner Communications merged with Time Life to create an entertainment empire that included a film studio (Warner Brothers), a television cable network (Home Box Office), a music company (Atlantic Records), a baseball team (the Atlanta Braves), and several magazines (Time, Sports Illustrated, and People). Before Warner Communications combined with the Internet service provider America Online (AOL) in 2001, it had topped $21 billion annually in sales. Several other media conglomerates formed during this period as well. The Australian-born Rupert Murdoch, who already owned considerable holdings in his home country and in Great Britain, moved his operations to the United States. Murdoch soon purchased the Fox Broadcasting Company to go along with a satellite dish company; a movie studio; a variety of newspapers, including the New York Post and the Wall Street Journal; and thirty television stations. Media mergers mirrored the trend in the rest of the economy. The estimated number of business mergers rose dramatically from 1,529 in 1991 to 4,500 in 1998. The market value of these transactions in 1998 was approximately $2 trillion, compared with $600 billion for 1989, the previous peak year for consolidation.
Corporate consolidation brought corporate malfeasance, as some chief executives of major companies abused their power by expanding their companies too quickly and making risky financial deals, which put workers and stockholders in jeopardy. Such practices led to a number of scandals, including one involving Enron. Enron was the product of a merger in 1985 between Houston Natural Gas and InterNorth, a gas company headquartered in Omaha, Nebraska. Operating out of Houston, Texas, Enron benefited from the deregulation of the gas and electric industry in the 1990s, which brought exorbitant profits and encouraged corporate greed. As Enron thrived, its prices shot up and its stock soared, earning the company more than $50 billion in 2001. In October of that year, information began to trickle out about insider trading, faulty business deals, and questionable accounting practices. As these revelations mounted, Enron’s stock and its credit rating plunged, jeopardizing the solvency of the company. In December 2001, the once mighty Enron filed for bankruptcy and fired four thousand employees; its top two executives were subsequently convicted on charges of criminal fraud. This scandal affected companies beyond Enron, leading to the conviction of executives from Enron’s accounting firm, Arthur Andersen, and another Andersen client, WorldCom.
The Changing American Population
At the same time as the technological revolution helped transform the U.S. economy and society, an influx of immigrants began to greatly alter the composition of the American population. Since passage of the Immigration Act of 1965 (see chapter 26), the country had experienced a wave of immigration comparable to that at the turn of the twentieth century. As the population of the United States grew from 202 million to 300 million between 1970 and 2006, immigrants accounted for some 28 million of the increase. They came to live in the United States for much the same reasons as those who had journeyed before: to seek economic opportunity and to find political and religious freedom.
Most newcomers who came in the 1980s and 1990s arrived from Latin America and South and East Asia; relatively few Europeans (approximately 2 million) moved to the United States, though their numbers increased after the collapse of the Soviet empire in the early 1990s. Poverty and political unrest pushed migrants out of Mexico, Central America, and the Caribbean. The prizewinning film El Norte (1983) dramatized the plight of undocumented Guatemalan Indians who traveled through Mexico to settle in California. Yet most others took advantage of a provision in the 1965 act that permitted them to join family members already settled in the United States. At the beginning of the twenty-first century, Latinos (35 million) had surpassed African Americans (34 million) as the nation’s largest minority group. However, with the arrival of Caribbean and African immigrants, black America was also becoming more diverse in this period.
In addition to the 16 million immigrants who came from south of the U.S. border, another 9 million headed eastward from Asian nations, including China, South Korea, and the Philippines, together with refugees from the Vietnam War and Cambodia. By 2007 an estimated 1.6 million Indians from South Asia had immigrated to the United States, most arriving after the 1960s. Indian Americans became the third-largest Asian American group behind Chinese and Filipinos. Another 1 to 2 million people came from predominantly Islamic nations such as Pakistan, Lebanon, Iraq, and Iran (Figure 29.1).
Like their predecessors, new immigrants formed ethnic and religious enclaves. California displayed this fresh face of immigration most vividly. Latinos and Asians had long settled there, and by 2001, 27 percent of the state’s population was foreign-born. The majority of Californians consisted of Latinos, Asian Americans, and African Americans, with whites in the minority. In addition to California and the Southwest, immigrants flocked to northeastern and midwestern cities—New York City, Jersey City, Chicago, and Detroit—as they had in the past. However, they now fanned out through the Southeast, adding to the growing populations ofAtlanta, Raleigh-Durham, Charlotte, Columbia, and Memphis and providing these cities with an unprecedented ethnic mixture. Like immigrants before them, they created their own businesses, spoke their own languages, and retained their own religious and cultural practices.
They also encountered hostility from many native-born Americans. Some workers felt threatened by immigrants who took jobs, both commercial and agricultural, at lower wages. Middle-class taxpayers complained that the flood of impoverished immigrants placed the burden on them to fund the social services—schools, welfare, public health— that the newcomers required. Some of the children and grandchildren of immigrants who had assimilated into American culture resented foreigners who pushed for bilingual education and public signs and instructions in their native languages. Immigration critics also complained about the influx of illegal foreign residents among the immigrant population. Besides breaking the law, these critics argued, undocumented immigrants further depressed wages and taxed public resources.
FIGURE 29.1 Immigrant Growth by Home Region, 1971-2000 In the late twentieth century, immigration to the United States increased significantly, especially from North America (which in this figure includes Mexico, Central America, and the Caribbean). Residents of East and South Asia formed the second- largest group of immigrants while Africans arrived in small but growing numbers.
Source: Data from 2000 Statistical Yearbook of the Immigration and Naturalization Service.
California led the way in rolling back the effects of immigration. In 1986 Californians approved Proposition 63, which declared English to be the state’s official language. Thirty states passed similar laws. In 1994 California voters approved Proposition 187, which prohibited illegal residents from attending public schools and using any social services except emergency health facilities. This proposal never went into effect because federal courts ruled it unconstitutional. Also, some conservative Republicans like President Ronald Reagan, former governor of California, along with agribusiness and other corporate interests that relied on cheap immigrant labor to keep their operating costs low, opposed such severe measures.
REVIEW & RELATE
• How have computers changed life in the United States?
• How has globalization affected business consolidation and immigration?