WHEN ALL THE BABY BOOMERS were still children, in the early 1960s, the final legal end of white supremacy came into sight. And as a result, certain white Southerners started displaying Confederate symbols, and Southern states retrofitted state flags to include them. It was a historical rhyme of what had happened a century earlier, when losing the war led Southerners to glorify Dixie and the Lost Cause. After the Civil War, historians started calling the decades before 1861 the antebellum era—for many white Southerners, a word connoting fantasies of a perfect Old South. Antebellum is Latin for “before the war”—any war. After the wars of the 1960s—after Vietnam, “the Negro revolt,” the countercultural explosion—plenty of Americans mythologized the 1940s and ’50s and early ’60s as their own late lamented antebellum era.
The Republican Party saw an opportunity to play to that self-pitying, self-glorifying, ass-kicking nostalgia and adopted its so-called Southern Strategy to turn white Southerners, who had always been Democrats, into Republicans. It began working immediately. The historian Paul Gaston, a Southerner, was astonishingly astute about the social and cultural shift occurring as he wrote The New South Creed in the late 1960s. Many of the old “regional distinctions receded” after the Civil War, as expected. But nobody had counted on that becoming a two-way process starting in the ’60s—that white people in the North and West would start feeling like Southerners, anxious sore losers more conscious than ever of their race. It was, Gaston wrote, “the nationalization of the race problem”—the Black Power movement, black riots, and skyrocketing rates of urban crime, all making white Americans all over feel besieged, their comfortable ways of life threatened, their whiteness no longer such an all-access VIP pass. It was, he detected as soon as it began, “the infiltration into the total American experience of the elements of pathos, frustration, and imperfection that had long characterized the South.”
Wallowing in nostalgia for a lost Golden Age ruined by meddling liberal outsiders from Washington and New York, previously a white Southern habit, became more and more of a white American habit. And so the Republicans’ Southern Strategy could be nationalized as well. When he first ran for president, Ronald Reagan popularized the term welfare queen—a powerful caricature, based on a single criminal case, that exaggerated the pervasiveness of welfare fraud and spread the fiction that black people were the main recipients of government benefits. In Vietnam, the United States had also just lost a war, which gave non-Southerners a strong dose of Lost Cause bitterness for the first time.
The next time he ran, in 1980, Reagan’s fiscal big idea, cutting tax rates to expand the economy and thereby increase tax revenues, was famously mocked by his main GOP opponent as “voodoo economics”—crazy wishfulness, magical thinking. A few months later, after Reagan invited him to be his vice-presidential candidate, George H. W. Bush disavowed his voodoo line. And as President, Reagan didn’t stick strictly to the voodoo path. He did dramatically cut some tax rates, but he also increased others and closed lots of loopholes to keep deficits from growing even larger. The government did not shrink.
But while Reagan had sensibly tacked back toward reality, his true believers on the right maintained total belief in the voodoo. For them, the ultraindividualist liberation of the 1960s and ’70s had generated a kind of fundamentalist religious faith in markets, and thus an absolute knee-jerk opposition to any attempts by government to make markets work better or more fairly, and to taxes in general. If the new hypercapitalism was working well for you, even if you had no fervent ideological faith in markets, what had previously come across as simple selfishness could now be cloaked in righteousness. “Greed is good,” the fictional Gordon Gekko declared in 1987, but now real people insisted that their moneymaking lust and skill were not merely useful in the aggregate but made them virtuous individually. The year after Wall Street came out, Reagan was reelected in one of the biggest landslides in history.
Oh, Ronald Reagan, lovable, shrewd, twinkly, out-of-it, blithe, brilliant Ronald Reagan. The transmutation of presidential politics and governing into entertainment had started a generation earlier, in the 1960s, with John Kennedy. JFK was like a movie star and like a fictional character. He was young and dashing, witty and sexy. He’d been a war hero, and Hollywood made a movie about those heroics, PT 109, while he was president, a production on which he gave notes. His glamorous patrician wife was even younger, only thirty-one when she became First Lady, and one of his girlfriends was a real movie star, Marilyn Monroe, the ultimate sexual fantasy figure. But that projection of youth and extreme vitality was a fiction, a lie, a fantasy presented to the public: he was secretly very ill with osteoporosis and Addison’s disease, among others, and took painkillers, antianxiety drugs, sleeping pills, and stimulants.
As JFK was about to be elected, Norman Mailer wrote that “America’s politics would now be also America’s favorite movie, America’s first soap opera, America’s best-seller.” Up until Kennedy, the wall between government and show business had been thick. His father had owned a movie studio, and not only did Jack pal around with Hollywood people, he agreed to go on stage at Madison Square Garden to receive the supersalacious “Happy Birthday, Mr. President” serenade from his mistress Marilyn. “In America,” the great radio host Jean Shepherd said then about Kennedy, “everything becomes show business.”
His murder was like the tragic, implausible denouement of a novel or movie—and his widow immediately recast his forty-six-month presidency as fiction, Camelot, after the recent Broadway musical about magical young King Arthur. For a few years, Kennedy had used television and the rest of the fantasy-industrial complex as a telegenic Cary Grant manqué; for a few posthumous days, it used him as the star of the most compelling television event ever.
After JFK, president-as-performer became more explicit. In the 1980s, as America kept turning the dial up toward full Fantasyland, we were ready for Reagan as we wouldn’t have been earlier. “For Ronald Reagan,” said Pat Buchanan, who served as his White House communications director, “the world of legend and myth is a real world. He visits it regularly, and he’s a happy man there.” Buchanan meant this as praise.
If Reagan’s story were fiction, it would seem absurdly pat and overdetermined, the irony too heavy-headed. Out of college during the Depression, he went straight to work for the new fantasy-industrial complex. In a Des Moines radio studio, he regularly pretended he was at Wrigley Field in Chicago, performing fake play-by-play broadcasts of Cubs games based on real-time wire-service descriptions. He visited Hollywood and got his first movie role—playing a radio announcer. During World War II, he was an officer in the army—serving in the First Motion Picture Unit, stationed in Burbank and Culver City, where he starred in This Is the Army, a movie in which he played a corporal who stages a piece of musical theater called This Is the Army.
After a so-so career playing fictional characters in movies, he became a superstar playing a politician in real life and on the TV news, first as governor of California. His winning presidential campaign in 1980 had policy specifics that jibed with his misty vision of a simpler, happier, more patriotic old-fashioned America—which in turn jibed with the simultaneous shift in Hollywood and architecture and elsewhere in the culture toward old-timey forms and subjects. As a vacationing president, he wore a cowboy hat and rode a horse at his ranch in southern California. He and his team concocted a brilliant fantasy narrative in which he was the convincing leading character. More than any previous presidential handlers, they staged and crafted his presidential performances specifically to make for entertaining television.
Reagan as fantasist had its cute side, in particular the multiple instances of movies blending into real life, such as his comment about the large fraction of the Pentagon’s budget that went for “wardrobe.” Then there was the story of American grit he repeatedly told about a World War II bomber crew, their B-17 going down, the terrified gunner unable to eject, his superior officer reassuring him, “Never mind, son, we’ll ride it down together.” The exchange did take place during World War II, but only in a movie called Wing and a Prayer, in which the star says to the other actor, “Take it easy, we’ll take this ride together.” Several times as president, once in a conversation with the Israeli prime minister, he told of being deployed to Europe at the end of the war to film concentration camps; it never happened.
In warning Congress not to enact a tax increase, he quoted Clint Eastwood’s recent Dirty Harry line, “Go ahead—make my day.” A few months later the United States and Lebanon negotiated the release of airline passengers hijacked and held hostage by Hezbollah. “Boy,” President Reagan said, “I saw Rambo last night”—Rambo: First Blood Part II—and “now I know what to do next time this happens.” Star Wars, the recent and ultimate Hollywood blockbuster, was both unprecedented and old-fashioned—like Reagan!—and he repeatedly used it. In the screen-crawl text of the first movie, the villains are described as “the evil Galactic Empire.” Just before Return of the Jedi came out, President Reagan delivered a speech in which he referred to the Soviet Union as the “evil empire.”*1 After he announced the development of technology to shoot down Soviet nuclear missiles, that not-quite-real technology became known as Star Wars, and Reagan said of it that “the Force is with us.”
Nancy Reagan was also a former actor, and as First Lady she played “Nancy Reagan” both on the sitcom Diff’rent Strokes and on the prime-time soap opera Dynasty. It had always seemed as if she, ten years his junior, was the brains of the operation, coolly, anxiously, thoroughly reality-based. But then it turned out she employed an astrologer to schedule Reagan’s important trips and meetings. “She feels there’s nothing wrong in talking to her,” Mrs. Reagan’s spokesperson said of the astrologer. According to the astrologer, “Air Force One didn’t take off without permission. [Nancy] set the time for summit meetings with Mikhail Gorbachev, presidential debates with Carter and Mondale…the timing of all the President’s trips abroad, of his press conferences, his State of the Union addresses.” “Good God,” George H. W. Bush said when he learned of this operational voodoo near the end of his vice-presidency, “I had no idea.”
Not so cute was a president of the United States expecting apocalyptic biblical prophecies to be fulfilled soon. Reagan was never much of a churchgoer, but he’d been enthusiastically connecting the Christian end-of-days dots at least since the late 1960s. “Apparently never in history,” he’d said in 1968, “have so many of the prophecies come true in such a relatively short time.” This was a consistent line before he ran for president—“We may be the generation that sees Armageddon,” he said repeatedly. When Muammar Gaddafi took over Libya in the 1970s, Reagan saw it as “a sign that the day of Armageddon isn’t far off. For the first time ever, everything is falling into place for the battle of Armageddon and the second coming of Christ. It can’t be long now. Ezekiel says that fire and brimstone will be rained upon the enemies of God’s people. That must mean that they’ll be destroyed by nuclear weapons. They exist now, and they never did in the past.”
He kept up the end-time chatter in the White House. A decade earlier such talk from a president would’ve been a shocking national embarrassment, but no longer.
“You know,” President Reagan said in a conversation about the Middle East with the head of AIPAC, the main American pro-Israel lobbying group, “I turn back to your ancient prophets in the Old Testament and the signs of foretelling Armageddon, and I find myself wondering if—if we’re the generation that is going to see that come about. I don’t know if you’ve noted any of these prophecies lately, but believe me, they certainly describe the times we’re going through.”
At the final presidential debate in 1984, if the rest of us hadn’t noticed before, there he went again. A moderator asked if it was true he “believe[d], deep down, that we are heading for some kind of biblical Armageddon.” Yessir. Based on “some philosophical discussions with people who are interested in the same things,” he did indeed take seriously “the prophecies down through the years, the biblical prophecies of what would portend the coming of Armageddon, and so forth, and the fact that a number of theologians for the last decade or more have believed that this was true, that the prophecies are coming together that portend that.”
For Americans inclined to believe that prophecies definitely were coming together that portended the coming of Armageddon and so forth, the cheerful president of the United States had just confirmed it, as he would again and again. In the election two weeks later he won forty-nine states.
Beyond the mainstreaming of problematically batty beliefs, presidential politics was merging even more with the fantasy-industrial complex. Just as Bill Clinton wrapped up the Democratic nomination in 1992, he came onstage on a late-night talk show wearing Ray-Ban Wayfarers to play “Heartbreak Hotel” on sax. It was a memorable moment in the evolution of presidential campaigns into auditions for entertainer-in-chief, and on MTV two years later, he laid down the next milestone. Answering questions from an audience of young people, the president of the United States told a seventeen-year-old girl that he wore “usually briefs” rather than boxer shorts.
In early 1998, as soon as we learned that Clinton had been fellated by an intern around the Oval Office, his popularity spiked, according to the polls. Which was baffling only to those who still thought of politics as an autonomous realm, existing apart from entertainment. American politics happened on television; it was a TV series, a reality show just before TV became glutted with reality shows. A titillating new story line that goosed the ratings of an existing series was an established scripted-TV gimmick. The audience had started getting bored with The Clinton Administration,but the Monica Lewinsky subplot, including its cover-up, got people interested again. Fox News and MSNBC were both new start-ups, and because politics had become a low-production-value subgenre of show business, it was engaging only when it was entertaining. When serious journalists started asking Clinton about having extramarital sex with an intern, the public was not so much alarmed as amazed and thrilled.
Just before the Clintons arrived in Washington, the right had managed to do away with the federal Fairness Doctrine, which had been enacted to keep radio and TV shows from being ideologically one-sided. Until then, big-time conservative opinion media had consisted of two magazines, William F. Buckley’s biweekly National Review and the monthly American Spectator, both with small circulations. But absent a Fairness Doctrine, Rush Limbaugh’s national right-wing radio show, launched in 1988, was free to thrive, and others promptly appeared, followed at the end of Clinton’s first term by Fox News.
Should the old federal broadcast rules have been abolished? Maybe, maybe not, but in any case, cable TV was making them iffy and the Internet was just about to start rendering them moot. In any case, when the Washington gatekeepers decided to get rid of that regulatory gate, it was a pivotal moment, practically and symbolically. For most of the twentieth century, national news media had felt obliged to pursue and present some rough approximation of the truth rather than to promote a truth, let alone fictions.
With the elimination of the Fairness Doctrine, a new American laissez-faire had been officially declared. If lots more incorrect and preposterous assertions circulated in our most massive mass media, that was a price of freedom. If splenetic commentators could now, as never before, keep believers perpetually riled up and feeling the excitement of being in a mob, so be it.
Rush Limbaugh, raised by a family of politically well-connected lawyers in southern Missouri, entered show business early. From his teens through his twenties, he was the radio deejay Rusty Sharpe, then the radio deejay Jeff Christie before he moved to talk radio using his real name. His virtuosic three hours of daily talk started bringing a sociopolitical alternate reality to a huge national audience. Instead of relying only on a magazine or a newsletter every so often to confirm your gnarly view of the world, now you had nationally broadcast talk radio drilling it in for hours every day.
As Limbaugh’s radio show took off, in 1992 the television producer Roger Ailes created a syndicated TV show around him. The following year, just after Ailes became president of CNBC, I was reporting a Time cover story about talk radio, including Limbaugh. I had not yet had contact with either man when Ailes phoned me out of the blue to yell at me about the article that didn’t yet exist. “How would you like it if I sent a CNBC camera crew to follow your kids home from school?” he said. My daughters were six and four. “Wow,” I replied, “I’m sure Jack Welch”—the CEO of GE, which then owned NBC—“would be interested to hear that his new news executive is planning to stalk a journalist’s children.” I thought I could hear a gasp. “Are you threatening me?” Ailes shouted down the line. Two years later, when NBC News hired someone else to create and launch its cable channel, Ailes went off and created Fox News for Rupert Murdoch, and ran it until just before he died in 2017.
Fox News brought the Limbaughvian talk-radio version of the world to national TV, but it mingled straighter news with the news-ish commentary. It permitted viewers an unending and immersive propaganda experience of a kind that had never existed before. The new channel’s trademarked slogan was a kind of postmodern right-wing inside joke: since the rest of the national news media posing as objective were unfair and imbalanced in favor of liberals, Fox News would be “Fair and Balanced.” As the new right-wing multimedia complex was establishing itself, on radio and now on TV, in the White House were a pair of glamorous Hollywood-connected liberal yuppies out of Yale—perfect villainous foils, as political infotainment entered its WWF era.
For Americans, this was another new condition. Modern electronic mass media had been a defining piece of the twentieth-century experience that served an important democratic function—presenting Americans with a shared set of facts. Now those news organs, on TV and radio, were enabling a reversion to the narrower, factional, partisan discourse that had been normal in America’s earlier centuries. The new and newly unregulated technologies allowed us, in a sense, to travel backward in time.
AND THE INTERNET. In the 1980s, before the Web, Usenet was a kind of cross between email and social media. In 1994 the first spam was sent, visible to everyone on Usenet: “Global Alert for All: Jesus is Coming Soon.” Over the next year or two, the masses learned of the World Wide Web. The exponential rise of Fantasyland and all its dominions now had its perfect infrastructure.
It’s hard to overstate the flabbergasting speed and magnitude of the change. In the early 1990s, when the Internet was still an obscure geek thing, fewer than 2 percent of Americans used it; by 2002, less than a decade later, most Americans were online. After the 1960s and ’70s happened as they happened, it may be that America’s long-standing dynamic balance—between thinking and magical thinking, reason and wishfulness, reality and fiction, sanity and lunacy—was broken for good. But once the Internet came along, we were definitely on a superhighway to a certain destination with no likely looking exits.
Before the Web, cockamamie ideas and outright falsehoods could not spread nearly as fast or widely, so it was much easier for reason and reasonableness to prevail. Before the Web, institutionalizing any one alternate reality required the long, hard work of hundreds of full-time militants—the way America’s fundamentalist Christians spent decades setting up their own colleges and associations and magazines and radio stations. In the digital age, every tribe and fiefdom and principality and region of Fantasyland—every screwball with a computer and a telecom connection—suddenly had an unprecedented way to instruct and rile up and mobilize believers, and to recruit more.
Yes, we all know all about the extraordinary virtues and benefits of digital communication. You and I now have astounding access to information and ideas and cultural artifacts and people. In every pocket there is now a library, a phonograph, a radio, a movie theater, and a television, as well as a post office, a printing press, a telegraph, a still and video camera, a recording studio, a navigation system, and a radio and TV station. It is advanced technology indistinguishable from magic.
I’m not sure there ever would have been any effective or acceptable way to permit all the Internet’s good parts and minimize the bad ones. By government fiat? By some spontaneous mass reactivation of our recessive American gene for restraint and moderation, like the Danes’ jantelov? We’re a populist democracy of individualists, so too much democracy and individualism were always going to be the directions in which we finally erred.
In any case, the way Internet searching was designed to operate in the 1990s—that is, the way information and belief now flow, rise and fall—is democratic in the extreme. On the Internet, the prominence granted to any factual assertion or belief or theory depends entirely on the preferences of billions of individual searchers. Each click on a link, trillions a year, is effectively a vote pushing that version of the truth toward the top of the pile of results, because every link to a page increases that page’s prominence.
Exciting falsehoods tend to do well in the perpetual referenda and become self-validating. A search for almost any “alternative” theory or belief generates many more links to true believers’ pages and sites than to legitimate or skeptical ones, and those tend to dominate the first few pages of results. For instance, beginning in the 1990s, conspiracists decided contrails, the skinny clouds of water vapor that form around jet-engine exhaust, are exotic chemicals, part of a secret government scheme to test weapons or poison citizens or mitigate climate change—and renamed them chemtrails. When I googled “chemtrails proof,” the first page had nine links, the first seven of those linking to validations of the nonexistent conspiracy. When I searched for “government extraterrestrial cover up,” in the first three pages of results, only one link didn’t lead to an article endorsing a conspiracy theory. After a Cornell psychologist’s widely reported experiment purported to show that people can telepathically know the future, a team of psychologists at the University of Pennsylvania and three other universities tried replicating it seven times—and found no evidence supporting precognition. When I googled the two papers, “Feeling the Future” and “Correcting the Past,” the dubious and more exciting first one had seven times as many search results.
Before the Web, it really wasn’t easy to find or stumble across false or crazy information that was convincingly passing itself off as true. Post-Web, however, as the Syracuse University professor Michael Barkun wrote in 2003 in The Culture of Conspiracy, “such subject-specific areas as crank science, conspiracist politics, and occultism are not isolated from one another,” but “rather, they are interconnected. Someone seeking information on UFOs, for example, can quickly find material on antigravity, free energy, Atlantis studies, alternative cancer cures, and conspiracy.
The consequence of such mingling is that an individual who enters the communications system pursuing one interest soon becomes aware of stigmatized material on a broad range of subjects. As a result, those who come across one form of stigmatized knowledge will learn of others, in connections that imply that stigmatized knowledge is a unified domain, an alternative worldview, rather than a collection of unrelated ideas.
THE APPARENTLY UNRELATED ideas are related by their exciting-secrets-revealed extremism, over the air and online, in paranormal and New Age and Christian and right-wing and left-wing political permutations. They form tactical alliances, interbreed, and hybridize. One thing leads to another. Ways of thinking correlate and cluster. Believing in one type of fantasy tends to lead to believing in others. The major general who commanded the army’s paranormal R&D unit starting in the late 1970s—personally attempting to levitate, to dematerialize, to pass through walls, and to mentally disperse clouds—later became a 9/11 truther who’s certain that hijacked planes didn’t bring down the towers or hit the Pentagon. And it’s not only a matter of the patently ridiculous coexisting with the patently ridiculous. Seventy percent of the “spiritual” third of U.S. college students, for instance, also believe the untrue claim that “genetically modified food is dangerous to our health,” whereas among the “secular” third of college students, the majority know that GMO foods are safe to eat.*2
Academic research shows that religious belief leads people to think that almost nothing happens accidentally or randomly: as the authors of some recent cognitive science studies at Yale put it, “individuals’ explicit religious and paranormal beliefs” are the main drivers of their exceptional “perception of purpose in life events,” their tendency “to view the world in terms of agency, purpose, and design.” Americans have believed for centuries that the country was inspired and guided by an invisible, omniscient, omnipotent planner and that He and His fellow beings from beyond are perpetually observing and manipulating us. That native religiosity has led since the 1960s both to our special interest in extraterrestrials and to a Third Worldly tendency to believe in conspiracies.
Those Yale researchers also found that believers in fate, religious and otherwise, include a large subset of “highly paranoid people” who “obsess over other people’s hidden motives.” In a paper called “Conspiracy Theories and the Paranoid Style(s) of Mass Opinion,” based on years of survey research, two political scientists at the University of Chicago have confirmed this special American connection. “The likelihood of supporting conspiracy theories is strongly predicted,” they concluded, by two key pieces of our national character that derive from our particular Christian culture: “a propensity to attribute the source of unexplained or extraordinary events to unseen, intentional forces” and a weakness for “melodramatic narratives as explanations for prominent events, particularly those that interpret history relative to universal struggles between good and evil.” In fact, they found the single strongest driver of conspiracy belief to be belief in end-time prophecies. Belief in things such as ghosts and psychic healing also “significantly predicted belief in five specific conspiracy theories,” according to the Chicago research. In other words, supernatural belief is the great American gateway to conspiracy belief.
Whether an individual’s conspiracism exists alongside religious faith, psychologically they’re similar: a conspiracy theory can be revised and refined and further confirmed, but it probably can’t ever be disproved to a true believer’s satisfaction. The final conspiratorial nightmare crackdown is always right around the corner but never quite comes—as with the perpetually fast-approaching end-time. Like Christians certain both that evolution is a phony theory and that God created people a few thousand years ago, conspiracists are simultaneously credulous (about impossible plots) and incredulous (about the confusing, dull gray truth). Conspiracists often deride arguments against their theories as disinformation cooked up by the conspirators—the way some Christians consider evolutionary explanations to be the work of the devil.
Researchers and experimenters have repeatedly demonstrated this pinball effect, in which fantastical beliefs lead to other, disparate fantastical beliefs. Once people decide a particular theory is true, they’re apt to be open to another and another and another. In their 2013 paper on conspiracy believers, a team of German social psychologists summarized the research. “In fact,” they found,
this tendency even extends to beliefs in mutually contradictory conspiracy theories, and to beliefs in fully fictitious conspiracy theories. Thus, those who believe that Princess Diana faked her own death are also more likely to believe that she was murdered; those who believe…that John F. Kennedy fell victim to an organized conspiracy…are more likely to believe that there was a conspiracy behind the success of the Red Bull energy drink—a conspiracy theory that was purposely developed for a social psychology study.
A MAIN ARGUMENT of this book concerns how so many parts of American life have morphed into forms of entertainment. From 1980 to the end of the century, that tendency reached a tipping point in politics and the political discourse. First a Hollywood celebrity became a beloved president by epitomizing and encouraging the blur between fiction and reality. Then talk radio and TV news turned into forms of politicized show business. And finally the Internet came along, making false beliefs both more real-seeming and more contagious, creating a kind of fantasy cascade in which millions of bedoozled Americans surfed and swam. Why did Senator Daniel Patrick Moynihan begin remarking frequently during the 1980s and ’90s that people were entitled to their own opinions but not to their own facts? Because until then, it hadn’t seemed like a serious problem in America.
With the Internet, our marketplace of ideas became exponentially bigger and freer than ever, it’s true. Thomas Jefferson said he’d “rather be exposed to the inconveniences attending too much liberty than those attending too small a degree of it”—and it would all be okay because in the new United States “reason is left free to combat” every sort of “error of opinion.” However, I’m inclined to think if he and our other democratic forefathers returned, they would see the present state of affairs as too much of a good thing. Reason remains free to combat unreason, but the Internet entitles and equips all the proponents of unreason and error to a previously unimaginable degree. Particularly for a people with our history and propensities, the downside—this proliferation and reinforcement of nutty ideas and false beliefs, this assembling of communities of the utterly deluded, this construction of parallel universes that look and feel perfectly real, the viral appeal of the untrue—seems at least as profound as the upside.
*1 Reagan delivered his “evil empire” speech in Orlando to the National Association of Evangelicals, an hour after he had been at Disney World. “I just watched a program—I don’t know just what to call it—a show, a pageant…at one point in the movie Mark Twain, speaking of America, says, ‘We soared into the twentieth century on the wings of invention and the winds of change.’ ” He’d seen Disney’s The American Adventure, featuring an animatronic Mark Twain saying things Mark Twain never said.
*2 2013 Trinity College American Religious Identification Survey.