Part two

The Science of Psychiatric Drugs

3

The Roots of an Epidemic

“Americans have come to believe that science is

capable of almost everything.”

—DR. LOUIS M. ORR, AMA PRESIDENT (1958)1

It may seem odd to begin an investigation of a modern-day epidemic with a visit back to one of the great moments in medical history, but if we are going to understand how our society came to believe that Thorazine kicked off a psychopharmacological revolution, we need to go back to the laboratory of German scientist Paul Ehrlich. He was the originator of the notion that “magic bullets” could be found to fight infectious diseases, and when he succeeded, society thought that the future would bring miracle cures of every kind.

Born in East Prussia in 1854, Ehrlich spent his early years as a scientist researching the use of aniline dyes as biological stains. He and others discovered that the dyes, which were used in the textile industry to color cloth, had a selective affinity for staining the cells of different organs and tissues. Methyl blue would stain one type of cell, while methyl red stained a different type. In an effort to explain this specificity, Ehrlich hypothesized that cells had molecules that protruded into the surrounding environment, and that a chemical dye fit into these structures, which he called receptors, in the same way that a key fits into a lock. Every type of cell had a different lock, and that was why methyl blue stained one type of cell and methyl red another—they were keys specific to those different locks.

Ehrlich began doing this research in the 1870s, while he was a doctoral student at the University of Leipzig, and this was the same period that Robert Koch and Louis Pasteur were proving that microbes caused infectious diseases. Their findings led to a thrilling thought: If the invading organism could be killed, the disease could be cured. The problem, most scientists at the time concluded, was that any drug that was toxic to the microbe would surely poison the host. “Inner disinfection is impossible,” declared scientists at an 1882 Congress of Internal Medicine in Germany. But Ehrlich’s studies with aniline dyes led him to a different conclusion. A dye could stain a single tissue in the body and leave all others uncolored. What if he could find a toxic chemical that would interact with the invading microbe but not with the patient’s tissues? If so, it would kill the germ without causing any harm to the patient.

Ehrlich wrote:

If we picture an organism as infected by a certain species of bacterium, it will be easy to effect a cure if substances have been discovered which have a specific affinity for these bacteria and act on these alone. (If) they possess no affinity for the normal constituents of the body, such substances would then be magic bullets.2

In 1899, Ehrlich was appointed director of the Royal Institute of Experimental Therapy in Frankfurt, and there he began his search for a magic bullet. He focused on finding a drug that would selectively kill trypanosomes, which were one-celled parasites that caused sleeping sickness and a number of other illnesses, and he soon settled on an arsenic compound, atoxyl, as the best magicbullet candidate. This would be the chemical he would have to manipulate so it fit into the parasite’s “lock” while not opening the lock on any human cells. He systematically created hundreds of atoxyl derivatives, testing them again and again against trypanosomes, but time and time again he met with failure. Finally, in 1909, after Ehrlich had tested more than nine hundred compounds, one of his assistants decided to see if compound number 606 would kill another recently discovered microbe, Spirocheta pallida, which caused syphilis. Within days, Ehrlich had his triumph. The drug, which came to be known as salvarsan, eradicated the syphilis microbe from infected rabbits without harming the rabbits at all. “This was the magic bullet!” wrote Paul de Kruif in a 1926 bestseller. “And what a safe bullet!” The drug, he added, produced “healing that could only be called biblical.”3

Ehrlich’s success inspired other scientists to search for magic bullets against other disease-causing microbes, and although it took twenty-five years, in 1935 Bayer chemical company provided medicine with its second miracle drug. Bayer discovered that sulfanilamide, which was a derivative of an old coal-tar compound, was fairly effective in eradicating staphylococcal and streptococcal infections. The magic bullet revolution was now truly under way, and next came penicillin. Although Alexander Fleming had discovered this bacteria-killing mold in 1928, he and others had found it difficult to culture, and even when they’d succeeded in growing it, they hadn’t been able to extract and purify sufficient quantities of the active ingredient (penicillin) to turn it into a useful drug. But in 1941, with World War II raging, both England and the United States saw a desperate need to surmount this hurdle, for wound infections had always been the big killer during war. The United States asked scientists from Merck, Squibb, and Pfizer to jointly work on this project, and by D-Day in 1944, British and American sources were able to produce enough penicillin for all of the wounded in the Normandy invasion.

“The age of healing miracles had come at last,” wrote Louis Sutherland, in his book Magic Bullets, and indeed, with the war over, medicine continued its great leap forward.4 Pharmaceutical companies discovered other broad-acting antibiotics—streptomycin, Chloromycetin, and Aureomycin, to name a few—and suddenly physicians had pills that could cure pneumonia, scarlet fever, diphtheria, tuberculosis, and a long list of other infectious diseases. These illnesses had been the scourge of mankind for centuries, and political leaders and physicians alike spoke of the great day at hand. In 1948, U.S. secretary of state George Marshall confidently predicted that infectious diseases might soon be wiped from the face of the earth. A few years later, President Dwight D. Eisenhower called for the “unconditional surrender” of all microbes.5

As the 1950s began, medicine could look back and count numerous other successes as well. Pharmaceutical firms had developed improved anesthetics, sedatives, antihistamines, and anticonvulsants, evidence of how scientists were getting better at synthesizing chemicals that acted on the central nervous system in helpful ways. In 1922, Eli Lilly had figured out how to extract the hormone insulin from the pancreas glands of slaughterhouse animals, and this provided doctors with an effective treatment for diabetes. Although replacement insulin didn’t rise to the level of a magic-bullet cure for the illness, it came close, for it provided a biological fix for what was missing in the body. In 1950, British scientist Sir Henry Dale, in a letter to the British Medical Journal, summed up this extraordinary moment in medicine’s long history: “We who have been able to watch the beginning of this great movement may be glad and proud to have lived through such a time, and confident that an even wider and more majestic advance will be seen by those living through the fifty years now opening.”6

The United States geared up for this wondrous future. Prior to the war, most basic research had been privately funded, with Andrew Carnegie and John D. Rockefeller the most prominent benefactors, but once the war ended, the U.S. government established the National Science Foundation to federally fund this endeavor. There were still many diseases to conquer, and as the nation’s leaders looked around for a medical field that had lagged behind, they quickly found one that seemed to stand above all the rest. Psychiatry, it seemed, was a discipline that could use a little help.

Imagining a New Psychiatry

As a medical specialty, psychiatry had its roots in the nineteenth-century asylum, its founding moment occurring in 1844, when thirteen physicians who ran small asylums met in Philadelphia to form the Association of Medical Superintendents of American Institutions for the Insane. At that time, the asylums provided a form of environmental care known as moral therapy, which had been introduced into the United States by Quakers, and for a period, it produced good results. At most asylums, more than 50 percent of newly admitted patients would be discharged within a year, and a significant percentage of those who left never came back. A nineteenth-century long-term study of outcomes at Worcester State Lunatic Asylum in Massachusetts found that 58 percent of the 984 patients discharged from the asylum remained well throughout the rest of their lives. However, the asylums mushroomed in size in the latter part of the 1800s, as communities dumped the senile elderly and patients with syphilis and other neurological disorders into the institutions, and since these patients had no chance of recovering, moral therapy came to be seen as a failed form of care.

At their 1892 meeting, the asylum superintendents vowed to leave moral therapy behind and instead utilize physical treatments. This was the dawn of a new era in psychiatry, and in very short order, they began announcing the benefits of numerous treatments of this kind. Various water therapies, including high-pressure showers and prolonged baths, were said to be helpful. An injection of extract of sheep thyroid was reported to produce a 50 percent cure rate at one asylum; other physicians announced that injections of metallic salts, horse serum, and even arsenic could restore lucidity to a mad mind. Henry Cotton, superintendent at Trenton State Hospital in New Jersey, reported in 1916 that he cured insanity by removing his patients’ teeth. Fever therapies were said to be beneficial, as were deep-sleep treatments, but while the initial reports of all these somatic therapies told of great success, none of them stood the test of time.

In the late 1930s and early 1940s, asylum psychiatrists embraced a trio of therapies that acted directly on the brain, which the popular media—at least initially—reported as “miracle” cures. First came insulin coma therapy. Patients were injected with a high dose of insulin, which caused them to lapse into hypoglycemic comas, and when they were brought back to life with an injection of glucose, the New York Times explained, the “short circuits of the brain vanish, and the normal circuits are once more restored and bring back with them sanity and reality.”7 Next came the convulsive therapies. Either a poison known as Metrazol or electroshock was used to induce a seizure in the patient, and when the patient awoke, he or she would be free of psychotic thoughts and happier in spirit—or so the asylum psychiatrists said. The final “breakthrough” treatment was frontal lobotomy, the surgical destruction of the frontal lobes apparently producing an instant cure. This “surgery of the soul,” the New York Times explained, “transforms wild animals into gentle creatures in the course of a few hours.”8

With such articles regularly appearing in major newspapers and magazines like Harper’s, Reader’s Digest, and the Saturday Evening Post, the public had reason to believe that psychiatry was making great strides in treating mental illness, participating in medicine’s great leap forward, but then, in the wake of World War II, the public was forced to confront a very different reality, one that produced a great sense of horror and disbelief. There were 425,000 people locked up in the country’s mental hospitals at that time, and first Life magazine and then journalist Albert Deutsch, in his book The Shame of the States, took Americans on a photographic tour of the decrepit facilities. Naked men huddled in barren rooms, wallowing in their own feces. Barefoot women clad in coarse tunics sat strapped to wooden benches. Patients slept on threadbare cots in sleeping wards so crowded that they had to climb over the foot of their beds to get out. These images told of unimaginable neglect and great suffering, and at last, Deutsch drew the inevitable comparison:

As I passed through some of Byberry’s wards, I was reminded of the Nazi concentration camps at Belsen and Buchenwald. I entered buildings swarming with naked humans herded like cattle and treated with less concern, pervaded by a fetid odor so heavy, so nauseating, that the stench seemed to have almost a physical existence of its own. I saw hundreds of patients living under leaking roofs, surrounded by moldy, decaying walls, and sprawling on rotting floors for want of seats or benches.9

The nation clearly needed to remake its care of the hospitalized mentally ill, and even as it contemplated that need, it found reason to worry about the mental health of the general population. During the war, psychiatrists had been charged with screening draftees for psychiatric problems, and they had deemed 1.75 million American men mentally unfit for service. While many of the rejected draftees may have been feigning illness in order to avoid conscription, the numbers still told of a societal problem. Many veterans returning from Europe were also struggling emotionally, and in September 1945, General Lewis Hershey, who was the director of the Selective Service System, told Congress that the nation badly needed to address this problem, which had remained hidden for so long. “Mental illness was the greatest cause of noneffectiveness and loss of manpower that we met” during the war, he said.10

With mental illness now a primary concern for the nation—and this awareness coming at the very time that antibiotics were taming bacterial killers—it was easy for everyone to see where a long-term solution might be found. The country could put its faith in the transformative powers of science. The existing “medical” treatments said to be so helpful—insulin coma, electroshock, and lobotomy—would have to be provided to more patients, and then long-term solutions could arise from the same process that had produced such astonishing progress in fighting infectious diseases. Research into the biological causes of mental illnesses would lead to better treatments, both for those who were seriously ill and those who were only moderately distressed. “I can envisage a time arriving when we in the field of Psychiatry will entirely forsake our ancestry, forgetting that we had our beginnings in the poorhouse, the workhouse and the jail,” said Charles Burlingame, director of the Institute of the Living in Hartford, Connecticut. “I can envisage a time when we will be doctors, think as doctors, and run our psychiatric institutions in much the same way and with much the same relationships as obtain in the best medical and surgical institutions.”11

In 1946, Congress passed a National Mental Health Act that put the federal government’s economic might behind such reform. The government would sponsor research into the prevention, diagnosis, and treatment of mental disorders, and it would provide grants to states and cities to help them establish clinics and treatment centers. Three years later, Congress created the National Institute of Mental Health (NIMH) to oversee this reform.

“We must realize that mental problems are just as real as physical disease, and that anxiety and depression require active therapy as much as appendicitis or pneumonia,” wrote Dr. Howard Rusk, a professor at New York University who penned a weekly column for the New York Times. “They are all medical problems requiring medical care.”12

The stage had now been set for a transformation of psychiatry and its therapeutics. The public believed in the wonders of science, the nation saw a pressing need to improve its care of the mentally ill, and the NIMH had been created to make this happen. There was the expectation of great things to come and, thanks to the sales of antibiotics, a rapidly growing pharmaceutical industry ready to capitalize on that expectation. And with all those forces lined up, perhaps it is no surprise that wonder drugs for both severe and not-so-severe mental illnesses—for schizophrenia, depression, and anxiety—soon arrived.

If you find an error please notify us in the comments. Thank you!