Communities of Inventors: Solutions in Search of Problems

RESEARCH AND DEVELOPMENT (R&D) did not become a focus of national interest until long after it had become a national institution. When the nation in the mid-twentieth-century became concerned about its institutions of learning and its ability to advance the frontiers of scientific knowledge, people worried over schools and universities, and sometimes over government research, but the nation as a whole seemed barely aware of the decisive and growing role of the industrial research laboratory. Not until the Depression of the 1930’s had stirred awareness of the connection between scientific progress, employment, and prosperity had statistics on such research become available. By 1956 the total national annual expenditure for Research and Development in the natural sciences alone came to nearly $8.5 billion, more than twice the total national expenditure in that year for all institutions of higher education. By the 1960’s R&D had entered dictionaries as another Americanism. Despite the dramatic increase of national expenditures on higher education (from some $7 billion in 1960 to $13 billion in 1965 and $23 billion in 1970), the growing annual expenditures on R&D remained steadily ahead, totaling $27 billion by 1970. While federal funds for this purpose were an ever increasing proportion, the expenditures by private industrial firms, at least until the mid-twentieth century, accounted for fully half the totals.

THE INDUSTRIAL RESEARCH LABORATORY was not invented in the United States. Since ancient times, philosophers and visionaries had imagined ideal communities in which men of science collaborated. In the early seventeenth century, Francis Bacon’s New Atlantisdescribed “Solomon’s House,” where sages pooled their knowledge and searched for its uses. The actual societies of learned men, like the French Academy, founded in 1635, and the Royal Society in London, founded in 1662, were primarily literary and honorific; they never produced that active scientific collaboration which Bacon had desired. The first modern industrial research on a large scale occurred in Germany, where, before the end of the nineteenth century, chemical and optical firms had shown how profitable it was to apply science to industry. Germany, too, had many of the negative advantages found in the United States. Since industry developed there later than in England, Germany lacked the petrifying traditions which had kept science separated from technology. And the new synthetic-dye industry in Germany depended heavily on laboratory discoveries, especially in synthesizing indigo and other products. The new German optical industry, led by Carl Zeiss, also successfully dominated the world market in its field, and so discouraged the growth of industrial research elsewhere. The Kaiser Wilhelm Institute, founded in 1911, flourished at first under the joint sponsorship of government and industry, but the Nazis proved its ruin.

In the United States the modern industrial research laboratory would eventually find the habitat where it could flourish without precedent. Here too it was to be a phenomenon of the twentieth century, quite distinct both in scale and in purpose from earlier American efforts.

We have already seen how individual scientific adventurers had set up laboratories for a variety of purposes. The pioneer oil men had employed Benjamin Silliman, Jr., to analyze their oil samples. Charles T. Jackson, who had introduced Samuel F. B. Morse to the principles of the electric telegraph, opened a chemical laboratory in Boston in 1836, where he experimented with guncotton and sorghum, and proposed the application of ether in surgery. At about the same time, a Philadelphia chemist, James C. Booth, set up a chemical laboratory for researches on sugar, molasses, and iron, and there he provided instruction to ambitious young chemists.

During the nineteenth century, ingenious individual inventors struggled and sacrificed and braved public ridicule. Eli Whitney, Oliver Evans, Elias Howe, Gail Borden, Samuel F. B. Morse, Alexander Graham Bell, George Eastman, John Wesley Hyatt, and scores of others spent their own lives, and plagued their families, to make some particular novelty into a marketable commodity. They were inventor-businessmen. Having imagined a sewing machine, they had to fashion it with their own hands, demonstrate it, show how it could be manufactured, and finally persuade someone to do the manufacturing, unless they decided also to do that themselves. Then, too, there were the businessmen-inventors, who had a nose for the new, and were willing to risk their energy and their capital. They were the host that included Frederic Tudor, Edwin Drake, Isaac Singer, Gustavus Swift, and John Henry Patterson; each had something of the inventor in him, but they were mainly organizers and promoters. What characterized all of them was single-minded persistence, what the American catalogue of schoolboy virtues called “stick-to-it-iveness.” They pursued a fixed idea despite sickness, poverty, or public ridicule. Like the prospector who knew what he was seeking, they were uncertain only of where to find it.

But the industrial research laboratory was organized to seek in a new spirit. New institutions and new men would arise, with a new attitude toward their purpose, toward time and cost and “need.” They spoke a new idiom of “feasibility” and “pilot plants.” The men who searched in these laboratories were a new breed. It would be hard to make them popular heroes, because they were working on frontiers that most Americans did not even know were there. They were no longer amateurs, nor rule-of-thumb men. They were no longer workers-in-attics, but scientist-statesmen with advanced training, using an esoteric language in the highest councils of the nation. No longer looking for some particular thing, they were going as much to seek as to find. The community emphasis, the vague hopes and booster optimisms of the earlier movers across physical America, they were now reliving in the mysterious wilderness of science.

“ELECTRICITY,” wrote Charles W. Eliot on his inscription for the Union Depot in Washington, is “carrier of light and power; devourer of time and space; bearer of human speech over land and sea; greatest servant of man—yet itself unknown.” To explore this unknown and discover its treasure was the self-appointed mission of the pioneer American industrial research laboratory founded by General Electric in 1900.

The prophet and the pioneer organizer of the new institution was Willis R. Whitney. As the son of a chair manufacturer in Jamestown in far-western New York, Whitney had been expected to learn a trade. But in high school he was inspired at the evening classes run by a local businessman who had made a hobby of science. There he looked through a microscope for the first time, and then he persuaded his parents to buy one for him. Whitney went on to the Massachusetts Institute of Technology (a land-grant institution), where he graduated in chemistry in 1890, and then on to two years more and a doctorate at the University of Leipzig in Germany, followed by six months at the Sorbonne. Returning to M.I.T., he taught chemistry, explored the boundaries between electricity and chemistry, and developed an electrochemical theory of corrosion. When the General Electric Company decided to organize a research laboratory, they turned to the thirty-two-year-old Whitney. His first response, as they all later recalled, was “I would rather teach than be President!”

Whitney’s reluctance to leave the university for General Electric was understandable. For in the United States until then, scientific research as well as teaching had, except for a few government-sponsored enterprises like the Smithsonian Institution, been centered in colleges and universities. The General Electric Company that had grown out of Edison’s company needed the talents of a Whitney, since Edison himself had become a businessman. The brilliant German immigrant Charles P. Steinmetz, as the company’s consulting engineer, had made advances in the theory of alternating currents, but he was a loner, and by temperament not qualified to organize anything. The original Edison lamp patent had expired in 1894, and other early patents had either expired or had lost their value because of later advances. General Electric still controlled most of the incandescent lamp sales in the United States, but gaslighting and arc-lighting continued to be competitors. German firms had developed filaments far superior to the original carbon filament, now made of osmium, tantalum, and other new materials. General Electric also needed new ideas, new materials, and new products. As Whitney recalled:

The time, the year 1900, was most propitious…. The panic and depression of 1893–96 had shaken things down. The prejudices of the pioneer days had largely passed away. We had learned that there was a field for all the various applications of electricity, whether of alternating current or of direct current, such as the arc lamp, incandescent lamp and the electric street railway, and we were busy welding these together. The opinion seemed to have been generally held that no radically new developments could arise. Copper was the best conductor of electricity; iron the best for magnetism; carbon the best for electrodes for arc lamps and lamp filaments and for brushes for commutators. As far as we could see it was likely that these materials would always remain the best for their respective purposes. Such at least was the opinion of most engineers. However there were a few who thought differently….

At first when Whitney went to the General Electric plant at Schenectady, he worked in a barn at the back of Steinmetz’s house. Then, as he organized the laboratory, he secured a new building and gathered a galaxy of scientists with whose help he gradually defined his enterprise.

William D. Coolidge—a poor farm boy from Massachusetts, who also had gone through M.I.T. on scholarships, then on to a Leipzig degree, and back to M.I.T.—was one of Whitney’s most brilliant early collaborators. In 1908, within three years after joining Whitney, Coolidge invented a process for making tungsten filaments which (combined with other improvements) produced lamps two and a half times as efficient as those used before. Coolidge went on to develop a cathode tube which opened a new era in X-rays and gave General Electric the lead in that young science. Irving Langmuir, who also came with a German doctor’s degree in physical chemistry, found that the tungsten filament functioned much more efficiently when set in a bulb filled with inert gas. Then, as Langmuir moved on from light bulbs to vacuum tubes for radio, he incidentally developed a new atomic theory which proved essential for the exploitation of atomic energy. No one had intended, or even imagined, these particular directions. Like the Spanish and Portuguese navigators four hundred years before who sought a short path to the Orient and found a New World in their way, these science explorers were touching the edges of unknown continents. But supporting such explorers as these became normal in the plans and budgets of American enterprise, and came to be taken for granted by the man in the street.

The proper job of American engineers, Whitney argued, was to produce obsolescence. A large enterprise like General Electric, he explained, could not stay alive merely by producing the best of old products.

Our research laboratory was a development of the idea that large industrial organizations have both an opportunity and a responsibility for their own life insurance. New discovery can provide it. Moreover the need for such insurance and the opportunity it presents rise faster than in direct proportion to the size of the organization. Manufacturing groups could thus develop their continuity beyond that of the originator because the accumulated values of knowledge and experience became generally recognized. No one yet knows the possible longevity of a properly engineered manufacturing system.

Out of the vagrant researches in Whitney’s laboratory came products as diverse as the “calrod” (a new insulated wire that made possible the electric cooking range), underwater stethoscopes for detecting enemy submarines, improvements in public-address systems, and high-frequency dielectric heating to induce artificial fevers for the treatment of syphilis, arthritis, boils, and bursitis.

WHITNEY, THEN, BECAME an American Heraclitus, preaching and teaching that only change is real. Into his world of science he carried the faith of William James and John Dewey that all old categories of knowledge, of need, and even the familiar definitions of invention and of novelty were the enemies of life and progress. As “director” of the laboratory he said he did not want to direct. “A director merely points, like some wooden arrow along the highway. Research directing is following the openings of acceptable new ideas. It is watching the growth of thought in the minds and hands of careful investigators. Even the lonely mental pioneer, being grubstaked, so to speak, advances so far into the generally unknown that a so-called director merely happily follows the new ways provided. All new paths both multiply and divide as they proceed.” A research laboratory was not where men fulfilled assignments, but where (in Langmuir’s phrase) men exercised “the art of profiting from unexpected occurrences.” Habits, procedures, “science” itself, were somehow an obstacle in the pursuit of the still unknown.

Whitney was fond of quoting the physiologist Claude Bernard’s words, “The goal of all life is death.” For Whitney this meant that “the interesting processes of life will still go on and forever change.” “The grand goal of good gadgets is gradual obsolescence, or, expressed differently, we may get better every day.” Recalling the dogmatic prophecies of his own time—the expectation that electric streetcars would one day reach into every alley of every town, that electric lighting was the final improvement in street lighting—he wondered whether one could ever predict the next form of transportation; or whether there might not be some way of putting “phosphors” into the cement of the roadways to collect and store cheap daylight and feed it back at night, thus making all earlier forms of street lighting obsolete.

INDUSTRIAL RESEARCH LABORATORIES grew and flourished on the borders of all the newly discovered worlds—of electricity and electronics, and of photography, of petroleum, of rubber and glass, and of synthetics. DuPont laboratories appeared in 1911, and in 1912 George Eastman founded the Kodak Research Laboratories, followed by laboratories in the United States Rubber Company in 1913, in the Standard Oil of New Jersey in 1919, and in the Bell Telephone Company in 1925. By mid-century the nation had two hundred large industrial research laboratories and two thousand others. From the Kodak laboratories, besides innovations in photography, there came apparently irrelevant, incidental discoveries, such as the finding that concentrated Vitamin A could be distilled from fish-liver oil. From the United States Rubber laboratories, in addition to advances in rubber technology, there came latex threads, latex textiles, latex insulated wire, new paper, and new adhesives.

These laboratories were as varied in sponsorship and support as the hundreds of American institutions of higher learning. Besides the laboratories of industrial firms, there were government laboratories pioneered by the Department of Agriculture and those in the Department of Commerce established by the Bureau of Standards. In 1930, although scientific research expenditures by the federal governent already exceeded those of universities for all scientific research, the federal figure reached only $23 million. During World War II, federal research expenditures rocketed to $750 million. And after the war these figures kept rising, with continuing research on atomic energy, aircraft development, space exploration, and countless other projects. Meanwhile, technological research institutes were founded and supported by large industrial firms. The Mellon Institute, founded in 1913 at the University of Pittsburgh, offered an arrangement whereby an industrial firm that supported a researcher would acquire his discoveries as the property of the firm. The Battelle Memorial Institute (1929), founded from an iron-and-steel fortune “for the purpose of education and creative and research work and the making of discoveries and inventions for industry,” financed Xerox in its early stages, as we have seen, and established a reputation for supporting adventurous research.

The trade associations opened their own laboratories, led by the National Canners in 1913 and followed by the successful laboratories of the National Paint, Varnish, and Lacquer Association, the Institute of Paper Chemistry, the Textile Research Institute, the American Petroleum Institute, and others. Some of these aimed mainly to improve public relations, but by mid-century there was hardly a major labor union or a trade association that was not somewhere supporting authentic industrial research.

Second only to Willis Whitney as an apostle of the new research was the pioneer of the private consulting laboratory, Arthur D. Little. Born into an old Massachusetts family, Little studied chemistry at M.I.T., and incidentally acquired a lively interest in literature. In 1886 he and another young chemist announced “the Chemical Laboratory which we have established at No. 103 Milk Street, Boston…. Mr. Griffin and Mr. Little have had several years’ experience in the development of new chemical processes on the commercial scale and are prepared to undertake, either in their own laboratory, or upon the spot, investigations for the improvement of processes and the perfection of products.” After his partner was killed in a laboratory accident in 1893, Little carried on alone.

At the Paris Exhibition of 1889, when Little saw “artificial silk” made from nitrocellulose, he was enticed by the possibilities of artificial fibers. He secured the American license to use a new viscose process to solubilize cellulose. Few businessmen were willing to invest in so dubious an enterprise, but in 1900 Little finally found backers for his Cellulose Products Company. Although the firm was a financial failure, before the company was dissolved it had succeeded in opening up the new world of cellulose products. Eastman Kodak bought the firm’s patent for the first noninflammable motion picture film, and the Lustron Company bought the “artificial silk” patents which enabled them to pioneer in the American manufacture of acetate silk. After Little co-authored a textbook on paper manufacturing, he was drawn into the chemistry of cellulose. And in 1911, when the United Fruit Company engaged him to find a way to make paper from bagasse (the waste-fiber by-product of making sugar from Cuban cane), Little devised an experimental paper machine. This became one of the earliest “pilot plants” (a new American expression) in the United States. These and other Little experiments incidentally helped establish the practice of building factories expressly to test new techniques before risking large-scale production; and pilot plants in turn stimulated the search for novelty.

In 1916 Little moved his operations into a grand new Cambridge edifice, one of the first buildings in the United States specifically designed for industrial research, which soon became known as Little’s Research Palace. There he and his research team explored a wide miscellany: petrochemicals; odors and flavors, how to produce them and how to classify them; glass containers for perfume, toothpaste, and other purposes; stencil paper; and improved materials for battery boxes. During World War I the Little laboratories improved airplane glue and devised better gas-mask filters.

The unexplored territory where Little worked—like the American West in the nineteenth century—was, of course, a natural habitat of hoaxers. And Little enjoyed exposing them. One Little client had invested a half-million dollars in a scheme to generate electricity directly by the oxidation of carbon electrodes, but Little showed that the “inventor” was only a clever charlatan who had secretly attached the bushings of his generator to the regular power lines.

In his will, Little bequeathed ownership of his firm to M.I.T., and after his death in 1935 the work of his laboratories went on. In 1942 the company helped the government of Puerto Rico devise its “Operation Bootstrap” to promote industrialization: its wide-ranging recommendations included ways to improve the culture of sugar and the distillation of rum, changes in the island’s tax structure, and a program to promote tourism. During World War II the firm’s researchers developed the Kleinschmidt vapor compression method, still used for making drinking water from sea water. They developed new techniques for liquefying oxygen and other gases, and advanced the new science of “cryogenics” for the study of phenomena at low temperatures. They collaborated on support equipment for the first hydrogen bomb in 1952, they helped devise liquid-fuel loading systems for use in space missiles, and they improved the design of space suits. And they developed new uses of radioactive tracers in medicine and in manufacturing. In 1966, under the Model Cities Program, when A. D. Little, Inc., devised a plan for East Cleveland, the scheme touched everything from city government, transportation, schools, and shop-front architecture to child day care, garbage removal, and the colors of lampposts.

“Research,” Little had preached, “is the mother of industry.” In a world where, as Kipling said, “any horror is credible,” Little argued that unlimited progress was also credible. “The United States,” he declared, “is an aggregation of undeveloped empires, sparsely occupied by the most wasteful people in the world.”

AS THE PROPHETS of novelty were changing the meaning of the new, the possible, and the necessary, common sense was understandably losing its persuasive power. “You can’t make a silk purse out of a sow’s ear”—this bit of folk wisdom had irritated Arthur D. Little. So, in order to prove that in modern America anything was possible, Little set out in 1921 to do the proverbially impossible. From Wilson & Co., the Chicago meat packers, he secured ten pounds of gelatine “manufactured [according to an accompanying affidavit] wholly from sows’ ears.” From this he spun an artificial silk thread, he wove the thread into a fabric, and actually produced an elegant purse “of the sort which ladies of great estate carried in medieval days—their gold coin in one end and their silver coin in the other. It is one of which both Her Serene and Royal Highness the Queen of the Burgundians in her palace, and the lowly Sukie in her sty, might well have been proud.” Little offered the purse for public display, and then described the process in a pamphlet which was subtitled “A Contribution to Philosophy,” He might have called it “A Caution against Philosophy,” and against all modes of thought that fence men in from the future.

Little’s kind of exploring carried its price. When novelty ceased to be astonishing and unusual and became normal and expected, the boundary between the usual and the unusual, between the commonplace and the surprising, was blurred. “New impressions so crowd upon us,” wrote Little, “that the miracle of yesterday is the commonplace of today.” When the unexpected and the miraculous were expected, what comfort or security was there in expectation? When invention became the mother of industry, invention soon became the mother of necessity. Americans would have to look about them at the state of technology, and read the advertisements in their paper or watch the commercials on television to discover their “needs.”

And Americans were finding solutions for which they had not yet discovered the problems. Americans had turned on the tap of novelty. Would they, or could they, turn it off?

If you find an error please notify us in the comments. Thank you!