PART EIGHT

Language, Knowledge, and the Arts

“Knowledge is no longer an immobile solid; it has been liquefied. It is actively moving in all the currents of society itself.”

JOHN DEWEY

“When we Americans are done with the English language it will look as if it had been run over by a musical comedy.”

FINLEY PETER DUNNE

“It is not easy to know what you like. Most people fool themselves their entire lives through about this.”

ROBERT HENRI

NEVER BEFORE HAD democracy been tried on such a large scale, nor allowed to shape the academic standards of a whole nation’s language, to redefine its higher learning and reconstruct its notions of art. Out of this remarkable opportunity came pervasive new ambiguities and uncertainties about the standards by which all culture was to be measured. Was your language “right” or “wrong”? Were you speaking eloquently or crudely? Were you acquiring knowledge or falsehood, were you being educated, propagandized, entertained, or actually deceived? What were the proper boundaries of your profession? What were the roles of teacher or of pupil? What was knowledge and what was ignorance? What knowledge was “useless” and what was “useful”? Did art have to be “beautiful”? And if not, what was art anyway?

Underlying the democratic transformations of the ways of judging and measuring was a faith in “the people,” in their inherent spontaneous wisdom, when unguided by authority or by tradition. Vox populi, vox Dei (the voice of the people is the voice of God) became the ruling maxim of more and more of American life. Even if this democratic faith was never quite universal, and only a few dared make it explicit, in America it had become a nation’s orthodoxy. And democracy had made society into a mirror where people saw the way things were, and made that the measure of the way things ought to be.

50

The Decline of Grammar: The Colloquial Conquers the Classroom

IN THE EARLY TWENTIETH CENTURY, American technology blurred the proverbial distinctions between speech and writing. After the telephone came into common use, it was of course no longer true that the reach of the spoken word was limited by the power of the speaker’s lungs. After the phonograph, future generations could not only read a man’s written words, they could actually hear the sound of his voice. The new technologies of repeatable experience made speech, for all practical purposes, as durable and as easily diffused as writing.

At the same time a new science of language, which aimed to dissolve pedantic and aristocratic ways of thinking, gave a new dominance to speech as against writing and made new problems for the naïve citizen. Americans, since the colonial period, had assuaged their insecurities of social status by the reassuring certainties of grammar and spelling. While the spoken language in America was more classless and more uniform than the spoken language of England, while the American vernacular was free and lively and inventive, bursting with anticipation and exaggeration, language teaching aimed to instill the rules of “correct” speech and writing. The American language championed by Noah Webster and others was conceived to be a “purified” English language with rules all its own. One virtue of democracy, according to Webster, was that it offered people a “standard”—in “the rules of the language itself”—more uniform than the language of aristocracies.

BY THE MID-TWENTIETH CENTURY, new democratic criteria had come into the classroom, changing the notion of what standards, if any, a democratic society could apply to its language. These were the product of a new science of linguistics. Until about the mid-nineteenth century, studies of the origin and development of language had been tangled with theology, philosophy, rhetoric, and logic. Then, when a modern descriptive science of language began, in Europe it focused on the relationship among the so-called Indo-European languages, and their derivation from hypothetical original forms. Effort to make this study more “scientific” had tied it to specific new theories of psychology. By the early twentieth century, European scholars had begun to survey the way language really worked. The monumental New English Dictionary on Historical Principles (commonly called the Oxford English Dictionary, 12 vols., 1888–1928), together with the works of continental scholars, provided raw materials for a new era of generalizing.

A new American school of linguistics used distinctive American opportunities. The two founders, Edward Sapir and Leonard Bloom-field, both discovered their point of departure in the languages of the American Indian. Sapir, son of an immigrant German Jewish cantor, had come to America at the age of five and won the scholarship competition for “the brightest boy in New York City.” As a young man he became interested in the nature of language, went to the state of Washington to study the language of the Wishram Indians, and then in 1921 produced his basic work, Language: An Introduction to the Study of Speech. Bloomfield had studied the language of the Menominee and the Plains Cree Indians before he produced his influential Language in 1933.

Earlier scholars in Europe had focused on Indo-European languages which were recorded in abundant documents, and which were essentially similar to one another in structure. By taking off from the American Indian languages, Sapir and Bloomfield found a new linguistic perspective. For in American Indian communities the dominant and in some cases the only form of language was spoken; most of their languages had no written form, and when they had, often there were no grammars and no available texts. To study these languages, then, inevitably meant reconstructing patterns of meaning from the actual sounds of speech. Such studies of “primitive” spoken languages revealed that contrary to the snobbish clichés which insisted that peoples without an elaborate literature had no more than a few hundred words in their vocabulary, each of these languages was found to contain upwards of 20,000 words. The anthropologist A. L. Kroeber found 27,000 words, for example, in the spoken vocabulary of the Aztec Nahuatl. Spoken languages, then, proved to have an unsuspected copiousness and flexibility.

Primarily from study of these exotic, nonwritten languages grew the new science of structural linguistics. During World War II, structural linguists proved the validity of their approach by making it the basis of speedy new ways to teach non-European languages in training programs for interrogating officers and occupation forces.

Although only a small number of Americans had even heard of the new linguistic science, within two decades it would become the dominant influence on the teaching of the American language in schools. The effect, the linguist Mario Pei explained, was “an excessive reverence for the spoken tongue and a corresponding disregard of, not to say contempt for, written versions, even where these exist and are of value in tracing a language’s nature, history and affiliations; and excessive regard for the phonetic portion of the language and a corresponding neglect of its semantic role; and a tendency to accept as the standard form of a tongue that which is common to the bulk of uneducated native speakers, rather than that which is consecrated by tradition as ‘good usage.’” If the science of structural linguistics was new and remained esoteric, its dogmas were at one with the folk wisdom of democracy.

About the same time that scholars were elaborating their solemn new science a powerful voice spoke with the biting self-assurance of a prophet. H. L. Mencken, a Baltimore-born journalist, reached a wide audience from the pages of the American Mercury(1924–33), where he attacked the American “Boobocracy.” Setting a vigorous example with his own richly colloquial style, Mencken had become a pioneer scholar of the American language. By 1910 he was collecting examples of the differences between the British and the American language. With the coming of World War I, when Mencken dared to doubt whether defeating Germany and hanging the Kaiser would make the World Safe for Democracy, he took refuge in his studies of the language, which became a monumental work, The American Language (3 vols., 1919–48). There Mencken provided a storehouse of information that gave courage to champions of popular American speech, and he pleaded for the full-bodied colloquial language against the skimmed milk of the schoolmarms. The American language, Mencken’s first volume prophesied, was diverging so rapidly from the English language that the speakers of either one would soon be unintelligible to the speakers of the other. Twenty years afterward he observed that the language of England was becoming nothing more than a dialect of American. Mencken’s work, contradicting the dogma of the stupidity of the masses which he had preached in the American Mercury, celebrated the peculiar democratic grandeur of the language. An American Rabelais, he pilloried obscurantism and pomposity as energetically and as wittily as he championed learning. His American Language, widely bought but not widely enough read, was a vivid if cumbersome American saga: the hero was the whole motley churning American people using words as their weapons in perpetual battle with a novel reality.

Mencken himself became a one-man Institute for the Study of American. He enticed American scholars of “English” to abandon the clichés of English literature for the study of their own spoken language. He helped plan the Linguistic Atlas of the United States, was a founder of the Linguistic Society of America, and joined in establishing a lively new journal called American Speech. Following Mencken’s lead, others built monuments of scholarship such as A Dictionary of American English on Historical Principles (W. A. Craigie and James Root Hulbert, eds., 4 vols., 1938–44) and A Dictionary of Americanisms on Historical Principles (Mitford M. Mathews, 2 vols., 1951), proving beyond pedantic doubts the delightfully special character of the language of Americans.

THESE NEW VIEWS of language in general and of the American language in particular were developing at a time when the task of the American language teacher, especially in the high schools, was being transformed. In 1900, of Americans in the age group 14 to 17, only about one in ten was attending high school, and of those aged 18 to 21, only one in twenty-five was attending college. By 1920 the enrollments of high schools had quadrupled; by 1930 more than half the children aged 14 to 17 were in high school. The proportion of Americans aged 18 to 21 going to college had become one in ten, and was going up. The democratizing of the high school, an easily forgotten American achievement of the twentieth century, brought into the classroom more and more students whose parents had had only a scanty formal education. As a result, the linguist Raven I. McDavid, Jr., explains, “ever larger proportions of those in the classroom have come from homes where the traditional values of humanistic education are of little importance, where even the standard language is a foreign idiom. To cope with the new situation created by the new clientele, especially in high school and college, new theories of language analysis have been introduced; old theories of grammar and usage … have been continually reexamined in the light of new evidence and of insights provided by other disciplines.” As the schoolroom was expected to perform a remedial function, the temptations were increased for insecure, upward-mobile teachers to impose “Rules of Good English” on their insecure, upward-mobile students.

But there was also the Democratic Temptation—to flatter the people by assuring them that whatever they were already doing was right and best. Teachers sought to relieve their new students of feelings of inferiority by suggesting that perhaps the language they heard at home was not actually “incorrect” after all. This temptation became increasingly potent as the century wore on. Before the century was out, some who still called themselves “teachers” of language thought they could bolster the egos and reduce the aggressions of their underprivileged Negro students by validating “Black Language” and so relieving them of the need to learn Standard English.

The democratic argument was summarized by an official of the Modern Language Association writing in 1968:

It is no exaggeration to say that linguistic naiveté on the part of the teacher of English in the urban school contributes to rioting in the streets and a hostility between community and school which the United States cannot afford in the next decade. Teachers must begin by dispossessing themselves of linguistic myths: Southerners have lazy speech habits; Boston English is “purer” than Bronx English; “ain’t” is a mark of linguistic inferiority. The English teacher must admit that a student’s ability to spell or punctuate, to write or recognize a compound-complex sentence or a 200-word paragraph which has unity and coherence is less important than his learning to speak openly and honestly, to listen well, to read many kinds of books and magazines and newspapers, and to write what he believes and thinks and feels.

These streams—the new science of linguistics, the new data of American language habits, and the currents of democratic feeling—became a flood which overwhelmed the schoolmarms.

The structural linguists, appalled by the pomposities and pedantries of the pedagogues, were determined to be more than the detached classifiers of language habits. In their vocabulary, “grammar” became a synonym for obscurantism. Although one of their manifestoes, by Robert A. Hall, Jr., in 1950, was entitled Leave Your Language Alone!, they would not leave the grammarians alone. In fact, they launched a crusade against the grammarians which was probably more widespread and more effective than any comparable movement outside the totalitarian countries in changing popular attitudes to institutions.

The brilliant Charles Carpenter Fries, a professor at the University of Michigan, pointed the morals of linguistic science for schoolteachers of English. His two influential books aimed to discover the “structure” of the American language, not from the grammarians’ rules, but from how ordinary Americans actually wrote and spoke. American English Grammar (1940), nearly twenty years in preparation, was the product of Fries’s study of three thousand letters written by families of servicemen to a branch of the War Department during World War I. Having finally secured official permission to use these letters (provided he omitted all names of persons and places), he made them the raw source of his new terminology for “the grammatical structure of present-day American English with special reference to social differences or class dialects.” To identify the social class of each writer, Fries used the sworn biographical statements in the War Department files.

Financed by the National Council of Teachers of English, this study by Fries aimed to tell schoolteachers how to improve their instruction in English. His book insisted on a “scientific” point of view. “Instead of having to deal with a mass of diverse forms which can be easily separated into the two groups of mistakes and correct language,” Fries explained, “the language scholar finds himself confronted by a complex range of differing practices which must be sorted into an indefinite number of groups according to a set of somewhat indistinct criteria called ‘general usage.’” Fries therefore banished such terms as “mistake,” “correct,” and “error” from his vocabulary, and he urged all English teachers to do the same. After offering his own scheme to describe the structure of actual usage, he concluded that the “study of the real grammar of Present-day English” had actually never been found in the schools. He urged English teachers to change their methods in order to teach students the everyday practices of their community, aiming finally “to stimulate among our pupils observation of actual usage and to go as far as possible in giving them a practical equipment for this purpose.”

Fries’s next book, The Structure of English (1952), was based, significantly, not on written but on spoken usage. To be sure that the language he was studying would not be stilted or self-conscious, Fries secretly recorded fifty hours of the ordinary conversations that went over the two telephones in his house. The recordings, made before the law that required a warning beep on tapped phones, were taken without the knowledge of the speakers. In this book he discarded the whole old-fashioned vocabulary of “nouns,” “verbs,” etc., displacing them with a novel antiseptic system of his own: Class I, Class II, Class III, and Class IV, and “Function Word” for leftovers. His scheme was constructed simply by classifying the word structures that he had heard actually spoken on the telephone.

This system, according to Fries and his allies, at long last made it possible to modernize the language classroom, and so take schoolchildren out of what they called “the pre-scientific age.” They prepared new textbooks and teachers’ guides: “English Grammar” now became “Patterns of English.” The new textbooks abandoned the old idea of grammar as a set of commandments for the well-educated. As the teacher’s guide to one of the most successful of the new textbooks explained in 1956, Paul Roberts’ Patterns of English “does not dwell on sentence errors as such. It is, or tries to be, purely descriptive, and descriptive of good writing rather than poor writing.” Roberts began by reminding the student that the language he spoke was in some ways “the most important language in the world,” since it was spoken by some three hundred million people spread around the world; but he quickly added that “among all these speakers of English there are no two people who speak it exactly alike. We have all noticed how we can recognize our friends’ voices on the telephone, even before they tell us who is speaking.”

The new point of view was illustrated by Roberts’ approach to one troublesome problem of usage:

Probably some educated speakers of English live long and happy lives without ever letting the word [“whom”] pass their lips. Whom is strongest, of course, in Choice Written English, where it is used regularly according to handbook precepts. It is used more sparingly in Choice Spoken English; many radio announcers, for example, avoid it altogether; possibly they feel that the average radio audience would find it too hoity-toity. In General Written English it is avoided more often than not, and it is seldom heard in General Spoken English.

Those who avoid definite whom do so in two ways. First, they may use the nominative form, even though the pronoun is an object:

                     a fellow who we used to know

                     a girl who I used to go with

Or, and this is more common, they simply omit the relative:

                     a fellow we used to know

                     a girl I used to go with.

This new attitude toward language carried contempt for the grammarians’ knowledge, which was now labeled superstition. “The methods and philosophy of grammar teachers are of as little moment to linguists,” declared a leader of the new movement, “as the horoscope-casting methods of astrologers are to astronomers.”

While the new linguists took a ruthlessly descriptive view of the language, they knew, of course, that educated Americans had been indoctrinated at school with the canons of traditional linguistic respectability. When the new linguists were told that “bad grammar” might prevent a person from getting ahead in the world, they did not disagree. But they insisted that this was very different from making the rules for getting ahead into a linguistic ethic. “It is not correct,” they argued, “… to tell a boy who says ‘I didn’t see no dog’ that he has stated he did see a dog. His statement is clear and unequivocal. What we can tell him is that he has made a gross social faux pas, that he has said something that will definitely declass him…. If you say it ain’t me instead of it is not I, or I seen him instead of I saw him, you will not be invited to tea again, or will not make a favorable impression on your department head and get the promotion you want … [But] in itself, and apart from all considerations of social favor, one form of speech is just as good as another.” Speaking too “correctly,” they warned, might also have its penalties, for example in leading to overcharges “from such relatively uneducated but highly practical citizens as plumbers and garagemen.”

Still, the new linguists took the classroom by storm. The decline of the teaching of Latin had made the victory easier. Patterns of English and similar iconoclastic texts began to displace the traditional English Grammar. Parents who had learned “a noun is the name for a person, place, or thing,” and who could never forget the drudgery of diagraming sentences, found their children saying “a noun is a word that patterns like apple, beauty, or desk” and thinking more about the way people actually spoke.

THE UNCOMPROMISING DESCRIPTIVE point of view was reflected more slowly in the dictionaries. For while the dictionary makers themselves, in contrast to grammarians, had tended to be more realistic in their approach to language, the dictionary users had a sanctimonious reverence for dictionaries such as they had never felt for grammars. Throughout life, people were accustomed to “look it up in the dictionary” to see what a word “really meant” and to find out whether they were “right” or “wrong.” Besides this, compiling or even revising a large dictionary was enormously expensive, and required years of work. No prudent publisher was apt to invest several millions of dollars on a book that was faddish or that would not command continuing respect from the learned world.

In 1961, both the power and the novelty of the new point of view were dramatized by the appearance of an unabridged dictionary which was ruthlessly descriptive and speech-oriented. Perhaps never before had a product of the green-eyeshaded lexicographers been so explosive. On September 27, 1961, the G. & C. Merriam Company of Springfield, Massachusetts, published Webster’s Third New International Dictionary. The 2,720-page volume weighed thirteen and a half pounds and sold for $47.50. It was the first completely new Merriam-Webster unabridged dictionary in twenty-seven years. Of course there were many other dictionaries, but the Springfield publishers had managed to hold on to the kudos of the famous Noah Webster, who had died in 1843. When people said, “Look it up in Webster,” they usually meant “look it up in the latest unabridged Merriam-Webster.” This new edition, according to the publishers, was the product of a permanent staff of more than a hundred specialists and hundreds of consultants, and had cost more than $3.5 million. The main vocabulary section, based on a file of 10 million citations, offered 450,000 entries, including 100,000 new words or new meanings never before found in the unabridged Merriam-Webster.

In the language of public relations, this was definitely a Publishing Event. Publication date was preceded by numerous teaser news stories for background. The book was launched, the publishers boasted, with “the greatest concentration of advertising ever used by any publisher to promote any single book.” It was greeted by front-page stories and leading editorials in daily newpapers, followed by articles in general-circulation weeklies and literary reviews. With few exceptions these were a barrage attack. “WEBSTER’S LAYS AN EGG,” announced the Richmond News Leader. “NEW DICTIONARY, CHEAP, CORRUPT,” declared the Rt. Rev. Richard S. Emrich in the Detroit News, in a review that explained how the new Webster’s expressed “the bolshevik spirit.” The assistant managing editor of the New York Times circulated a memorandum to his staff informing them that the editors of the news, Sunday, and editorial departments “have decided without dissent to continue to follow Webster’s [earlier] Second Edition for spelling and usage.” And a Washington Post editorial exhorted readers: “KEEP YOUR OLD WEBSTER’S.”

The prim Atlantic Monthly reviewer labeled the new Webster’s “a fighting document. And what the enemy is out to destroy is every obstinate vestige of linguistic punctilio, every surviving influence that makes for the upholding of standards, every criterion for distinguishing between better usages and worse.” This, the most important event in American linguistic history in the mid-twentieth century (as many called it) was declared to be “a very great calamity.”

These expressions of outrage from respectable journalists and editors of literary reviews were a measure of the unnoticed revolution in the thinking of the scientific linguists who had already dominated the American classroom. Most of these reviewers, of course, had been raised on the old “right-or-wrong” school of grammar and themselves knew little or nothing about the new linguistic science. While the influential journalists, preachers, and popular reviewers deplored the product of the new linguists, they had hardly noticed that their own children were being raised on Patterns of English. The new linguistics had conquered the classroom in a secret victory.

But most linguistic scientists enthusiastically approved the new Webster’s. For few critics could deny that the new Webster’s was an accurate, up-to-date, and comprehensive description of American usage. What disturbed most of the lay critics was that this “authoritative” work deliberately failed to provide the guide to “right” and “wrong” usage of their language. Wasn’t that what they had always wanted from Webster’s? Even in a democracy, these educated Americans still believed, they ought to be able to use their dictionary, their sacred Webster’s, as an authority. And even if the community had become confused on all other subjects, in language at least there ought to remain some agreement on “right” and “wrong.”

The new Webster’s was so heavily colloquial in point of view that it had actually abolished the distinguishing label “colloquial.” And it made little use of the familiar stigmata—slang, cant, facetious and substandard—which the earlier Webster’s had applied to words not in the standard written usage of educated Americans. The new Webster’s not only defined 100,000 words which were not found in the earlier edition but was extremely permissive in admitting to its list, without marks of reproach or disparagement, such words as “ain’t” (which it said was “used orally in most parts of the United States by cultivated speakers”), “wise up,” “get hep,” “ants in one’s pants,” “hugeous,” and “passel” (for “parcel”). The illustrative quotations, culled from the ten million in their file, came not only from statesmen and famous authors but from movie stars, night-club entertainers, baseball players, and boxing promoters; not only from Winston Churchill, Dwight D. Eisenhower, Edith Sitwell, Jacques Maritain, and Albert Schweitzer but also from James Cagney, Ethel Merman, Burl Ives, Willie Mays, Mickey Spillane, Jimmy Durante, Billy Rose, and Ted Williams.

Within a few years other American dictionaries came on the market which took some account of the new linguistics but still offered readers a semblance of the “authority” they were looking for. In 1966 appeared The Random House Dictionary of the English Language, a 2,096-page volume, the product of eight years of work by 150 editors and 200 outside consultants. It was intended to compete with the new Webster’s but it sold for only $25, and so aimed at a wider market. This work, the publishers explained, had been shaped by extensive surveys of teachers, professors, librarians, and journalists, the very groups that had cried “Murder!” to the new Webster’s. Avoiding the New Linguists’ temptation to be “so antiseptically free of comment that it may defeat the user by providing him with no guidance at all,” this dictionary preserved tradition at least to the extent of informing its readers of “long-established strictures in usage.” For example, “ain’t” was stigmatized as “nonstandard” and the reader was warned that “it should be shunned by all who prefer to avoid being considered illiterate.” At the same time, “to have ants in one’s pants” was labeled “slang,” and then illustrated by “She had ants in her pants ever since she won that ticket to Bermuda.” Instead of drawing illustrative quotations from athletes, entertainers, and other not-necessarily-educated celebrities, and so offending the literati—or from highbrow authors and scholars, and so proving they were out of tune with the new linguistics—the quotations offered had been made up by the editors themselves. Here was a compromise, the editor explained, “a linguistically sound middle course” between the antiquated “authoritarian” point of view and the futuristic “descriptive” approach. But this dictionary, too, was permissive, plainly dominated by the living, spoken language. When “usage” was prescribed, it was only in order to make the book “fully descriptive.” The Random House Dictionary, partly because of its differences from the new Webster’s, was an enormous commercial success.

Soon another, smaller dictionary succeeded by taking still another step backward toward the traditional “right-or-wrong” approach to language. In 1969 the editor of the new American Heritage Dictionary of the English Language announced that the publishers, with a “deep sense of responsibility as custodians of the American tradition in language as well as history…. would faithfully record our language, the duty of any lexicographer, but it would not, like so many others in these permissive times, rest there. On the contrary, it would add the essential dimension of guidance, that sensible guidance toward grace and precision which intelligent people seek in a dictionary.” The editors were freer than their predecessors in applying the stigmas of “slang” or “vulgar,” but they too sought to avoid the stigma of pretending to be authoritative. In an ingenious device of democratic acquiescence they made their “Usage Notes” the product of a kind of opinion poll. In these Notes, a “Usage Panel” of about a hundred members, which happened to include many of those very people who had been so uncharitable in their judgments of the new Webster’s, reported their opinions of dubious words. After “ain’t” (labeled “nonstandard”) the entry noted that the word was “with few exceptions … strongly condemned by the Usage Panel…. ain’t I is unacceptable in writing other than that which is deliberately colloquial, according to 99 per cent of the Panel, and unacceptable in speech to 84 per cent.”

THE SHREWDEST HISTORICAL JUDGMENT on the development of the American language was probably the one reported in Business Week: the new Webster’s “was right” but had appeared some thirty years too soon. For the spoken language had long been establishing its sway. The new linguistic scientists were not bolsheviks or anarchists, nor were they conscious opponents of the good, the true, and the beautiful. Neither were they bad scientists. What the new Webster’s had revealed was a long-accelerating revolution, far wider and deeper even than the linguistic conservatives had imagined. In linguistic science, as in countless other surprising ways, Americans were using their new techniques simply to describe the way the world really was. A world so visibly speedy, so obviously kaleidoscopic, was newly interesting to describe, and newly difficult to prescribe for.

Everywhere one looked in the United States in the twentieth century, today’s hastening world seemed to make irrelevant the traditional standards of how the world ought to be. The triumph of science and democracy was not yet total—at least in the world of language. People remained nostalgic for the language ways of the “best” speakers. But the attempt of prudent dictionary makers to keep alive an authoritarianism, however mild and enlightened, was only a rear-guard action. The self-consciousness with which they aimed to “preserve” the language (while they bowed low to the new descriptive linguistics) hinted where the currents of history were running. When Americans looked for standards in so simple an everyday matter as their words, they no longer found an unambiguous authority.

If you find an error please notify us in the comments. Thank you!