Language gatekeepers are often self appointed
Excerpts from Charles Harrington Elster's guide to "proper" English
In this reprint from the December 1983 issue of The Atlantic Monthly, Geoffrey Nunberg makes an impassioned plea for civility when grammarians assess a changing language.
Every one has noticed the way in which the Times chooses to spell the word "diocese;" it always spells it diocess, deriving it, I suppose, from Zeus and census. The Journal des Débats might just as well write "diocess" instead of "diocese," but imagine the Journaldes Débats doing so! Imagine an educated Frenchman indulging himself in an orthographical antic of this sort, in face of the grave respect with which the Academy and its dictionary invest the French language! Some people will say these are little things; they are not; they are of bad example. They tend to spread the baneful notion that there is no such thing as a high, correct standard in intellectual matters; that every one may as well take his own way; they are at variance with the severe discipline necessary for all real culture; they confirm us in habits of wilfulness and eccentricity, which hurt our minds, and damage our credit with serious people. -- Matthew Arnold in "The Literary Influence of Academies," 1865
IS the English language -- or to put it less apocalyptically, English prose writing -- really in a bad way? How would one tell? The standard jeremiads of the Sunday supplements give only anecdotal evidence, and that of a curious sort; the examples of degradation that they present are drawn not from current plays or novels, which are grammatically and syntactically extra judicium, but from advertisements, scholarly papers, and -- most popular of all -- memos from college deans. It is hard to believe that any of these texts will survive even until the next century, much less that late-twentieth-century English will be judged by their example. Our picture of the English of previous centuries, after all, has been formed on the basis of a careful selection of the best that was said and thought back then; their hacks and bureaucrats are mercifully silent now.
But while it is understandable that speakers of a language with a literary tradition would tend to be pessimistic about its course, there is no more hard evidence for a general linguistic degeneration than there is reason to believe that Aaron and Rose are inferior to Ruth and Gehrig.
It is absurd even to talk about a language changing for the better or the worse
Most of my fellow linguists, in fact, would say that it is absurd even to talk about a language changing for the better or the worse. When you have the historical picture before you, and can see how Indo-European gradually slipped into Germanic, Germanic into Anglo-Saxon, and Anglo-Saxon into the English of Chaucer, then Shakespeare, and then Henry James, the process of linguistic change seems as ineluctable and impersonal as continental drift. From this Olympian point of view, not even the Norman invasion had much of an effect on the structure of the language, and all the tirades of all the grammarians since the Renaissance sound like the prattlings of landscape gardeners who hope by frantic efforts to keep Alaska from bumping into Asia.
The long run will surely prove the linguists right: English will survive whatever "abuses" its current critics complain of. And by that I mean not just that people will go on using English and its descendants in their daily commerce but that they will continue to make art with it as well. Yet it is hard to take comfort in the scholars' sanguine detachment. We all know what Keynes said about the long run, and in the meantime does it really matter not at all how we choose to speak and write? It may be that my children will use gift and impact as verbs without the slightest compunction (just as I use contact, wondering that anyone ever bothered to object to it). But I can't overcome the feeling that it is wrong for me to use them in that way and that people of my generation who say "We decided to gift them with a desk set" are in some sense guilty of a moral lapse, whether because they are ignorant or because they are weak. In the face of that conviction, it really doesn't matter to me whether to gift will eventually prevail, carried on the historical tide. Our glory, Silone said, lies in not having to submit to history.
Linguistic manners are like any others
Take Modern English Usage, by that good man H. W. Fowler, "a Christian in all but actual faith," as the Dictionary of National Biography called him. Despite a revision in 1965, it is out-of-date, yet it still has a coterie as devoted as the fans of Jane Austen or Max Beerbohm, who prize its diffident irony, its prose cadences, and, above all, the respect it shows for its readers' intelligence and principles. Here, for example, is Fowler on the insertion of quotation marks or an expression like "to use an expressive colloquialism" to mark off a slang word from which the writer wants to dissociate himself:
Surprise a person of the class that is supposed to keep servants cleaning his own boots, & either he will go on with the job while he talks to you, as if it were the most natural thing in the world, or else he will explain that the bootboy or scullery-maid is ill & give you to understand that he is, despite appearances, superior to boot-cleaning. If he takes the second course, you conclude that he is not superior to it; if the first, that perhaps he is. So it is with the various apologies to which recourse is had by writers who wish to safeguard their dignity & yet be vivacious, to combine comfort with elegance, to touch pitch & not be defiled. . . . Some writers use a slang phrase because it suits them, & box the ears of people in general because it is slang; a refinement on the institution of whipping-boys, by which they not only have the boy, but do the whipping.
This passage would not be out of place in the company of Addison and Steele. It is apt, amusing, and above all instructive. It obviously has done little to stem the mania for quotation marks (WE ARE "CLOSED," I saw in the window of a shoe-repair shop the other day), but it did at least persuade me to remove the quotes from around the word life-style in a review I was writing, and I am a better person for it.
If we are bent on finding a decline in standards, the place to look is not in the language itself but in the way it is talked about. In the profusion of new books and articles on the state of the language, and in most new usage books, the moral note, if it is sounded at all, is either wavering or shrill. What is largely missing is the idea that there is any pleasure or instruction to be derived from considering what makes good usage good. Rather, grammar comes increasingly to be regarded as a mandarin code that requires only ritual justification. And, for all the heated polemics over the importance of grammar, it appears that each party at least implicitly accepts this view.
Linguists, of course, have been arguing for a long time that the rules of traditional grammar have no scientific or logical justification, and that the only reason grammarians consider certain usages "correct" is that they happen to have been adopted by the privileged classes in the past. As the linguists Anthony Kroch and Cathy Small put it in a recent article, "prescriptivism [that is, traditional grammar] is simply the ideology by which the guardians of the standard language impose their linguistic norms on people who have perfectly serviceable norms of their own." We will see that this view is not entirely justified. Nonetheless, the linguists have won over a large part of the educational establishment, so that "correct English" has come to mean no more than "standard English," the English spoken by the educated middle class. A few radicals have gone on to argue that traditional grammar, as an instrument of racism and class oppression, has no place in the school curriculum. But more often educators counsel an enlightened hypocrisy: standard English should be taught because there are still benighted employers who take stock in such things. This position is put concisely by Jim Quinn, whose American Tongue and Cheek is a lively and informative popularization of the linguists' views.
The fact remains that there is a way of writing that is necessary to success, just as there are rules about which fork to use at an expensive restaurant. And preparing children for success means preparing them to manipulate those rules, just as they have to be taught to manipulate the salad fork and demitasse spoon.
…imperfect grammar is not much of a stumbling block, even on the road to high office
To see the state to which things have fallen, one need only compare Fowler with a modern composition text and a modern prescriptive grammarian on a vexed point of grammar -- the problem of which pronoun to use with an antecedent like each or anyone. Here is Fowler: Each & the rest are all singular; that is undisputed; in a perfect language there would exist pronouns & possessives that were of as doubtful gender as they & yet were, like them, singular; i.e., it would have words meaning him-or-her, himself-or-herself, his-or-her. But just as French lacks our power of distinguishing (without additional words) between his, her, and its, so we lack the French power of saying in one word his-or-her. There are three makeshifts: -- A, as anybody can see for himself or herself; B}, as anybody can see for themselves; & C, as anybody can see for himself. No-one who can help it chooses A; it is correct, & is sometimes necessary, but it is so clumsy as to be ridiculous except when explicitness is urgent. . . . B is the popular solution; it sets the literary man's teeth on edge & he exerts himself to give the same meaning in some entirely different way if he is not prepared, as he usually is, to risk C; but it should be recorded that the OED . . . refrains from any word of condemnation. C is here recommended. It involves the convention that where the matter of sex is not conspicuous or important heand his shall be allowed to represent a person instead of a man, or say a man (homo) instead of a man (vir). Whether that . . . is an arrogant demand on the part of male England, everyone must decide for himself (or for himself or herself, or for themselves).
Fowler's article is a model of the traditional grammatical method. He begins by acknowledging the problem, and then addresses it with arguments from precedent and analogy, being careful to distinguish between the grammatical questions that lie within his brief and the political questions that lie outside it. As in all good homilies, it is the method, not the text, that matters; read Fowler on this and you will have an idea of how he might come at a wholly different problem.
Now contrast the approach to the problem taken by the Harbrace College Handbook, a standard text in college composition classes since its publication, in 1941. Its great virtue is that it is ideally organized to meet the needs of a teacher who may have to correct two or three hundred pages of student writing every week. Inside the back cover of the book is a table in which grammatical errors are classified into family, genus, and species, and are assigned code numbers. The point at issue is listed as: "6b(1) Agreement: Pronoun and Antecedent: Man, each, etc. as antecedent." Whenever a student makes an anyone . . . they sort of error, the instructor need only write "6b(1)" in the margin, and the student is referred to the corresponding section of the text, in which this point of usage is explained. There he can read (I quote from the seventh edition and omit some example sentences):
In formal English, use a singular pronoun to refer to such antecedents as man, woman, kind . . . anyone, someone, and nobody. In informal English, plural pronouns are occasionally used to refer to such words.
Caution: Avoid illogical sentences that may result from strict adherence to this rule.
ILLOGICAL Since every one of the patients seemed discouraged, I told a joke to cheer him up.
BETTER Since all the patients seemed discouraged, I told a joke to cheer them up.
These bare instructions give no reason at all for choosing the singular pronoun. In fact, there is no mention of an error in the use of the plural, which is labeled not "incorrect" or "illogical" but merely "formal," as if the difference between plural and singular were on a level with the difference between cop and policeman, or horse and steed.
The entry does touch on a point that is quite interesting to theoretical linguists, to the effect that English grammar does not generally allow the singular pronoun with an antecedent like everyone when that antecedent is not in the same clause. But the Handbook says only that the sentence is "illogical," giving no indication of what point of logic is violated. What is the student to make of that, especially since the Handbook has not explained the use of the singular as being "logical" in the first place? The student who finds "6b(1)" cropping up on his compositions may learn to rectify the error, but only in the way he learns to rectify his misspellings: by rote, learning nothing else in the process.
The linguists are at least forthright in their rejection of linguistic morality. Their opponents, the defenders of traditional values, are more deceptive. They talk a great deal about morality, but in millenarian tones, as if the rules of grammar were matters of revealed truth rather than the tentative conclusions of thoughtful argument. Here is John Simon on the same point of grammar:
We cannot and must not let "one" become plural
And don't let fanatical feminists convince you that it must be "as he or she pleases," which is clumsy and usually serves no other purpose than that of placating the kind of extremist who does not deserve to be placated. The impersonal "he" covers both sexes.
For Simon, the whole matter is cut and dried, exactly as it is for the Harbrace College Handbook, except that his world is divided into the "thickheaded" and "those who know better." That last phrase is particularly telling, not just in its appeal to a reader's smug self-satisfaction but also because it shows that for Simon grammar really is a matter of knowing the rules, not of working them out. Indeed, he has written elsewhere: "There is, I believe, a morality of language: an obligation to preserve and nurture the niceties, the fine distinctions, that have been handed down to us." That is the credo of a czarist emigre, not an English grammarian. Johnson and Fowler did not regard themselves as mere keepers of the sacred flame.
Simon's shots at feminists are also instructive. For him, a commitment to correct grammar is naturally associated with a conservative ideology. Like William Safire and William Buckley, he seems to see good grammar as bathed in the same rosy glow that surrounds the other traditional institutions that liberal America has forsaken. This indicates a shift of some importance. For most of its history the English grammatical tradition has been associated with classical liberalism. Its earlier defenders, from Johnson to Auden and Orwell, would probably be distressed to learn that their standard had been taken up by the right. But, then, the ideal of grammar that the conservatives champion is much changed from what the earlier grammarians had in mind.
Simon is particularly shrill, but other writers on the state of the language are equally dogmatic. Edwin Newman and Richard Mitchell (the "Underground Grammarian") write books about the language that rarely, if ever, cite a dictionary or a standard grammar; evidently one just knows these things. William Safire is a different story. Affable and self-effacing ("I may not know much about grammar, but . . ."), he brings out of the woodwork readers who are less frequently snobs than enthusiasts, who exchange with him schoolmarm maxims and scraps of linguistic folklore. These are word-lovers who live to catch out the mighty in a misused whom ; though their zeal is commendable, their authority is suspect.
The point of traditional grammar was to demonstrate a way of thinking… not to canonize a set of arbitrary rules
There is nothing in modern writing about the language that is more pathetic than attempts to fix the blame for the "problem" (whatever the problem is understood to be) on this or that small group. If the English grammatical tradition has declined, this is the result of basic changes in our attitude toward the language, themselves the consequences of far-reaching social changes. It is not a case of the schools having "failed in their duty." As Richard Lanham argues in his provocative book Style: An Anti-Textbook, "You cannot teach as duty what society does not feel a duty." Neither are the linguists responsible. Their criticisms of the grammatical tradition are overstated, we will see, but they are much closer to the mark when they describe the contemporary scene, for the mastery of grammar has come to be considered largely a social accomplishment. And the traditionalists like Simon and Newman are even less to blame; they are simply moving into the cultural vacuum.
Before we can talk about how to put grammar back on its moral and intellectual feet, we must consider what grammatical criticism has been all about in the English-speaking world, and how we have come to the present sad state of affairs.
We usually assume that good English is based on a few simple and unexceptionable maxims
One reason for the canonization of clarity and logicality is that for us the notion of good usage is applicable only to the narrow class of writing that we call expository prose. Novelists and poets simply aren't held to the rules of grammar; what they do is "creative writing," a thing apart. But, again, such a sharp distinction is peculiar to modern English; in Italian and French--as well as in the English written before the nineteenth century--the language of poetry is not exempt from the requirements of correctness, unless the poem has been expressly written in dialect.
This disregard for the grammar of poetry and fiction is connected with another curious feature of English-language values: unlike speakers of most Continental languages, we do not hold that there is a single "correct" accent, and we permit each area to set its own pronunciation standards. The New Yorker who drops his "r"s (or the Englishman who pronounces his) may be looked down on, but he is guilty only of a social gaffe, like the man who wears a polyester leisure suit. It is inconceivable that a New York City teacher would tell his pupils that pronouncing "horse" to rhyme with "sauce" is "not English," in the way that a Tuscan teacher might tell his pupils that it is "not Italian" to pronounce "casa" as "hasa. " Likewise, we can't imagine that in America the accents used by newscasters might become a matter for heated public discussion, as they have at times in Germany and Italy.
Our linguistic values, being so particular to English, are by no means absolute or immutable. They must change, as they have already changed, along with the social composition of the English-speaking world. It was because of sweeping social changes in the eighteenth century that our present system of values arose. The new values were created in part by the rise of the middle class, with a corresponding increase in literacy; in part by the importation of the German-speaking Hanover court; and in part by the new conception of English as the language that extended over the whole of Great Britain and then the colonies. An immediate effect of these changes was the emergence of a new intellectual class, independent of aristocratic patronage, which came to cultural authority. As the critic Leo Braudy has pointed out, the members of this group were largely outsiders: Scots like Hume, Smollett, and Adam Smith; Irishmen like Burke, Goldsmith, Steele, and Swift; Catholics like Alexander Pope; and middle-class provincials like Gay, Johnson, and Sterne. If they did not manage to, as they put it, "ascertain" the English language in a fixed form for all time, they did at least succeed in establishing the linguistic ground rules that would hold for the next two hundred years.
18th century grammarians accepted the doctrine that usage was the final arbiter of correctness
Unlike their Continental contemporaries, the English grammarians rejected the notion that national institutions should have any role in determining the models of correct usage. In 1712, Swift had seconded earlier suggestions by Dryden and others that an English academy be established, on the model of the Italian and French ones, and only the fall of the Tory government prevented his plan from being realized. By mid-century, however, the idea was generally opposed, as inconsistent with what Johnson called "the spirit of English liberty." Johnson's Dictionary, in fact, was widely hailed as a vindication of the superiority of free institutions over Continental absolutism. As Garrick wrote:
And Johnson, well armed like a hero of yore, Has beat forty French, and will beat forty more. From its inception, then, the modern doctrine of good usage was associated with progressive ideals.
For the grammarians of the Age of Reason, the advantage of literary models was that their superiority could be defended by appeals to logic and sensibility. For the first time, a distinction was made between those parts of grammar that could be rationalized--diction, syntax, and the like--and those parts, like pronunciation, that were left to be ruled by arbitrary fashion. Modern techniques of grammatical argument were introduced in this period: justification by logic, by literary precedent, by analogy, and by etymology. In fact, a good many of the specific dictates of prescriptive grammar were introduced then. It may be either consoling or disheartening to realize that grammarians have been railing for more than two hundred years against usages like It's me, the tallest of the two, and the man who I saw, with no sign of a resolution either way. (The grammarians have won some battles over the years-- most notably against the innocuous ain't, which educated speakers now use only in a jocular way. They have lost others, such as the fight to maintain a distinction between shall and will, which never really caught on outside of England.)
The basic linguistic values established in the eighteenth century were rarely challenged over the next hundred and fifty years. (In Jacksonian America, there was a brief reaction against the imposition of Old World grammatical values, but this was little more than a provincial rebellion, and it subsided, with the rest of such populism, by the Gilded Age.) It was not until the 1920s and 1930s that the traditional doctrines were rejected by a significant part of the cultural elite. In the forefront of the attack were the "structural linguists," as they then styled themselves. The battle culminated in the brouhaha over the publication in 1961 of Webster's Third New International, which refused to label usages like ain't and to contact as incorrect or even colloquial. Despite the fulminations of Dwight Macdonald and Jacques Barzun (and Nero Wolfe, who burned his copy page by page), the linguists succeeded in convincing most of the educational establishment of the rightness of their views. But they could not sway the body of educated public opinion; hence the cold war that endures to this day.
Defenders of the grammatical old order often speak of the linguists as a cabal of intriguers who have singlehandedly undermined traditional values. Jacques Barzun wrote that "modern linguists bear a grave responsibility. . . for the state of the language as we find it in the centers of culture," and Wilson Follett (writing in The Atlantic) referred to the editors of Webster's Third as "the patient and dedicated saboteurs in Springfield." Like most other conspiracy theories, this one is a little paranoid. The linguists could have had so wide an effect on the attitudes of educators only by addressing areas of general concern. In fact, the linguists based their attack on two sound points that appealed to the public's growing respect for science and increased awareness of cultural pluralism. First, every language is a complex system with an internal logic, the full understanding of which requires scientific investigation. And second, since nonstandard forms of English possess internal logic just as standard English does, they are not inherently inferior; rather, the doctrines of prescriptive grammar reflect covert class prejudice and racism.
I think that linguists have been wrong in their conclusions about the value of traditional grammar
It may be true that only those with technical expertise can begin to understand the workings of language -- and even to them, many of the basic issues remain as controversial as the causes of inflation. Still, it does not follow that the layman cannot decide for himself what is right and wrong. From the point of view of modern linguistics, Fowler knew very little about the mechanics of grammar, but he had exquisite intuitions about what sounded right and, more important, the capacity to reflect on these intuitions in a reasoned way. It is not important that he was unsuccessful in formulating general rules that would specify exactly when everyone must be followed by he, and when a plural verb should follow a collective noun. When grammar consists of nothing but such rules, in fact, it becomes frozen and useless, because there are always cases that the rules do not cover, or in which two rules contradict each other. (Should we say We have each taken his coat or .... our coats ? It is he whom I was going to see or ... him whom . . . ? Only the ear knows.) What Fowler does teach is an approach to grammatical problems that can be cranked up anew for each situation. In the end, that is what all good writers rely on. Linguistics can help here; it provides a language for talking about language (a "meta-language," in the trade), which is much more precise than the mysterious and dimly remembered classifications of traditional grammar. (Terms like "predicate nominative" may have a limited applicability to Latin, but they were not very useful for talking about English even in the days when educated speakers were presumed to have some familiarity with the classical languages.) But we should no more ask linguistic scientists to tell us what sounds best than we should ask economists to tell us which distribution of property will be fairest; those matters are for us to decide.
Until the 20th century grammarians rarely criticized the speech of the working classes; they simply assumed their readers knew better than to talk like that
Minority and working-class children must, as a purely practical first measure, learn the speech habits that happen to have been adopted by the middle class. What linguists have not understood, however, is that standard English and good English are different things. It is still possible to distinguish in principle between the crass hypocrisy that leads us to try to get ghetto children to talk like the children of stockbrokers and the higher and more arduous calling that leads us to try to get the children of stockbrokers to write like James Baldwin. But it is only so long as we bear that distinction in mind--so long as attacks on the "slovenliness" of ghetto English still raise our hackles--that we are entitled to try to resuscitate the enterprise of grammar with a clear conscience.
We can revive grammatical values only if we can make them consistent with our other social values, so that we can argue from moral principles that are as unexceptionable and familiar to us as the etiquette for addressing servants was to Fowler's original readers. There is no point in my trying to justify a usage to my students by an appeal to some musty, Arnoldian ideal of culture. They will listen politely and forget the whole matter before they log onto the computer to write their next term papers. They have nothing to be nostalgic about, and I can't make a good case for any sort of attention to grammar unless I am willing to accept the universe that they quite contentedly inhabit.
In the first place, we have to recognize that literature has lost the kind of public importance that it had in Johnson's or Arnold's day. Although more people than ever are functionally literate, few are literate in the high sense of the word, and those who are can't expect other people to be. (There is an old joke about the days when the British universities were thoroughly corrupt, and a dissolute young aristocrat could pass his exams by answering the question "Now then, Lord Arthur, who dragged whom around the walls of what?" Nowadays, we would take a correct answer as fair proof of successful completion of a major in humanities.) It is not that people don't read novels and such. They do, probably more than ever, but only for the pleasure of it. There is no canon--no books that everyone expects everybody to have read, or to be able to pretend to have read--that can serve as a common reference point in discussions of social values. The effects of all this on the way we think about and use the language are by now irreversible.
Take the way we talk about character. People who used to be vain, wrathful, self-reliant, sullen, or driven are now narcissistic, hostile, secure, passive-aggressive, or obsessive (more or less). To know what the old words meant, we went to Jane Austen and Thackeray; to understand the new ones, we take psychology courses. (The extent to which the new order has become established was brought home to me by a sentence in a recent article in Commentary on delinquency. The author ridiculed the jargon with which social workers describe delinquents, then concluded: "In short, he is what the layman would call a sociopath.") Psychoanalysis is unlikely to be repealed; people are not going to go back to reading novels in order to understand themselves and their lives.
Orwell's Politics and the English Language is the most widely cited of all 20th century essays on the language
The declining importance of literature is tied as well to the changing role of written language as a medium of public information. When the eighteenth-century grammarians insisted that writing, not speech, must be the model for good usage, they were on solidly democratic ground, for a spoken model could be familiar only to a small group of people connected by personal ties and could be broadcast only inefficiently, through plays and sermons. Written models could reach a much larger public. But now, thanks to radio and television, the spoken language has once again become the medium of the broadest public discussion, especially as regards the political and civic aspects of our lives. If we hew to the same democratic principles that led the eighteenth-century grammarians to insist upon the primacy of writing, we will accord more importance to the spoken language. Writing is not about to wither away, to be sure. The doomsayers who see in television the death of literacy sound very much like the nineteenth-century critics who thought that photography would be the end of painting. But television and radio are now the principal means for disseminating political information to a large public. Consequently, it is increasingly important for people to know how to listen critically and how to evaluate spoken arguments. Yet here again, the public-school curriculum and grammar books are largely unchanged. Instruction in speaking and listening, where it is given at all, still tends to take as its model the Lincoln-Douglas debates, rather than The MacNeil/Lehrer NewsHour.
Nowadays, it is common to throw all linguistic vices into the same hopper. The typical state of-the-language essay begins by citing a misuse of disinterested and then jumps to an example of bureaucratic jargon or of faulty verb agreement, as if each error consigned the writer to the same circle of hell. But the eighteenth-century grammarians were careful to distinguish among several different types of linguistic vices (and, by implication, of linguistic virtues). In particular, they set off barbarisms, expressions that could not legitimately be used in serious writing; solecisms, which were offenses against their ideas of logic; and improprieties, or mistakes in diction.
Barbarisms seem at first to have been a diffuse class, which included the use in polite discourse of foreign expressions, of archaisms, of "low cant" and "provincial idioms," and of the newly coined jargon of philosophers and theologians, a category later expanded to include the language of the sciences, both real and self-styled. What all barbarisms ostensibly had in common was that they offended against the idea that good usage must be "reputable, national, and present," as the great eighteenth-century rhetorician George Campbell put it. But some words that fail to meet these criteria have always escaped the epithet. It is only in our discourse about certain topics that we have objected to the importation of words from communities outside the general literate public. Take French expressions, until recently a matter of great concern. Campbell attacked the use of politesse, hauteur, and belles lettres; a hundred and fifty years later, Fowler ridiculed the use of jeu de mots, flâneur, and dernier ressort. But has anyone ever objected to à la carte, pas de deux, or ménage à trois? There are certain cultural categories that we insist on defining for ourselves, but when it comes to the arts of cooking, dance, and love, we readily defer to the authority of the French.
The barbarisms that concern modern critics are quite different from those that bothered Campbell or Fowler. No one now troubles over the use of archaisms like peradventure and anon, for the language is scarcely threatened by an unwholesome excess of gentility, as it was in the nineteenth century. And the use of an unassimilated French word is at worst regarded as an annoying affectation, in part because French culture is not as important a model as it once was, and in part because the practice is no longer intimidating--not only do people not know French but they are not even ashamed of not knowing French.
One type of barbarism that does rile modern critics is borrowings from technical usage--one aspect of the tendency to refashion the language on the model of scientific discourse. For the most part, the borrowing is natural and inevitable--what else would you call a minicomputer or a quasar? And no one objects to the use of terms from economies like money supply and productivity. But we raise the roof when bureaucrats and administrators introduce verbs like prioritize, source, implement, and input as if their procedures were as technically intricate and as inaccessible to common understanding as the workings of computers or the money market. And now that we justify linguistic values in political terms, we find no offense so heinous as the use of jargon by politicians, who call invasions "incursions" and tax increases "revenue enhancement"--the sorts of usages that so bothered Orwell.
Why is it all right for a politician to use capital-intensive, but not revenue enhancement?
Critics would be wise to say nothing about true slang, the special language of linguistically disenfranchised groups like the young, the minorities, and the underworld
The objection to the word "lifestyle" is that it is at too many removes from reality; in its contemporary usage are implied a number of assumptions about life that are belied by experience. Chief among these is an assumption about the absolute plasticity of character--change your lifestyle, change your life--that is simply not true; and the popularity of the word "lifestyle" is testimony to how much people want to believe it.
Epstein has hit on the sensibility that underlies the current use of life-style, but his argument leaves an important question unanswered: what is wrong with having the word? After all, there really are life-styles. Some people choose to have split-level houses and Ford station wagons, while others have condos with wet bars and drive Porsches, and if that isn't a difference of style, what is it? Would anyone object if we talked about "the Southern California style of living"? The difference is that in giving a one-word name to a category, we make it into a kind of primary concept, which is presumed to have a basic place in our overall scheme of things. That is the intimation about life-style that bothers Epstein--and me.
CRITICS have shown themselves to be relatively flexible in their resistance to barbarisms. If the word life-style manages to survive for another generation, it is unlikely that anyone will still be bothered by it. (Who is there under fifty who minds the use of contact and process as verbs, usages that set Wilson Follett's teeth on edge a mere twenty years ago?) By contrast, the canon of solecisms has remained largely the same over the years. These are violations of the rules of verb agreement, pronoun case, and so on, rules that are supposed to ensure that prose will be constructed logically. The word logic can mean several things. A discourse is logical if its conclusions follow from its premises, but that has nothing to do with the grammatical construction of its sentences. Provided that it is logical, in this sense, to say Everyone who leaves after six will miss his train, it is equally logical to finish the sentence . . . will miss their train. The notion of logic at stake in the discussion of solecisms has to do with syntax, not sense: a plural subject is logically followed by a plural verb; two negatives logically make a positive; a pronoun following a form of the verb to be is logically in the nominative case. But linguists have been at some pains to point out that this sense of logic is a curious one, if only because, unlike our sense of what constitutes a well-formed argument, it varies from one language to another. Two negatives do not make a positive in French or Italian, for example, nor does French use the nominative case after to be. Would we really want to argue that these languages are less logical than English? And if not, why do we insist that there is something illogical about the English sentences It's me, or I didn't see none? What is more, we seem to apply the rules of logic somewhat selectively. Surely it is illogical to say More than one student has failed the exam (or Fewer than two have passed), but what grammarian would try to fly in the face of established usage here?
The linguists are right in all of this, and right as well in saying that such rules serve little communicative purpose; It's me is no less clear than It is I. But it is precisely because these rules are arbitrary and difficult to observe that they may once have had a certain usefulness as exercises, for they force the writer to pay more attention to his syntax than mere communication would require.
To get your whoms in the right place, you may have to look half a dozen clauses down the pike
A better case can be made for some of the rules determining solecisms that operate within a single clause. Adherence to subject-verb agreement continues to be a good exercise, though it is important to note both that this rule too fails to serve real logic and that there are dialects of English (as well as many other languages) that do not bother with it. Still more useful are the rules and strategies that inarguably help to make a discourse coherent and easy to follow: the injunctions against, for example, dangling modifiers and the use of pronouns with vague or confusing referents. It's not just that the observation of these rules makes a text more "readable," as E.D. Hirsch, Jr., puts it, though that is reason enough for insisting on them. The rules also help to remind us of the differences between private and public talk. A student writes, "He watches Claudius praying, wondering whether to kill him"; the offending clause is underlined. "But it's clear what I meant!" And so it is. In friendly conversation, we scatter dangling modifiers at will and stick in a this or an it whose referent is nowhere spoken but is easily recovered against the familiar background. It is only in the neutral context of public discourse that we have to be careful and explicit. The extreme case is in the public print, where writer and reader are at an indeterminate remove from each other. Because we first learn our language almost entirely in intimate contexts, we have to make an effort to acquire the awareness needed for communication across contexts in which little can be taken for granted.
This takes us to an important observation about the value of grammatical rules of all kinds. Critics and grammarians spend most of their time talking about the obligations of the writer or speaker to the reader or listener. But most people never so much as write a letter to the editor, much less appear on Meet the Press. Why should we make such a fuss over the rules, then? Why not simply give special instruction to journalism majors and let the others off the hook? The fact is that instruction in public speaking and writing, particularly regarding clarity, is most important as an indirect way of teaching people to listen and read. Beyond the first steps of teaching children to sound out their letters, there is not a lot we can do to help people read better (which does not as a rule mean faster). Yet between the acquisition of the elementary skills and the ability to make sense of ordinary public prose there is an enormous gap--something that is just becoming clear to psychologists and linguists looking at literacy. For someone accustomed only to face-to-face communication and personal letters, it is virtually impossible to recover the sense of a magazine article written by somebody miles and months away. (The linguist Jerrold Sadock has noted the complexity of the inferences associated even with simple instructions like "Wet hair, lather, rinse, repeat," which we would paraphrase in ordinary English as "First you wet your hair, then you lather it with this stuff, then you rinse it off, then you lather again--your hair is already wet now, after all--and then you rinse the stuff off again.") The best way to learn the skills of reading and listening that everybody needs in order to participate fully in society is to imitate the skills of writing and speaking that only a few will ever have to practice for their own sake.
THE "improprieties" of traditional grammar are the usages that arise out of the natural drift of the meanings of words in the standard vocabulary. The list of them has changed over the years, though less rapidly than the list of barbarisms. The eighteenth-century grammarians objected to the confusion of ceremonious and ceremonial, and to the use of demean to mean "debase" rather than simply "behave"; their twentieth-century counterparts have seized on the confusion of disinterested and uninterested, the use of presently to mean "now," and the use of hopefully as a sentence adverb. What is at issue in such cases is the extent of the debt we owe to precedent. Obviously we are not bound to use the language just as it was used a hundred years ago, but neither is it in our interest to change the language willy-nilly if we want to ensure the continuity of our discourse. Faced with a particular change, then, we need rules of thumb. I submit that the two questions we ought to ask are: Does it involve any real loss? and Is there anything we can do about it?
Invoking the first of these criteria, we will lament the progressive loss of the disinterested-uninterested distinction. Unbiased and impartial will not do the work that disinterested used to be reserved for. But there is no point making a fuss about this change, because it was forgone that disinterested would lose its older sense once interested lost the sense of "having a stake in," which we retain only in the fixed phrase interested party. Even if disinterested had survived intact, therefore, it would eventually have become one of those curious asymmetric negatives like untoward and disgrace, whose senses are not recoverable as the sum of their parts. Invoking the second criterion, we should be prepared to admit that the fight on behalf of disinterested is a "lost cause," as Trilling described it. This may be an occasion for regret, but indignation would be out of place. Isaac Asimov writes, "I'm very proud of knowing the distinction, and insist on it, correcting others freely." The fact that being familiar with a distinction can be a cause for self-congratulation is, however, reason to eliminate it from the canons of standard usage, which should not be repositories of grammatical arcane.
I am more puzzled by the resistance to the newer use of hopefully, because here I cannot discern any loss whatsoever. Most of the members of the Harper Dictionary of Contemporary Usage panel who condemn it give no reason at all: Shana Alexander calls it "slackjawed, common, sleazy"; Phyllis McGinley says it is an abomination, whose adherents should be lynched; Harold Taylor says it makes him physically ill; and so on. Now, I do not doubt the sincerity of these passions, but I wonder what arouses them. Some say the problem is that the adverb doesn't modify the verb in a sentence like Hopefully, it will be done by Monday. But why does hopefully get singled out when the same point could be made about various uses of mericifully, frankly, happily, primarily, and dozens of other words? In fact, Follett has suggested that what is wrong with hopefully--and sorrowfully, thankfully, and others--is that they "lack point of view; they fail to tell us who does the hoping, the sorrowing, or the being thankful." But that would seem to be a virtue, which hopefully shares with a number of other, unexceptionable expressions (apparently, obviously, with luck, and alas, for example). As Robert Crichton puts it, "No one cares if I hope the war is over. . . . " Follett suggests that the "natural way to express what is meant" is it is to be hoped, which achieves the same impersonality at a cost of excruciating stiltedness. Obviously, I have missed something--there has to be a reason for all the vehemence--but I am sure that the point is too nice for the mass of American speakers who have adopted the new usage with no communicative ill effects.
I have spent time on these two cases because they seem to bring out the worst in language critics. But I would not want to claim that there are no improprieties worth bothering about. Take the often-remarked use of literally to mean ''figuratively" (as in We were so bored we were literally climbing the walls). Unlike the recent problems with hopefully and disinterested, this tangle has been around at least since Fowler's time. If literally were going to shift its meaning, then, it would have done so long ago; the fact that it has hung on in its etymological sense is an indication that people are willing to reform their usage when the rationale is explained to them. Unlike disinterested, which has become opaque as the meaning of one of its parts has shifted, literally is opaque only because the form of its parts is hidden. Once the connection with letter is made, the correct usage makes perfect sense. In short, this is an example of an impropriety that both should and can be corrected, since people have continued to find the distinction worth making, and have gone on to make it. Not that the misuse of literally will ever vanish; the error is as natural as the tendency to correct it on reflection. I think it is not always sufficiently appreciated that the battles over grammar, like other battles for souls, are won at the individual level. It should be a source of satisfaction that the grammar books of a hundred years hence will be decrying to good effect the tendency to misuse literally, or to confuse imply and infer.
Some recent critics have been sensitive to the misuse of the particles we use …things like I mean, really, and you know
I find that I am struck by the misuse of the expression only when I am listening to public discourses: radio call-in shows and TV interviews, for example. What is otherwise a natural appeal to a shared background is distressing in such contexts precisely because we can no longer take so much for granted: we don't know who the speakers are, as we do in face-to-face conversation, and we can't ask them for clarification.
It is only in recent times that we have had reason to criticize the abuse of particles like you know, with the rise of the once unimaginable genre of public conversation. Of course, the ultimate point of the criticism is not the improvement of the quality of talk-show contributions. Rather, just as an attention to avoiding dangling modifiers in writing exercises helps teach us to read intelligently, an awareness of the abuse of you know may make us better listeners to public forums.
I WOULD be surprised to find that any reader agreed with all the judgments about usage that I have offered here. But uniformity of linguistic values is neither necessary nor desirable. It is unnecessary because what choice we make about hopefully, whom, or any other particular point of usage matters very little; specific cases are supposed to serve simply as parables to guide our thinking about usage in general. It is undesirable because it can be achieved only through rigid codification, which makes all further discussion pointless. We should not suppose that the schools can teach linguistic values by a simple return to "basics." If we mean by basics an increase in the attention paid to reading and writing, well and good. But the very word basics, with its implication that what is lacking is simple rote skills, on the order of the mastery of spelling or the multiplication tables, surely would have set Johnson or Fowler to shuddering. The aspects of usage (and mathematics) that really matter are not learned easily and are not learned early.
What we need now is not more invective, but a civil discussion of the problems
In saying that the discussion must be nonpartisan, I mean not that it should have nothing to do with ideology but that it should be extricated from the kind of left-right polarization in which it has lately been mired. One of the greatest of Johnson's accomplishments was that he managed to raise the question of the language above partisan politics. (Fifty years earlier, Swift's proposal for an academy had been denounced as a Tory plot.) After Johnson, it was understood that what was under discussion was the rules that both sides would accept in order to centinue their debate. The only way for the consideration of grammar to become once again a matter for general discussion will be for everyone to realize that all sides have a stake in coming to terms, and to eliminate from the discourse such epithets as "permissive," "liberal," and "leftwing" on the one hand, and "repressive," "reactionary," and "rightwing" on the other.
There will always be points of usage on which grammar and political principle find themselves at odds, such as the everyone . . . he business. But debates in these cases are not over which usage is grammatically preferable; rather they are over the relative strength of the claims that grammatical and other principles have on us. Grammarians should no more decide these issues for all than physicists should decide whether we ought to have a space program. We must come to our own conclusions. For example, I stick with he as an impersonal pronoun, except where sex neutrality is important. I think there are good syntactic reasons for my choice, though I know syntacticians who would disagree with me. But grammar can only excuse my usage, not justify it, and all its arguments are irrelevant for people who have decided to go with the use of they on the grounds that he is sexist. As Fowler maintained, matters of conscience must take precedence.
This takes me to the third requirement that a new discussion of values must satisfy--the question of courage and tolerance. When I ask myself why I have decided to stick with the use of the singular he to refer to an antecedent like every American, I find that my motives are unclear. Intuition tells me that the singular makes grammatical sense. But I am troubled that English grammar requires the singular to be of one gender or another, and that precedent requires it to be masculine. And I have sometimes supposed that it is only out of fear of being thought ignorant that I don't move to using they and their in all cases. If this is indeed my motivation, then I am like the character of whom Randall Jarrell wrote: "She always said to paint the lily: she knew that this was a commonplace phrase and that the memory of mankind had transfigured it, and she was contemptuous of people who said to paint the lily--just as she was contemptuous, in a different way, of people who said to gild the lily--but she couldn't bear to have anyone think that she didn't know which one it really was." It isn't easy to flout a rule that many people set great store by, even when there are ideological reasons for doing so; it is harder still when one's only reason is that the rule does not seem to make much sense anymore. As Fowler said, one must "deliberately [reject] the trammels of convention" to consciously split an infinitive. The trouble with indiscriminate conformity to traditional strictures is that it can become a form of appeasement, in which we find ourselves "placating the kind of extremist who does not deserve to be placated," in Simon's words. It is by exploiting our insecurities and our well-intentioned reluctance to give offense that purists have been able to exercise their unwarranted tyranny over usage and to block any serious re-evaluation.
Of course, it is not only out of cowardice that we may want to adhere to rules that seem unjustified. Civility must be kept in mind when considering usage; that is what gave "polite prose" its name. Once there is a wide consensus that a certain usage is preferable, it behooves us to conform to it out of deference to public opinion, particularly if our private objections are only grammatical, with no basis in principle. Furthermore, there is a clear risk of irresponsibility in counseling others to disregard rules that they may be judged by. Finally, we don't want to underestimate the importance of nostalgia as a conservative force. I myself do not use disinterested to mean "uninterested," not because I think it is a sin to do so but because I am fond of the older sense of the word. We all have our "bower-birds' treasures," as Fowler called them, which are no more harmful than any other antiquated mannerisms it may please us to affect. But we would do best to reserve these for our personal usage manuals, recognizing a distinction between the private decisions that we come to about usage and the public course we counsel. Where we cannot be bold, we can at least be tolerant.Reprinted courtesy: Atlantic Monthly
William and Flora Hewlett
© COPYRIGHT 2005 MACNEIL/LEHRER PRODUCTIONS. All Rights Reserved.