English lexicographer Jonathon Green offers this well-reasoned article criticizing crowd sourcing for dictionaries under the thesis “dictionaries are not democratic.” As a confirmed word snob and English language fetishist, I have to say that I generally agree.
To study and love the English language is to accept and even celebrate that it is and always has been the most mutable language in human history. The properties that make English so hard to learn for the foreign student are exactly those traits we logophiles find so attractive. I believe good writing, beyond clarity of intent, is about two things: sound and connotation. For the student of English, it’s enough to acquire the denotation of a word like grumpy, but the serious writer will spend more than a little effort considering the contextual appropriateness of prickly, testy, waspish, churlish, or irascible. This is both the joy and the mania of the medium.
So, one might assume that if embracing the ephemeral nature of English is part of loving the language, then the rapid mutations of digital-age neologisms would be eagerly accepted as well. For some, I’m sure this is the case, and it’s true that fleeting colloquialisms are always fun when they add color and spice to everyday parlance. But the professional communicator also wants his language moored to something a bit more firm, which is why I believe Mr. Green makes a good case for lexicographers beyond mere job security.
Take the word I used above, logophile. I knew such a word existed, but writing this at dawn and half a coffee short, I couldn’t think of it. One thing Web searching offers that a dictionary does not is a means to stumble about with half words and phrases in search of the target. While doing exactly this, I found wordophile, cited in Meriam-Webster’s online urban dictionary. Wordophile is sort of fun sounding, so why not choose it over logophile? In the context of this piece, the correct choice for me is the word that has more solid roots in the language, whereas I might use wordophile in a more casual or flippant circumstance.
The question of professional vs amateur lexicography feels anachronistically futile in a world where many college educators are happy to receive papers using whole English words at all rather than the truncated form of Tweetish. Still, Mr. Green and his ilk are guardians of something more profound than dusty and heavy stacks of paper. When language is stretched too thin and too rapidly, vagueness becomes the norm in all communication. We see this in corporate and political shorthand all the time — two worlds where speaking without saying anything is often a purposeful strategy.
In the next podcast I’ll be posting, writer Jeffrey Turrentine refers to one downside of the Web as its being a tool for achieving “epistemic closure.” There is nothing particularly new about the observation that the Web offers the user evidence for whatever bias — political, scientific, historic, social — he brings to the keyboard, but when we combine this phenomenon with too much democratization of language itself, I believe we ultimately serve anti-democratic interests who too easily manipulate the masses through the fog of an unbound lexicon.
This seems like the latest frontier in the long-running debate between “descriptivists” and “prescriptivists.” The former believe that dictionaries, thesauri, and grammar guides should simply reflect what is common usage. The latter believe that such references play an important role in maintaining and perpetuating conventional and traditional usage.
The debate has always seemed to me to be quite a bit of a self-justifying preening fest. All descriptivists also codify ad all prescriptivists also accept change. After all, no one holds today that “awful,” should be used as a laudatory phrase just because it originally was. The issue is how easy you are to accept; it is the low standard vs. high standard debate. I personally lean toward the difficult school. I think novel usage should have to survive the attempt of pedants, grammarians, word snobs, and know-it-alls to stamp it out before it gets accepted. Only those changes hardy enough to thrive and prevail in spite of the hard opposition of the old guard should be seen to have earned acceptance. Darwinian lexicography if you will.
But it has to be said that the spirit of the times leans towards the descriptivist side, often with vexing results. I have two personal examples of oxymoronic usage that never cease to annoy me, which I will now self-indulgently inflict upon the readers of this blog: In the first place, I can’t abide it when people modify the word “unique,” as in “pretty unique,” “very unique,” or “really unique.” Whenever I hear it, I am reminded of the line from the film The Princess Bride: “I don’t think that word means what you think it means.” It just defies logic – something is either “unique” or it isn’t. It is categorical, not relative. The other one that is like fingernails is “factoid,” which by all logic of the language should mean “like a fact, but not.” Ironically, this does describe most items offered as “factoids,” even thought those using the word use it to mean something like “small fact” (which might actually be fairly termed a “factlet.”)
(I also can’t stand it when people use “hopefully” to mean “i (or we) hope” rather than “in a hopeful manner,” but at least it isn’t a contradiction.)
I agree with Cormac that the dichotomy is exaggerated. The truth is, a word means whatever people think it means (and, thus, we are losing the word factoid as a falsehood). You can fight these battles word by word — either for the principle of the matter or in dedication to a particular word — but has anyone EVER won such a battle?? Just think of the people trying really hard and STILL screwing up the “Cormac and me” vs. “Cormac and I” selection, or not splitting infinitives to awkwardly avoid a rule in Latin. The hardest-fought battles in language are responsible for some of our most ridiculous writing and speaking.
I am both frightened and horrified by the proliferation of “would of” in place of “would have”, and disgusted by the general acceptance of “your” for “you’re”. Let’s remember to pick our battles when faced with “amazeballs”.
I suspect this is a tempest in a teapot because the group of people willing to get serious about lexicography is largely self-selecting. The result may well be something successful in the same way wikipedia is, with the majority of contributors playing no more in depth a role than those who submitted word usages to the OED. And they’re unlikely to get “amazeballs” wrong.
Thank you Cat and Cormac for contributing. You both cite examples that help focus my own thoughts. And because I’m the kind of nerd I am, I cannot help being amused that the word “factoid” itself may be in the cocoon stage, destined to emerge as its own antonym. This is, if I may coin a term, an “ironyball.”
To Cat’s point (I think), “factoid” in years to come may require an official new denotation, just like “awful” because usage will have functionally changed its meaning. This word evolution is part of the deal with English, but I do suspect that technology is accelerating the process so that we are witnessing more transition of more words on a daily basis. Combine this with the fact that the lion’s share of the English lexicon goes unused, and I do wonder if communication will not become an amorphous goo like most of the aforementioned political and corporate dialogue out there.
On a personal note, I find it frustrating when a word like “factoid,” which has (or had) a very specific meaning is now denied to the writer trying to express himself because every reader will take him to mean the opposite of what he has written. Because I write about ideas and offer theses, I want to be as precise as possible; and when a word like “factoid” is exactly what I want to say, it’s a frustrating exercise to seek an alternative. Because the word has not officially been redefined, I as writer would not be wrong in applying its official denotation, but I have to accept that the word is in lexicographical limbo.
Cat’s reference to “your” and “you’re” point to a matter of grave concern — grave in context anyway. I think we can all agree that this particular abuse is a digital-age phenomenon, most likely traceable to the shorthand of texting. Unlike word evolution, this is a situation in which the next generation (if I can generalize) is literally losing the ability to tell the difference between a possessive and a contraction, and presumably the sensibility that it even matters. While I agree (as does Mr. White) that something like infinitive splitting is a matter of stylistic choice for the author, I also believe certain rules of grammar are more mathematical than they are aesthetic. And this “your/you’re” thing is one of those rigid cases.
My overarching question, given the context of this blog, is whether or not technology is transforming language at a very rapid rate and even possibly limiting rather than expanding it? In the same way that expanding telecommunications and social media actually seem to be segmenting people rather than cross-pollinating them, are we fostering a kind of cybernetic Tower of Babel in which people are sharing the same, incestuously coded thoughts with their peer groups?
Of course, we all know what an old grump I am.
You ask if technology is transforming language more rapidly. More rapidly than what? I would suggest that the transformation of language — at first completely, then widely uninhibited by the written word — slowed dramatically once / wherever literacy and primary education became widespread. Now the access to vastly more communication (in the vulgate rather than classical language) is mitigating that slowing — that standardization — which is quiet young compared to the human acquisition of language.
Yes, we are losing our ability to differentiate between a possessive in a contraction, much like we lost the ability to distinguish between nouns in prepositional and dative cases somewhere between Latin, German and modern-day English. This is how languages evolve: in what can sometimes be construed as a big fat shame. Similarly, systemic hyperbole and use of metaphor have been forcing the invention of new superlatives far longer than there have been computers. Isn’t that amazeballs? And totally gnar…
P.S. The word you’re looking for when you think you want “factoid” is “lie”. 😉
All good, although I would argue that “lie” and “factoid” have different connotations. “Factoid” used to cut a particular line the way “plausible” once did when it meant something that sounds factual or likely but is in fact not so much.
I’m all for the invention of new superlatives and the like. This is what I meant when I said referred to the fun of new, even fleeting, colloquialisms in casual parlance. I like “amazeballs” just fine and grew up in the land of the “gnarly.” But to your example of the “your/you’re” issue, does it matter beyond briefly annoying those who know the difference? I don’t exaggerate when I say that I have encountered a fair bit of writing from members of the next generation, and when all the grammar is abandoned, it makes for some pretty tough reading. Is a new “system” in the works, or is it just a free-for-all?