BY CHRIS OSTERNDORF
You know things have gotten out of hand when actual Nazis start becoming grammar Nazis.
Io9’s Lauren Davis recently penned a defense of typos, which was actually less in defense of anything than it was a response to an essay by author John Higgs on usvsth3m. Higgs was prompted to write the op-ed after seeing the Nazi party’s now infamous tweet about spelling and grammar, and in it, he takes the opportunity to extol not just the virtues of typos, but the very existence of imperfection itself. Mistakes in writing, Higgs argues, like mistakes in life, reveal essential humanity, and as Higgs puts it, “being human is good.”
But Davis found herself less sure of the benefits of mistakes, or at least, less sure of how linguistic mistakes should be handled, writing:
While there is something to be said for readers forgiving the occasional mistake (I say as a person who writes things hastily on the Internet), I’m not sure Higgs makes a convincing argument that writers should leave genuinely accidental errors be, especially when there’s an edit button handy. On the one hand, we shouldn’t miss the forest for the trees and ignore a passionate piece of writing because the author dropped a word or screwed up a pronoun. On the other, making mistakes isn’t the same thing as ignoring them.
Davis’s point about being, “a person who writes things hastily on the Internet” speaks to larger concerns regarding the existence of the written word in a digital age. Too often people find themselves in one camp or the other; either you’re a traditionalist who believes that as print has died out, so has the purity of all language, or you’re a devoted student of new media, ready to defend the existence of blogging and online journalism at every turn. But the truth is that the merits of either discipline don’t automatically cancel out the other.
There’s no question that digital language is a force to be reckoned with. The Guardian’s Tom Chatfield wrote a whole book about it. However, as mentioned above, discussions about that language continue to tend to be less exploratory, and more along the lines of outright condemnation or praise.
Chatfield’s fellow Guardian writers Robert McCrum and Steven Poole squared off on the matter last year. McCrum used George Orwell’s famous “Politics and the English Language” to back him up, condemning the idea “that language is a natural growth and not an instrument we can police for better self-expression.”
He goes on:
To paraphrase Orwell, the English of the world wide web—loose, informal, and distressingly dyspeptic—is not really the kind people want to read in a book, a magazine, or even a newspaper. But there’s an assumption that that, because it’s part of the all-conquering internet, we cannot do a thing about it. Twenty-first century civilisation has been transformed in a way without precedent since the invention of moveable type. English prose, so one argument runs, must adapt to the new lexicon with all its grammatical violations and banality. Language is normative; it has—some will say—no choice. The violence the internet does to the English language is simply the cost of doing business in the digital age.
Poole, on the other hand, responded by suggesting that just because the Internet fosters more writing, that doesn’t necessarily mean said writing is inherently worse. He muses that, “Of course, there’s a lot of bad writing on the web, but there’s a lot of very good writing too. There’s just more writing at all levels of quality. McCrum offers no evidence that the bad is a greater proportion of the whole than it ever was. Arguably, thanks to internetworked electronic communications, people are writing more than ever before in history. This does not by itself seem adequate cause for dejection among the literati.”
But McCrum does in fact seem to be bothered by the fact that there are more writers now than in any time in history, and essentially claims that all those writers are watering the discipline down. He observes that “we became so exhilarated by the freedom of the new media that we weren’t willing to grapple with the responsibilities that came with liberation… the author of Nineteen Eighty-Four remains a talisman. Those who assert the “democratic” and “free” qualities of the worldwide web would probably cite his famous essay with approval in any discussion of English usage today.”
This kind of back and forth can become extremely exhausting, especially considering that research has yet to validate a definitive school of thought entirely. A more interesting use of one’s time might be to examine at how these viewpoints interact in real life.
After looking at a Wall Street Journal piece about the need for good grammar in office settings, Linton Weeks of NPR decided to do a story about how the perspectives of English scholars compare to those of individuals in the business world.
The results were revealing.
University of North Carolina professor Connie C. Eble generally found that “today’s students are actually much better writers than they were 30 years ago.” And Matthew Gordon, a linguist at the University of Missouri, notes that the struggle to grasp proper English is nothing new, telling Weeks, “People have always had trouble with homophones…and they have always used language creatively, coining new words or respelling established words. … What’s different today is that we can see these ‘mistakes’ more commonly because we’re encountering a broader swath of writing on the Internet.”
Yet the most interesting person Weeks quotes in his story is Kyle Wiens, CEO of the web company iFixit and founder of Dozuki software, who wrote a blog post in the Harvard Business Review about why he won’t hire people who use bad grammar. Wiens claims, “Good grammar is credibility, especially on the Internet. In blog posts, on Facebook statuses, in emails, and on company websites, your words are all you have. They are a projection of you in your physical absence. And, for better or worse, people judge you if you can’t tell the difference between their, there, and they’re.”
The fascinating part of this is that by denouncing the use of bad grammar online, Wiens is also validating the use of the Internet as a means of legitimate writing. There may be no explicit need to “police” the growth of language as McCrum suggests. But as long as people like Wiens exist, there will always be a certain kind of self-awareness inherent in the use of spelling, grammar, punctuation, and so on. Because to disregard the correct usage of these elements can also be an act of self-awareness.
The point is that whether you are ignoring typos, as Higgs would encourage, or following strict grammar guidelines to the tune of Wiens, language remains a force to be reckoned with. If the deconstruction of language causes some individuals to be more careful about the writing they do online, that’s good news. But it’s also good news when writers who have long worked in print open themselves up to the digital world, and thereby, to the use of digital language.
This also factors into the debate over whether “real” writers should stay away from tools like Twitter. If you take into account how much hand-wringing is going on in both camps, it’s a debate that automatically seems that much sillier. Why shouldn’t real writers be on Twitter? It seems unlikely that just because Joyce Carol Oates occasionally sends out messages in 140 characters, she’s all of a sudden going to forget how to write a novel. You may not like what she has to say, but it’s not as if major authors were incapable of stirring up controversy before the Internet.
Moreover, with the rise of genres like “Twitter fiction,” it’s getting harder and harder to tell where “legitimate” literature begins and Internet writing ends. The result is merely that people continue to enjoy the many possibilities of language in all its forms.
Great writers know the rules so they can break them. In that spirit, it’s important for all writers to recognize that traditional language is not on its way out because of digital language. But traditional language is also more essential because of digital language. Internet scribes should know the rules if they want to break them, and print authors should break the rules because that’s what they’re there for.
Chris Osterndorf is a graduate of DePaul University’s Digital Cinema program. He is a contributor at HeaveMedia.com, where he regularly writes about TV and pop culture.