Over winter break, I was up late one night playing board games with a few friends. We moved through Pictionary, Hearts, and even Monopoly (apparently, we’re very good friends) before turning our efforts to Scrabble.
As per the rules, it came time to agree on a dictionary. One friend offered the latest edition of Merriam Webster while another suggested the Oxford American Dictionary. But why, I wondered aloud, did we even have to use one at all?
Hear me out. These friends and I are all juniors at Columbia and have similar educational histories and online presences; the kinds of things we talk about are more or less equivalent. We all speak English natively, and we all know how to spell the words we use. For these reasons, I proposed we trust ourselves and play whatever words we want—be our own dictionaries—and use the book as no more than a reference for settling disputes.
Needless to say, this idea didn’t go over well. The fundamental objection was that, without a dictionary to guide gameplay, we could play words that weren’t “real”: slang words, Internet lingo, words from other languages, proper nouns and—heaven forbid—maybe even acronyms.
The argument confused me. These kinds of words are no more or less “real” than anything else we say in our 21-year-old varieties—more formally referred to as idiolects—of English. I asked: should we not be able to play “bougie” or “ghosted” or “turnt,” words that we type, say, spell, and use all the time, words that we all understand perfectly well? Is it because they sound informal, a young person invented them, or a dictionary hasn’t christened them “real” yet?
My friends’ reluctance to diverge from standard, “formal” Scrabble play revealed an odd paradox. As young people, as students, as part of the generation pioneering how we communicate online, we faithfully rely on these “non-real” words to get our meanings across and make sense of a world changing faster than we can manage. And linguists are paying closer attention than ever before: text, Facebook, emails, and tweets might as well be four sides of a 21st-century Rosetta Stone. When I say “that situation was sus” or “I have fomo” or “he ghosted her,” the meanings of “sus," “fomo" and “ghosted" are clear—no less clear than the nouns or verbs that came before them and, for that reason, no less “real” than other English words.
But—if my friend’s immediate rejection of my offer to use Urban Dictionary as a complement to Merriam Webster is any indication—we are a little ashamed of this “informal” lexicon. This variety of words, the one that infuses and enriches our 2018 day-to-day conversation, is new, young, and ever-changing—just like the people who use it. Yet, for some reason, we consider them just “casual” words that tarnish our otherwise immaculate English repertoires—words that, when other people hear us saying them, are reason for alarm, more “evidence” of the linguistic degradation at the hands of our generation.
Our employment of these “non-real” words hardly signals that we’re running our prestigious English into the ground. These words are meaningful, adaptable, and perfectly grammatical—and, might I add, we more or less agree on how they’re spelled. This new usage is rather evidence that we mould our words to fit our needs, just like every other language user who has ever existed.
Consider the Columbia idiolect. We discuss “neoliberalism” and “postcolonialism” often and with ease; we’re happy to slap a psycho-socio-hetero-pseudo-intra-cultural prefix onto a concept in a seminar to make ourselves sound clearer (and smarter). But, at home and online, we also enjoy words like “snek” and “doggo,” and we engage with them even more so. How can one person use such dissimilar kinds of words? Can I sound academic and be into memes at the same time?
Well, of course you can. This lexical dichotomy need not be reconciled. Affectionately calling an inanimate object a “boi” isn’t going to preemptively make your University Writing English any less articulate. Using different registers of vocabulary doesn’t mean we’re otherwise brilliant orators with occasional degenerate streaks, or vice versa. We simply switch between varieties as we see necessary.
To resist new words, and new uses of words, is hardly a recent phenomenon. There was a time before “eyeball” and “manager” and “fashionable,” though we don’t think twice when we hear them now. Whether you address a single person as thou or you took English speakers three hundred years to sort out. Difference—in grammar, vocabulary, and pronunciation—has historically been used to otherize nonstandard varieties of English, a precedent founded on no more than fear and racism.
Don’t get me wrong—we do need dictionaries. We need people to document our language as it changes. With a standard, we can be mutually understood, which is more important now than ever before as the number of English speakers globally continues to rise. But dictionaries are hardly the end-all and be-all of language. They are, as English professor Anne Curzan said in her 2014 TED talk, “fantastic resources,” yes—but they are “not timeless” and just as “human” as the people who edit them. They are a reference, a limited fossilization of English as it is now—not an absolute authority.
Fill the gaps in our language, and don’t be ashamed to do so. It is our negative associations and biases that forge this guilt (“‘Bae’ isn’t a real word because only 14-year-old girls say it!”), not any credible linguistic scholarship. As author don Miguel Ruiz says in The Four Agreements, “be impeccable with your word”—whichever word(s) you want that to be. And they may even score you some points. In fact, I think “finsta” is worth about 9 in Scrabble.
Harmony Graziano is a Columbia College junior studying linguistics but is certain some sort of econ/poli sci double major will work out. She has never won a game of Scrabble in her life. Have some shit to say about how we say shit? Email her at firstname.lastname@example.org. Striking Chords runs alternate Tuesdays.
To respond to this column, or to submit an op-ed, contact email@example.com.