If there’s one thing this Presidential race and debate have reminded me of, it’s that everything is subjective. A few thoughts on the content of the first 2016 Presidential debate from a linguistically-inclined cognitive scientist:
America is a piggy bank
You look at what China is doing to our country in terms of making our product. They are devaluing their currency and there’s nobody in our government to fight them and we have a very good fight and we have a winning fight because they are using our country as a piggy bank to rebuild China and many other countries are doing the same thing. -Donald Trump
If the US is truly a piggy bank, then China may have to smash us to pieces to get their money out. We should watch out.
Trump and Clinton argue over Trump’s statement that: You [Clinton] have regulations on top of regulations and new companies cannot form and old companies are going out of business and you want to increase the regulations and make them even worse.
Clinton: I kind of assumed there would be a lot of these charges and claims and so –Trump: Facts.
What you call a thing matters. Both candidates agree on that.
There’s been some innovative language use from both Clinton and Trump.
Clinton defines her phrase “Trumped up trickle down”:
And the kind of plan that Donald has put forth would be trickle down economics. It would be the most extreme version, the biggest tax cuts for the top percents of the people in this country that we’ve ever had. I call it trumped up trickle down because that’s exactly what it would be.
Trump’s new word, bragadocious, needs no formal definition:
I have a great company and I have tremendous income. I say that not in a bragadocious way but it’s time that this country has somebody running the country who has an idea about money.
Oh! Hillary just wrote my conclusion for me: “Words matter, my friends, and if you are running to be President or you are President of the United States, words can have tremendous consequences.”
I heard myself mention to a friend one day, “I’m reading this great book about the making of the Oxford English Dictionary.” This comment was followed by a pause as I thought to myself, that feels like a weird thing to have just said, and as she (probably) thought to herself, this girl is getting geekier by the day.
The book truly is about how the OED came to be, but reads more like a novel. Simon Winchester gives his readers an appreciation for the magnum opus that is the dictionary. In a world without the Internet or other good dictionaries to use as precedents, the people working on this project had to read extensively, documenting and defining every new word they came about. The OED goes beyond this, though, because it includes examples of the word in context – examples that really make its meaning clear. And the dictionary makers were careful to include examples from different time periods, in order to show the changes in usage that a single word has undergone during its life. All of this had to be coordinated among a changing team of numerous contributors distributed across many locations (did I mention yet that there was no Internet? This feat alone blows my mind).
In addition to imparting an appreciation for the complexity of the project, Simon Winchester shares much about two of the most influential men involved (the professor [James Murray] and the madman [William Minor]). Readers get a sense of these mens’ lives – for example, that William Minor was a doctor during the Civil War, forced to brand a deserter’s face with a hot iron – and how their pasts shaped the men they were as they worked on the project. There was hardly an antagonist (though there were characters that posed trouble at times). Instead, I was rooting for everyone all along – for Murray, Minor, and for the dictionary itself.
This book rekindled my appreciation of stories, quirky genius characters, words, and massive, seemingly intractable projects. It simultaneously inspired me, and made my own work feel like a picture book in comparison.
In all aspects of life, we’re often forced to accept that things change, and language is no exception. The only reason the English language even exists today is because languages change.
Last week, The Oxford Dictionaries Online announced the newest additions to their database: some words include “buzzworthy” (likely to arouse the interest and attention of the public), “food baby” (a protruding stomach caused by eating a large quantity of food and supposedly resembling that of a woman in the early stages of pregnancy), and “selfie” (a photograph that one has taken of oneself, typically with a smartphone or webcam and uploaded to a social media website.
In fact, language evolution is so natural that we don’t even realize how many things we say are products of recent changes. The American Heritage Dictionary surveys about 200 writers each year about what is acceptable in the English language. In the 1960s, 53% answered “no” to the question: “The construction sick at one’s stomach is defined by most dictionaries. and usage manuals. Can ‘at’ be replaced by ‘to’?” Even more recently, in the 1990s, 80% of writers deemed this sentence to be an inappropriate use of the word “grow”: “One of our strategies is to grow our business by increasing the number of clients.” Srsly?!?
One major way that a language changes is by coming in contact with other languages, such as when people learn a second language, but native speakers are behind many of the changes causing the current drama. The digital age has brought about a vast number of new concepts and therefore a vast number of new names to describe them, but technology also serves as a platform for the proliferation of new linguistic trends. This article, “We the Tweeple,” highlights Twitter in particular as a “fusion muse,” the inspiration for words like Twitterati, Twittersphere, and twirting. One linguist, Ben Zimmer, suggests that the distinctiveness and playfulness of the prefix “tw-” may be a main reason that Twitter is venue for so many portmanteau words. Twitter is also the ideal forum for coining new words because communicative space is limited and new words can catch on and spread thanks to the practice of hashtagging.
As with pretty much any other signs of change in society, there are always dissidents. In the debate over language change, those people are the prescriptive linguists, who try to report what proper language should be, and people often referred to as “Grammar Nazis,” who are enraged by signs like this one. The use of the word “irregardless” is a common Grammar Nazi gripe, but in keeping with language change and a practice of descriptive linguistics, Merriam-Webster assures us that it is, in fact, a word.
However, plenty of people embrace the evolution that language continually undergoes. I especially love this Atlantic article by Derek Thompson in which he cleverly incorporates all 44 words that were recently added to the Oxford Dictionary Online. Courtrooms seem to be another place in which language change is not only accepted, but even embraced. The New York Times reports that because conventional dictionaries exclude slang by design, courtrooms have begun looking to other sources for these definitions, namely, Urban Dictionary, a site with an extensive crowdsourced slang database.
Undeniably (and maybe fortunately), many new words are fads. This Atlantic article looks back on some of the new words of the ‘90s. While some, like geek and LOL, stuck, many others, like cowabunga and infobahn (information highway) either never really caught on or have disappeared almost entirely.
My thoughts on the new additions to the dictionary: many may sound silly, but the fact of the matter is that they’ve disseminated at least to some extent and are being used by English speakers. Maybe twerk will disappear from our lexicons before the end of the year, or maybe in a few generations, children won’t be able to believe there ever was an English-speaking world without the word twerk. Either way, today it’s a word, so I guess it belongs in our dictionaries… for now.
There aren’t too many aspects of life that haven’t changed in the English-speaking world between the years of 1800 and 2000. Not surprisingly, language in books published in 2000 systematically differs from the language published 200 years earlier. I doubt that many people wrote about emails or the telephone at the turn of the 19th century, just as it seems likely that few people publishing in 1800 wrote about horses and buggies or working as a cooper.
However, the changes that this article mentions were a little more surprising. To detect changes in word frequencies, the author (Greenfield) used Google’s Ngram Viewer, which counts word frequencies in a million books in less than one second. In total, her study looked at about 1,160,000 books published over the 200 year span in the US. When she looked at about 350,000 books published in the UK over the same time span, she found all the same trends in frequencies, which means:
“These replications indicate that the underlying concepts, not just word frequencies, have been changing in importance over historical time.”
Here are some of the words that have increased in usage over time:
And here are some that have decreased:
Greenfield summarizes her conclusions:
“This research shows that there has been a two-century–long historical shift toward individualistic psychological functioning adapted to an urban environment and away from psychological functioning adapted to a rural environment.”
To me, the fact that there are cultural shifts and word use shifts are both unsurprising, but the fact that they seem to correlate is pretty interesting. It also suggests that computational methods might be pretty reliable ways to detect meaningful changes in language and behavioral patterns over time.
In addition to linguistic labels, there’s another type of label that interests me: the one on a bottle of wine. They’re often creative and innovative, and wine labels can have cognitive effects beyond simply informing consumers about the bottle’s contents. Evidence suggests that labels affect consumers’ memory of a wine, their purchasing decisions, and even their perceptions of the wine.
Much of the reason that creative wine labels have taken off is that they have powerful effects. One such effect is, perhaps not surprisingly, on people’s memory of the wine. Kristin Appenbrink describes a study in which the researchers showed 11 participants 12 bottles of wine, 6 with graphic labels, and 6 with traditional ones. The next day, they showed the participants the same 12 bottles, but with 12 additional bottles. Participants were asked to choose the 12 original bottles, and overall they remembered 94% of the bottles with graphic labels, but only 68% of those with traditional labels. A graphic label appears to be the key to being remembered, at least for a wine.
Labels also have a power that links more directly to profit, one that influences consumers’ decisions to buy the wine. Because many consumers are easily overwhelmed by the enormous array of choices, the labels on bottles are the only way that one bottle will attract the customer’s attention. In an article titled “People buy the label, not the wine,” Ortrun Reidick argues just that. He gives one example of a wine label that features a flying pig. Because consumers expect the front label to emphasize some feature that’s relevant to the wine, a bottle featuring a flying pig catches their attention, as they wonder what the weird label has to do with wine. The back label sheds light on the connection: “We think you’ll stand more of a chance of seeing a flying pig than a better wine at this price…” Reidick paraphrases the text: “‘Buy me, I am cheap and good!’” In this case, the flying pig label is likely to have piqued the consumer’s interest enough to convince him to buy the wine.
The most surprising effect that wine labels have over their consumers is the ability to alter their perception of the wine’s quality. In one study, a group of researchers presented 41 diners in a restaurant with the exact same bottle of cheap Cabernet Sauvignon, but half of the bottles claimed to be from California, the “favorable” location, and the other half from North Dakota, the “unfavorable” one. Participants drinking the wine they believed to be from California rated not only the wine as tasting better, but also rated their food as higher quality, ate 11% more of their meals, and were more likely to make return reservations at the restaurant than diners who were given the wine supposedly from North Dakota. This study shows that the expectations people had of a wine’s quality, based on the information they got from its label, affected their perception not only of the wine itself, but also the food they consumed with it. Studies like this one suggest that labels may have a greater function beyond appealing visually.
I think a lot about how the word we use to label something affects our perceptions, conceptualizations, and actions regarding that object. Grammatical gender is one type of label that many languages employ, and in some cases, it may have a strong influence over speakers’ conceptualizations of the objects they talk about.
In one pretty classic study, Russian speakers were asked to personify the days of the week (all of which have associated genders), and participants consistently and unconsciously personified grammatically masculine days as males and feminine days as females. Although the evidence isn’t unanimous, a number of studies suggest that grammatical gender may have meaningful effects on speakers’ cognition in ways like this.
Another context that draws attention to the power of word labels is the concept of functional fixedness. This is the idea that once we have an established norm for what an object does, it becomes much more difficult to think of new uses for that object. To overcome functional fixedness and increase flexible thinking, Tony McCaffrey, a researcher at UMass Amherst, has developed a method called the “generic parts technique,” which requires a person to break an object down into its component parts and name each part in a way that doesn’t imply meaning. For example, “candle” would be broken down into the parts “wax” and “string.” While “wick” implies an object that should be lit, “string” is much more general, and people are therefore more likely to think of novel and creative uses for the object than when they use its functional label. Empirically, McCaffrey has shown that this method allows participants to solve more problems that require creative insight.
Another issue that’s widely debated is whether labeling psychological illnesses might have negative effects on patients. One side is that labeling an illness results in better access to services for a patient, but the side of the argument that I’m more interested claims that having a named diagnosis might propagate the illness for the patient. For example, if a psychologist diagnoses someone with depression, he will almost certainly go straight home to Google “depression,” and WebMD will enlighten him with a number of common depressive symptoms: fatigue, feelings of worthlessness, loss of interest, overeating or loss of appetite, etc. Armed with this knowledge, it seems likely that the diagnosed person might start noticing these “symptoms” that weren’t actually present until he started looking for them, or may have been present but milder. Next thing you know, the patient stops eating and starts harboring suicidal thoughts, because isn’t that what a depressed person does? Cue vicious cycle.
Lissa Rankin suggests in her book Mind Over Medicinethat physical diagnoses might have a similar effect. She argues that when given a troubling diagnosis, the body signals a stress response, and bodies under stress don’t have the healing capacities that healthy bodies do. Thus, regardless of the validity of the diagnosis, the patient is now in a mental state that will create physical hardship, and possibly illness, for his body.
I’m not saying that diagnoses are never valuable, or that people with diagnoses all of a sudden inflict more severe symptoms on themselves than they had in the first place. What I am saying is that maybe we should think twice before hastily slapping a diagnostic label on a person- it could be a violation of the Hippocratic oath to “first do no harm.”
Along these lines, my recent preoccupation with the introvert/extrovert dichotomy makes me wonder: could “self-diagnosing” yourself as an introvert be harmful? While it seems like a good thing in many cases- it will allow you to better understand yourself and your behavior- might it be the excuse you need to avoid group functions and hole up by yourself whenever you feel stressed? Could it be a self-fulfilling prophecy in that sense? Is that a bad thing?