I just wrote a fun piece for Virgin on the topic: Does business need a new language of love? It was an interesting topic to write on as someone who thinks about language (all day, every day…), but whose engagement with the world of business is roughly limited to occasional trips to the grocery store. The whole series on this topic is a very cool mix of perspectives.
This quarter I’m TAing for a class called Distributed Cognition, which explores a bunch of ways that cognition might not be something that happens exclusively in the brain. This week we looked at different flavors of embodiment, the idea that the body is crucial for cognition. For example, we talked about one study showing that people who were unknowingly leaning to the left made numerical estimates that were too small (consistent with the location of smaller numbers on our number line), while those leaning to the right made overestimations (Eerland, Guadalupe, & Zwaan, 2011). The overarching theme was that the state of our body can affect thoughts that we typically attribute only to our brain.
One study that I was reminded about when talking to a student is a study that has gotten a good amount of popular press attention. It’s called Extraneous factors in judicial decisions (Danziger, Levav, & Avnaim-Pesso, 2011), but the message that is usually taken is that judges have no mercy when they’re hungry. The authors divided judges’ work days into three chunks, divided by their food breaks. They found that at the beginning of each segment, judges made favorable decisions about 65% of the time, and their favorable decision rate declined steadily, reaching nearly 0%, throughout each segment. As someone whose brain and body shut down without a relatively consistent stream of food, this finding is not too shocking, though the magnitude of the change in favorable decisions is dramatic. I think it’s a great example of “body spills into the brain.”
It’s also an example of what many researchers refer to as “ego depletion,” the idea that we have a limited pool of mental resources, and cognition suffers once they’re used up. We get mentally fatigued, and then make poor decisions or have poor performance on some task as a result. Ego depletion underlies claims that working fewer hours increases productivity. I read this sort of advice often, each time thinking to myself, yes! I should do that. I feel this way especially on days like today, a Saturday morning in which ego depletion is fresh on my mind. I’m in recovery mode. Then, inspired to change my work habits, I’ll open my calendar to decide which work hours I’ll shave off the week, and I just stare at it. My trusty, color-coded calendar feels non-negotiable, so I close it and decide that working fewer hours maybe isn’t that crucial. I convince myself of this by reading reminders that some researchers claim that ego depletion is all in our heads. There’s probably some truth to this too – I often don’t start to feel drained until I acknowledge how busy I’ve been.
I do a lot of meta-cognition about work. By that I mean that I think about my work patterns and other people’s, and I try to evaluate what’s good bad about those patterns. My conclusion, for this morning at least, is that there’s probably not a one-size-fits-all solution to this issue. Some people might suffer from major ego depletion, while others might be more Energizer-bunny-like. Some weeks a person might get tons done while putting in many hours, and other weeks might be more efficient with a leaner schedule. For me, my goal is to work deliberately and mindfully, taking each week, day, or project as it comes, and adapting work habits as necessary. I will probably never discover the secret recipe for 100% efficient work, but that’s ok – it’s kind of fun trying to figure it out anyway.
Originally posted on NeuWrite San Diego:
Humor is a difficult concept to articulate. We might not always know why things are funny, but we do tend to know what kinds of things are funny. It comes in many forms, and general consensus is that things like videos of treadmill mishaps, cynical comics and corny puns are funny.
Luckily, there’s a pretty large body of research that takes humor seriously. Technically, humor is “a positive emotion called mirth, which is typically elicited in social contexts by a cognitive appraisal process involving the perception of playful, nonserious incongruity, and which is expressed by the facial and vocal behavior of laughter.” 
Not surprisingly, Freud had a few thoughts on humor. He believed that it helps us relieve inner tension that arises from our constant desire for things like food and sex . Jokes allow us to express our anxieties in a lighter way, so the things we…
View original 1,312 more words
Two things that I do almost every day are programming and knitting. Programming allows me to implement experiments and analyze the results, and knitting allows me to unwind and recharge, restoring some of the mental energy that activities like programming require for me. Programming is analytic and something that Silicon Valley geniuses do a lot; knitting is artistic and something that your grandmother does a lot. Upon deeper reflection, there are some pretty cool links between the two, though. They both require focused attention to detail and following patterns, and the end goal is usually to create something that has a functional purpose.
This blog post spurred my thinking about knitting and programming as related. The post talks about the benefit that handwork has on diagrammatic thinking and fine motor skills, suggesting that knitting will help children acquire analytical skills. Another post suggests even more strongly that exposing young students to more handwork might help them in computational and engineering fields down the line.
Another blog post shows a different intersection between knitting and coding. Karen Shoop, an engineer from Queen Mary University of London, writes about the complex code that knitters use to convey a pattern (to me, this can be sometimes frustrating when trying to learn a new pattern, but programming languages can be equally enigmatic). There are also some programs (both in her lab and elsewhere) that allow users to input sequences of knit stitches and purl stitches, and the generates what that sequence would look like if implemented. (This is apparently crucial for graphic artists who want to put cable-knit sweaters on their graphic people.)
After a little more searching, I found some more cool intersections between knitting and programming. One is a Japanese knitwear designer, Motohiro Tanji, who has also dabbled in fashion based on 3D geometric algorithms. There’s also a weekly meeting of computer hackers in Portland that appears to include knitters. Not many people will dispute that programming is an increasingly important skill, and kids are being exposed as early as possible in many cases… I wonder if knitting will accordingly make a comeback with younger people!
And finally, why didn’t I think of this!? A “Laptop Compubody Sock privacy, warmth, and concentration in public spaces.”
The beginning of September marks the traditional start of a new school year, even if in reality, many start sooner or later. A few pieces of back-to-school inspiration:
The first is a blog post, How to learn anything better by tweaking your mindset. The post describes a study in which two groups were taught the exact same information, but one group was told ahead of time that they’d later need to teach the information to someone, and the other group was told they’d be tested on the material. In actuality, no one had to teach the information to someone new, and participants in both groups received the same post-learning test. Those who had been planning to teach the new info, however, did significantly better on the test than those who were planning on being tested. The bottom line is that when we learn something with the intent of teaching it, we actually synthesize the information more and mentally organize it better than when we believe we’re learning for a test.
Anecdotally, I find this true. The classes I’ve TA’ed in the past year have been outside my realm of knowledge, but I knew I’d have to get up in front of a group of students just a few days after hearing the professor’s lecture and help the students synthesize the information presented and answer questions about it. I’d never have a written test on the material, as the students would, but I’d have an oral one when leading discussion. Technically, the stakes were low for me – I wasn’t going to get a bad grade or lose my job as a TA, but learning the information in order to be a competent teacher seemed crucial. As a result, I went into sponge mode right before every lecture, and I believe that I sopped up much more information and made stronger connections among the things being taught than if I had been a student expecting to be tested on it later.
On a related note, Khan Academy reminds us that You can learn anything. Even though we often have to fail before we can succeed, “thankfully, we’re built to learn.”
I have a favorite thought experiment that, for some reason, I think about a lot when I’m driving (to clarify, I’m not driving at the moment). It’s inspired by the claim that the Pirahã language, spoken by a group of people in Brazil, lacks number terms (the original paper is here). The claim is based on Pirahã speakers’ performance in two tasks. In the first, they were shown one battery and asked: how many? The researchers continued to present one at a time, continuing to ask how many there were. The responses were as expected based on previous research: the speakers all used the same term for “one,” a different term for “two,” and combinations of the “two” term and one that signifies “many” for larger quantities.
In experiment 2, the batteries were presented in the reverse order, so the participants first saw 10 batteries, and they were taken away one at a time. This time, the participants used the “one” term when there were as many as 6 batteries left, and they all used it when there were 3. The researchers took this as evidence that the terms that researchers believed to indicate “one” and “two” are not precise, but instead seem to be relative quantifiers. The claim is controversial, but the possibility that a language might not have any definite terms for numbers is intriguing.
Returning to my thought experiment, I often try to imagine living in a society with no ways to quantify things. If we had terms for “one,” “two,” and “many,” we could still see the difference between five apples and six, but the only way we could talk about that difference would be invoking our terms for “one” and “many.” In addition to having no words for definite quantities, we wouldn’t have numerals either. I recognize that a society without number terms would be vastly different from the modern-day American society that I know, but I like to imagine some consequences that would arise if our society suddenly lost all numbers:
We’d all have far less money. We’d have the currency that we could stash away, but no more invisible money in abstract sources like stocks and bonds. Debt would probably be a lot more manageable too.
It would be nearly impossible to be punctual. It seems natural to measure time of day by the sun, but that’s still so subjective. The pattern of the sun shifts a tiny bit every day, and we’re probably not pretty good at perceiving the sun’s exact angle in order to use it to tell time.
Life would be less competitive. In school, we wouldn’t be able to split hairs over percentage points. Many sports, like swimming or long jump, would be pointless without a precise measure of time or distance. We would have no way of knowing how many people liked our facebook posts, how many grams of fat were in the cake we just ate, or how few hours we slept last night (thank God – time for that competitive habit to die anyway).
Losing our number system would dramatically catapult our society into a much more primitive culture, and we’d lose progress in every domain of life. But at the same time, I wonder if we might see the number of people being diagnosed with ulcers and high blood pressure plummet… even without the technology to diagnose them.
P.S. An interesting post that uses the comic above as a jumping off point: Is “one, two, many” a myth
I recently stumbled upon a blog post at raptitude titled “The frightening thing you learn when you quit the 9 to 5.” I’m not sure why I was so drawn to it, since I’ve never actually worked a traditional 9 to 5 job. Maybe I was trying to mentally prepare for the day I quit a job I will most likely never have. Regardless, I was curious.
David Cain, the author, is 32 years old and recently left an unfulfilling 9-5 job to pursue writing. Although bizarre curiosity might have led me to click the link in the first place, I was soon captivated by the parallels between his situation and the one I’ve found myself in after beginning work on my PhD, and especially this summer, a time when much of the structure I was used to has temporarily died down.
Cain writes, “before I quit my job at 32, I had never really experienced a self-directed period of my life in which I was actually trying to accomplish something.” Oddly enough, this is probably true for most of us. We might have side projects that are self-directed and goal-oriented, but how rare is it for your everyday life to be this way? It sounds a little fantastical, the sort of thing we might wish for: no boss, doing work we love, when and how we want to do it. Cain’s reflections suggest that it’s not the walk in the park it might seem to be at first. It’s great in a lot of ways, but it’s far from intuitive. Although the post has nothing to do with academia, I recognize that thriving in this situation is what needs to be done to earn a PhD.
A few other quotes that really hit the nail on the head for me:
“If I chose not to work, it was my loss and only mine. When you’re self-employed, every day is Wednesday.”
“Each day is a blank page with no outline indicating where the crayons go. I have to decide what to draw, how ambitious or humble it’s going to be, and what it’s all going to add up to over time.”
Cain came face-to-face with the sudden need to be his own boss and define his own career path at age 32, after an average of 10 post-college years characterized by the having-a-boss experience. I wonder if it’s more jarring at that point in life than at 22 when you’re inexperienced and naive, but haven’t had the 9-5 routine grounded into you yet? In some ways, college seems like an intermediate step between school years when children are micromanaged and this self-directed state that Cain writes about. It seems like the traditional 9-5 path is a step in the opposite direction, though, so maybe the freedom is less dumbfounding for me than it might be if I had become accustomed to a more traditional work scenario.
The goal of Cain’s post is to urge all people, from those currently employed in a 9-5 job to children still in school, to think about their escape from the resignation to trudge through 5/7 of your life to earn a paycheck. “Much better than resignation is to make a long-term plan to find work that is valuable enough to you that your typical day is a fulfilling one, and valuable enough to others that people will pay you for doing it.” It’s a pretty romantic prospect, but a pretty cool one to aim for nonetheless.