At one point during South by Southwest, [talk show host Jimmy] Kimmel’s crew approached a poised young woman with brown hair. “What have you heard about Tonya and the Hardings?” the interviewer asked. “Have you heard they’re kind of hard-hitting?” Failing to pick up on this verbal wink, the woman launched into an elaborate response about the fictitious band. “Yeah, a lot of men have been talking about them, saying they’re really impressed,” she replied. “They’re usually not fans of female groups, but they’re really making a statement.” From some mental gossamer, she was able to spin an authoritative review of Tonya and the Hardings incorporating certain detailed facts: that they’re real; that they’re female (never mind that, say, Marilyn Manson and Alice Cooper aren’t); and that they’re a tough, boundary-breaking group.
One of the most worrisome elements of the book (and one that, in an alternative lifetime, I might have spent more time on), is the question of how our mental biases and shortcomings get in the way of making the right decisions for communities. This also plays a big role in my suspicion of experts — the risk that the podium – commanding expert doesn’t know what he’s talking about is a lot higher than we’d like to admit.
This article, from Pacific Standard, does a lovely job of illustrating some of that (and using a late-night humor piece to do it, to boot — read the article for more of these great goofs). Take a look at part of the rest:
In the more solemn confines of a research lab at Cornell University, the psychologists Stav Atir, Emily Rosenzweig, and I carry out ongoing research that amounts to a carefully controlled, less flamboyant version of Jimmy Kimmel’s bit. In our work, we ask survey respondents if they are familiar with certain technical concepts from physics, biology, politics, and geography. A fair number claim familiarity with genuine terms like centripetal force and photon. But interestingly, they also claim some familiarity with concepts that are entirely made up, such as the plates of parallax, ultra-lipid, and cholarine. In one study, roughly 90 percent claimed some knowledge of at least one of the nine fictitious concepts we asked them about. In fact, the more well versed respondents considered themselves in a general topic, the more familiarity they claimed with the meaningless terms associated with it in the survey. [….]
In 1999, in the Journal of Personality and Social Psychology, my then graduate student Justin Kruger and I published a paper that documented how, in many areas of life, incompetent people do not recognize—scratch that, cannot recognize—just how incompetent they are, a phenomenon that has come to be known as the Dunning-Kruger effect. Logic itself almost demands this lack of self-insight: For poor performers to recognize their ineptitude would require them to possess the very expertise they lack. To know how skilled or unskilled you are at using the rules of grammar, for instance, you must have a good working knowledge of those rules, an impossibility among the incompetent. Poor performers—and we are all poor performers at some things—fail to see the flaws in their thinking or the answers they lack.
What’s curious is that, in many cases, incompetence does not leave people disoriented, perplexed, or cautious. Instead, the incompetent are often blessed with an inappropriate confidence, buoyed by somethingthat feels to them like knowledge.
This isn’t just an armchair theory. A whole battery of studies conducted by myself and others have confirmed that people who don’t know much about a given set of cognitive, technical, or social skills tend to grossly overestimate their prowess and performance, whether it’s grammar, emotional intelligence, logical reasoning, firearm care and safety, debating, or financial knowledge.
Of course, there’s ways to at least attempt to counteract these kinds of mistakes. We can:
- Broaden the number and the type of people who are actively participating, so that we up our chances of someone catching the mistakes. Of course, that means sharing ownership of the topic and creating an environment where everyone knows that questioning is desired.
- Lay out the underlying expectations explicitly. Making very clear what we’re trying to achieve and what we know and don’t know about it makes it harder to launch off down the wrong path– and a little easier to see when someone else is.
- Establish as a ground rule that it’s OK to make mistakes, to float trial balloons. The biggest reason why people get caught in the Dunning-Kruger effect is probably the fear that they’ll be laughed at if they say the honest “I don’t know.”
Others, I’m sure. What would you add to that list?