— This is a fascinating piece for us to chew on this weekend by Steven Horwitz, the Charles A. Dana Professor and Chair of the Department of Economics at St. Lawrence University in Canton, NY. Chew on! – L
The Risks of Mental Shortcuts about Risks by Steven Horwitz
One of the great things that the Free-Range Kids movement has done is to remind parents that their perceptions of what is dangerous to their kids are often at odds with the statistical facts. The things that we often think are dangerous, like Halloween candy and stranger abduction, are really not meaningful dangers at all. Misperceiving risk is hardly limited to thinking about parenting issues, and misperceiving statistical frequency is a common human problem. For example: what’s the most dangerous part of air travel? Answer: the drive to the airport. You are far more likely to be killed on that drive than on the plane, yet our beliefs about which is riskier are often at odds with that fact.
Cognitive psychologists and behavioral economists have a number of explanations for why we are so bad at thinking like statisticians. These explanations are often termed cognitive “heuristics,” which is just a fancy word for a mental shortcut. We use heuristics all the time when our information is incomplete. For example, we infer that person who is smiling at us is trustworthy. It doesn’t always work, but it’s mostly right much of the time. Using these sorts of heuristics, however, introduces biases into our thinking. The previous example might lead us to be wrong about smiling people being trustworthy in specific instances, as experiences with car salespeople with big toothy grins might indicate.
One of the most important of these biases for the issues facing parents is the “availability bias.” A common shortcut we use when our information is incomplete is to rely on the information that is currently available to us. This is particularly the case when information that is relevant to the question at hand has been in our face recently. So when a friend says “Does it seem to you like everyone’s getting divorced these days?” you immediately think about the two or three divorces that you just heard about and quickly nod your head. But the fact that those examples are easily available to you doesn’t mean that the divorce rate is, in fact, any higher than it used to be. (Statistically, by the way, the divorce rate has fallen slightly in the last 25 years.) Our brains are biased toward the available and we make statistical inferences from those examples, not the total set of relevant information.
For example, if you ask people to come up with 12 examples of their assertiveness then rate themselves on how assertive they are, they will likely rate themselves lower than if they are asked to provide only 6 examples. Why? Because getting the first few examples is easy, but then it gets harder, and when we go to self-rate, we remember the struggle to find those examples and those bias our self-rating downward. With only 6, we struggle less and that experience is less available and therefore does not bias our judgment.
If we turn to judging things like the risk of a stranger abduction or other childhood dangers, one of the main sources of available information is media coverage. If you don’t know the real statistics, you will grab for information that is available, and that’s often that story you saw on TV about the little girl who was kidnapped from her bedroom (or that plane that crashed last week, or that story about some rare disease). As Lenore points out early in Free-Range Kids, the media, both through the “news” and dramas like CSI, focus on the extreme and on the rare, and because our experience of the media dominates our own store of recent information, it feeds right into the availability heuristic/bias. Looking to make a judgment about our children’s safety, we are ripe for falling for that cognitive bias.
The availability bias can lead to what some behavioral economists have termed the “availability cascade.” Media stories that create exaggerated perceptions of risk start generating emotional reactions in the public, and those emotional reactions themselves become newsworthy. The media coverage intensifies and now the issue becomes a major cultural concern. As various media outlets compete for attention, the risk involved gets exaggerated even more and people begin to dredge up other available examples of what seems to be the phenomenon. Those who try to provide evidence of the real risk are shouted down, and often labeled as “cold-hearted.” It’s but a short step from this point to calls for political action and the result is often wasteful and misguided public policy based on a false perception of risk. One need only think of the over-wrought media and policy-maker reactions to child abductions and related events that fit this pattern perfectly. In the context of parenting, it is what I have called in other places “Helen Lovejoyism” after The Simpsons character who interrupted every discussion of public issues with, “What about the children?!” regardless of its relevance.
Parenting issues can add one more element to the availability cascade. Parents react to the misperception of risk by engaging in costly “preventative” steps that, like bad public policy, end up harming the very people it is intended to help – their kids. The availability bias leads to over-protected kids, denied important skills they need to effectively navigate the adult world. Plus, these misperceived risks put inordinate demands on parents’ time for very little in the way of benefit, creating stressed-out parents and kids. The very need for a Free-Range Kids movement is to push back against the availability bias and the way that availability cascades lead to bad policy and overprotective parenting.
So is the availability bias inevitable? It’s not. The availability heuristic will always be with us as our brains want to take those normally very useful shortcuts. What we can do, however, is be aware that this heuristic can become a bias. We need to engage what the psychologist Daniel Kahneman (who also has a Nobel Prize in Economics) calls, in his marvelous book Thinking: Fast and Slow, our “System 2.” That’s the part of our brain that slows us down and double-checks the automatic reactions of “System 1,” which is the part that uses all of those heuristics.
We need to train ourselves to check our biases and think twice about our decision-making, especially when it comes to thinking statistically. So much of Lenore’s work and the free-range kids movement in general is about getting parents to think about their kids and the risks they face using System 2. So be aware of the shortcuts your brain takes. Stop yourself when you are thinking about risks and other statistical questions and ask if you’re falling for the availability bias. Do a little searching and find information beyond what first comes to mind and see if those risks or data are really what they first appear to be. The heuristics of System 1 are invaluable, but they are not infallible.
Good parents should be less worried about constantly double checking their kids’ homework and more worried about using their their System 2 to double check the homework of their System 1 shortcuts for cognitive biases like the availability bias. You’ll stop a lot of bad public policy, and be a more relaxed parent and happier person if you do.