Tweet
Share

Cognitive biases

The human brain is the best tool we have for figuring stuff out. Really, it's the only tool we have. Computers can perform millions of calculations quickly and AI can make amazingly accurate predictions, but to truly understand something takes a brain.

Unfortunately, brains are imperfect for a couple of reasons. First, the default human brain is the product of millions of years of evolution shaped by environments completely unlike today's world. The human brain was designed to excel at tasks that may be irrelevant and foreign to functioning well in the modern world. Second, each individual brain is also shaped by the idiosyncratic and limited experiences of its cultural, social, and educational history. These two forces combine to result in a very powerful, but sometimes flawed, tool for figuring stuff out.

Luckily, psychologists have been studying human thinking for decades and have made great progress in identifying a number of specific flaws or biases our brains seem to possess. These biases are called "cognitive biases" and can be quite varied in their effects.

We here at StatsExamples like to think of statistics as a method for discovering truth - when we have data, statistics helps us understand the data clearly. The quantitative approach allows us to base our beliefs about the world on data and probability instead of just opinion or wishful thinking. By itself, statistics excels in finding signal within noise, but it is less effective at dealing with systematic biases. To the extent that we are using statistics because we want to figure out what is real and what isn't, a good understanding of the cognitive biases that shape our thinking and interpretations of data is important. The best way to defend ourselves from errors caused by our own faulty brains is to understand how they go wrong.

For this reason, we have compiled a list of what we think are the most important cognitive biases.


1. Confirmation Bias

decorative image

The rest of the biases in this list are listed alphabetically, but we put this one first because we think of it as an uber-bias. To paraphrase Tolkien, "One bias to rule them all and in the darkness blind them."

Confirmation bias makes us favor facts and arguments that match (i.e., confirm) what we already think. Sometimes this can be willful; since most people don't want to admit to themselves or others that they've been wrong about something, they avoid paying attention when presented with information that does just that. But this isn't just willful denial, studies have shown a real and unconscious bias in how well we comprehend and recall information that either agrees with what we already think or contradicts it.

We've all seen this in other people who have a great memory for the negative sides of our favorite celebrities or politicians while simultaneously seeming to forget or ignore the negative sides of theirs. They seem so biased, but this bias isn't always deliberate dishonesty, it's often their memories being better or worse outside of their conscious control. And it's not just "them" ... cognitive biases also exist within "us" and we need to remember this.

We like to think of this as the uber-bias because it can combine with and magnify the effects of all the other biases listed here.

To protect ourselves against confirmation bias, the best defense is to be especially careful when reading or watching anything that we find ourselves easily agreeing with. This goes double for anything that is using emotionally charged language to make fun of or criticize people with opinions different from ours. It is the very things we think we know that we should most constantly question.


2. Anchoring Effect

decorative image

This Anchoring Effect is a bias that arises from our tendency to compare options to one another and rank them. It's called the anchoring effect because we often choose one option as the "anchor" and compare the rest to that single value or item. We then lose the ability to objectively evaluate the items since they are seen relative to the anchor.

This bias is exploited by many businesses by placing an option at the cheap end (which we avoid because of concerns about quality) and another at the high end (avoiding this one means we "save" money) to guide us to the middle options which we think of as a good trade-off of price and quality. We feel this way even if they are priced higher than we would pay for them if we saw those prices on their own because, relative to the anchors, they are "better" options. This bias is used by car dealerships when they sell cars. Once you agree to spend $25,000 on a car,which may be a reasonable price, the $300 on floor mats seems cheap. If you saw the same mats for sale in a store all by themselves however, you would never pay that much. Next time you’re buying a car, look carefully at the prices of the options and think about whether they are really worth that much.

As another example, when looking at health related statistics we can be misled about the impact of some conditions or diseases if we see them presented side-by-side with other statistics. For example, bladder cancer kills around 17,000 people a year. If we are reminded that this more than five times as many people killed each year as died in the 9/11 attacks in 2001, then we will tend to see this value as very high and be more likely to think of bladder cancer as a major health problem. On the other hand, if we were reminded that approximately ten times more people due from lung cancer each year than bladder cancer, then we will tend to see this value as smaller and less likely to think of bladder cancer as a major health problem.

To protect ourselves against the anchoring effect, also sometimes called the "relativity trap", we should be sure to mentally step back and think about the value. Is the amount we are looking at high or low in an absolute sense, or does it just seem that way relative to values that may be designed to mislead us?


3. Authority Bias

decorative image

The authority bias arises from our innate willingness to respect and believe people in positions of authority. For most of our history, this bias would have usually been a good strategy since authority figures were often elders with more experience and access to education or knowledge denied to many. Doing what these people say without question would have often been the best action. This bias can lead to two major problems however - manipulation and false attribution of knowledge.

A number of experiments have shown that we defer to confident authority figures and will follow their instructions, even to the point of violating our own ethics. In Stanley Milgram's famous experiment, university students were convinced to administer lethal electric shocks (so they thought) to their fellow students just because an authority figure told them too. Obedience to authority can also explain why so many otherwise ordinary people can commit atrocities in times of war (e.g., Nazis were terrible and many Germans were Nazis, but most Germans are usually good people). Obviously this deference can also extend to perceptions of reality when an authority figure claims that the world is a certain way, possible even overruling direct evidence to the contrary.

A related aspect of this bias arises from attributing knowledge to authorities in fields other than ones they specialize in. For example, famous singers obviously know a lot about performing and music so their opinions on these topics should therefore be considered seriously, but that doesn't mean they know much about nutrition or politics. When we transfer our valid trust in someone's knowledge in some fields to others we end up over-valuing the claims of people with potentially zero knowledge.

To protect ourselves against authority bias, we need to think logically about the claims of the authority and what basis they have for their authority. Are they asking us to do something we would not usually do? Are they making claims about things they genuinely know about, or are they talking about things outside their true expertise? Are they deliberately using their position of authority to prevent us from objectively evaluating the questions, data, or action being discussed?


4. Bandwagon Effect

decorative image

This bias is similar to the authority bias, but instead of a single individual acting as the guide it’s the group. When a bunch of people believe something we have a natural drive to join in and agree - think of how people choose their favorite sports teams or follow certain religions. Just like the authority bias, this logical flaw would have been useful for most of history because groups following established norms usually made correct decisions when life was fairly consistent. These days however, when issues are more complex and the world is constantly changing, traditional norms no longer act as such accurate guides. Groups can also be intentionally misled using media so this bias can become a serious problem.

And this bias is powerful, tons of studies have shown that our behaviors are strongly influenced by what we think everyone else is doing. For example, if all the other passersby ignore a collapsed man on the street we are more likely to do so too compared to when we are alone and come across someone in distress. Part of it is a feeling of responsibility absolution, but some of it is also a sense that if everyone else is doing something then there must be a good reason for it. One of the shocking aspects of Milgram’s famous psychology studies was how adding complicit actors could drive the compliance rate of study subjects (who were essentially being asked to kill innocent people) almost all the way to 100% - the bandwagon effect can make murderers out of almost all of us.

Where this can interact with statistical studies is when we design experiments or interpret data. To get to the point where we are using these tools usually means becoming part of a professional or educational group; being able to “think outside of the box” and go against orthodoxy can be difficult even if our data suggests that we should. Furthermore, convincing others of novel ideas can be difficult if we go against the bandwagon beliefs or we come from outside their group, no matter how good our data or statistics.

To protect ourselves against bandwagon bias, we need to realize that while the wisdom of crowds can be useful for some things it can also go very wrong. When we find we have a strong opinion about something, we need to think carefully about whether it is based on data and logic or whether we feel that way just because everyone else does.


5. Conservatism Bias

decorative image

This bias is one in which we systematically misremember and misestimate very low or high rates or frequencies. In each case we tend to think that the true value is closer to the middle than it really is - i.e., higher for rare events and lower for common events. This bias is also sometimes called the "regression bias" and it is like a coin with two sides.

First side of the coin, we tend to remember rare events as more common than they really are. For example, if we are dealing with dozens of customers every day and one of them makes a very unusual order, we may think that this order is more common than it is when asked later because it sticks out and we don't think about the others who didn't make it. Similarly, if you asked NFL football fans how common ties are, they would likely overestimate the actual value of approximately 1 in 500 which is less than 0.2% (from 1999-2018 the actual rate was 9/5120=0.18%). This is exacerbated when unusual and memorable events like the ties on consecutive weeks in 2018 occur (as is expected in a Poisson process) - which even led many people to complain about ties being too common in football.

Other side of the coin, we tend to remember common events as less common than they really are. For example, if our commute is usually easy and we are asked about it, then we will remember the traffic jams that occur and place a lower value on how often it was easy. Similarly, the coffee shop that "always messes up our order" gets it right far more often than we think because we are remembering the rare mistakes.

By now we can see that our memories for rare events are what is driving this bias. Each rare event is pushing our perception of the rate or probability towards the center disproportionately. This is because the far more numerous, but similar, outcomes are far less individually memorable. When we do our mental calculation, since we have a hard time imagining very small numbers, we round up the tiny value.

To protect ourselves against the Conservatism/regression bias we need to stop and deliberately think about the common outcomes when asked about the unusual ones. We should do the math and use one of the biggest benefits of crunching numbers, its bias-piercing power. When we are forced to identify the unusual events and then add up all the common outcomes we will often see that what we thought was merely uncommon is very rare, and what we thought was just usually true is almost always true.


6. Current Moment Bias

decorative image

This bias is also called the present bias and describes how our minds often focus better on the current moment rather than the future. We tend to overestimate possible benefits or costs if they will come immediately versus ones that come later. For example, most people understand that a healthy diet is good for us and will lead to greater overall happiness via better health, but in the current moment it is often difficult to balance that future benefit of better health against the current happiness a piece of cake offers. Likewise, when we decide whether to spend money now or save for the future, this bias makes us less likely to save.

This comes into play when using statistics by biasing our perceptions of benefits. If we gather solid evidence that a crime prevention program will reduce the number of thefts 5 years from now by 100, but would need to be funded by reducing surveillance which will lead to an extra 20 crimes now, the current moment bias will cause us to resist favoring the crime prevention program. We would get some thefts either way, but choosing the surveillance programs results in five times as much crime overall, but we’re biased to choose that option because the immediate effects are more apparent and emotionally persuasive. This is more than just strategic decisions by people who may not be able to take credit for future benefits (e.g., mayors who make these decisions that their successors benefit from), it is a genuine honest feeling that more immediate benefits are better than equivalent or even greater future benefits.

To protect ourselves against current moment bias, we need to keep in mind that the future always comes and the best decisions for the average over a longer period of time are the ones we should make. We can help motivate ourselves by thinking about how happy we will be with ourselves in the future or how wise and thoughtful we will seem to others when the time comes.


7. Dunning-Kruger Effect

decorative image

The Dunning-Kruger effect is a bias in which people overestimate their knowledge about a subject more when they actually know less. In other words - the less you know, the more you think you know. The name comes from a 1999 study by two psychologists named Dunning and Kruger (surprise) which showed that the people who performed poorly on tests of grammar or logic estimated that they had done well when asked about their own abilities.

The most insidious aspect of this bias comes from what it results in at the group level. We tend to believe people with confidence in their opinions (i.e., the authority and bandwagon biases above) and the Dunning-Kruger effect causes the people with the least information to be the most confident. The net effect of this can be debates driven by the opinions and conclusions of the exact people who should be given the least weight.

To protect ourselves against the Dunning-Kruger effect, we should do two things. First, whenever we think about things outside of our own sphere of knowledge and expertise we need to be aware that our ideas may be just as inaccurate as when outsiders come into our field. We all know how dumb “they” can sound when they are talking and clearly don’t understand something basic without realizing it; we need to realize that “we” may be that person when we talk about things we have only casually studied. Second, when we are listening to the claims of others we should respect training, knowledge, and credentials instead of passion and confidence. We do need to be cautious of automatically accepting expert opinions (i.e., the authority bias), but the odds are that they do know more about the topic than people without appropriate professional credentials or a proven track record in the field.


8. Gamblers Fallacy

decorative image

The gambler’s fallacy is a cognitive bias where we feel that previous events influence future ones even when we know they are unrelated. For example, we know that flipping a coin is a 50-50 heads or tails result and unrelated to what happened to the coin in the past, but if we’ve just flipped a coin four times and gotten heads four times we feel as if the next flip is more likely to be tails … to even things out. This feeling biases our expectations. The name of the bias comes from the idea that many gamblers have that if they’ve had a serious of losing bets then they’re due a win.

In addition to harming many gamblers, this bias can have real-world effects like people becoming unduly upset when their expectation is violated. If a couple has four sons and tries for a daughter, they may feel that they are due a daughter and be surprised when they have another son since it feels too unlikely, but of course having four sons doesn’t make the next child more likely to be different - the new fertilization has nothing to do with ones that took place a year or more earlier. In fact, this bias will cause any string of “bad luck” to make someone too optimistic about future events - and vice versa.

This fallacy only applies to independent events, if there is some way for the situation to be biased (e.g., maybe some genetic factors predispose couples to have more males or females) then past events may provide information about the probability of future ones. If this occurs then it is the unknown cause of non-independence which increases the chances of us seeing what we’ve already seen. This can make the inaccurate predictions arising from the gambler’s fallacy even worse if we misunderstand that interaction.

To protect ourselves against the gambler’s fallacy, we need to keep reminding ourselves that the events are independent of one another. Streaks may occur, but each new event is a new beginning. We need to evaluate each choice or probability objectively on its own and put the past behind us.


9. Hindsight Bias

decorative image

The Hindsight bias is a cognitive bias where we overestimate how predictable events in the past were. After the fact, we often think we knew something was going to happen even though we had no idea. This bias may even alter our memories to the extent that we recall predicting things we did not.

For example, in retrospect the 2008 economic crisis was bound to happen. Housing prices in the US were increasing extremely quickly and loans were being given to anyone with a pulse. It looks so obvious now and lots of people claim they saw it coming. But we know this isn’t true because so many knowledgeable people made disastrous investing decisions they would not have made if it had genuinely been that clear. In another example, doctors often feel that they knew their patients’ outcomes all along even though their actual diagnoses are typically cautious and based on overall averages (citation link).

A real danger from this bias is that is makes us think we know much more than we really do. We forget how uncertain we were and recall ourselves as much smarter than we really are. This interferes with our ability to learn from our mistakes and can even contribute to other biases like the Dunning-Kruger effect described above.

To protect ourselves against hindsight bias, we need to realize that we are magnifying our perception of what we knew and when we knew it. We should ask ourselves, “did I really know that at the time or just feel confident now?” When we use statistics and data to make predictions we are always faced with uncertainty and we need to stay aware that the best we ever do is make predictions based on probability - we shouldn’t forget our limited knowledge when looking back.


10. Ingroup Bias

decorative image

The ingroup bias is one in which we favor individuals that are members of the same group that we belong to. In addition to being nicer to them, we are also more likely to agree with the things they say and believe. This logical flaw would have been useful for most of history because favoring individuals we are likely to interact with often will result in returned favors and better social cohesion (i.e., we would be less likely to be ostracized). For most of history, outsiders brought warfare and disease, there has been a powerful biological drive to result in us favoring the members of our group, and their beliefs, above those of others.

This bias becomes especially problematic if the popular opinion of our group goes awry, leading us down a path to having incorrect beliefs.

Although racial and ethnic groups are the ones that most quickly spring to mind, any group that we feel a kinship with is fair game. In fact, the term kinship speaks to the strength of the ingroup bias - membership in these groups can feel as strong as families sometimes. Studies have even shown that we experience hormone surges resembling those seen in loving relationships just from being with members of our groups, these hormones can manipulate us.

Political parties, religions, and tribes are also groups that can create strong ingroup bias. Even something as trivial as being a fan of a sports team can do this - there are plenty of famous examples of controversial plays where the fans of the opposing teams see reality (e.g., a goal or foul) differently.

To protect ourselves against ingroup bias, we need to recognize when we feel this kinship and consciously prevent this emotional feeling from clouding our rational judgement. Members of every group imaginable can be wrong and we shouldn’t give the people in our groups extra credibility benefit just because of their membership. A related strategy is to think of ourselves as members of much broader groups. If we avoid thinking of ourselves as “American” or “Chinese” and instead think of ourselves as “Human” then we can shield ourselves from reflexively agreeing or disagreeing with “American” or “Chinese” people and focus on their data and logic instead.


11. Loss Aversion

decorative image

Loss aversion is a bias in which we tend to emotionally overvalue losses and undervalue comparable gains. For example, people will generally prefer to avoid losing $50 rather than gain an extra $50. In other words, if we could measure happiness on a scale it would decline more when a person loses $50 than it would increase when they gain $50.

An example of this comes a famous study of insurance company premiums where the researchers studied rates of customer switching before and after increases or decreases were performed. Obviously increases caused more customers to switch companies and reductions caused fewer to leave, but the effect wasn't equal - a given premium increase was twice as likely to cause a switch as the same amount of reduction was to prevent a switch.

This bias can be exploited when people make decisions against their economic interest just to preserve what they have. A person may be more willing to take a new employment contract that gives them a $100 raise while leaving their insurance costs the same versus one that gives then a $200 raise while increasing their insurance costs by $50. In the second case they come out $50 better than the first, but they "lose" $50 in the second option whereas in the first they aren't "losing" anything so they prefer the first one.

As another example, companies can use free trials to exploit this bias. Many people who would not pay $10 for a certain service or item would be willing to pay $10 to avoid losing it. This technique isn't just about giving you the chance to see if you like the product, it is also a way to shift the mental calculation from paying to gain the product to paying to avoid losing the product. The power of this shift is part of why this is such a classic marketing technique.

This bias is related to another irrational behavior that people do that can lead them on a road to ruin. This involves a related concept called "sunk costs" which describes the costs you've paid for something that you can never regain. If we are being logical, we should always make decisions about spending time or resources on something based on the future expectation - what we pay versus what we may get. But if we have already paid a lot, then these past sunk costs are mentally part of the calculation. Imagine we are gambling and playing a game with a 1/20 chance of winning and with a x15 payout. Each time we play we should balance the 1/20 chance of winning against the 15/1 reward. But once we've made a few losing bets then we tend to make our future decisions based on not wanting to have wasted or lost those previous bets for nothing. Our fear of losing those past bets magnifies our perceived benefit of winning to more than the 15/1 and encourages us to keep playing.

To protect ourselves against loss aversion we need to realize how often companies try to manipulate us with this and get angry. The sum of life is the total of our losses and gains; we should value them equally. When making decisions we should make sure that we aren't settling for a lower overall result because we are accidentally overvaluing the negative side of the equation. We shouldn't let companies win because they are exploiting our emotions.


12. Negativity Bias

decorative image

The negativity bias is one in which we pay closer attention to bad news and find it more memorable and important than good news. For example, each news story we hear about a murder makes us more nervous and fearful even though we may also be consciously aware of the fact that violent crime rates are much lower than they used to be. As another example, immigration brings complex issues to a country, but many people tend to focus on the negative aspects instead of the positive.

This psychological bias is related to an idea in risk management called the “smoke detector principle” whereby it is better to err on the side of caution. Smoke detectors are designed to be overly sensitive to danger because 100 false alarms are better than a single failure to respond when fire is present. Similarly, during our species’ history, being overly cautious or fearful would have been advantageous because avoiding 100 positive interactions was worth avoiding a single deadly one. Our brains carry biases which are the product of years of living in an unforgiving environment where ignoring danger could be literally fatal.

This bias does lead to stress and pessimism however. Now that we live in a fairly safe world, overestimating danger leads us to fail to realize how incredibly safe we are. Also, by focusing so much on rare negative things (e.g., terrorism, rare diseases), we may devote more resources to these issues than they warrant based on an objective assessment of the improvement in our lives gained from doing so. Given that life can be improved by both by reducing negatives and by increasing positives, when we mentally magnify the negatives we miss so many opportunities to gain more benefit from the positives.

To protect ourselves against the negativity bias, we need to recognize when emotion and fear gets involved in our thinking. Fear makes us stupid. Classic science fiction authors recognized this when “fear is the mindkiller” was a mantra in Dune and “fear is the path to the dark side” was cautioned in Star Wars. Any time we are exposed to a piece of negative information or are weighing positive against negative, we can make better decisions by consciously resisting the emotional power of the negative. Resist emotions and do a genuine objective cost-benefit analysis for best results.


13. neglecting-probability

decorative image

The neglecting probability biases arises from people's natural tendency to ignore the actual probabilities of events and merely think of things as likely or unlikely to happen. This results in a bias going both ways, with rare events considered more likely than they really are and common events considered less likely than they really are. In essence, the range between the extremes of likeliness is reduced and all possible events are considered equal.

For example, the amount we pay for insurance should be based on the risk of us experiencing the catastrophic result that it protects us from. I.E., If the event is somewhat likely we should be willing to pay more than if the event is extremely rare. In a study asking people how much they would be willing to pay for insurance the researchers found that subjects gave the same amounts when the risks were 1/100,000, 1/1,000,000, or 1/10,000,000 or in a second trial when the risks were 1/650, 1/6,300, and 1/68,000 (citation link). Even as the risks varied by 100-fold, the subjects did not alter how much they would be willing to pay - they were obviously neglecting the probability itself and just thinking about the magnitude of the risky event.

In another example of this phenomenon, researchers connected study subjects to electrodes and measured their willingness to pay to avoid being shocked. The amount people were willing to pay was largely independent of the risk - they were willing to pay $10 to avoid a 99% risk, but $7 to avoid a 1% risk (citation link). Again, the subjects were neglecting the probability itself and just thinking about how bad getting a shock would be.

This bias can obviously be exploited by insurance companies - people seem willing to pay to insure against major risks that are incredibly unlikely. Furthermore, since the amount customers are willing to pay to insure against common risks isn't much higher than whatever their baseline is, insurance companies are better off not even offering these policies since they would not be able to make much money.

This bias can lead to extremely skewed priorities and consequent actions. For example, vaccines protect against potentially common diseases that can be serious, but do have extremely rare side-effects. If the probabilities are neglected then it feels like we are balancing two equal risks and people are much more likely to avoid vaccinations when their overall effect is beneficial. Another example is lotteries. Many lotteries have been adjusting their games over time to reduce the probability of winning, but giving bigger payouts when they do. Because customers ignore the probability side of the equation and just focus on the reward side, the games attract greater and greater numbers of players even as the game itself gets less and less favorable to them.

To protect ourselves against neglecting probability, we need to ... not neglect probability. We should never use the mental shortcut that all possible events are equally (or even approximately) likely. We need to think out the true probabilities of risk and reward when we consider buying a warranty or insurance policy. In particular, we need to be clear-headed when doing something that has an extremely low risk of a negative outcomes, but a much higher chance of a benefit. We shouldn't deny ourselves good outcomes because of exaggerated fears of bad ones.


14. Observer/observational selection

decorative image

This bias describes how we tend to notice certain things more after we become aware of them. If you've ever had the experience of learning a new unusual word and then all of a sudden it shows up on a TV show or in a book, this is the bias at play. We are affected so strongly because the bias makes us notice this new thing because we were primed. As another example, sometimes after we make a purchase (e.g., a certain model of car or item of clothing) we suddenly see more of them around. It's not that they're more common, it's that we're noticing them more. Our perceptions of how frequent things are should be based on the genuine frequency, but this bias throws our ability to make accurate estimates off.

This bias is also called selection bias, but that's a poor choice of terms since selection bias also describes how samples can be skewed (intentionally or not) by non-random sampling and lead to bad inferences. That's a serious bias, but more of a statistical or experimental one instead of a cognitive one.

To protect ourselves against observer selection bias, we need to keep examine any feeling that we get that something has suddenly become more common than we thought it was. The higher perception of occurrence may be magnified by a previous experience we had. Always considering whether our personal experiences may be responsible for influencing our estimates of frequencies is a good habit to get into.


15. Post-Purchase Rationalization Bias

decorative image

This post-purchase rationalization bias describes the way in which we tend to view decisions we made as superior to alternatives we could have made, even if presented with evidence to the contrary. It gets its name from the observation that after we purchase something we often find it easier to justify why we made the purchase and appreciate its positives rather than accept arguments for why we shouldn't have. This bias is also called the choice-supportive bias and studies have shown it applies to a wide range of decisions, not just purchases.

For example, this bias can explain how some people hold onto their faith in certain political or religious leaders even as evidence comes out showing them to have serious flaws or they publicly argue for positions contrary to those of the voter or parishioner. Instead of admit (or realize) they made a mistake in supporting the leader, many followers will go to great lengths to justify the leader's actions and rationalize why their behaviors are not that bad.

This bias can also explain undying pride in a team or school. Some fans remain supporters of their team year after year as they lose because once someone has chosen a team (i.e., made the emotional purchase or choice) it becomes very hard to change allegiances. Similarly, many alumni of schools retain school pride their entire lives and feel confident that their school is superior to all others, but that is based on almost no information since they obviously didn't experience the alternatives. Once you've committed to spending four or more years (and thousands of dollars) at an institution, this bias prevents you from being objective about how that may have been a mistake and perhaps some other school would have been better.

What makes this a true cognitive bias, instead of just not wanting to consciously admit you were wrong, is that fact that studies have shown that people unwittingly alter their memories after we make choices. We unconsciously create additional justifications for our choices and forget evidence against our choices. It's not just wanting to avoid embarrassment that maintains our confidence in our choices, it is our perception of why we made them in the first place.

This bias contributes to people maintaining incorrect beliefs (e.g., flat Earth, anti-vax) even in the face of incontrovertible evidence. When the evidence conflicts with a major decision we have made, this bias adds to the natural resistance to change our minds.

To protect ourselves against post-purchase rationalization we should remember all the times we have been wrong in the past. We have all made bad purchases and decisions we have regretted. Some of these we have realized and admitted to ourselves, but we need to be aware that there are probably many more we are clinging to because of this bias. When faced with evidence contrary to something we believe in we should be humble. We should not assume that our past selves were automatically correct, especially since now we typically have more experience and information than they did.


16. Rhyme as Reason

decorative image

This rhyme as reason bias describes the observation that people find statements that rhyme to be more accurate or correct than statements with the same meaning that don't rhyme. Obviously this is ridiculous, but the phenomenon seems to be true.

For example, a statement like "woes unite foes" is better received than one like "woes unite enemies" even though the meaning is exactly the same (citation link). The saying "an apple a day keeps the doctor away" is much more persuasive than something like "eating apples regularly reduces required medical visits."

The most famous example of this bias being deliberately used comes from the criminal trial of OJ Simpson. During the trial Simpson's lawyer had Simpson try on a glove thought to be worn by the murderer and when he had trouble putting on the glove the lawyer told the jury, "If it doesn’t fit, you must acquit." The point he was making was clear, but he deliberately used a rhyme to increase the persuasiveness of his argument.

There appear to be two hypotheses ideas about why this effect works. First, rhyming improves the beauty of statements and we all respond more positively to well-crafted artistic statements. In the absence of data, beauty and aesthetics may be being used as a stand-in to judge accuracy. Second, rhyming increases the comprehension and memorability of statements and we are more likely to agree with things we can easily understand and recall versus ones that take longer to figure out.

We here at StatsExamples would like to propose a third possibility. We hasten to add that, as far as we know, this is our own idea and hasn't been studied. Many people are superstitious or believe in higher powers that influence the world, sometimes even to the extent that they don't believe in coincidence at all. For these people, when a statement rhymes it feels like some special karma or spiritual quality is involved instead of mere coincidence. If it isn't true, then why does it sound so good?

To protect ourselves against the rhyme as reason bias the main tool is awareness. We should not attribute extra meaning to statements that rhyme. If faced with a rhyme, we should rephrase the statement with different words that don't rhyme and see if it still makes as much sense. Also, we should keep in mind that since advertisers know about this phenomenon, we should be wary of slogans and statements that rhyme because that is a possible sign of deliberate manipulation.


17. Zero Risk Bias

decorative image

The zero-risk bias is one in which we prefer risk or cost minimizing solutions that reduce the number of risks even if the overall sum of risks could be better reduced some other way.

For example, imagine a situation in which threat A kills 12 people per year while threat B kills 3. There is a choice between a pair of actions: one action reduces the deaths due to each threat by 33% while the other eliminates threat B, but does nothing about threat A. Many people will prefer the second solution even though it is worse overall since the first solution saves 0.33(12+3)=5 lives, whereas the second saves only 3 lives.

Another similar behavior would be the behavior of a person paying off two credits cards - one with a $500 balance at 10% and the second with a $4,000 balance at 20%. The most economically rational decision would be to pay the minimum payments of the first and and reduce the highest rate balance first by devoting all available funds to the second. However, many people would instead focus on the first balance because paying it off first leaves them with just one balance, even though doing so would cost them more money overall as they delay paying down the higher interest rate balance.

The zero-risk bias seems to arise from a preference for completely eliminating some risks or simplifying the risk profile. In the examples above, the second solution leaves us with just one threat (or balance) whereas the first leave us with both (or prolongs that state). It's nice to worry about fewer things, but this has very little tangible benefit and the downside is costing lives or money which are far more valuable than the benefit of simplicity.

The zero-risk bias may also arise from mental carelessness with numbers. A 100% reduction may feel like more than a 33% reduction when we ignore the magnitudes of the categories we're working with. In the same way that some people prefer black and white solutions, or rely on stereotypes instead of dealing with the complexities of reality or people, the zero-risk bias feeds on a desire for an easy answer.

To protect ourselves against the zero-risk bias we should be sure to calculate overall benefits and compare them. It's okay to value simplicity, but we need to be aware of how many lives or dollars we are paying for that. We may still decide we want to pay off the smaller balance card first, but we need to be aware of the extra money we're paying (or lives we're condemning in the first example above) and think about whether it's really worth it.

Connect with StatsExamples here


This information is intended for the greater good; please use statistics responsibly.