“If the doors of perception were cleansed every thing would appear to man as it is, Infinite. For man has closed himself up, till he sees all things thro’ narrow chinks of his cavern.” ― William Blake
This is part-6 of the multi-part series on cognitive biases and how to get beyond them. Here are Part-1, Part-2, Part-3, Part-4, and Part-5.
Here are the biases discussed so far:
1. Confirmation Bias
2. Hindsight Bias
3. Negativity Bias
4. Impact Bias and The Inaccurate Simulator
5. The False Consensus Bias or “most people are like me bias”
6. Attention Bias And The Tunnel Visioning Effect
7. Optimism bias Or the Wishful Thinking bias
8. Distinction bias
9. Anchoring bias
10. The Endowment Effect
11. Functional Fixedness
12. Projection Bias
13. Information Bias
14. Consistency Bias
15. Social Comparison Bias
Let us get on with the cognitive biases:
“You will always define events in a manner which will validate your agreement with reality.” ― Steve Maraboli, Life, the Truth, and Being Free
16. Availability Heuristic
“It is easy for us to criticize the prejudices of our grandfathers, from which our fathers freed themselves. It is more difficult to search for prejudices among the beliefs and values we hold.” ― Peter Singer, Practical Ethics
The availability of certain examples and ideas in our minds can bias our thinking to attach more importance to them.
When we are making a decision or thinking about a concept, certain shortcuts and top off the head examples can influence and skew our results.
This bias operates on the principle that available and easily recalled information is more important.
It also implies that information that we cannot recall is not as consequential.
In a classic research paper titled Availability: A Heuristic for Judging Frequency and Probability by Tversky and Kahneman in 1973 from the Journal, Cognitive Psychology, the authors first describe this heuristic.
They say:
“A person could estimate the numerosity of a class, the likelihood of an event, or the frequency of co-occurrences by assessing the ease with which the relevant mental operation of retrieval, construction, or association can be carried out.”
The authors describe inconsistencies such as research subjects believing that certain coin tosses are more probable. Respondents may feel and believe that the toss HTTHTH is more probable when compared to the toss HHHHTH or HHHTTT.
Why is this so when all coin tosses are equally probable?
The authors say that subjects think that certain people will be in a certain occupation.
Subjects base this on their stereotypical ideas about the representation of specific people in a particular occupation.
So what do we do:
- We can compare the features of an event with its originating structures. This comparison can establish similarity and dissimilarity.
- We can estimate the probability of an event by the ease of availability or what the authors call “associative distance.”
As an example, the authors say that larger classes can be recalled quickly and better than less frequently occurring ones.
What we think as likely to happen can be imagined with greater ease than what we think of as unlikely.
Also, When two events co-occur, we are more likely to form associations and connections between them.
Another example that the authors provide is divorce prediction.
Let us say that you are assessing the divorce rate in a community or the possibility a certain couple will end up divorced. You assess your memory and scan for available and similar results.
Depending on your retrieval of information, divorce will appear more or less possible in this particular case.
In fact, we may even make mental scenarios and constructions and then ask if the divorce is likely. How possible such scenarios are and how easily and fast that they come to our mind will influence our final outcomes.
The authors add:
“A person is said to employ the availability heuristic whenever he estimates frequency or probability by the ease with which instances or associations could be brought to mind. To assess availability, it is not necessary to perform the actual operations of retrieval or construction. It suffices to assess the ease with which these operations could be performed, much as the difficulty of a puzzle or mathematical problem can be assessed without considering specific solutions.”
Based on ideas from the study and my personal experience, I have a few ways to avoid this bias.
Action Tips:
1. Become aware of the possibility that it is easier to retrieve freshly available information.
2. You may be attaching more importance to easily available information and choices.
3. When events or ideas co-occur in our thinking process, we attribute stronger associations with them. For example, peanut may evoke butter faster than say chutney or sauce.
4. When making an important choice or decision, make sure that you dig deeper into other possible reasons, causes, and ideas.
5. When brainstorming, cast aside the first several ideas till your mind is devoid of ideas. Then coming up with more ideas will create novel associations that are not skimmed off the top of your head.
6. Realize that based on favorites and hardwired opinions, we have a bias towards certain events and ideas. In this case, you rationalize why the easily retrievable and available option is the best one.
7. Look, seek and allow conflicting information. Remember that co-occurring or correlation does not mean that there is a strong cause and effect between the events. This is the classic “correlation is not causation” idea.
“The brain is wider than the sky.”- Emily Dickinson
“But I think that no matter how smart, people usually see what they’re already looking for, that’s all.” ― Veronica Roth, Allegiant
17. Pessimism Bias
“What you see and what you hear depends a great deal on where you are standing. It also depends on what sort of person you are.”― C.S. Lewis, The Magician’s Nephew
This is the opposite of the optimism bias which we discussed before. In previous posts, I discussed the negativity bias.
The negativity bias is the hardwiring of the human brain for negative events. Negative events also have a “stickiness” effect and stay longer in the mind.
Studies have found that in relationships, five positive interactions are needed to counter the effect of one negative one.
We have also discussed the optimism bias. The optimism bias is the overly optimistic and unrealistic view that people can hold, often in comparison to others.
Research has shown that the existence of the reverse of the optimism bias is the pessimism bias.
In this bias, people believe that they are less likely to be the recipients of positive results.
They may also believe that they are more likely to experience negative outcomes in comparison to other people.
Now you might be thinking that this is only natural.
There are optimistic and pessimistic people whose beliefs reflect their world views.
But the question is under what situations in a social comparison setting do these biases manifest? And more importantly can we reverse or reduce them?
“All that we see or seem is but a dream within a dream.” ― Edgar Allan Poe
In a research study titled Biases in social comparisons: Optimism or pessimism?, authors Menon, Kyung, and Agrawal ask when do these biases manifest?
The study, from the journal Organizational Behavior and Human Decision Processes, was published in 2009.
The authors say:
“The focus of our paper is to determine conditions under which one might expect to observe a comparative optimism or pessimism bias with the goal of gaining a better understanding of the antecedents of such biases in social comparison.”
The study shows that comparative optimism in a social setting is often the result of people’s perception that they have control over the outcome.
Pessimism in a comparative setting manifests when we feel and believe that we have less control over our outcomes.
The study also shows that these biases can be reduced when subjects perceived themselves to be similar to a comparison target.
A comparison target is an average person, for example an average undergraduate student in a study of undergraduates.
How this reduces the bias is again by a change in the perception of control that people experience in a particular domain.
I think that that these results make a lot of sense. Optimism and pessimism bias may be the result of isolation or insular thinking and belief processes.
When we see that the average person is not different from us, we may shift our beliefs to reflect that understanding.
In my experience, extreme pessimism is deeply isolating and there is no feeling of connection and similarity with others.
The study also showed that studying biases can have implications for motivation in the workplace.
The authors also say:
“We found that in situations encouraging comparative optimism, people are motivated to work harder to attain positive outcomes when the dissimilarity between self and the comparison target is enhanced. In contrast, in situations encouraging comparative pessimism, people are motivated to work harder when the similarity between self and the comparison target is enhanced. We also demonstrated that perceptions of control over the outcome for self and the comparison target vary in ways to support the two biases.”
A few of my Action Tips:
1. Become aware of the pessimism bias operating in your life.
2. Pessimism may manifest in a comparative social setting if you perceive a lack of control of the outcome. Look for ways to increase a perception of control by active participation.
3. Establish meaningful connections and comparisons to peers and similar groups to reduce the biases of excess unrealistic optimism and relentless pessimism.
“We are so made, that we can only derive intense enjoyment from a contrast and only very little from a state of things.”- Sigmund Freud
“Perception is subjective.” ― Toba Beta
18. The Clustering Bias and The Texas sharpshooter Fallacy
“The wise do not buy into other people’s perceptions of who they are and what they are capable of. Instead, they bypass a person’s public persona and see who they are in their highest expression. When you see actions taken with integrity, instead of words only, you will then know a soul’s worth.” ― Shannon L. Alder
Sometimes, we find patterns in a larger set of random data.
These patterns might allow us to confirm a theory or an argument that we have been waiting to confirm.
When we attribute natural causes to random clusters and believe in a certain theory when in reality there is none, we are engaging in this bias.
We have all seen posts or emails that get forwarded that find eerie and unbelievable similarities between random events.
We believe that there are synchronicities or mysterious things at play. But in reality, the similarities may be getting highlighted and the dissimilarities not.
A practical way this bias manifests is the Texas sharpshooter bias.
The origin of the Texas sharpshooter bias is from a Texan legend. Legend has it that there was a legendary Texan who randomly shot his rifle into the side of a barn. Where there were many bullet holes, the Texan painted a bullseye around them.
Everyone who saw the bullseye was impressed at what a great shot he was.
They did not know that he imposed artificial order over a random set of bullet holes thus skewing the results to his favor.
This bias is common in the epidemiology research. Disease researchers are wary of the so-called “disease clusters” that can occur.
A community with a higher than percentage rate of a disease such as cancer looks for possible environmental or other causes. When they look, they often find a cause.
Now some of the cases can indeed be true causations but many of them suffer from the clustering bias.
In a New Yorker article published in 1999 titled, The Cancer-Cluster Myth says:
“A community that is afflicted with an unusual number of cancers quite naturally looks for a cause in the environment—in the ground, the water, the air. And correlations are sometimes found: the cluster may arise after, say, contamination of the water supply by a possible carcinogen. The problem is that when scientists have tried to confirm such causes, they haven’t been able to.”
The article says that a cancer cluster expert and California’s environmental health investigator, Raymond Richard Neutra points out that hundreds of published cancer cluster studies could not identify an indisputable environmental cause.
And even abroad, cluster cases have only been able to identify one unrecognized cancer causing environmental carcinogen.
The case was in a Turkish village, Karain that had a higher incidence of mesothelioma, a rare form of lung cancer. The cancer was traced to an abundant mineral in the area called erionite.
“Every man takes the limits of his own field of vision for the limits of the world.” ― Arthur Schopenhauer, Studies in Pessimism: The Essays
What are some of the hallmarks of the clustering bias and how should we get beyond it?
1. Your mind tries to find order and structure where none exists. Ask if the order exists by looking at the data in a different way.
2. You begin to attach more weight to random events believing that there is an underlying commonality or connection.
3. You disregard and avoid the obvious differences. What happens when you take the differences into account? Does it make your theory weaker?
4. Your mind seeks resolution of the random chaos into something meaningful and significant. Instead of seeking resolution, accept that some events are random. There may be no underlying patterns in the chaos.
“The most perfidious way of harming a cause consists of defending it deliberately with faulty arguments.” ― Friedrich Nietzsche
Now over to you. Let me know in the comments below if these biases sound familiar and how you get beyond them.
If you liked this post, why don’t you sign up for free updates. Please use the form below or on the sidebar and I will see you soon with a sparkly new post from LaunchYourGenius delivered straight to your inbox!
[do_widget “Hybrid Connect”]
Comments