Not enough time
We’re constrained by time and information, and yet we can’t let that paralyze us. Without the ability to act fast in the face of uncertainty, we surely would have perished as a species long ago. With every piece of new information, we need to do our best to assess our ability to affect the situation, apply it to decisions, simulate the future to predict what might happen next, and otherwise act on our new insight.
To act, we must be confident we can make an impact and feel what we do is important
In reality, most of this confidence can be classified as overconfidence, but without it we might not act at all.
Excessive confidence in one’s own answers to questions. For example, for certain types of questions, answers that people rate as “99% certain” turn out to be wrong 40% of the time.
Example – Over 90% of US drivers rate themselves above average. 68% of professors consider themselves in the top 25 percent for teaching ability.
Social desirability bias
The tendency to over-report socially desirable characteristics or behaviors in oneself and under-report socially undesirable characteristics or behaviors.
Example – When confronted with the question, “Do you use drugs/illicit substances?” the respondent may be influenced by the fact that controlled substances, including the more commonly used marijuana, are generally illegal. Respondents may feel pressured to deny any drug use or rationalize it, e.g. “I only smoke marijuana when my friends are around.”
Belief that mass communicated media messages have a greater effect on others than on ourselves.
Example – In WW2 Japanese attempted to dissuade black U.S. soldiers from fighting at Iwo Jima using propaganda. Leaflets stressed that the Japanese did not have a quarrel with the black soldiers and that they should give up or desert. Although there was no indication that the leaflets had any effect on the soldiers, the incident preceded a substantial reshuffle among the officers and the unit was withdrawn the next day.
False consensus effect
The tendency for people to overestimate the degree to which others agree with them.
Example – For example, if a man doubted whether he wanted to buy a new tool, breaking down his notion that others agree with his doubt would be an important step in persuading him to purchase it. By convincing the customer that other people in fact do want to buy the appliance, the seller could perhaps make a sale that he would not have made otherwise.
Based on a specific level of task difficulty, the confidence in judgments is too conservative and not extreme enough.
Example – Who was born first, Aristotle or Buddha?” or Was the zipper invented before or after 1920?”. The subjects filled in the answers they believed to be correct and rated how sure they were of them. The results showed subjects tend to be under-confident of their answers to questions designated by the experimenters as to be easy, and overconfident of their answers to questions designated as hard.
The tendency for unskilled individuals to overestimate their own ability and the tendency for experts to underestimate their own ability.
Example – With more difficult tasks, the best performers were less accurate in predicting their performance than were the worst performers. Therefore, judges at all levels of skill are subject to similar degrees of error in the performance of tasks.
Occurs when people claim more responsibility for themselves for the results of a joint action than an outside observer would credit them with.
Example – The egocentric bias has also been shown to contribute to a citizen’s decision to vote in elections. Firstly, people tend to view their personal choice between voting and abstinence as a reflection of those who support the same candidates and issues. Secondly, although each individual vote has very little power in large-scale elections, those who vote overestimate the significance their ballot.
The tendency to be over-optimistic, overestimating favorable and pleasing outcomes (see also wishful thinking, valence effect, positive outcome bias).
Example – In health, the optimistic bias tends to prevent individuals from taking on preventative measures for good health. For example, people who underestimate their comparative risk of heart disease know less about heart disease, and even after reading an article with more information, are still less concerned about risk of heart disease.
The observation that individuals will give high accuracy ratings to descriptions of their personality that supposedly are tailored specifically for them, but are in fact vague and general enough to apply to a wide range of people. Explains astrology, fortune telling, graphology, and some types of personality tests.
Example – On average, the students rated its accuracy as 4.30 on a scale of 0 (very poor) to 5 (excellent). Only after the ratings were turned in was it revealed that each student had received an identical sketch assembled by Forer from a newsstand astrology book. The sketch contains statements that are vague and general enough to apply to most people.
The tendency to claim more responsibility for successes than failures. It may also manifest itself as a tendency for people to evaluate ambiguous information in a way beneficial to their interests (see also group-serving bias).
Example – Clinically depressed patients tend to show less of a self-serving bias than individuals in the general population.
The tendency for explanations of other individuals’ behaviors to overemphasize the influence of their personality and underemphasize the influence of their situation (see also Fundamental attribution error), and for explanations of one’s own behaviors to do the opposite (that is, to overemphasize the influence of our situation and underemphasize the influence of our own personality).
Example – A student who studies hard for an exam is likely to explain her own (the actor’s) intensive studying by referring to the upcoming difficult exam (a situational factor), whereas other people (the observers) are likely to explain her studying by referring to her dispositions, such as being hardworking or ambitious.
Illusion of control
The tendency to overestimate one’s degree of influence over other external events.
Example – One simple form of this effect is found in casinos: when rolling dice in a craps game people tend to throw harder when they need high numbers and softer for low numbers.
Overestimating one’s desirable qualities, and underestimating undesirable qualities, relative to other people. (Also known as “Lake Wobegon effect”, “better-than-average effect”, or “superiority bias”.)
Example – If an individual evaluated themselves as honest, they would be likely to then exaggerate their characteristic towards their perceived ideal position on a scale of honesty. And in a survey of high school students, only 2% of the students reported that they were below average in leadership ability.
Fundamental attribution error
The tendency for people to over-emphasize personality-based explanations for behaviors observed in others while under-emphasizing the role and power of situational influences on the same behavior (see also actor-observer bias, group attribution error, positivity effect, and negativity effect).
Example – Alice, a driver, is cut off in traffic by Bob. Alice attributes Bob’s behavior to his fundamental personality, e.g., he thinks only of himself, he is selfish, he is a jerk, he is an unskilled driver; she does not think it is situational, e.g., he is going to miss his flight, his wife is giving birth at the hospital, his daughter is convulsing at school.
Defensive attribution hypothesis
Attributing more blame to a harm-doer as the outcome becomes more severe or as personal or situational similarity to the victim increases.
Example – Similarity of the witness to the person(s) involved in the misfortune – in terms of situation, age, gender, personality, etc. -changes the amount of blame one is ready to ascribe.
Trait ascription bias
The tendency for people to view themselves as relatively variable in terms of personality, behavior, and mood while viewing others as much more predictable.
Example – Pretty much the same as Actor-Observer bias.
Student explain poor performance to a supervisor (supervisor might superficially believe the student’s explanations but really thinks the performance is due to “enduring qualities”: lack of ability, laziness, ineptitude, etc.)
Effort justification is a person’s tendency to attribute a value to an outcome, which they had to put effort into achieving, greater than the objective value of the outcome.
Example – Rites of passage and hazing rituals on group solidarity and loyalty. The hazing rituals, prevalent in military units, sports teams and fraternities and sororities, often include demanding and/or humiliating tasks which lead the new member to increase the subjective value of the group. This contributes to his loyalty and to the solidarity of the entire group.
The tendency to take greater risks when perceived safety increases.
Example – Booth’s rule #2′, often attributed to skydiving pioneer Bill Booth, states, the safer skydiving gear becomes, the more chances skydivers will take, in order to keep the fatality rate constant”.
The reduction of predicted benefit from regulations that intend to increase safety.
Example – Part of the risk compensation bias.
Sam Peltzman on Automobile Safety Regulation: “offsets (due to risk compensation) are virtually complete, so that regulation has not decreased highway deaths”
A tendency to believe ourselves to be worse than others at tasks which are difficult.
Example – This effect seems to occur when chances of success are perceived to be extremely rare. Traits which people tend to underestimate include juggling ability, the ability to ride a unicycle, the odds of living past 100 or of finding a U.S. twenty dollar bill on the ground in the next two weeks.
Losing sight of the strategic construct that a measure is intended to represent, and subsequently acting as though the measure is the construct of interest.
Example – Managers tend to use measures as surrogates for strategy, acting as if measures were in fact the strategy when making optimization decisions.
The age-independent belief that one will change less in the future than one has in the past.
Example – Participants consistently expected their preferences to remain relatively unchanged over the next 10 years while participants one decade older reflected on much higher levels of preference change.
To stay focused, we favor the immediate, relatable thing in front of us
We value stuff more in the present than in the future, and relate more to stories of specific individuals than anonymous individuals or groups. I’m surprised there aren’t more biases found under this one, considering how much it impacts how we think about the world.
A tendency for people to have a stronger preference for more immediate payoffs relative to later payoffs. Hyperbolic discounting leads to choices that are inconsistent over time -people make choices today that their future selves would prefer not to have made, despite using the same reasoning. A study showed that when making food choices for the coming week, 74% of participants chose fruit, whereas when the food choice was for the current day, 70% chose chocolate.
Example – A substantial number of subjects reported that they would prefer $50 immediately rather than $100 in six months, but would NOT prefer $50 in 3 months rather than $100 in nine months, even though this was the same choice seen at 3 months greater distance.
An experiment with cult members showed that they would rather receive a small amount of money before their predicted judgement day than a lot of money after as they thought everyone would be dead by that point.
Appeal to novelty
The appeal to novelty is a fallacy in which one prematurely claims that an idea or proposal is correct or superior, exclusively because it is new and modern.
Example – “Upgrading all your software to the most recent versions will make your system more reliable.”
Identifiable victim effect
The tendency to respond more strongly to a single identified person at risk than to a large group of people at risk.
Example – Part of compasion fade. Same exact bias as singularity effect. The effect is epitomized by the phrase (commonly attributed to Joseph Stalin) “A single death is a tragedy; a million deaths is a statistic.”
To get things done, we tend to complete things we’ve invested time and energy in
The behavioral economist’s version of Newton’s first law of motion: an object in motion stays in motion. This helps us finish things, even if we come across more and more reasons to give up.
Sunk cost fallacy
Decision makers take previous costs into account when making decisions about the future.
Example – The expression “Concorde fallacy” refers to the UK and French governments’ taking their past expenses on the costly supersonic jet as a rationale for continuing the project, as opposed to the rationale of “cutting their losses”.
We also tend to finish bad movies or bad meals. Even though we don’t enjoy them.
Escalation of commitment
Escalation of commitment is a human behavior pattern in which an individual or group facing increasingly negative outcomes from a decision, action, or investment nevertheless continues the behavior instead of altering course. The actor maintains behaviors that are irrational, but align with previous decisions and actions.
Example – Letter to Lyndon Johnson on the Vietnam War: “Once we suffer large casualties, we will have started a well-nigh irreversible process. Our involvement will be so great that we cannot -without national humiliation – stop short of achieving our complete objectives. Of the two possibilities, I think humiliation would be more likely than the achievement of our objectives -even after we have paid terrible costs.”
That self-generated information is remembered best. For instance, people are better able to recall memories of statements that they have generated than similar statements generated by others.
Example – For example, a participant that is required to generate same-category targets from distinctive semantic cues (e.g., PURR-C_T, SADDLE-H_RS_) is likely to notice similarities between the targets (e.g., they are all animals). This type of manipulation would promote whole-list relational processing, which may enhance generation performance on a free recall test.
The disutility of giving up an object is greater than the utility associated with acquiring it. (see also Sunk cost effects and endowment effect).
For a more in-depth understanding of loss aversion check out this article by kent hendricks.
Example – Note that whether a transaction is framed as a loss or as a gain is very important to this calculation: would you rather get a $5 discount, or avoid a $5 surcharge? The same change in price framed differently has a significant effect on consumer behavior.
The tendency for people to place a disproportionately high value on objects that they partially assembled themselves, such as furniture from IKEA, regardless of the quality of the end product.
Example – The subjects were given the task of assembling IKEA furniture. Researchers then priced the items the experimenters had assembled as well as pre-assembled IKEA furniture. The results showed that the subjects were willing to pay 63% more for the former than for the latter.
The standard suggested amount of consumption (e.g., food serving size) is perceived to be appropriate, and a person would consume it all even if it is too much for this particular person.
Example – M&M experiment – the researchers offered a large mixing bowl of the candy at the front desk of the concierge of an apartment building. Below the bowl hung a sign that read “Eat Your Fill” with “please use the spoon to serve yourself” written underneath. Whether there was a big or small spoon subjects took 1 spoon exactly.
Preference for reducing a small risk to zero over a greater reduction in a larger risk.
Example – In American federal policy, the Delaney clause outlawing cancer-causing additives from foods (regardless of actual risk) and the desire for perfect cleanup of Superfund sites have been alleged to be overly focused on complete elimination. Limited resources were increasingly being devoted to low-risk issues.
The tendency to sell an asset that has accumulated in value and resist selling an asset that has declined in value.
Example – Under Choice A, they would have a 50% chance of gaining $1,000, and a 50% chance of gaining $0; under Choice B, they would have a 100% chance of gaining $500. In the second situation, they had $2,000 and had to select either Choice A (a 50% chance of losing $1,000, and 50% of losing $0) or Choice B (a 100% chance of losing $500). An overwhelming majority of participants chose B in the first scenario and “A” in the second.
The tendency to make risk-averse choices if the expected outcome is positive, but make risk-seeking choices to avoid negative outcomes.
Example – Which of the following options do you prefer? A. a sure win of $30 or B. 80% chance to win $45. Then again with: C. 25% chance to win $30 or D. 20% chance to win $45? People picked A and D even though the answers had identical outcomes.
Processing difficulty effect
That information that takes longer to read and is thought about more (processed with more difficulty) is more easily remembered.
Related to Serial recall effect and all the biases under that category.
The tendency for people to demand much more to give up an object than they would be willing to pay to acquire it.
Example – Participants first given a Swiss chocolate bar were generally unwilling to trade it for a coffee mug, whereas participants first given the coffee mug were generally unwilling to trade it for the chocolate bar.
The reaction to disconfirming evidence by strengthening one’s previous beliefs. Similar to Continued influence effect.
Example – The backfire effect has been noted to be a rare phenomenon rather than a common occurrence.
That uncompleted or interrupted tasks are remembered better than completed ones.
Example – The Zeigarnik effect has been used to explain the widespread criticism of the National Basketball Association in allowing free throws for a player “chucking it up whenever a guy comes near them.” There is a stoppage of play with each foul. When repeatedly done, it is felt to build up a cognitive bias against this move. The criticism necessitated a rule change penalizing this activity, known as the Harden Rule, named after its most prominent user, James Harden.
Just as losses yield double the emotional impact of gains, dread yields double the emotional impact of savouring.
Example – Like loss aversion, but directly for emotional states and not monetary values.
To avoid mistakes, we aim to preserve autonomy and group status, and avoid irreversible decisions
If we must choose, we tend to choose the option that is perceived as the least risky or that preserves the status quo. Better the devil you know than the devil you do not.
The tendency to defend and bolster the status quo. Existing social, economic, and political arrangements tend to be preferred, and alternatives disparaged, sometimes even at the expense of individual and collective self-interest. (See also status quo bias.)
Example – Slow response of relief efforts after Hurricane Katrina in 2005 were perceived by some to expose governmental shortcomings, call into question the legitimacy of agency leadership, and highlight racial inequality in America. These perceptions indirectly brought a threat to the legitimacy of the U.S. government. As a result of this system threat, people tended to restore legitimacy to the system through utilizing stereotypes and victim blaming.
Reverse psychology is a technique involving the assertion of a belief or behavior that is opposite to the one desired, with the expectation that this approach will encourage the subject of the persuasion to do what actually is desired. Based on reactence.
Example – Some people value things or people more if those things or people are unavailable to them or who pretend to be unavailable: they want what they can’t have. However, being emotionally unavailable to one’s partner will damage the health of a long-term romantic relationship.
The urge to do the opposite of what someone wants you to do out of a need to resist a perceived attempt to constrain your freedom of choice (see also Reverse psychology).
Example – Psychological reactance is an important indicator in adolescent smoking initiation. Peer intimacy, peer individuation, and intergenerational individuation are strong predictors of psychological reactance.
Preferences for either option A or B change in favor of option B when option C is presented, which is completely dominated by option B (inferior in all respects) and partially dominated by option A.
For a more in-depth understanding of the decoy effect check out this article by kent hendricks.
Example – When offered the choice between a small bucket of popcorn for $3 and a large one for $7, most picked the small bucket. But when a medium decoy bucket for $6.5 was added, most picked the large bucket.
Social comparison effect
The tendency, when making decisions, to favor potential candidates who don’t compete with one’s own particular strengths.
Example – Students depending on their grade level are very competitive about the grades they receive compared to their peers. Social comparisons not only influence students’ self-concepts but also improve their performance.
Status quo bias
The tendency to like things to stay relatively the same (see also loss aversion, endowment effect, and system justification).
Example – Same as default bias.
The US states of New Jersey and Pennsylvania in the early 1990’s. As part of tort law reform programs, citizens were offered two options for their automotive insurance: an expensive option giving them full right to sue and a less expensive option with restricted rights to sue. In New Jersey the cheaper option was the default and most citizens selected it. Only a minority chose the cheaper option in Pennsylvania, where the more expensive option was the default.
Shared information bias
Known as the tendency for group members to spend more time and energy discussing information that all members are already familiar with (i.e., shared information), and less time and energy discussing information that only some members are aware of (i.e., unshared information).
Example – Members are motivated to establish and maintain reputations, to secure tighter bonds, and to compete for success against other group members. As a result, individuals tend to be selective when disclosing information to other group members.
We favor simple-looking options and complete information over complex, ambiguous options
We’d rather do the quick, simple thing than the important complicated thing, even if the important complicated thing is ultimately a better use of time and energy.
The tendency to avoid options for which the probability of a favorable outcome is unknown.
Example – When buying a house, many people choose a fixed rate mortgage, where the interest rate is set in stone, over a variable rate mortgage, where the interest rate fluctuates with the market. This is the case even though a variable rate mortgage has statistically been shown to save money.
The tendency to seek information even when it cannot affect act-ion.
Example – A female patient is presenting symptoms and a history which both suggest a diagnosis of globoma, with about 80% probability. If it isn’t globoma, it’s either popitis or flapemia. If the ET scan was the only test you could do, should you do it? Many subjects answered that they would conduct the ET scan even if it were costly, and even if it were the only test that could be done. But he number of patients with globoma will always be greater than the number of patients with popitis or flapemia in either case of a positive or negative ET scan.
An effect where someone’s evaluation of the logical strength of an argument is biased by the believability of the conclusion.
Example – All teenage girls are ambitious. (major premise) Teenage girls study hard. (minor premise) Therefore, girls study hard because they are ambitious. (conclusion)
The rhyme-as-reason effect is a cognitive bias whereupon a saying or aphorism is judged as more accurate or truthful when it is rewritten to rhyme.
Example – For example, the statement “What sobriety conceals, alcohol reveals” was judged to be more accurate than by different participants who saw “What sobriety conceals, alcohol unmasks”.
Law of Triviality
The tendency to give disproportionate weight to trivial issues. This bias explains why an organization may avoid specialized or complex subjects, such as the design of a nuclear reactor, and instead focus on something easy to grasp or rewarding to the average participant, such as the design of an adjacent bike shed.
Example – Also called Bike-shedding effect.
“Agendas, the first is the signing of a £10 million contract to build a reactor, the second a proposal to build a £350 bicycle shed for the clerical staff, and the third proposes £21 a year to supply refreshments for the Joint Welfare Committee. Every man there knows about coffee – what it is, how it should be made, where it should be bought – and whether indeed it should be bought at all. This item on the agenda will occupy the members for an hour and a quarter.”
The tendency to assume that specific conditions are more probable than a more general version of those same conditions. For example, subjects in one experiment perceived the probability of a woman being both a bank teller and a feminist as more likely than the probability of her being a bank teller.
Example – “Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.” Is Linda a bank teller or a bank teller and a feminist? A majority picked the second option.
Occam’s razor says that when presented with competing hypotheses that make the same predictions, one should select the solution with the fewest assumptions.
Example – Biologists argue that the best way to explain altruism among animals is based on low-level (i.e., individual) selection as opposed to high-level group selection.
The tendency to prefer a smaller set to a larger set judged separately, but not jointly.
Example – A person giving a $45 scarf as a gift was perceived to be more generous than one giving a $55 coat. A dinnerware set with 24 intact pieces was judged more favorably than one with 31 intact pieces (including the same 24) plus a few broken ones.
Read part 4
Ultimate List of Cognitive Biases with examples (Part 4 of 4) – Not enough memory