Skip to content

[2020] Ultimate List of Cognitive Biases with examples (Part 2 of 4)

Section One

Section Two

Section Three

Section Four

Section Two

Not enough meaning

The world is very confusing, and we end up only seeing a tiny sliver of it, but we need to make some sense of it in order to survive. Once the reduced stream of information comes in, we connect the dots, fill in the gaps with stuff we already think we know, and update our mental models of the world.

Chapter Six

We tend to find stories and patterns even when looking for sparse data

Since we only get a tiny sliver of the world’s information, and also filter out almost everything else, we never have the luxury of having the full story. This is how our brain reconstructs the world to feel complete inside our heads.


In psychology, confabulation is a memory error defined as the production of fabricated, distorted, or misinterpreted memories about oneself or the world. People who confabulate present incorrect memories ranging from “subtle alterations to bizarre fabrications”, and are generally very confident about their recollections, despite contradictory evidence.

Example – Brian Williams, anchor of NBC Nightly News, told a story on his show about being in a helicopter that was “forced down after being hit by an RPG”. In reality he was on another helicopter half an hour behind them.

About a visit to Bosnia Hillary Clinton said: I remember landing under sniper fire. There was supposed to be some kind of a greeting ceremony at the airport, but instead we just ran with our heads down to get into the vehicles to get to our base. In reality it was a safe and calm visit.

Clustering illusion 

The tendency to overestimate the importance of small runs, streaks, or clusters in large samples of random data (that is, seeing phantom patterns).

Example – Although Londoners developed specific theories about the pattern of impacts within London, a statistical analysis by R. D. Clarke originally published in 1946 showed that the impacts of V-2 rockets on London were a close fit to a random distribution.

Insensitivity to sample size

The tendency to under-expect variation in small samples.

Example – Kidney cancer rates are lowest in counties that are mostly rural, sparsely populated, and located in traditionally Republican states in the Midwest, the South, and the West, but that they are also highest in counties that are mostly rural, sparsely populated, and located in traditionally Republican states in the Midwest, the South, and the West. While various environmental and economic reasons could be advanced for these facts, Wainer and Zwerlig argue that this is an artifact of sample size.

Neglect of probability

The tendency to completely disregard probability when making a decision under uncertainty.

Example – Subjects were presented a choice between two games of chance. In one, you have a one in 100 million chance of winning $10 million; in the other, you have a one in 10,000 chance of winning $10,000. It is more reasonable to choose the second game; but most people would choose the first. For this reason, jackpots in lotteries are growing.

Anecdotal fallacy

Anecdotal evidence is evidence from anecdotes: evidence collected in a casual or informal manner and relying heavily or entirely on personal testimony.

Example – “There’s abundant proof that drinking water cures cancer. Just last week I read about a girl who was dying of cancer. After drinking water she was cured.” winning $10,000. It is more reasonable to choose the second game; but most people would choose the first. For this reason, jackpots in lotteries are growing.

Illusion of validity

Believing that one’s judgments are accurate, especially when available information is consistent or inter-correlated.

Example – In one study subjects reported higher confidence in a prediction of the final grade point average of a student after seeing a first-year record of consistent B’s than a first-year record of an even number of A’s and C’s.

Masked-man fallacy

The masked-man fallacy is when you think that if one object has a certain property, while another object does not have the same property, the two objects cannot be identical.

Example – Lois Lane believes that Superman can fly. Lois Lane does not believe that Clark Kent can fly. Therefore, Superman and Clark Kent are not the same person.

Recency illusion 

The illusion that a phenomenon one has noticed only recently is itself recent. (see also frequency illusion).

Example – “Literally” being used figuratively as an intensifier is often viewed as a recent change, but in fact usage dates back to the 1760s.

Gambler’s fallacy

The tendency to think that future probabilities are altered by past events, when in reality they are unchanged. The fallacy arises from an erroneous conceptualization of the law of large numbers. For example, “I’ve flipped heads with this coin five times consecutively, so the chance of tails coming out on the sixth flip is much greater than heads.”

Example – Perhaps the most famous example of the gambler’s fallacy occurred in a game of roulette at the Monte Carlo Casino on August 18, 1913, when the ball fell in black 26 times in a row. 1 in 66.6 million chance. Gamblers lost millions of francs betting against black, reasoning incorrectly that the streak was causing an imbalance in the randomness of the wheel, and that it had to be followed by a long streak of red.

This amazing illustration of the conjunction fallacy was made by @cartoonbias. Do check out their work on Instagram.

Hot-hand fallacy

The “hot-hand fallacy is the belief that a person who has experienced success with a random event has a greater chance of further success in additional attempts.

Example – One of the several “inability to process randomness” biases.

The “Hot Hand in Basketball” study questioned the theory that basketball players have “hot hands”, which the paper defined as the claim that players are more likely to make a successful shot if their previous shot was successful.

Illusory correlation 

Inaccurately perceiving a relationship between two unrelated events.

Example – A parallel effect occurs when people judge whether two events, such as pain and bad weather, are correlated. They rely heavily on the relatively small number of cases where the two events occur together. People pay relatively little attention to the other kinds of observation (of no pain or good weather).


A vague and random stimulus (often an image or sound) is perceived as significant, e.g., seeing images of animals or faces in clouds, the man in the moon, and hearing non-existent hidden messages on records played in reverse.

Example – A satellite photograph of a mesa in the Cydonia region of Mars made people perceive a face on the planet. It’s often called the “Face on Mars” and cited as evidence of extraterrestrial habitation.


The tendency to characterize animals, objects, and abstract concepts as possessing human-like traits, emotions, and intentions.

Example – From the beginnings of human behavioral modernity in the Upper Paleolithic, about 40,000 years ago, examples of zoomorphic (animal-shaped) works of art occur that may represent the earliest evidence we have of anthropomorphism.

This amazing illustration of the conjunction fallacy was made by @cartoonbias. Do check out their work on Instagram.

Berkson’s paradox

The tendency to misinterpret statistical experiments involving conditional probabilities. This effect leads one to come to the opposite conclusion because of observation bias. For example, not being at a hospital is seemingly healthier for you as that’s what the sick vs. healthy stats show directly.

Example – Suppose Jane will only date a man if his niceness plus his handsomeness exceeds some threshold. Then nicer men do not have to be as handsome to qualify for Jane’s dating pool. So, among the men that Jane dates, Jane may observe that the nicer ones are less handsome on average (and vice versa), even if these traits are uncorrelated in the general population. Berkson’s negative correlation is an effect that arises within the dating pool: the rude men that Jane dates must have been even more handsome to qualify.

Chapter Seven

We fill in characteristics from stereotypes, generalities, and prior histories

When we have partial information about a specific thing that belongs to a group of things we are pretty familiar with, our brain has no problem filling in the gaps with best guesses or what other trusted sources provide. Conveniently, we then forget which parts were real and which were filled in.

Group attribution error

The biased belief that the characteristics of an individual group member are reflective of the group as a whole. Or the tendency to assume that group decision outcomes reflect the preferences of group members, even when information is available that clearly suggests otherwise.

Example – The conception of children believing that all girls are nice” illustrates the influence of categorization and generalization to members of this group (girls).

Ultimate attribution error

Similar to the fundamental attribution error, in this error a person is likely to make an internal attribution to an entire group instead of the individuals within the group.

Example – In a 2×2 between-group design, Hindu or Muslim participants were asked to make casual attributions for undesirable acts performed by Hindus or Muslims. Hindus attributed external causes to undesirable acts committed by fellow Hindus, but an internal cause for undesirable acts committed by Muslims. Conversely, Muslims attributed external causes to undesirable acts committed by fellow Muslims, but an internal cause for undesirable acts committed by Hindus.


Expecting a member of a group to have certain characteristics without having actual information about that individual.

Example – Jews were stereotyped as being evil and yearning for world domination.


Essentialism is the view that every entity has a set of attributes that are necessary to its identity and function.

Example – Paul Bloom attempts to explain why people will pay more in an auction for the clothing of celebrities if the clothing is unwashed. He believes the answer to this and many other questions is that people cannot help but think of objects as containing a sort of “essence” that can be influenced.

Functional fixedness

Limits a person to using an object only in the way it is traditionally used.

Example – Duncker (1945) gave participants a candle, a box of thumbtacks, and a book of matches, and asked them to attach the candle to the wall so that it did not drip onto the table below. Duncker found that participants tried to attach the candle directly to the wall with the tacks, or to glue it to the wall by melting it. Very few of them thought of using the inside of the box as a candle-holder and tacking this to the wall.

Moral credential effect

Occurs when someone who does something good gives themselves permission to be less good in the future

Example –

We drink Diet Coke – with Quarter Pounders and fries at McDonald’s. We go to the gym – and ride the elevator to the second floor. We install tankless water heaters – then take longer showers. We drive SUVs to see Al Gore’s speeches on global warming.”

Just-world hypothesis

The tendency for people to want to believe that the world is fundamentally just, causing them to rationalize an otherwise inexplicable injustice as deserved by the victim(s).

Example – In a study one of two men was chosen at random to receive a reward for a task, that caused him to be more favorably evaluated by observers, even when the observers had been informed that the recipient of the reward was chosen at random.

This amazing illustration of the conjunction fallacy was made by @cartoonbias. Do check out their work on Instagram.

Argument from fallacy

Argument from fallacy is the formal fallacy of analyzing an argument and inferring that, since it contains a fallacy, its conclusion must be false.

Example – All cats are animals. Ginger is an animal. Therefore, Ginger is a cat.

Authority bias

The tendency to attribute greater accuracy to the opinion of an authority figure (unrelated to its content) and be more influenced by that opinion.

Example – The Milgram experiment on obedience to authority figures by giving electric shocks to “learners”.

Automation bias

The tendency to depend excessively on automated systems which can lead to erroneous automated information overriding correct decisions.

Example – Eastern Air Lines Flight 401. 101 fatalities. The crash occurred while the entire cockpit crew was preoccupied with a burnt-out landing gear indicator light. They failed to notice that the autopilot had inadvertently been disconnected and, as a result, the aircraft gradually lost altitude and crashed.

Bandwagon effect

The tendency to do/believe things because many other people do/believe the same. Related to groupthink and herd behavior.

Example – In a study, independents, which are those who do not vote based on the endorsement of any party and are ultimately neutral, were influenced strongly in favor of the person expected to win.

Investors were eager to invest in hyped and modern internet companies in the 90’s. But as these companies didn’t have a sustainable business model it created the Dot-com bubble that burst in 2000.

Placebo effect

A placebo is an inert substance or treatment which is designed to have no therapeutic value.

Example – Acupuncture is a form of alternative medicine and a key component of traditional Chinese medicine (TCM) in which thin needles are inserted into the body. Acupuncture is a pseudoscience because the theories and practices of TCM are not based on scientific knowledge, and it has been characterized as quackery.

Look-elsewhere effect

An apparently statistically significant observation may have actually arisen by chance because of the size of the parameter space to be searched.

Example – p-hacking is a term used for when scientists do 20 studies to get 1 significant result. Another hack is just to not set any expectations for an effect and measure 100 factors instead. Some factors are bound to correlate with something else just because its like running 100 experiments. This is also partly why the replication crisis hit psychology as a science as there had been a hunt for positive results.

Law of the instrument

An over-reliance on a familiar tool or methods, ignoring or under-valuing alternative approaches. If all you have is a hammer, everything looks like a nail.”

Example – Abraham Maslow said in 1966, “I suppose it is tempting, if the only tool you have is a hammer, to treat everything as if it were a nail.”

This amazing illustration of the conjunction fallacy was made by @cartoonbias. Do check out their work on Instagram.

Illicit transference

Referring to every member of a class or to the class itself as are treated as equivalent. The two variants of this fallacy are the fallacy of composition and the fallacy of division.

Example – Fallacy of composition: assumes what is true of the parts is true of the whole. This fallacy is also known as “arguing from the specific to the general.” Since Judy is so diligent in the workplace, this entire company must have an amazing work ethic.

Fallacy of division: assumes what is true of the whole is true of its parts (or some subset of parts). Because this company is so corrupt, so must every employee within it be corrupt.

Form function attribution bias

In human-robot interaction, the tendency of people to make systematic errors when interacting with a robot. People may base their expectations and perceptions of a robot on its appearance (form) and attribute functions which do not necessarily mirror the true functions of the robot.

Example – We notice as shape and act on it. We have instincts in our mind to easily recognize animal shapes for example. So a stick in the forest may make us jump up in fear.

Ben Franklin effect

A person who has performed a favor for someone is more likely to do another favor for that person than they would be if they had received a favor from that person.

Example – “… requesting he would do me the favor of lending it to me for a few days. He sent it immediately, and I return’d it in about a week with another note, expressing strongly my sense of the favor. When we next met in the House, he spoke to me (which he had never done before), and with great civility; and he ever after manifested a readiness to serve me on all occasions, so that we became great friends, and our friendship continued to his death.” – Benjamin Franklin

Availability cascade

A self-reinforcing process in which a collective belief gains more and more plausibility through its increasing repetition in public discourse (or “repeat something long enough and it will become true”).

Example – The media inclination to sensationalism results in a tendency to devote disproportionate coverage to sympathetic victims (e.g. missing woman syndrome), terrifying assailants (e.g. Media coverage of the Virginia Tech massacre), and incidents with multiple victims. Although half the victims of gun violence in the United States are black, generally young urban black males, media coverage and public awareness spike after suburban school shootings, as do calls for stricter gun control laws.

Chapter Eight

We imagine things and people we are familiar with or fond of as better

Similar to the above but the filled-in bits generally also include built in assumptions about the quality and value of the thing we’re looking at.

Out-group homogeneity bias

Individuals see members of their own group as being relatively more varied than members of other groups.

Example – An example of this phenomenon comes from a study where researchers asked 90 sorority members to judge the degree of within-group similarity for their own and 2 other groups. It was found that every participant judged their own sorority members to be more dissimilar than the members of the other groups.

Cross-race effect

The tendency for people of one race to have difficulty identifying members of a race other than their own.

Example – A study was made which examined 271 real court cases. In photographic line-ups, 231 witnesses participated in cross-race versus same-race identification. In cross-race lineups, only 45% were correctly identified versus 60% for same-race identifications.

In-group favoritism 

The tendency for people to give preferential treatment to others they perceive to be members of their own groups.

Example – Using a publics-goods game, Van Vugt, De Cremer, and P. Janssen (2007) found that men contributed more to their group in the face of outside competition from another group; there was no distinct difference amongst women’s contributions.

Halo effect

The tendency for a person’s positive or negative traits to “spill over from one personality area to another in others’ perceptions of them (see also physical attractiveness stereotype).

Example – In studies attractive people are rated more positively on personality and other traits . Advertising often makes use of famous musicians and movie stars to promote products via halo effect.

This amazing illustration of the conjunction fallacy was made by @cartoonbias. Do check out their work on Instagram.

Cheerleader effect

The tendency for people to appear more attractive in a group than in isolation.

Example – Across five studies by Walker and Vul (2013), participants rated the attractiveness of male and female faces when shown in a group photo, and an individual photo, with the order of the photographs randomized. The people photographed got higher scores for their group photos.

This amazing illustration of the conjunction fallacy was made by @cartoonbias. Do check out their work on Instagram.

Positivity effect

Older adults favor positive over negative information in their memories.

Example – On online social networks like Twitter and Instagram, users prefer to share positive news, and are emotionally affected by positive news more than twice than by negative news.

Not invented here

Aversion to contact with or use of products, research, standards, or knowledge developed outside a group. Related to IKEA effect.

Example – In programming, it is also common to refer to the “NIH syndrome” as the tendency towards reinventing the wheel (reimplementing something that is already available) based on the belief that in-house developments are inherently better suited, more secure, more controlled, quicker to develop, and incur lower overall cost (including maintenance cost) than using existing implementations.

Reactive devaluation 

Devaluing proposals only because they purportedly originated with an adversary.

Example – Stillinger and co-authors asked pedestrians in the US whether they would support a drastic bilateral nuclear arms reduction program. If they were told the proposal came from President Ronald Reagan, 90 percent said it would be favorable or even-handed to the United States; if they were told the proposal came from a group of unspecified policy analysts, 80 percent thought it was favorable or even; but, if respondents were told it came from Mikhail Gorbachev only 44 percent thought it was favorable or neutral to the United States.

Well-traveled road effect

Underestimation of the duration taken to traverse oft-traveled routes and overestimation of the duration taken to traverse less familiar routes.

Example – This effect creates errors when estimating the most efficient route to an unfamiliar destination, when one candidate route includes a familiar route, whilst the other candidate route includes no familiar routes. The effect is most salient when subjects are driving, but is still detectable for pedestrians and users of public transport.

Singularity effect

The tendency to behave more compassionately to a single identifiable individual than to any group of nameless ones.

Example – Part of the compassion fade, but its about an individual vs. a group and not a small group vs. large group. Joseph Stalin: The death of one man is a tragedy, the death of millions is a statistic”.


Occurs within a group of people in which the desire for harmony or conformity in the group results in an irrational or dysfunctional decision-making outcome. Group members try to minimize conflict and reach a consensus decision without critical evaluation of alternative viewpoints by actively suppressing dissenting viewpoints, and by isolating themselves from outside influences.

Example – This is ingroup vs. outgroup. A basis concept in psychology. One of the most common biases that affects many other group biases.

Compassion fade

The predisposition to behave more compassionately towards a small number of identifiable victims than to a large number of anonymous ones.

Example – As singularity effect, but for a large group vs. a small group instead of an individual.

Anthropocentric thinking

The tendency to use human analogies as a basis for reasoning about other, less familiar, biological phenomena.

Example – George Orwell’s novel Animal Farm. Whereas originally the animals planned for liberation from humans and animal equality, as evident from the “seven commandments” such as “whatever goes upon two legs is an enemy.”, “Whatever goes upon four legs, or has wings, is a friend”, All animals are equal.”; the pigs would later abridge the commandments with statements such as “All animals are equal, but some animals are more equal than others.” and “Four legs good, two legs better.”

Chapter Nine

We simplify probabilities and numbers to make them easier to think about

Our subconscious mind is terrible at math and generally gets all kinds of things wrong about the likelihood of something happening if any data is missing.

Mental accounting

Psychological accounting attempts to describe the process whereby people code, categorize and evaluate economic outcomes.

Example – Automotive dealers, for example, benefit from these principles when they bundle optional features into a single price but segregate each feature included in the bundle (e.g., heated seats, heated steering wheel, mirror defrosters).

Appeal to probability fallacy

An appeal to probability is the logical fallacy of taking something for granted because it would/might probably be the case. Inductive arguments lack deductive validity and must therefore be asserted or denied in the premises.

Example – If I do not bring my umbrella (premise) then it will rain. (invalid conclusion). Murphy’s law is a (typically deliberate, tongue-in-cheek) invocation of the fallacy.

This amazing illustration of the conjunction fallacy was made by @cartoonbias. Do check out their work on Instagram.

Normalcy bias

The refusal to plan for, or react to, a disaster which has never happened before.

Example – As for events in world history, the normalcy bias explains why, when the volcano Vesuvius erupted, the residents of Pompeii watched for hours without evacuating. It explains why thousands of people refused to leave New Orleans as Hurricane Katrina approached.

Murphy’s Law

Murphy’s law is an adage or epigram that is typically stated as: “Anything that can go wrong will go wrong”.

Example – Richard Dawkins gives as an example. Aircraft are in the sky all the time, but are only taken note of when they cause a problem. This is a form of confirmation bias whereby the investigator seeks out evidence to confirm his already formed ideas, but does not look for evidence that contradicts them.

Zero sum bias

A bias whereby a situation is incorrectly perceived to be like a zero-sum game (i.e., one person gains at the expense of another).

Example – When politicians argue that international trade must mean that one party is winning and another is losing when transfer of goods and services at mutually-agreeable prices is in general mutually beneficial, or that a trade deficit represents losing money to another country.

Survivorship bias

Concentrating on the people or things that “survived” some process and inadvertently overlooking those that didn’t because of their lack of visibility.

Example – Diagoras of Melos was asked concerning paintings of those who had escaped shipwreck: “Look, you who think the gods have no care of human things, what do you say to so many persons preserved from death by their especial favor?”, to which Diagoras replied: “Why, I say that their pictures are not here who were cast away, who are by much the greater number.”

This amazing illustration of the conjunction fallacy was made by @cartoonbias. Do check out their work on Instagram.

Subadditivity effect

The tendency to judge probability of the whole to be less than the probabilities of the parts.

Example – In WW2 the damaged portions of returning planes show locations where they can sustain damage and still return home; those hit in other places do not survive. Which means that putting extra armor on the destroyed areas was a fallacy.

Denomination effect

The tendency to spend more money when it is denominated in small amounts (e.g., coins) rather than large amounts (e.g., bills).

Example – Each participant was given $5 as either five $1 bills or one $5 bill and told they could spend the money at the gas station store. Customers who were given five $1 bills were more likely to buy something compared to customers receiving a single $5 bill.

This amazing illustration of the conjunction fallacy was made by @cartoonbias. Do check out their work on Instagram.

The magical number 7 ± 2

The notion that the number of objects an average human can hold in short-term memory is 7 ± 2.

Example – We do have a limited short term memory. But it is hard to measure precisely in object counts.

Selection bias

The tendency to notice something more when something causes us to be more aware of it, such as when we buy a car, we tend to notice similar cars more often than we did before. They are not suddenly more common U we just are noticing them more.

Example – A very big group of biases where we are unable to see or calculate random numbers. Some of the biases: Sampling bias. Susceptibility bias and there among; Clinical susceptibility bias, Protopathic bias and Indication bias. Self-selection. Attrition bias.

Attribute substitution 

Occurs when a judgment has to be made that is computationally complex, and instead a more easily calculated heuristic attribute is substituted. This substitution is thought of as taking place in the automatic intuitive judgment system, rather than the more self-aware reflective system.

Example – Americans were offered insurance against their own death in a terrorist attack while on a trip to Europe, while another group were offered insurance that would cover death of any kind on the trip. The former group were willing to pay more even though “death of any kind” includes “death in a terrorist attack”, Kahneman suggests that the attribute of fear is being substituted for a calculation of the total risks of travel.

Chapter Ten

We think we know what other people are thinking

In some cases this means that we assume that they know what we know, in other cases we assume they’re thinking about us as much as we are thinking about ourselves. It’s basically just a case of us modeling their own mind after our own (or in some cases after a much less complicated mind than our own).

Illusion of transparency

A tendency for people to overestimate the degree to which their personal mental state is known by others.

Example – When confronted with a potential emergency, people typically play it cool, adopt a look of nonchalance, and monitor the reactions of others to determine if a crisis is really at hand. No one wants to overreact, after all, if it might not be a true emergency. However, because each individual holds back, looks nonchalant, and monitors the reactions of others, sometimes everyone concludes (perhaps erroneously) that the situation is not an emergency and hence does not require intervention.

This amazing illustration of the conjunction fallacy was made by @cartoonbias. Do check out their work on Instagram.

Curse of knowledge

When better-informed people find it extremely difficult to think about problems from the perspective of lesser-informed people.

Example – The teacher already has the knowledge that he or she is trying to impart, but the way that knowledge is conveyed may not be the best for those who do not already possess the knowledge.

Spotlight effect

The tendency to overestimate the amount that other people notice your appearance or behavior.

Example – A study indicated that certain situations in which perceivably embarrassing items are factors, such as an embarrassing t-shirt, increase the extent to which the spotlight effect is experienced by an individual.

Extrinsic incentive error

An exception to the fundamental attribution error, when people view others as having situational extrinsic motivations and dispositional intrinsic motivations for oneself

Example – MBA students were asked to rank the expected job motivations of Citibank customer service representatives. They rated “amount of pay” highest while the workers ranked it very low. But of course this does not conclude that wages weren’t important for the workers. They just claimed they weren’t.

Illusion of external agency

When people view self-generated preferences as instead being caused by insightful, effective and benevolent agents.

Example – 3 biases: illusion of influence, illusion of insight, illusion of benevolence.

Illusion of asymmetric insight

People perceive their knowledge of their peers to surpass their peers’ knowledge of them.

Example – A study found that people seem to believe that they know themselves better than their peers know themselves and that their social group knows and understands other social groups better than other social groups know them.

Sexual over/underperception bias

The tendency to over-/underestimate sexual interest of another person in oneself.

Example – This is a basic evolutionary psychology hypothesis. Men tend to overperceive sexual interest and may state things like: “Did you see that? She’s totally into me.” While women tend to underestimate sexual interest from men and say: “He’s just a good friend.”

Chapter Eleven

We project our current mindset and assumptions on to the past and future

Magnified also by the fact that we’re not very good at imagining how quickly or slowly things will happen or change over time.

Hostile attribution bias

The “hostile attribution bias” is the tendency to interpret others’ behaviors as having hostile intent, even when the behavior is ambiguous or benign.

Example – Substantial literature has documented a robust association between hostile attribution bias and aggression in youth. Hostile attribution bias is particularly linked to relational problems in adulthood, including marital conflict/violence and marital/relationship dissatisfaction.

Telescoping effect

The tendency to displace recent events backward in time and remote events forward in time, so that recent events appear more remote, and remote events, more recent.

Example – A real-world example of the telescoping effect is the case of Ferdi Elsas, an infamous kidnapper and murderer in the Netherlands. When he was let out of prison, most of the general population did not believe he had been in prison long enough. Due to forward telescoping, people thought Ferdi Elsas’ sentence started more recently than it actually did.

Rosy retrospection 

People sometimes judge the past disproportionately more positively than the present. The Romans occasionally referred to this phenomenon with the Latin phrase “memoria praeteritorum bonorum”, which translates into English roughly as the past is always well remembered”.

Example – In one group of experiments, three groups going on different vacations were interviewed before, during, and after their vacations. Most followed the pattern of initially positive anticipation, followed by mild disappointment thereafter. Generally, most subjects reviewed the events more favorably some time after the events had occurred than they did while experiencing them.

Hindsight bias

Sometimes called the “I-knew-it-all-along” effect, the tendency to see past events as being predictable at the time those events happened.

For a more in-depth understanding of the hindsight bias check out this article by kent hendricks.

Example – After economic recessions or a product failure some analysts will predict that failure after it has happened. When a political candidate or party loses an election there will be a lot of “I knew they wouldn’t win and this is why…” after the fact.

Outcome bias

The tendency to judge a decision by its eventual outcome instead of based on the quality of the decision at the time it was made.

Example – One such example involved a surgeon deciding whether or not to do a risky surgery on a patient. The surgery had a known probability of success. Subjects were presented with either a good or bad outcome (in this case living or dying), and asked to rate the quality of the surgeon’s pre-operation decision. Those presented with bad outcomes rated the decision worse than those who had good outcomes.

Moral luck

The tendency for people to ascribe greater or lesser moral standing based on the outcome of an event.

Example – Consider Nazi followers and supporters in Hitler’s Germany. They were and are worthy of moral blame either for committing morally reprehensible deeds or for allowing them to occur without making efforts to oppose them. But, if in 1929, those people were moved to some other country, away from the coming hostilities by their employers, it is quite possible that they would have led very different lives, and we could not assign the same amount of moral blame to them.


The predisposition to view the past favorably (rosy retrospection) and future negatively.

Example – Declinism has been found to be rather widespread in the United Kingdom. In a 2015 survey, 70% of Britons agreed with the statement that “things are worse than they used to be,” even though at the time Britons were in fact “richer, healthier and longer-living than ever before.”

This amazing illustration of the conjunction fallacy was made by @cartoonbias. Do check out their work on Instagram.

Impact bias

The tendency to overestimate the length or the intensity of the impact of future feeling states.

Example – Participants were chosen to forecast how they would feel if they were chosen or not chosen for the job immediately after learning if they had been hired or fired. The study showed that both groups felt much better than they had originally predicted, ten minutes later, demonstrating the impact bias.

Pessimism bias

The tendency for some people, especially those suffering from depression, to overestimate the likelihood of negative things happening to them.

Example – Some voters in rich countries claim they will have to move out of the country if the opposition leader is elected as things will get unbearable and the economy will tank. But after a loss they claim to have changed their mind and want to stay and “fight”.

Planning fallacy

The tendency to underestimate task-completion times.

Example – The Sydney Opera House was expected to be completed in 1963. A scaled-down version opened in 1973, a decade later. The original cost was estimated at $7 million, but its delayed completion led to a cost of $102 million.

Time-saving bias

Underestimations of the time that could be saved (or lost) when increasing (or decreasing) from a relatively low speed and overestimations of the time that could be saved (or lost) when increasing (or decreasing) from a relatively high speed.

Example – In general, people underestimate the time that could be saved when increasing from a relatively low speed (e.g., 25 mph or 40 km/h) and overestimate the time that could be saved when increasing from a relatively high speed (e.g., 55 mph or 90 km/h).

Pro-innovation bias

The tendency to have an excessive optimism towards an invention or innovation’s usefulness throughout society, while often failing to identify its limitations and weaknesses.

Example – Roger Smith, then chairman of General Motors, said in 1986: By the turn of the century, we will live in a paperless society.”

This amazing illustration of the conjunction fallacy was made by @cartoonbias. Do check out their work on Instagram.

Projection bias

The tendency to overestimate how much our future selves share one’s current preferences, thoughts and values, thus leading to sub-optimal choices.

Example – The so-called “disability paradox” states the discrepancy between self-reported levels of happiness amongst chronically ill people versus the predictions of their happiness levels by healthy people.

Restraint bias

The tendency to overestimate one’s ability to show restraint in the face of temptation.

Example – An individual’s inability to control, or their temptation can come from several different visceral impulses. Visceral impulses can include hunger, sexual arousal, and fatigue. These impulses provide information about the current state and behavior needed to keep the body satisfied.

Self-consistency bias

Incorrectly remembering one’s past attitudes and behavior as resembling present attitudes and behavior.

Example – Undergraduate students were asked to rate their anxiety and emotions approaching a midterm exam as well as one week after the exam. Those who found out they had done well generally underestimated pre-test anxiety while those who found out they did poorly generally overestimated pre-test anxiety.

Consistency effect for behavior to fit into the group. This is like the default effect, but for self perception.

Travis Syndrome

Overestimating the significance of the present. It is related to chronological snobbery with possibly an appeal to novelty logical fallacy being part of the bias.

Example – Overall way to see the world as we can only act in the now. So our instincts overall will be focused on taking the most profitable choice right now.


When time perceived by the individual either lengthens, making events appear to slow down, or contracts.

Example – Psychoactive drugs can alter the judgement of time. These include traditional psychedelics such as LSD, psilocybin, and mescaline as well as the dissociative class of psychedelics such as PCP, ketamine and dextromethorphan. At higher doses time may appear to slow down, speed up or seem out of sequence.

Next Part

Ultimate List of Cognitive Biases with examples (Part 3 of 4) – Not enough time