As discussed in my last post, motivated reasoning is a pattern of thinking used in decision making (and elsewhere I suppose). It appears on its face to be evidence-based reasoning, but it departs from evidence-based reasoning at important junctures to produce the conclusions we've preferred from the outset. Those departures are possible, for example, when we weigh the strength of arguments or evidence, or when we set priorities for investigations and evidence gathering, or when we interpret evidence we find, or when we allocate time in meetings for discussion of particular points, or when we choose less capable or more capable individuals to present material relevant to pending decisions.
Motivated reasoning Motivated reasoning is a source of
bias in workplace decision making.
But in combination with a cognitive
bias known as the Pseudocertainty
Effect it can be especially damaging.is thus a source of bias in workplace decision making. But in combination with certain specific cognitive biases, its effects can be profound. In this post I explore the synergistic effects of motivated reasoning in combination with a cognitive bias known as the Pseudocertainty Effect. [Tversky 1981]
The Pseudocertainty Effect is a cognitive bias that affects our ability to make good decisions in situations that involve projecting outcomes from a sequence of decisions under uncertainty. Because of the Pseudocertainty Effect, we humans have a tendency to focus only on the final decision in the sequence, ignoring earlier-stage uncertainties.
Explaining the Pseudocertainty Effect is a little easier if I begin with the Certainty Effect. [Kahneman 1979.1] The Certainty Effect is a cognitive bias that causes us to value too highly the utility of those outcomes that are certain, as compared to the utility of outcomes that are merely probable. The use of the word "highly" is a bit tricky here, because the Certainty Effect applies for both welcome and unwelcome outcomes — for both positive and negative utility. That is, the Certainty Effect also causes us to overestimate the "damage" of an unwelcome outcome when that outcome is certain, as compared to the damages of unwelcome outcomes that are merely probable.
Let's return now to the Pseudocertainty Effect. Consider a two-stage decision process. In the first stage we must choose between two options, Opt11 and Opt12, that produce different outcomes, O11 and O12, with probabilities P11 and P12 respectively. Similarly, the second stage also has two options, Opt21 and Opt22 that produce different outcomes, O21 and O22, with probabilities P21 and P22 respectively. However, in this scenario, we reach this second stage only if we choose Option Opt11, and we succeed in achieving Outcome O11. Since the probability of Outcome O11 is only P11, reaching the second stage isn't a sure thing.
Now it gets a little tricky.
In the second stage, the respective probabilities are P21 and P22. If P21 is 100%, that is, if the probability of Option Opt21 producing Outcome O21 is 100%, then the Certainty Effect would tend to cause us to overvalue O21.
But now, because we're in a two-stage decision process, the actual probability of Outcome O21 isn't 100%. It's only P11*P21, assuming that the option outcome distributions are probabilistically independent of each other. But Kahneman and Tversky demonstrated experimentally that people tend to overvalue Outcome O21 in a manner analogous to how they would have treated it if it actually were certain, that is, if P11*P21 were 100%. Hence the term Pseudocertainty Effect. That is, people tend to disregard the fact that the first stage of this two-stage scenario imposes a probability distribution that affects the final outcome. Instead, people focus only on the final stage of the two-stage scenario.
The experimental results suggest that people tend to "assume away" the uncertainties of the first stage of a two-stage decision string, and choose options only on the basis of the final stage. Or, at least, they give too much weight to the uncertainties of the final stage. This is the essence of the Pseudocertainty Effect.
Presumably, this phenomenon also applies to multi-stage scenarios, and to scenarios in which the probability distributions of the various stages aren't entirely independent.
Synergistic effects of Pseudocertainty Effect and Motivated Reasoning
The Pseudocertainty Effect has implications for workplace decision making in the context of motivated reasoning. Either phenomenon, acting alone, can be costly. But when both are acting they display a synergy that can be especially pernicious. For example, consider risk management.
In a typical risk management problem, we identify five attributes of a risk: the risk event, its probability, its impact, response if it materializes, and a mitigation strategy. What makes risk analysis so interesting is that some risks cannot materialize unless other risks materialize first, forming a "risk string." For example, some neighborhoods in Houston, Texas, can flood only if (a) a hurricane passes over the area and (b) a dam fails as a result of rainfall so extreme that the dam cannot withstand the pressure of the accumulated rainfall. These two events combine to provide an example of a risk string of length 2. Longer strings are clearly possible.
Because analyzing risk strings inherently produces staged decision strings, risk strings provide a setting in which the Pseudocertainty Effect can take hold. But the effect will be even more significant if we can identify a preferred outcome. That is, if we're also at risk of engaging in motivated reasoning, then the Pseudocertainty Effect can cause some real trouble.
For example, consider a risk A' that can materialize only if risk A materializes and is successfully addressed by OptA, one of the options defined for risk A. The decisions regarding how much to invest in mitigating either A or A' (or both) satisfy the structure Kahneman and Tversky studied in their research into the Pseudocertainty Effect. Moreover, in risk analysis, decision makers have a clear preference for investing a minimum amount in risk mitigation. They are thus at risk of engaging in motivated reasoning to justify low-cost options for risk management.
Probably the most dangerous case occurs when OptA', one of the options for dealing with risk A', is very low cost but nearly certain to work. Risk managers will be tempted to implement that option, despite the fact that the probability of that option succeeding is also dependent on the success of the Option OptA. Because of the Pseudocertainty Effect, risk managers will tend to ignore the probabilities associated with Risk A. That will lead them to over-invest in preparing for OptA', which they regard as leading to a favored outcome, because it is low cost.
An example from history
The image here shows the battleship USS Arizona, burning during the Japanese attack on the U.S. naval base at Pearl Harbor, Hawaii, December 7, 1941. Naval losses were extraordinary, but losses of aircraft were no less severe. Mark Parillo, a military historian, writes, "U.S. [aircraft] losses amounted to 188 aircraft destroyed and another 159 damaged of the 402 aircraft present when the raid began." [Parillo 2006] Among factors contributing to the aircraft losses was the decision to position the aircraft for defense against sabotage, instead of defense against air attack. Aircraft at Wheeler Field, for example, "were taken out of the U-shaped earthen bunkers that had been built for their protection." [Correll 2007] Aircraft were also disarmed, and in some cases, rounds were removed from their belts to make storage more efficient. These actions led to delays in mounting effective defense against air attack.The strategic decision to defend against sabotage instead of air attack can be regarded as the first stage in one of Kahneman and Tversky's experiments. A second stage could be the deployment of manpower around the island. Personnel that would be needed for air defense on the day of the attack were either off duty or standing guard at facilities at some remove from the airfield. These decision strings would have produced a successful defense against sabotage, but as we now know, they produced an unsuccessful defense against air attack. The outcome suggests that the Pseudocertainty Effect might have played a role.
Last words
In situations that involve risk strings and motivated reasoning, trusting to intuition is likely to run afoul of the Pseudocertainty Effect. Careful mathematical analysis of all options under consideration offers a path with a minimum of exposure to the Pseudocertainty Effect. First issue in this series Top Next Issue
Is every other day a tense, anxious, angry misery as you watch people around you, who couldn't even think their way through a game of Jacks, win at workplace politics and steal the credit and glory for just about everyone's best work including yours? Read 303 Secrets of Workplace Politics, filled with tips and techniques for succeeding in workplace politics. More info
Footnotes
Your comments are welcome
Would you like to see your comments posted here? rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.comSend me your comments by email, or by Web form.About Point Lookout
Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.
This article in its entirety was written by a human being. No machine intelligence was involved in any way.
Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.
Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.
Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.
Related articles
More articles on Cognitive Biases at Work:
- Effects of Shared Information Bias: II
- Shared information bias is widely recognized as a cause of bad decisions. But over time, it can also
erode a group's ability to assess reality accurately. That can lead to a widening gap between reality
and the group's perceptions of reality.
- Seven More Planning Pitfalls: III
- Planning teams, like all teams, are vulnerable to several patterns of interaction that can lead to counter-productive
results. Two of these relevant to planners are a cognitive bias called the IKEA Effect, and a systemic
bias against realistic estimates of cost and schedule.
- The Illusion of Explanatory Depth
- The illusion of explanatory depth is the tendency of humans to believe they understand something better
than they actually do. Discovering the illusion when you're explaining something is worse than embarrassing.
It can be career ending.
- Lessons Not Learned: I
- The planning fallacy is a cognitive bias that causes us to underestimate the cost and effort involved
in projects large and small. Mitigating its effects requires understanding how we go wrong when we plan
projects by referencing our own past experience.
- Additive bias…or Not: II
- Additive bias is a cognitive bias that many believe contributes to bloat of commercial products. When
we change products to make them more capable, additive bias might not play a role, because economic
considerations sometimes favor additive approaches.
See also Cognitive Biases at Work and Cognitive Biases at Work for more related articles.
Forthcoming issues of Point Lookout
- Coming September 4: Beating the Layoffs: I
- If you work in an organization likely to conduct layoffs soon, keep in mind that exiting voluntarily before the layoffs can carry significant advantages. Here are some that relate to self-esteem, financial anxiety, and future employment. Available here and by RSS on September 4.
- And on September 11: Beating the Layoffs: II
- If you work in an organization likely to conduct layoffs soon, keep in mind that exiting voluntarily can carry advantages. Here are some advantages that relate to collegial relationships, future interviews, health, and severance packages. Available here and by RSS on September 11.
Coaching services
I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.
Get the ebook!
Past issues of Point Lookout are available in six ebooks:
- Get 2001-2 in Geese Don't Land on Twigs (PDF, )
- Get 2003-4 in Why Dogs Wag (PDF, )
- Get 2005-6 in Loopy Things We Do (PDF, )
- Get 2007-8 in Things We Believe That Maybe Aren't So True (PDF, )
- Get 2009-10 in The Questions Not Asked (PDF, )
- Get all of the first twelve years (2001-2012) in The Collected Issues of Point Lookout (PDF, )
Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info
Follow Rick
Recommend this issue to a friend
Send an email message to a friend
rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.comSend a message to Rick
A Tip A Day feed
Point Lookout weekly feed