When we've made a decision that has led to serious trouble, it's possible that a cognitive bias known as neglect of probability might have played a role. This cognitive bias can affect decision making by causing us to choose among options based solely on the values of the respective "best outcomes," respectively, of those options. Using this criterion leads to trouble when the outcomes of the options are uncertain, because it ignores the probabilities of actually achieving those outcomes.
Stated a Neglect of Probability causes us
to choose among options based
solely on the values of their
respective "best outcomes,"
ignoring the likelihood of those
outcomes actually coming aboutbit more precisely, a rational decision would be based on considering both the value of an option's potential outcome and the probability of actually generating that outcome. Instead, when we're under the spell of Neglect of Probability, we tend to assess the goodness of an option by focusing solely (or excessively) on the value of its best outcome, while ignoring the probability of achieving that outcome.
An illustration might clarify this effect further.
The IT department at Dewey, Cheatham and Howe, LLP, (a fictitious global law firm), is upgrading the operating systems of DCH's fleet of personal computers from PC-OS 11 to PC-OS 12 (a fictitious operating system). The task would be relatively straightforward were it not for the enormous number of commercial and custom applications running on those computers. DCH's experts expect that the change from PC-OS 11 to PC-OS 12 will render many of those applications unusable. Among the 18,000 total applications, most are expected to operate correctly, but many will not. The full list of questionable applications is unknown.
Testing all 18,000 applications is an impractically large effort. But by working with the vendors of the commercial applications, and by collaborating with IT departments in other law firms, DCH has reduced the number of applications whose status is unknown to a mere 1,500. That number is less daunting, but it's still impossibly large.
IT has therefore decided to let users of each of the questionable applications perform the testing and certification, in three stages. In Phase I an application expert checks that the app operates in PC-OS 12. If it does, then in Phase II for that app, 10% of the app's users are authorized to use the app for up to 30 days. If they report no problems, then the app is cleared for Phase III, general use. If the Phase II users do report problems, usage of that app is suspended and IT works with the vendor or internal author to resolve the issue. When the issue is resolved, the app is returned to Phase II for another 30 days and the procedure repeats until the app is cleared.
In this way, IT can reduce the number of apps that need more thorough testing. And when an app functions properly, the cost of determining that it does so is very low. These cost-control features are very attractive to IT decision makers.
But there's a problem with this approach. In the 30 days during which Phase II users operate untested apps, those untested apps might corrupt existing data or documents, without the knowledge of the users of the apps. Based on prior experience with PC-OS 10, this corruption is almost certain to occur in many apps. Therefore, the probability of a successful outcome for IT's intended three-phase approach is very low. But the decision makers in IT who conceived of this plan are neglecting the probability of that good outcome. They're attracted by the low cost of the best outcome, and that has caused them to ignore the fact that some applications — IT knows not which ones — will almost certainly cause data corruption. In this example, IT's approach might have a very good outcome, but the probability of that outcome is small.
Neglect of Probability is most likely to play a role in decision making when the decision-making scenario involves choosing among a set of options, some of which have outcomes very much more attractive than others. In these decision-making scenarios, Neglect of Probability tends to cause the decision makers to choose options with too little regard for — or without regard for — the probabilities of success of the various options.
And even when decision makers do consider probabilities, the risk of a poor decision remains unacceptably high for some kinds of mission-critical decisions. Decision makers might in some cases estimate probabilities in a biased way that tends to favor the options with the outcomes they find most appealing. There are indeed many ways to mess things up. Top Next Issue
Are your projects always (or almost always) late and over budget? Are your project teams plagued by turnover, burnout, and high defect rates? Turn your culture around. Read 52 Tips for Leaders of Project-Oriented Organizations, filled with tips and techniques for organizational leaders. Order Now!
Your comments are welcome
Would you like to see your comments posted here? rbrenjTnUayrCbSnnEcYfner@ChacdcYpBKAaMJgMalFXoCanyon.comSend me your comments by email, or by Web form.About Point Lookout
Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.
This article in its entirety was written by a human being. No machine intelligence was involved in any way.
Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.
Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.
Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.
Related articles
More articles on Cognitive Biases at Work:
- Seven More Planning Pitfalls: I
- Planners and members of planning teams are susceptible to patterns of thinking that lead to unworkable
plans. But planning teams also suffer vulnerabilities. Two of these are Group Polarization and Trips
to Abilene.
- Risk Acceptance: Naïve Realism
- When we suddenly notice a "project-killer" risk that hasn't yet materialized, we sometimes
accept the risk even though we know how seriously it threatens the effort. A psychological phenomenon
known as naïve realism plays a role in this behavior.
- Illusory Management: I
- Many believe that managers control organizational performance, but a puzzle emerges when we consider
the phenomena managers clearly cannot control. Why do we believe in Management control when the phenomena
Management cannot control are so many and powerful?
- Lessons Not Learned: II
- The planning fallacy is a cognitive bias that causes us to underestimate the cost and effort involved
in projects large and small. Efforts to limit its effects are more effective when they're guided by
interactions with other cognitive biases.
- Additive Bias…or Not: I
- When we alter existing systems to enhance them, we tend to favor adding components even when subtracting
might be better. This effect has been attributed to a cognitive bias known as additive bias. But other
forces more important might be afoot.
See also Cognitive Biases at Work and Cognitive Biases at Work for more related articles.
Forthcoming issues of Point Lookout
- Coming January 22: Storming: Obstacle or Pathway?
- The Storming stage of Tuckman's model of small group development is widely misunderstood. Fighting the storms, denying they exist, or bypassing them doesn't work. Letting them blow themselves out in a somewhat-controlled manner is the path to Norming and Performing. Available here and by RSS on January 22.
- And on January 29: A Framework for Safe Storming
- The Storming stage of Tuckman's development sequence for small groups is when the group explores its frustrations and degrees of disagreement about both structure and task. Only by understanding these misalignments is reaching alignment possible. Here is a framework for this exploration. Available here and by RSS on January 29.
Coaching services
I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenjTnUayrCbSnnEcYfner@ChacdcYpBKAaMJgMalFXoCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.
Get the ebook!
Past issues of Point Lookout are available in six ebooks:
- Get 2001-2 in Geese Don't Land on Twigs (PDF, )
- Get 2003-4 in Why Dogs Wag (PDF, )
- Get 2005-6 in Loopy Things We Do (PDF, )
- Get 2007-8 in Things We Believe That Maybe Aren't So True (PDF, )
- Get 2009-10 in The Questions Not Asked (PDF, )
- Get all of the first twelve years (2001-2012) in The Collected Issues of Point Lookout (PDF, )
Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info
Follow Rick
Recommend this issue to a friend
Send an email message to a friend
rbrenjTnUayrCbSnnEcYfner@ChacdcYpBKAaMJgMalFXoCanyon.comSend a message to Rick
A Tip A Day feed
Point Lookout weekly feed