Making plans for projects, Șcampaigns, developments, complex events, or other collaborative activities is a collaborative activity in itself. A common experience of plans, sadly, is their inadequacy, even when we invest significant resources in developing them. Indeed, it's often said that "…no plan survives first contact…" with the enemy if in a military context, or with reality in a project management context.
But the planning activity has a reputation perhaps worse than it deserves. Planning a complex undertaking is difficult to do well because of unpredictable changes in the context, or incomplete or inaccurate information about that context, or incomplete or inaccurate information about the undertaking itself. That much is understood and expected. But much of what's wrong with our plans is a direct result of the way we think about making plans. Our human limitations manifest themselves in plan deficiencies.
In these next posts, I "plan" to construct a short catalog of seven observations about how plan deficiencies arise from the way we think, or the way we go about developing plans. This first part addresses how we use our experience and preferences and our knowledge of past mistakes to develop plans.
- We make better plans for things we know, or favor, or can imagine
- When we make plans, we rely on experience, preference, and imagination. That's why our plans exhibit three kinds of biases. First, plans tend to anticipate better those events or conditions that have occurred in the past. In military conventional wisdom, as the saying goes, we tend to plan to re-fight the last battle or the last war.
- A second source of bias in our plans arises from our preferences or preconceptions. We tend to search for reasons that justify or coincide with our preferences or preconceived notions. We tend also to avoid searching for reasons why our preferences or preconceived notions about our plan might be mistaken. In this way, the data generated by our research tends to confirm our preferences and preconceptions. The pattern is so prevalent that psychologists have given it a name: confirmation bias. [Nickerson 1998]
- The third source of Much of what's wrong with our
plans is a direct result of the way
we think about making plansbias in our plans is a cognitive bias known as the Availability Heuristic. [Tversky 1973] We're using the Availability Heuristic when we determine the relevance of a phenomenon by sensing the difficulty of imagining or understanding the string of events that contribute to its development. So if we have difficulty imagining a phenomenon, or if we have difficulty imagining the conditions that bring it about, we regard it as less than likely, and we tend not to address it effectively in our plans. - We lack adequate information about failures
- When we make plans and choose approaches, we tend to focus on what has worked for us or for others in the past. We invest effort in understanding why a particular method is reliable, or why an approach is recommended. We try to be knowledgeable about "best practices." Usually, whatever we draw upon does need tailoring, but we use it as guidance nevertheless.
- We pay much less attention to failures. Finding information about "worst practices" or "less-than-best practices" or even "ok practices" is next to impossible. Failures are often buried quietly. We have difficulty consulting people of our own organizations who led past efforts that failed, because they are often terminated, blocked from promotion, reassigned, or departed from the organization. Other organizations rarely publish results of investigations into their own failures, even when they do publish stories of their successes. These are some of the reasons why our understanding of failures is much less thorough than is our understanding of successes. In some cases, there are past failures of which current planners are completely unaware, even when the causes of those failures might be relevant to the planning task at hand.
- Our relative ignorance about failures might be a contributing factor when we repeat our own errors or the errors of others. We follow this pattern so often that psychologists have given it a name: survivorship bias. [Elton 1996] Survivorship bias is our tendency, when making plans or decisions, to pay too much attention to past events that we regard as successes, and too little attention to past events that we regard as failures.
- But failures have more to offer than mere patterns to avoid. If we truly understand a particular failure, sometimes we can identify those attributes of a failed approach that account for the failure and which could be adjusted. Occasionally these insights can lead to solutions for new problems that might be very valuable indeed.
In Part II, I'll examine the effects of factors external to the planning process, and which are therefore beyond the control of the planning team. Next issue in this series Top Next Issue
Are your projects always (or almost always) late and over budget? Are your project teams plagued by turnover, burnout, and high defect rates? Turn your culture around. Read 52 Tips for Leaders of Project-Oriented Organizations, filled with tips and techniques for organizational leaders. Order Now!
Footnotes
Your comments are welcome
Would you like to see your comments posted here? rbrenjTnUayrCbSnnEcYfner@ChacdcYpBKAaMJgMalFXoCanyon.comSend me your comments by email, or by Web form.About Point Lookout
Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.
This article in its entirety was written by a human being. No machine intelligence was involved in any way.
Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.
Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.
Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.
Related articles
More articles on Cognitive Biases at Work:
- Bullet Point Madness: II
- Decision makers in many organizations commonly demand briefings in the form of a series of bullet points
or a series of series of bullet points. Briefers who combine this format with a variety of persuasion
techniques can mislead decision makers, guiding them into making poor decisions.
- Risk Acceptance: One Path
- When a project team decides to accept a risk, and when their project eventually experiences that risk,
a natural question arises: What were they thinking? Cognitive biases, other psychological phenomena,
and organizational dysfunction all can play roles.
- Choice-Supportive Bias
- Choice-supportive bias is a cognitive bias that causes us to assess our past choices as more fitting
than they actually were. The erroneous judgments it produces can be especially costly to organizations
interested in improving decision processes.
- Illusory Management: II
- Many believe that managers control organizational performance more precisely than they actually do.
This illusion might arise, in part, from a mechanism that causes leaders and the people they lead to
tend to misattribute organizational success.
- Mental Accounting and Technical Debt
- In many organizations, technical debt has resisted efforts to control it. We've made important technical
advances, but full control might require applying some results of the behavioral economics community,
including a concept they call mental accounting.
See also Cognitive Biases at Work and Cognitive Biases at Work for more related articles.
Forthcoming issues of Point Lookout
- Coming January 22: Storming: Obstacle or Pathway?
- The Storming stage of Tuckman's model of small group development is widely misunderstood. Fighting the storms, denying they exist, or bypassing them doesn't work. Letting them blow themselves out in a somewhat-controlled manner is the path to Norming and Performing. Available here and by RSS on January 22.
- And on January 29: A Framework for Safe Storming
- The Storming stage of Tuckman's development sequence for small groups is when the group explores its frustrations and degrees of disagreement about both structure and task. Only by understanding these misalignments is reaching alignment possible. Here is a framework for this exploration. Available here and by RSS on January 29.
Coaching services
I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenjTnUayrCbSnnEcYfner@ChacdcYpBKAaMJgMalFXoCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.
Get the ebook!
Past issues of Point Lookout are available in six ebooks:
- Get 2001-2 in Geese Don't Land on Twigs (PDF, )
- Get 2003-4 in Why Dogs Wag (PDF, )
- Get 2005-6 in Loopy Things We Do (PDF, )
- Get 2007-8 in Things We Believe That Maybe Aren't So True (PDF, )
- Get 2009-10 in The Questions Not Asked (PDF, )
- Get all of the first twelve years (2001-2012) in The Collected Issues of Point Lookout (PDF, )
Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info
Follow Rick
Recommend this issue to a friend
Send an email message to a friend
rbrenjTnUayrCbSnnEcYfner@ChacdcYpBKAaMJgMalFXoCanyon.comSend a message to Rick
A Tip A Day feed
Point Lookout weekly feed