In two previous posts, I examined four reasons why planning complex projects is so difficult. In the first of these two posts, I explored how we can be limited by our ability to imagine what we're planning, and by our access to knowledge of past failures. In the second post, I considered external influences. One set of external influences is associated with a group of cognitive biases known as priming effects. A second set of external influences includes fads, dogma, regulations, and traditions.
Continuing this exploration of difficulties encountered in planning complex projects, I turn now to internal causes of trouble, namely, patterns in the way we think when we're constructing plans.
- Magical number 7
- In one of most-cited papers in the entirety of the psychology literature, George Miller reported experimental results regarding numerical limits to human cognition. [Miller 1956] There are limitations to the number of "chunks" people can hold in short term memory. And there are limits to our ability to distinguish among a defined set of stimuli to govern a choice among a defined set of associated responses. These limits, coincidentally, appear to lie in the range of 7±2 items.
- The effects of these limits on planning activities are both subtle and profound. For example, a widely believed maxim in the presentation architecture community is known as the 6x6 rule. It states that presentation slides should contain no more than six bullet points of not more than six words each. Although it is motivated by Miller's observations, experimental confirmation of its value has been elusive.
- However, the limits Miller described are real. They can play a role in planning efforts for complex projects that have parallel efforts that number more than 7±2. For example, managing a project with 19 parallel streams of tasks would tend to become unwieldy because project managers might not be able to keep the various attributes of so many work streams in their minds.
- The usual approach for dealing with this is to "chunk" the effort into pieces that are more manageable in number. Then if necessary, chunk each piece. This approach is usually called analysis and synthesis.
- Some of what we attribute to "poor planning"
is perhaps better regarded as an inevitable
result of how humans thinkAnalysis and synthesis can be problematic, because the chunks sometimes interact with each other in unexpected ways, outside the descriptions we use for the synthesis. - Ambiguity effect
- The ambiguity effect is a cognitive bias that affects how we make decisions under uncertainty. [Ellsberg 1961] When choosing among options that have favorable outcomes, we tend to favor those options for which the outcome is more certain, even if less favorable. And we tend to avoid options for which the probability of a given favorable outcome is unknown, even if all possible outcomes of that option are favorable.
- When devising plans for projects, the ambiguity effect can be costly indeed. For example, when considering a novel approach that offers great savings in cost and schedule, we might compare it unfavorably to a more familiar approach that's slower and more costly. The ambiguity effect causes us to favor the conventional approach more than might be justified by the uncertainties of using an unconventional approach for the first time.
- Mitigating the ambiguity effect requires careful estimation and objective computations.
- The planning fallacy
- In a 1977 report, Daniel Kahneman and Amos Tversky identify a particular cognitive bias, the planning fallacy, which afflicts planners. [Kahneman 1977] [Kahneman 1979] They discuss two types of information used by planners. Singular information is specific to the project at hand; distributional information is drawn from similar past efforts. The planning fallacy is the tendency of planners to pay too little attention to distributional information and too much attention to singular information, even when the singular information is scanty or questionable. Planners tend to underestimate cost and schedule by failing to harvest lessons from the distributional information, which is inherently more diverse and reliable than singular information.
- The tendency to attend too little to distributional information afflicts us all as people, but it can afflict organizations as well. For example, many organizations conduct retrospectives or "lessons learned" exercises in connection with projects. But the information they collect, valuable though it might be to subsequent projects, isn't always archived in ways that facilitate its use by the leaders of those subsequent projects. It might be scattered, or stored within the project that generated it, rather than collected with other similar volumes into an organized library. In some organizations, it is actually classified and its use is restricted.
- Such practices intensify the effects of the planning fallacy.
Knowing these patterns, and others like them, provides enormous advantages to planners. They can check their plans for these effects, and when they find indications of their presence, they can revise those plans to mitigate the effects. Of course, you have to plan on taking these steps from the outset. And that plan is itself subject to these same effects. First issue in this series Top Next Issue
Are your projects always (or almost always) late and over budget? Are your project teams plagued by turnover, burnout, and high defect rates? Turn your culture around. Read 52 Tips for Leaders of Project-Oriented Organizations, filled with tips and techniques for organizational leaders. Order Now!
Footnotes
Your comments are welcome
Would you like to see your comments posted here? rbrenjTnUayrCbSnnEcYfner@ChacdcYpBKAaMJgMalFXoCanyon.comSend me your comments by email, or by Web form.About Point Lookout
Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.
This article in its entirety was written by a human being. No machine intelligence was involved in any way.
Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.
Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.
Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.
Related articles
More articles on Cognitive Biases at Work:
- The Ultimate Attribution Error at Work
- When we attribute the behavior of members of groups to some cause, either personal or situational, we
tend to make systematic errors. Those errors can be expensive and avoidable.
- Perfectionism and Avoidance
- Avoiding tasks we regard as unpleasant, boring, or intimidating is a pattern known as procrastination.
Perfectionism is another pattern. The interplay between the two makes intervention a bit tricky.
- Illusory Management: II
- Many believe that managers control organizational performance more precisely than they actually do.
This illusion might arise, in part, from a mechanism that causes leaders and the people they lead to
tend to misattribute organizational success.
- The Risk Planning Fallacy
- The planning fallacy is a cognitive bias that causes underestimates of cost, time required, and risks
for projects. Analogously, I propose a risk planning fallacy that causes underestimates of probabilities
and impacts of risk events.
- Additive bias…or Not: II
- Additive bias is a cognitive bias that many believe contributes to bloat of commercial products. When
we change products to make them more capable, additive bias might not play a role, because economic
considerations sometimes favor additive approaches.
See also Cognitive Biases at Work and Cognitive Biases at Work for more related articles.
Forthcoming issues of Point Lookout
- Coming January 22: Storming: Obstacle or Pathway?
- The Storming stage of Tuckman's model of small group development is widely misunderstood. Fighting the storms, denying they exist, or bypassing them doesn't work. Letting them blow themselves out in a somewhat-controlled manner is the path to Norming and Performing. Available here and by RSS on January 22.
- And on January 29: A Framework for Safe Storming
- The Storming stage of Tuckman's development sequence for small groups is when the group explores its frustrations and degrees of disagreement about both structure and task. Only by understanding these misalignments is reaching alignment possible. Here is a framework for this exploration. Available here and by RSS on January 29.
Coaching services
I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenjTnUayrCbSnnEcYfner@ChacdcYpBKAaMJgMalFXoCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.
Get the ebook!
Past issues of Point Lookout are available in six ebooks:
- Get 2001-2 in Geese Don't Land on Twigs (PDF, )
- Get 2003-4 in Why Dogs Wag (PDF, )
- Get 2005-6 in Loopy Things We Do (PDF, )
- Get 2007-8 in Things We Believe That Maybe Aren't So True (PDF, )
- Get 2009-10 in The Questions Not Asked (PDF, )
- Get all of the first twelve years (2001-2012) in The Collected Issues of Point Lookout (PDF, )
Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info
Follow Rick
Recommend this issue to a friend
Send an email message to a friend
rbrenjTnUayrCbSnnEcYfner@ChacdcYpBKAaMJgMalFXoCanyon.comSend a message to Rick
A Tip A Day feed
Point Lookout weekly feed