O.P.B. (optimistic planning bias)
Are you or your customer suffering from O.P.B. (optimistic planning bias)? – if so, you are driving youself and our delivery teams crazy and you are not getting the results you and others expect.
How accurate are your project estimates? Research points out that most people suffer from optimistic planning bias! It is important that you reverse this trend and underpromise and over deliver. Moreover, if schedules delays manifest, they should be related to documented dependancies/risks. Depandancies/Risks should be detailed in your weekly status reports. If and when dependancy/risk/schedule delay manifest, you should link communication to the basis of delays to the relevant dependancy/risk in your weekly reporting/Project Change Requests (PCRs).
Optimism bias is the demonstrated systematic tendency for people to be over-optimistic about the outcome of planned actions. People tend to see the future through “rose-colored glasses,” as the saying goes. Optimism bias applies to professionals and laypeople alike. Optimism bias arises in relation to estimates of costs and benefits and duration of tasks. It must be accounted for explicitly in appraisals, if these are to be realistic. Optimism bias typically results in cost overruns, benefit shortfalls, and delays, when plans are implemented.
In a study in search of the brain regions responsible for optimism, researchers noted that “humans expect positive events in the future even when there is no evidence to support such expectations.” .
Risk Management avoidance
Risk management avoidance is the tendency to ignore risk. When risk is ignored in project initiation and planning the result is usually unrealistic expectations and the ripple effects that result from them. Ripple effects are schedule and/or budget overruns, benefits shortfall, quality shortfalls and disharmony among the stakeholders. When risk is ignored the project team loses the opportunity to avoid or mitigate risks. Ignoring risk promotes reactivity. Risk management supports a proactive approach.
Risk avoidance is the tendency to avoid situations which may involve loss or failure or in which there is significant uncertainty. Risk avoidance may be a problem because it can lead to missed opportunities based on emotional rather than analytical motivations. Risk management avoidance is always a problem in PM..
The Zen of PM seeks to “see things” as they are, to avoid conditioned responses and be as realistic as possible. Being realistic mean acknowledging positive and negative possibilities as well as the inevitable uncertainty involved in any complex effort performed over time.
At some companies, it is common to inhibit the identification and analysis of project risk because those that bring up the risks are considered pessimists and defeatists. Project proponents are eager to convince sponsors and clients to authorize their projects. To them, risk analysis gets in the way. It makes an idea harder to sell if the uncertainty of expected returns is highlighted. Some project and product champions are deluded in thinking that their concept is actually perfect. They don’t see the reason to take a “Black Hat” perspective, thinking that since the idea is perfect nothing would come out of risk assessment.
Some examples that illustrate how we suffer from optimistic planning bias and risk management avoidance!:
The Denver International Airport opened 16 months late, at a cost overrun of $2 billion (I’ve also seen $3.1 billion asserted). The Eurofighter Typhoon, a joint defense project of several European countries, was delivered 54 months late at a cost of £19 billion instead of £7 billion. The Sydney Opera House may be the most legendary construction overrun of all time, originally estimated to be completed in 1963 for $7 million, and finally completed in 1973 for $102 million.
Are these isolated disasters brought to our attention by selective availability? Are they symptoms of bureaucracy or government incentive failures? Yes, very probably. But there’s also a corresponding cognitive bias, replicated in experiments with individual planners.
Buehler et. al. (1995) asked their students for estimates of when they (the students) thought they would complete their personal academic projects. Specifically, the researchers asked for estimated times by which the students thought it was 50%, 75%, and 99% probable their personal projects would be done. Would you care to guess how many students finished on or before their estimated 50%, 75%, and 99% probability levels?
13% of subjects finished their project by the time they had assigned a 50% probability level; 19% finished by the time assigned a 75% probability level; and only 45% (less than half!) finished by the time of their 99% probability level. As Buehler et. al. (2002) wrote, “The results for the 99% probability level are especially striking: Even when asked to make a highly conservative forecast, a prediction that they felt virtually certain that they would fulfill, students’ confidence in their time estimates far exceeded their accomplishments.”
More generally, this phenomenon is known as the “planning fallacy”. The planning fallacy is that people think they can plan, ha ha.
A clue to the underlying problem with the planning algorithm was uncovered by Newby-Clark et. al. (2000), who found that:
Asking subjects for their predictions based on realistic “best guess” scenarios; or Asking subjects for their hoped-for “best case” scenarios… …produced indistinguishable results.
When people are asked for a “realistic” scenario, they envision everything going exactly as planned, with no unexpected delays or unforeseen catastrophes – the same vision as their “best case”.
Reality, it turns out, usually delivers results somewhat worse than the “worst case”.
Unlike most cognitive biases, we know a good debiasing heuristic for the planning fallacy. It won’t work for messes on the scale of the Denver International Airport, but it’ll work for a lot of personal planning, and even some small-scale organizational stuff. Just use an “outside view” instead of an “inside view”.
People tend to generate their predictions by thinking about the particular, unique features of the task at hand, and constructing a scenario for how they intend to complete the task – which is just what we usually think of as planning. When you want to get something done, you have to plan out where, when, how; figure out how much time and how much resource is required; visualize the steps from beginning to successful conclusion. All this is the “inside view”, and it doesn’t take into account unexpected delays and unforeseen catastrophes. As we saw before, asking people to visualize the “worst case” still isn’t enough to counteract their optimism – they don’t visualize enough Murphyness.
The outside view is when you deliberately avoid thinking about the special, unique features of this project, and just ask how long it took to finish broadly similar projects in the past. This is counterintuitive, since the inside view has so much more detail – there’s a temptation to think that a carefully tailored prediction, taking into account all available data, will give better results.
But experiment has shown that the more detailed subjects’ visualization, the more optimistic (and less accurate) they become. Buehler et. al. (2002) asked an experimental group of subjects to describe highly specific plans for their Christmas shopping – where, when, and how. On average, this group expected to finish shopping more than a week before Christmas. Another group was simply asked when they expected to finish their Christmas shopping, with an average response of 4 days. Both groups finished an average of 3 days before Christmas.
Likewise, Buehler et. al. (2002), reporting on a cross-cultural study, found that Japanese students expected to finish their essays 10 days before deadline. They actually finished 1 day before deadline. Asked when they had previously completed similar tasks, they responded, “1 day before deadline.” This is the power of the outside view over the inside view.
A similar finding is that experienced outsiders, who know less of the details, but who have relevant memory to draw upon, are often much less optimistic and much more accurate than the actual planners and implementers.
So there is a fairly reliable way to fix the planning fallacy, if you’re doing something broadly similar to a reference class of previous projects. Just ask how long similar projects have taken in the past, without considering any of the special properties of this project. Better yet, ask an experienced outsider how long similar projects have taken.
You’ll get back an answer that sounds hideously long, and clearly reflects no understanding of the special reasons why this particular task will take less time. This answer is true. Deal with it.”
Agree to an project plan based on fact-based commitments. This approach has best chance, to manage expectations and improve our chances of success. You’ll note pretence based and opinion based commitments are likely to result in disharmony among the stakeholders. The diagrams illustrate the consequences of escalating disharmony; something we need to avoid.
- 1. The pretence-based commitment – not appropriate/acceptance Has anyone walked out of a meeting having made promises that they didn’t really intend to keep? Failing to communicate what you know is relevant (not telling the truth, withholding information) A dishonest commitment to perform I do not intend to deliver, or I do not believe it is possible to deliver, or both A commitment intended to avoid conflict or some other perceived threat
- 2. The opinion-based commitment – relates in part to O.P.B. (optimistic planning bias) and risk management avoidance New Year’s resolutions? Left disconnected from real world challenges and others that are impacted, effected, involved An honest report of my personal opinion; includes thoughtless certainty that my view is right An honest commitment to perform A well-intended commitment that does not accurately confront what will really be demanded of me Those who provide opinion-based commitments tend to talk big and deliver little
- 3. The fact-based commitment – where we should target to operate from and includes specifying possible risks – many have manifested during complex project times to leverage in your planning. Designed for success Minimum condition for improvement is accuracy Involves delivery employees in the development of project plans, leveraging their insights to build smart plans and genuine commitment. Reveal facts and compare explanations for value A confident commitment to perform A commitment informed by conditions, clear expectations, measurable outcomes, key relationships, and available resources The source of confidence is the feasibility of the commitment In order to present a confident fact-based commitment and unified response to project and technical issues, the members of the project should meet to agree on decisions and issues to be prior to meeting with the customer, where possible. Otherwise, take actions away from meetings/telephone conversations to ensure you have an opportunity to present an unified response later.
Challenge – getting resources assigned and then commitments from them. Resource contention often exists with multiple groups. You need to avoid “committed” schedules going to customer that are not properly resourced. You need to get clear enough commitments from others for the things that you need them to do to keep the project on schedule. This enables you to make commitments/ promises that you know you can keep rather than hope you can keep. When the projects you do is time sensitive, you need to make sure you hold others to their promises, so that you can do the same. Moreover, given the workload on some critical resource groups you’ll need to check in with them periodically to see how everything is going and if to see if there is anything you can do to help them. Each time you follow up with them, provides you with another opportunity to ask for a firm commitment (risk higher priority work might gazzump your project at anytime).
What is an effective framework to communicate your estimate explicitly acknowledging risk:
- I think the estimate is this…
- and here’s why…
- but it could be this, if that happens…
- and here’s what can be done about it, and what it costs…
- and here’s how we’ll know if we were right or wrong
Estimates that turn out to be 100% accurate without a solid rationale are very rare and unlikely. Focus not just on the accuracy of the numbers, but on the quality of the rationale behind them to ensure the facts are understood.
Sources and further readings
Flyvbjerg, Bent, 2003. “Delusions of Success: Comment on Dan Lovallo and Daniel Kahneman.” Harvard Business Review, December Issue, pp. 121-122.
Flyvbjerg, Bent, “From Nobel Prize to Project Management: Getting Risks Right.” Project Management Journal, vol. 37, no. 3, August 2006, pp. 5-15.
Flyvbjerg, Bent and Cowi, Procedures for Dealing with Optimism Bias in Transport Planning: Guidance Document (London: UK Department for Transport, June 2004). 61 pp.
Flyvbjerg, Bent, Mette K. Skamris Holm, and Søren L. Buhl, “Underestimating Costs in Public Works Projects: Error or Lie?” Journal of the American Planning Association, vol. 68, no. 3, Summer 2002, pp. 279-295.
HM Treasury, Supplementary Green Book Guidance: Optimism Bias (London: HM Treasury, 2003).
Kahneman, Daniel and Dan Lovallo, 2003. “Response to Bent Flyvbjerg.” Harvard Business Review, December Issue, p. 122.
Lovallo, Dan and Daniel Kahneman, 2003. “Delusions of Success: How Optimism Undermines Executives’ Decisions,” Harvard Business Review, July Issue, pp. 56-63.
Lev Virine and Michael Trumper. Project Decisions: The Art and Science, Vienna, VA: Management Concepts, 2008. ISBN 978-1567262179
Powered by Facebook Comments
No related posts.