Better Business Cases Start Here: Let’s Stop Pretending We Know the Future
A post about best practice in making predictions. Which is what a business case actually is…
Humans have always sought to predict the future…
Ancient China - Turtles
When the Shang, who lived in China around 3,000 years ago, needed to plan something, they would crack turtle shells.
Incense would be burned, music played, animals sacrificed and a fire built. Then the turtle shell would be placed into the fire. After sitting for hours in the smouldering charcoal, it would suddenly crack, with a loud bang.
The oracle would then interpret the shape of the crack, announce the result and ‘file’ the shell carefully. To file it, they would lightly paint the result onto the shell with a fine brush, inscribe it on the turtle shell with a knife and then fill the inscription with pigment. Then the shell would be added to the archive of existing shells in Anyang.
There were two key skills involved: the first was in the cracking of the shell. The turtle shell needed to be on the fire for the right time at the right temperature to crack neatly. If it were too hot, it would shatter. Too cold and it wouldn’t crack at all.
Secondly, there was the skill of reading the turtle shell. Sometimes they were predictions: sometimes plans. Questions posed to the turtle shells were generally presented as either true or false statements (for example, “In the next 10 days there will be no disasters?”) or as multiple choice options (“To inspect the district of Lin, it should be Quin who does it or Bing who does it?”).
Thousands of carefully filed turtle shells have been found in Anyang: it is believed many more have been lost or were never found.
The ancient Chinese took their turtle shells seriously.
Ancient Greece - Stoned Women on Tripods
The oracle of Delphi was renowned for her ability to predict the future. She sat on a tripod and, in response to questions, spoke words that were believed to be the words of Apollo himself.
At first hearing, though, Apollo’s oracle seldom made a lot of sense, as the tripod was placed on top of a fissure in the rock which was a natural source of ethylene, a psychoactive drug. The oracle was basically high.
Luckily, there was a Priest on hand, who would interpret the - literally - delphic words into actionable advice. It was the best improv job of the ancient world.
Around 500 BCE, King Croesus of Lydia asked whether he should invade Persia. The oracle’s babblings were translated by the priest as “Croesus, having crossed the Halys, will destroy a great Empire”. So he crossed the River Halys (which bisected Turkey) on his way to invade Persia and, in the process, destroyed… his own empire.
When he returned to Delphi to complain, he was told that he “did not understand what was spoken or make further enquiry: for which now let him blame himself”.
We’ve all had customer service interactions like that.
Modern Times - MOdels
Turtles now being in short supply (presumably the Chinese burned them all) and Delphi’s natural supply of ethylene having been exhausted, we have to put our faith into alternative fortune-tellers. So we go for models.
Now, models are - of course - useful. Indeed, the most famous aphorism about models of all time is attributed to statistician George Box who said:
All models are wrong, some are useful.
Unfortunately, despite the fact he said this fifty years ago, we’ve failed to internalise it.
We typically put blind faith into models (often far more than the creators would wish) and insert the outputs into budgets, plans, forecasts and our own annual objectives.
Then, like King Croesus, we get very grumpy when they don’t come true.
I’ve definitely been in a meeting when reality didn’t work out as the model had predicted and the modeller (who’d done a seriously good job with Excel) told us (in slightly different language) that we “did not understand what was spoken or make further enquiry” and I think he advised that we now blame ourselves.
“Some Are Useful”
Models are immensely useful. The curious thing is that we tend to use them for the precise opposite of the thing they’re best at.
The great thing about a model is that it enables easy scenario testing. A model will typically have a whole bunch of different inputs, which are then mathematically connected to create a forecast outcome. This means you can get a good sense of what would happen if the unemployment rate goes down, or steel prices go up or the price is 10% lower. Given that no-one knows what the future holds, it means you can identify the biggest risk. If you’re a manufacturer and a 10% increase in steel prices is a much less important factor than a 10% reduction in price, then you know you really need to optimise for quality.
Models are all about risks and uncertainty.
But that’s the opposite of how we use them.
We create these tools that are designed around the assumption that the future is uncertain and we use them to create certainty. How many times have you heard someone say “The model says…”.
One scenario is chosen as the ‘base’ and - by the time it’s been through the financial forecasting processes - is locked into business cases and bonus schemes as fixed. From then on, divergence is not fate, it’s fault.
How to Predict the Future
Remember George Box: models are useful but they’re wrong.
So here’s my guide to the best ways of predicting the future:
Two Way doors - Just Do It
The simplest way to find out what will happen is to make it happen.
This is where the Amazon “One Way Door” / “Two Way Door” heuristic is very helpful.
Amazon categorise their decisions as being either a “One Way Door” (which, once you’re through, is hard to roll back) or a “Two Way Door” (if you don’t like the other side, you can step back through).
A “One Way Door” decision requires a business case. But many of the decisions we seek permission to make are “Two Way Door” decisions. In this case, just do it.
Interestingly, in a firm like Google or Amazon, you’ll get into trouble if you do seek approval before taking a Two Way Door decision. What are you wasting everyone’s time for? Just do it!
An alternative way of thinking about this is to ask if a decision is the equivalent of getting a haircut or a tattoo. How much time do you spend deciding to get a haircut? And how much time would you spend deciding on getting a tattoo? Apply the same ratios to your work.
Minimum Viable Business Cases
Some projects are big and require big budgets, so need to be treated as Big Decisions.
These are dangerous as it’s easy to fall into the temptation of thinking that you need to do a Big Model to predict what will happen.
There’s a great case study from my early career here. I realise today’s blog post is probably enough reading for one day, but do have a look if you’ve got time.
It describes how National Express spent a six-figure sum modelling a project and a seven-figure sum delivering the project, when a five-figure sum could have told them that it was not going to work.
What this case study illustrates is an absence of a Minimum Viable Business Case approach. Minimum Viable Business Cases aren’t suitable for everything: but you’d be surprised how much they can be used for.
The idea of a Minimum Viable Business Case is to only approve and spend enough money to narrow the range of possible outcomes. Then to approve a bit more. Then a bit more. Then a bit more. Each time you spend a bit more, you’re doing so having de-risked the project. Again, at risk of giving you a somewhat crazy reading list, here’s my detailed guide to Minimum Viable Business Cases.
Forecasting Costs and Budgets
What about the big projects that need a single answer?
Well, remembering that a Minimum Viable Business Case approach is more likely to be useful than you think, two researchers can act as our guides.
Superforecasters - Philip Tetlock
Philip Tetlock decided to find out the best ways to forecast something by persuading the American Government to fund a huge “forecasting tournament”, in which hundreds of individuals attempted to make forecasts using whatever methodologies they preferred. Then Tetlock looked to see which methods worked best. He wrote these up into “10 commandments”, which you can see here. But there are a couple of key characteristics of “Superforecasters” (those individuals who consistently achieved high levels of forecasting accuracy) which are highly relevant to forecasting a project or budget.
The Outside View
The first is that they tend to take what behavioural scientist Daniel Kahneman calls the ‘outside view’. Let’s illustrate that with an example from Tetlock’s book Superforecasters, written with Dan Graham:
I’m going to ask you a question about the Renzetti family. The Renzettis live in a small house at 84 Chestnut Avenue. Frank Renzetti is forty-four and works as a bookkeeper for a moving company. Mary Renzetti is thirty-five and works part-time at a day-care. They have one child, Tommy, who is five. Frank’s widowed mother, Camila, also lives with the family.
My question: How likely is it that the Renzettis have a pet?
Most people, in this situation, would start thinking about what their jobs tell you (do they both work outside of the home?), whether Camila might look after the pet, whether the kids are old enough to be demanding of pets, how big the house might be, etc. They’d try to answer the question based on the information of the case.
Whereas the Outside View is to start out by saying “what proportion of families have pets?”. Which, for Americans when this book was written, was 62%. That becomes the base, against which the Superforecaster then applies as many tweaks as possible.
And that leads to the second point, which is that Superforecasters break the question into the smallest possible chunks even if they have to use estimates to do so. He quotes Peter Backus who wanted to know how many potential life partners he had. He started with the population of London (six million), then winnowed it down by the proportion women (50%), proportion single (50%), proportion in his age range (20%), proportion university graduates* (26%), proportion he finds attractive (5%), proportion who find him attractive (5%) and the proportion compatible with him (10%). The answer is 26.
* I’m not entirely sure why Mr Backus couldn’t imagine going out with a woman without a degree, but there we are.
The key point here is that almost all of these were guesses (he has no idea what proportion of women find him attractive, and he didn’t spend time doing a survey - thank heavens) but that didn’t stop him estimating the number in order to keep breaking down the question.
So in the case of the Renzetti’s family’s pet, we can start with 62% as a base. Then adjust upwards a bit because there’s a person living at home. Then down a bit because she’s elderly and might be ill. Then up a bit because there are kids at home who might want a pet. Then down a bit because he’s not very old and won’t have so much influence in the family. Then down a bit further because the house is small. etc. etc. etc.
Remarkably, this technique even works if we have no idea what the “Outside View” base statistic is. In this example, if there is no data on the proportion of Americans that own pets, the best result is still found by estimating that “Outside View” stat first, using it as a base and adjusting from there.
How Big Things Get Done - Bent Flyvbjerg
The fascinating thing is that Superforecasters is an academic book by an academic, written using academic experiments.
Bent Flyvbjerg set out to answer a very real and important question: why are project budgets always wrong?
He took a very similar approach: he compiled a database of 258 projects from all over the world, and compared their initial forecasts with the out-turns. The results won’t surprise you. Rail projects were typically 45% over budget, bridges and tunnels 34% over and roads 20% over. 90% of cost forecasts were too low.
Then he looked at the characteristics of those 10% that had been forecast accurately. His advice based on that learning is identical to Philip Tetlock’s for any other type of forecast: “Take the outside view”. As he points out:
Your project is special, but unless you are doing what has literally never been done before - building a time machine, engineering a black hole - it is not unique. It is part of a larger class of projects. Think of your project as “one of those,” gather data, and learn from all the experience those numbers represent by making reference-class forecasts.
When he talks about 'reference class forecasts”, it’s the same thing as finding out what proportion of families have pets. If you want to build a new railway station, don’t estimate the costs by… well, estimating the costs. i.e. don’t add up the costs of land acquisition, steel, lifts, etc etc. Instead, find out how much every similar new station has cost in recent years, and then make lots of micro-adjustments up and down from that number.
Just do it!
However, and this takes us right back to the beginning, even though Bent Flyvbjerg is studying huge infrastructure projects: the kind that you think have to be planned and then built in a highly linear approach, he disagrees:
Planning, as I see it, is not merely sitting and thinking, much less a rule-based bureaucratic exercise of programming. It is an active process. Planning is doing: Try something, see if it works, and try something else in light of what you’ve learned. Planning is iteration and learning before you deliver at full scale, with careful, demanding , extensive testing producing a plan that increases the odds of delivery going smoothly and swiftly.
i.e. the best way to know what will happen… is to make it happen.
Take Action!
Entrepreneurial organisations are biased to action. Here are actions you can take:
For team-members:
Think in Scenarios, Not Certainties
Use the model for what it’s good at: exploring what might happen under different conditions. Don’t just plug in the 'official' assumptions. Play with the inputs. What changes everything? What hardly matters? Communicate upwards in scenarios.
Break Down Forecasts
When estimating anything (cost, time, risk) start with the 'outside view' first. What happened in similar projects? Then tweak from there, even if some of your estimates feel rough. That’s better than gut instinct disguised as precision or false certainty based on the only available data.Identify Two-Way Doors
Before writing a business case, ask yourself: “Is this a tattoo or a haircut?” If it’s a haircut, just do it. Don’t wait for permission you don’t need.Build Minimum Viable Business Cases
Propose spending just enough to reduce uncertainty, then revisit. Ask: “What’s the smallest step we could take to learn more?”
For Leaders:
Don’t Treat the Model as a Promise
Be explicit that the base-case is a scenario, not a promise. Encourage challenge, not compliance.
Make Divergence Safe
If the future doesn’t turn out like the forecast (and it won’t), don’t look for someone to blame. Look for what you’ve learned, and build from it.Fund Learning, Not Just Delivery
Create space (and budget) for Minimum Viable Business Cases. Don’t wait until the whole case is watertight before approving experimentation. Learning incrementally is a much cheaper way to do it! False certainty is expensiveModel Less, Decide More
Encourage teams to focus less on perfecting forecasts and more on surfacing assumptions, risks and steps to narrow the range of uncertainty. Reward momentum not a perfect model.
👋 I'm 𝗧𝗵𝗼𝗺𝗮𝘀. I help organisations like yours drive 𝗶𝗻𝗻𝗼𝘃𝗮𝘁𝗶𝗼𝗻, deliver 𝗰𝗵𝗮𝗻𝗴𝗲, and achieve 𝗳𝗮𝘀𝘁𝗲𝗿 results, drawing on 20 years of leadership across public and private sectors.
🚀 I offer 𝘀𝗽𝗲𝗮𝗸𝗶𝗻𝗴, 𝗺𝗲𝗻𝘁𝗼𝗿𝗶𝗻𝗴, and 𝗰𝗼𝗻𝘀𝘂𝗹𝘁𝗶𝗻𝗴 that energise teams, shape strategies and remove barriers to change. Whether you aim to accelerate innovation, drive change, or inspire your people, I’m here to help. Let’s talk!