Future Planning? It’s Better to Be Vaguely Right Than Exactly Wrong

Dr. Adam Gordon writes about strategic foresight and how to do it. 

Adam Gordon, 13.01.2017

There is a whole toolbox of qualitative foresight tools, that range from horizon scanning to systems dynamics to scenario planning, among various options.

The Bank of England’s chief economist Andrew Haldane last week admitted to “a series of forecasting errors” when economic planners failed to foresee the 2008 financial crash, and more recently greatly overestimated the negative economic impact of Brexit (so far).

Consistently mistaken forecasts will cause the future-thinking process to lose credibility among political and business decision-makers.”

The social and political gyrations of 2016 will continue affect businesses in all sectors. And, with a Trump presidency and EU right-wing revivalism, to say nothing of digital strategy transformations and energy sector upheavals, the road to 2020 looks unpredictable indeed.

Haldane blamed the failure of the Bank’s forecasting models on their inability to cope with “irrational behaviour” in the modern era, while expressing concern that consistently mistaken forecasts will cause the future-thinking process to lose credibility among political and business decision-makers. 

By way of a fix, he told an Institute of Government meeting that just as weather forecasting had improved markedly in a generation by throwing more data at the problem, business and economic forecasting could improve itself in the same way.

But, is he right?

Will more data or better modeling software lead to better predictions? Or mitigate the possibility of expert-led grand foresight mistakes?

While nobody would argue against more data or better analysis, Haldane is looking in the wrong place. 

As I have demonstrated in various forums, and in my book “Future Savvy” there are inherent limits to quantitative prediction. These are not limits to do with quality of data or how skillfully it is crunched, rather limits of where quantitative inquiry is a valid form of inquiry.

Simply put, forecasting via data-modeling and extrapolation brings rewards when applied within closed, stable, systems — that is, in situations where it is possible to make valid assumptions about which variables drive what outcomes, to what extent, and moreover be certain these assumptions will remain valid throughout the forecast period. 

But where there is possibility of external intrusion or “shock” to the system, then data-driven forecasting becomes a fools-errand.

Even if expertly or extensively done, modeling a foundational assumption that becomes mistaken leads to mistaken foresight. 

Applying future-modelling to complex open systems is best described by the well-known adage of the man looking for his keys under the corner streetlight who, when stopped by the policeman, says he lost his keys in the park. “So why are you looking here?” asks the policeman. “Here is where the light is,” says the man. 

No matter how assiduously we apply the bright light of quantitative modeling, it’s going to be useless to the task of finding our keys if they are under the bush in the park.

How do we shine light into the park?

Are there ways to usefully illuminate situations that are too systemically complex and prone to shocks to be validly tractable to quantitative forecasting? 

In fact, there are. There is a whole toolbox of qualitative foresight tools, that range from horizon scanning to systems dynamics to scenario planning, among various options.

Prediction of complex open multi-variable and exponentially oriented systems is not possible for anyone nor any machine.”

The caveat is this: a qualitative analysis of the future is not designed to render a “prediction”.

A qualitative analysis does however offer illumination of the cone of important, plausible uncertainty.

The qualitative toolbox can very reliably and insightfully help decision-makers see the systemic engine that underpins alternative outcomes, separate what is more vs. less probable, and pre-think the range of plausible uncertainty by way of baseline studies and alternative scenarios. 

This becomes the basis of better management decisions going forward. When planning for the future, it’s better to be vaguely right than exactly wrong.

By way of example: The bank of England predicted a dramatic slowdown in the UK economy after the Brexit vote. Oops. What actually happened was it bounced back strongly.

What it modeled was elements such as: how much the vote result would unnerve consumers and investors, how much a fall in the pound would create imported inflation, and by how much this would knock consumer spending, share prices and the housing market, and so on.

That was certainly one scenario. But a better foresight process for this high-complexity open-system situation would have illuminated the full envelope of possibilities, not least how systemic counterforces might produce “irrational behavior” or “unintended consequences”. In this case, how consumers would respond to a weak pound by buying now, stoking the economy, at least in the short term.  

Dr Adam Gordon is an instructor at Aalto EE's Strategic Foresight program »

Currently reading: Aalto Leaders' Insight: Future Planning? It’s Better to Be Vaguely Right Than Exactly Wrong

Subscribe to Aalto Leaders' Insight NewsletterSubscribe to Newsletter

Aalto Leaders' Insight newsletter provides you with topical Aalto Leaders' Insight content, invitations to our events and webinars, and information on new and upcoming programs and Early Bird benefits.

I permit Aalto University Executive Education Ltd to use my contact information for marketing purposes, for example, to provide me with further information about its programs or send invitations to events. I can decide anytime that I no longer want to be contacted. See our Privacy Policy