top of page

Upstream Thinking: Solving Problems before They Happen

Despite the truism that “prevention is better than cure”, organisations and societies invest more resources into reacting to problems than into preventing them from happening. In "Upstream", the best-selling business writer Dan Heath proposes a way of thinking about chronic problems and suggests a framework for preventative action.

Photo Credit: Aaron Burden/ Unsplash

The title "Upstream" is inspired by a public health parable about drowning children. According to the story, the rescuers were so busy fishing drowning children out of the water, they neglected to tackle the guy who was tossing them into the water further upstream. Johannes Kleske described the conundrum of misaligned priorities using a similar metaphor: His clients “didn’t have time to build fences, they had cows to catch.”


Heath defines Upstream efforts as those "intended to prevent problems before they happen or, alternatively, to systematically reduce the harm caused by those problems." He uses the stream metaphor to expand our thinking about solutions.


Upstream work involves systems thinking. In fact, systems thinking needs to be applied both as a diagnostic tool to understand and articulate the problem in new and different ways and to design or redesign a better (although never flawless) system in response. Russell Ackhoff, one of the pioneers in systems thinking, notes:


“The best thing that can be done to a problem is not to solve it, but to dissolve it - to redesign the entity that has it or its environment so as to eliminate the problem.”

But why do we tend to favour reaction? Heath argues that it’s more tangible; downstream (i.e., reactive) work is easier to see and measure. By contrast, preventative efforts often go unnoticed. As a consequence, prevention is rarely rewarded, as observed by Nassim Taleb in “Black Swan”:


"No credit for the silent hero who prevented a crisis. […] Who gets rewarded, the central banker who avoids recession or the one who comes to “correct” his predecessors’ faults and happens to be there during some economic recovery? Who is more valuable, the politician who avoids a war or the one who starts a new one (and is lucky enough to win)? Everybody knows that you need more prevention than treatment, but few reward acts of prevention. We glorify those who left their names in history books at the expense of those contributors about whom our book is silent. We humans are not just a superficial race (this may be curable to some extent); we are a very unfair one."

There are no trophies for preventing a disaster - precisely because it didn’t happen, and because we're rarely aware of near misses or contemplate counterfactual outcomes. A crisis, on the other hand, calls for a response and presents an opportunity for heroic action. Carrying out a rescue, which restores the previous state can feel emotionally rewarding. As Heath notes, “saving the day feels awfully good, and heroism is addictive.”


In addition to acts of prevention being unrecognised, doing nothing often seems a preferable course of action because it can’t get you into trouble. According to Ackhoff, fear of making mistakes is another reason why only few organisations focus on preventative action (or adopt systems thinking for that matter). Failure to do something an organisation should have done is called an error of omission. In contrast to errors of commission, which occur when an organisation does something, which should not have been done, they are often not recorded and therefore go unacknowledged. Even if acknowledged, accountability for them is seldom made explicit. However, according to Ackhoff, the deterioration and failure of organisations are almost always due to something they did not do. In the past, the failure to anticipate, perceive, react to, and solve problems led to the collapse of whole societies, and more recently, has been the downfall of seemingly invincible organisations.


In addition to fear of failure, organisations are myopic by design. They are created to give people focus. While focus and specialisation create great efficiencies, they also lead to silos - an impediment to holistic thinking and preventative Upstream action.


Heath outlines three barriers to Upstream thinking, which apply when dealing with recurring problems.


Problem blindness

It results from the belief that negative outcomes are inevitable or out of our control. Chronic problems tend to be treated like bad weather: They are perceived as bad, but also as something you can’t do anything about. A regrettable but inevitable condition of life. We become blind to a problem when it takes the form of a slow trend concealed by wide up-and-down fluctuations (e.g., creeping normalcy, landscape amnesia) or when we grow accustomed to it. As Heath explains, “you walk into a room, immediately notice the loud drone of an air conditioner, and five minutes later, the hum has receded into normalcy.” Repetition leads to habituation. "The escape from problem blindness begins with the shock of awareness that you’ve come to treat the abnormal as normal," concludes Heath.


Lack of ownership

Despite the enormous stakes, Upstream work is often optional. “With downstream activity – the rescues and responses and reactions – the work is demanded of us. A doctor can’t opt out of a heart surgery; a day care worker can’t opt out of a diaper change. By contrast, Upstream work is chosen, not demanded,” writes Heath. What follows from that insight is that if the work is not chosen by someone, the underlying problem won’t get solved. The reasons for why some problems lack owners are manifold: From the self-interest of a powerful group who benefit from the status-quo to fragmented responsibilities in complex organisations to a perceived lack of legitimacy to address a problem.


Tunnelling

The term is borrowed from psychologists Eldar Shafir and Sendhil Mullainnathan who found that people adopt tunnel vision when they experience scarcity – of time, money, mental bandwidth – and are juggling many problems. Instead of trying to solve them all, they tackle the most urgent or easiest-to-solve ones. There’s no long-term planning; no strategic prioritisation of issues. People who are tunnelling can’t engage in systems thinking. They can’t prevent problems; they just react. But if you can’t systematically solve problems, you stay in an endless cycle of reaction. Tunnelling leads to more tunnelling. It is not only self-perpetuating, it can even be emotionally rewarding. Stopping a big mistake at the last minute brings a certain kind of glory. “The need for heroism is usually evidence of a systems failure,” writes Heath.


Heath proposes seven questions to guide Upstream action:


1. How to unite the right people?
  • Surround the problem by recruiting a multifaceted group of people or organisations.

  • Keep a big tent, but be sure to address all the key dimensions of the issue.

  • Organise all the efforts around a compelling and important aim.

  • Create cross-functional groups.

  • Use data for the purpose of learning, not inspection.

  • Abdicate control; let the team members hold each other accountable.


2. How to change the system?
  • Remember that "every system is perfectly designed to get the results it gets."

  • Fight for systems change; a well-designed system is the best Upstream intervention.

  • Make sure the right things happen by default, not because of individual passion or heroism.

  • Have patience; systems change takes time.


3. Where to find a point of leverage?
  • Immerse yourself in the problem.

  • Remember that the post-mortem for a problem can be the preamble to a solution.

  • Consider the risk and protective factors for the problem you are trying to prevent.

  • Interrogate all the factors that increase risk or protect against the problem; each of these factors is a potential leverage point.

  • Alternatively, consider whether your leverage point might be a specific subpopulation of people.

  • Consider costs and benefits, BUT: reject the idea that preventive efforts must always save money; sometimes it’s about doing the right thing, like minimising your impact on the environment.


4. How to get early warning of the problem?
  • Ask whether having a warning system is justified based on the severity of the problem you’re trying to solve and whether it would give you enough time to respond.

  • Estimate the rate of false positives that can be expected.

  • Weigh the costs of handling false positives against the possibility of missing a real problem.

  • Be willing to tolerate a high rate of false positives if consequences of missing a problem are devastating.

5. How to know you’re succeeding?
  • Consider that with Upstream efforts success is not always self-evident.

  • In lieu of direct measures, use approximations – quicker, simpler measures that will likely correlate with long-term success.

  • Beware of "ghost victories": superficial successes that conceal failure.

  • Substitute with short-term measures, but don’t lose sight of the longer timelines inherent to Upstream efforts.

  • Make sure your short-term measures align with the true mission.

  • Use your short-term measures as critical navigational aids.

  • Understand that getting the short-term measures right can be frustratingly complex, but critical.

  • Balance a quantity metric with quality one.

  • Anticipate how your measures might be misused.


6. How to avoid doing harm?
  • Remember that when tinkering with complex systems, you can expect reactions and consequences beyond the immediate scope of your work.

  • Zoom out; spend part of your time from a vantage point that lets you see the whole system, not just the problem.

  • Realise that, especially in the short-term, changes for the good of the whole may sometimes seem to be counter to the interests of a part of the system.

  • Ask yourself whether you’re intervening at the right level of the system.

  • Consider second-order effects of your efforts: If you invest more time and energy in a particular problem, what will receive less focus as a result, and how might that inattention affect the system as a whole?

  • Beware of the “cobra effect”; it occurs when an attempted solution to a problem makes the problem worse.

  • Because the path forward might not be clear, you must experiment.

  • Remember that “the thing to do when you don’t know is not to bluff and not to freeze, but to learn. The way you learn is by experiment.” (Donella Meadows)

  • For experimentation to succeed, you need prompt and reliable feedback.


7. Who will pay for what doesn’t happen?
  • Beware of the “wrong pocket problem”; a situation where the entity that bears the cost of the intervention does not receive any primary benefit.

  • Find a payment model that supports the preventive approach.

  • Asks these questions when contemplating paying for upstream efforts:

    • Where are there costly problems?

    • Who is in the best position to prevent these problems?

    • How do you create incentives for them to do so?


While "Upstream" focuses on recurring problems, one chapter is dedicated to "distant and improbable threats" - problems that are either unpreventable (e.g., earth quakes or hurricanes), uncommon (e.g. pandemics) or far-fetched (e.g. humanity being extinguished by new technologies). Dealing with these kinds of problems requires a certain degree of preparedness, which can be achieved via the systematic application of Foresight. More often, however, as Heath observes, "we just fumble our way forward and deal with the consequences." He ends on a more encouraging note and reminds us of a concept called "the prophet's dilemma": a prediction that prevents what it predicts from happening. The Y2K bug is an example of a self-defeating prediction, "a catastrophe narrowly avoided thanks to a successful global mobilisation of talent and energy."

 

Check out "Upstream" for more details and concrete examples of Upstream action undertaken by individuals, organisation, and governments. Supplement with works by Donella Meadows and Russell Ackhoff.



Related Posts

See All
bottom of page