Think Again: How good leaders can avoid bad decisions

By Andrew Campbell and Jo Whitehead

Leaders can make good decisions or less good decisions. Several years ago, we set out to understand the causes of these less good decisions. Our main finding is that the bad decisions start when the brain lets us down.

Decision making lies at the heart of every organization.  Leaders can make good decisions or less good decisions.   Our interest has been in the causes of less good decisions.  Recent events in global finance have provided an abundance of examples of leaders getting it wrong.  However, the problem is not limited to the financial sector or even to business.  Bush and Blair made decision errors with regard to Iraq.  Brigadier Matthew Broderick judged that the levees in New Orleans after hurricane Katrina had not been breached, delaying federal aid by 24 hours.  The Queen of England chose to remain for four days at her castle in Scotland following the death of Diana, ex-Princess of Wales, despite the pleading of PM Tony Blair.

As a result, we set out, several years ago, to understand the causes of these less good decisions.  Our main finding is that the bad decisions start when the brain lets us down.


How our brain can let us down

Most of the time our brain processes help us understand the situation we face and choose a good action plan.  Less good decisions result from the same brain processes.  The difference is the circumstances.   In certain situations, the brain processes that normally get us to good decisions, lead us instead to less good decisions.  There are two processes we need to understand: pattern recognition and emotional tagging.

[ms-protect-content id=”9932″]

Our pattern recognition process has let us down and, because it happens unconsciously, we are often powerless to correct the error before we have made it.

Pattern recognition

Pattern recognition is a complex process that integrates information from many parts of the brain. There are three aspects of pattern recognition.  First, pattern recognition is complex, involving many parts of the brain. Second, our brain makes guesstimates and fills in gaps based on experience.  This allows us to function with incomplete information. And let’s face it, if we had to wait until we had all the information, most decisions would be too slow – with potentially fatal consequences. That is why pattern recognition is a great friend to decision makers. It allows us to make (mostly) good judgments without being in possession of all the information.  Third, most of the work our brain does during pattern recognition is unconscious.  We recognise something or we don’t.  We do not know how we have arrived at the recognition, which parts of our brain have been involved and what guesstimates have been made.

These three features – the assembly of interpretations from many different parts of the brain; the filling in of gaps to produce an understanding; and the fact that most of it happens unconsciously – work well for us most of the time.  But they are not foolproof.  We can make mistakes. Faced with a new situation, the brain makes assumptions based on prior experiences.  Take a business that is not performing well.  If we have had experience with cutting costs, our brain may recognize aspects of the situation that link with our cost cutting experience.  Our unconscious may then tell us that this situation is one where the problem is too many costs.  If our previous experience is with high growth situations, our brain may recognise aspects of the situation that link to high growth challenges.  Our unconscious may then tell us that the situation is one where the challenge is growth.  We know that generals often fight the last war, and that consultants are often hammers looking for nails.

How many times, on moving to a new organisation, have we instinctively resorted to solutions and approaches that were successful in our previous organisation and with our previous team, only to discover that they simply don’t work this time round? Our pattern recognition process has let us down and, because it happens unconsciously, we are often powerless to correct the error before we have made it.





Emotional Tagging

Emotional tagging is the process by which emotional information attaches itself to the thoughts and experiences stored in our memories. This emotional information tells us whether to pay attention to something or not, and it tells us what sort of action we should be contemplating (immediate or postponed, fight or flight). The latest findings in decision neuroscience suggest that our judgments are initiated by the unconscious weighing of emotional tags associated with our memories rather than by the conscious weighing of rational pros and cons: we start to feel something—often even before we are conscious of having thought anything. As a highly cerebral academic colleague recently commented, “I can’t see a logical flaw in what you are saying, but it gives me a queasy feeling in my stomach.”

Emotional tagging helps us make decisions. If parts of our brain controlling emotions are damaged, even though we retain the capacity for objective analysis, we become slow and incompetent decision makers. Antonio Damasio, one of the world’s leading neuro-scientists, provides a good example of this. He describes a discussion with one of his brain-damaged patients – a man whose cognitive functions were intact but who had no ability to use his emotions when making decisions. The two were trying to schedule the patient’s next appointment. Damasio suggested two possible dates. The patient embarked on an interminable, beautifully argued enumeration of the pros and the cons and the maybes of the different days. After some minutes, Damasio could listen no longer. He told the patient to visit on the second of the two dates. “That’s fine,” said the patient, as though there had never been any issue.2

Emotions help us push to judgment in a timely fashion – but if inappropriate emotions get tagged to the decision, this can powerfully distort the decision maker’s judgment.

One example of inappropriate emotions in business is love or hate.  This emotion contributed to the demise of Wang Laboratories, the most successful company in the word-processing industry in the 1980s. Founder An Wang believed he had been cheated by IBM, early in his career, over a new technology he had invented.  His dislike of IBM led him to reject the IBM PC standard for Wang’s PCs.  Instead he created a proprietary operating system despite resistance from his management team who could see that the PC was becoming the dominant standard. This flawed decision contributed to the company’s demise in the early 1990s.

So our brains, which normally guide us to good judgments, can in certain conditions lead us astray.  Fortunately we can do something about this.  First, we can spot in advance the conditions that may disrupt our thinking.  Second, we can put safeguards into the decision process that will reduce the risk of biased thinking affecting the choice made.

We have identified four conditions – “red flag conditions” – that are likely to lead to distortions in the judgments we make.




Red flag conditions

The first is misleading experiences and occurs when we are faced with an unfamiliar situation—especially if it appears familiar. Under these conditions we can think we recognize something when we do not. William D. Smithburg became CEO of Quaker in 1981 were he executed the successful acquisition of Gatorade – the sports drink company – in 1983. In 1994, the expanding company sought to repeat the success by acquiring another successful but underexploited drinks company – Snapple.  Smithburg failed to recognize that whereas Gatorade was promoted and distributed in a traditional fashion and a rising star in its market, Snapple was a quirky, entrepreneurial organisation producing an image drink that was already losing market share.  The acquisition was disastrous, leading to the downfall of both Smithburg and Quaker.

Another red flag condition is when our thinking has been primed before we begin to evaluate the situation, by previous judgments or decisions we have made that connect with the current situation. We refer to these as misleading prejudgments. Steve Russell, the CEO of Boots between 2000 and 2004, had prejudgment that Boots needed to grow and that health-care services were an attractive opportunity. In his own words, “I had been formulating this ambition for Boots since I was merchandising director of Boots the Chemist in the late 1980s. So, when I became CEO, I was determined to make it happen.”  Other managers suggested that many of the services Boots tried to enter were inherently low-margin businesses. A turbulent trading period ensued and Russell resigned in 2004.

The third red flag condition is inappropriate self-interest which can be a very powerful and often unconscious influence even among professionals who are highly ethical. Prescriptions that doctors write have been shown to be influenced by the favours they have received from drug companies.

The fourth red flag condition is inappropriate attachments, such as the attachment we might feel to colleagues or a business when considering cost reductions. A striking example of inappropriate attachments is that of Sir Derek Rayner, CEO of Marks and Spencer in the 1980s. He paid $750 million for Brooks Brothers – the iconic US retail chain famous for its button-down shirts – even though his team said it was worth only $450 million. Why?

As Judi Bevan describes in her book The Rise and Fall of Marks & Spencer4, Rayner “…was enamored with Brooks Brothers clothing, which was in large part aimed at men of Rayner’s age and taste.” Although his advisers had presented six possible acquisition targets, Rayner ignored all the others and “went straight for the preppy, upmarket Brooks Brothers chain.”

We can all cite examples from our own professional lives in which “Red Flag” conditions have existed. We urge all those involved in important decisions to consider whether Red Flags exist. If they don’t, decisions perhaps need fewer checks and balances. But if they do, the decisions with the highest stakes should be protected with robust safeguards.





We have identified many process “safeguards” – additions to any standard decision process that can counterbalance the effects of distorted thinking. Most safeguards are well known – the challenge is to pick the right ones for the particular red flag condition. For example, a presentation from an expert consultant might be a suitable safeguard for a decision maker who has misleading experiences about a new market entry. However, if that decision maker is a CEO with strong prejudgments, they might need a stronger challenge – perhaps from the Chairman or Board.

Safeguards reduce the risk that red flag conditions will lead to a bad decision: they act as a counterweight (see exhibit). While they cannot always eliminate the effect of the red flag on an individual decision maker, they provide a means whereby the decision can be tested, challenged and adjusted before it is too late.

Safeguards can be grouped into four categories:
Experience, data, and analysis. In business, there are many ways data can be collected and experience broadened. A discussion with a key customer can provide valuable feedback on a proposed new product. Market research might evaluate the risks of entering a new market. Consultants could be brought in, partly for their expertise and readily available manpower, but also because they are relatively objective. BP sometimes employs two firms of lawyers to get contrasting views for very important decisions, such as major acquisitions.

Debate and challenge. Creating a debate which challenges biases need not involve an elaborate process. It could mean simply chatting through an issue with a friend or colleague. However, in large organizations a typical approach is to form a decision group. The choices of who is in the group, the leader of the group and the process for the group to follow are all important choices. While many such groups operate with simple guidelines, there are a host of more elaborate approaches – such as splitting the authorizer, evaluator, and proposer roles, allocating “hats” to different people (as suggested by the lateral thinker, Edward de Bono), role-playing; or the devil’s advocate method (in which a subgroup attacks the proposed option).

Governance. Someone with power and strong prejudgments, such as Russell, may be resistant to new analysis or a group process. In this case it may be necessary to strengthen the governance process – perhaps by setting up a special subcommittee of the board to review the proposal in detail.

Monitoring. Finally, particularly when there is a risk that all these safeguards are still insufficient, it may be sensible to beef up the monitoring process – for example, by setting clear milestones, monitoring performance and adjusting the strategy accordingly.




Trusting your gut

Strengthening the decision process by adding what we term “safeguards” is sensible when there are red flags flying. But, inevitably, decisions will often still be based partly on the judgment and gut instinct of a senior decision maker – particularly if they are not major strategic decisions.
If you are the decision maker, how can you test out whether your feelings about a decision are helping or hurting you? There are four tests – derived from the four red flags – which can provide some guidance on when you can trust your gut:

If you are part of a decision process and are prepared to be more reflective, you can identify red flag conditions in advance of the decision. You are then able to apply safeguards to the decision process.

Whilst it might be discouraging to discover that our brains predispose us to some errors of judgment in our decision-making, leaders can take heart. If you are part of a decision process and are prepared to be more reflective, you can identify red flag conditions in advance of the decision.   You are then able to apply safeguards to the decision process.   If you are working on your own, you can use some simple tests to check whether or not your gut instincts are likely to lead you are astray.  If they are, involve someone else in the decision. Whilst you can’t ever eliminate the risk of errors in your decision-making, you can reduce the odds!

Further ideas are discussed in our book Think Againand our website,

About the authors

Andrew Campbell and Jo Whitehead are directors of London’s Ashridge Strategic Management Centre and coauthors, together with Sydney Finkelstein, of Think Again: Why Good Leaders Make Bad Decisionsand How to Keep It From Happening to You (Harvard Business School Press, 2009). They teach on a range of strategy programmes at Ashridge Business School, and also consult to a number of UK and International companies on business and corporate strategy issues.

Jo Whitehead previously spent 20 years at The Boston Consulting Group. Andrew Campbell worked with McKinsey and Company prior to founding the Ashridge Strategic Management Centre and is an acknowledged guru on the topic of corporate strategy.


1. For those interested in more dramatic examples of pattern recognition being a deceptive guide, VS Ramachandran, one of the world’s leading experts on phantom limbs, describes a number of types of patient who imagined things based on past experience. People who have lost a limb imagine the limb to still be there. People who suffer a scomata (a brain injury that causes them to lose a part of their vision) will hallucinate in the area where they cannot see – seeking to fill the gap with something based on their experience  – but these things can be very weird e.g., cartoons, monkeys, Hawaiian dancers. Patients suffering from this know that what they see cannot be true – but it still seems real to them. V.S. Ramachandran and Sandra Blakeslee, Phantoms in the Brain, Human Nature and the Architecture of the Mind (London: Fourth Estate, 1998), 72.

2. Damasio, Antonio. Descartes’ Error: Emotion, Reason, and the Human Brain. (New York: Grosset/Putnam, 1994). Page 193.

3. For a more recent discussion of MacLean’s theory see Gerald A. Cory Jr., The Reciprocal Modular Brain in Economics and Politics; Shaping the Rational and Moral Basis of Organization, Exchange, and Choice (New York: Kluwer Academic/Plenum Publishers, 1999).

4. Bevan, Judi. The Rise & Fall of Marks & Spencer…And How It Rose Again. (2nd ed. London: Profile Books), 2007




Please enter your comment!
Please enter your name here