517097293

Studying this technical article and answering the related questions can count towards your verifiable CPD if you are following the unit route to CPD and the content is relevant to your learning and development needs. One hour of learning equates to one unit of CPD. We'd suggest that you use this as a guide when allocating yourself CPD units.

This article was first published in the September 2017 international edition of Accounting and Business magazine.

Good corporate decision-making – such as investment in new products and markets, M&As and organisational change – requires in-depth thinking, a careful assessment of pros and cons, and much evidence. Hard to argue with that, but does this model really accord with actual management practice?

We know from research that such decision-making is frequently flawed because it is swayed by a whole variety of cognitive biases that board members will not be immune to.

Cognitive biases occur when people get carried away with what they expect, or what they assume to be the case, rather than reaching conclusions on the basis of a more deductive and empirical model of decision-making. Cue cognitive bias theory, developed by psychologists Amos Tversky and Daniel Kahneman in 1974.

In his book Thinking, Fast and Slow, Kahneman highlights many experimental studies of the biases that infect human decision-making. Because decision-making relies so heavily on intuition, it makes a real difference to economic performance and should be every FD’s concern.

Some of the characteristics of worrying biases that appear to be strongly supported by psychological experiments include:

  • a sense of cognitive ease (ie ‘I find this easy’), associated with illusions of truth, pleasure and reduced vigilance
  • a neglect of ambiguity and the suppression of doubt
  • a tendency to believe and confirm
  • an exaggeration of emotional consistency (the ‘halo effect’, where if the last thing experienced was good, the expectation is that what comes next will be good too)
  • a focus on existing evidence and a disregard for absent evidence (‘what you see is all there is’)
  • greater sensitivity to losses than gains (‘loss aversion’)
  • decisions too narrowly framed and made in isolation of each other.

A key danger of cognitive bias is that it is this quick, instinctive and purely intuitive thinking that leaves us open to being deceived. Social psychologists have suggested that clever people can be more vulnerable to being fooled, as they are more averse to being found to have made mistakes, so these become ‘self-sealing errors’.

Then there is the danger of ‘group think’, where the outlying opinions in a team become narrowed towards a consensus with the core of the team so that sound challenges to a particular strategy get shouted down.

Consider also commitment bias, where commitment to a course of action escalates as a non-linear curve: as you start to put effort into exploring a possible strategy, you are already half way to committing to it, as no one wants to see wasted effort.

Beware of intuition

We all recognise the importance of intuition in decision-making, but it is intuition that exposes us most to deception. Kahneman warns of the ‘good and bad news’ about intuition. The good news is it is very quick, low-effort and often reliable; the bad news is it can be profoundly dangerous, as it may omit important data and is highly vulnerable to bias.

He talks in terms of the human mind having two, polar-opposite styles of thinking: thinking fast (intuitive thinking), or ‘system 1’; and thinking slow (logical and deductive), or ‘system 2’. He suggests with everyday tasks most people tend to default to intuitive, fast thinking, as opposed to much more ponderous logical and deductive styles.

But I believe the two styles can come together in a third system. For instance, you might make an intuitive jump that is not haphazard but ordered (‘getting your ducks in a row’) yet seems to happen in a few blinks.

Faced with bias, an FD has got to be like a holding midfielder in a football team, bringing the team and its leader back to a deductive model of slower and evidence-based decision-making, rather than just following the decision momentum and merrily turning the team’s collective delusions into a happy-looking set of numbers.

Indeed, the FD needs mental flexibility, capable of going with the system 1 intuitive approach, slowing it down to system 2, while also ready to go with my system 3, where the insights seem to come in a flash but are a product of a logic chain as well. It can help to visualise these as ‘red’, ‘blue’ and ‘purple’ thinking, respectively.

I suggest practising strategic mindfulness: constantly track the state of cognitive flow that the team and you are in and learn how to modulate it, maybe using these colours. As the anchor person you need to guide the system 1 and system 3 thinking by sometimes bringing it down into the slower system 2, or even jolting it to 2 and 3 if totally stuck in 1.

Tony Grundy is a business school academic and independent strategy consultant