
Reality checkpoint: interview with Professor David Halpern
Our conscious thought processes are, to some extent, our own internal audit processes, trying to establish what’s happening around us, and whether our perceptions of reality and risk, and the narratives we tell ourselves are accurate, muses Professor David Halpern, CEO of The Behavioural Insights Team.
Governments have long sought to uncover the secrets of the human brain to persuade us to do willingly what they perceive as advantageous – whether this is to pay our taxes on time, eat healthier food or to drive safely. But there are no simple answers. Many different factors come into play when you seek to influence behaviour. The way people respond to an interest rate rise, for example, depends on their internal risk assessments, which will factor in issues such as whether they have faith in the judgment of the bank, how big their mortgage is, how they feel about their job security and whether they believe they understand what is driving inflation.
Organisations may find that efforts to reinforce peoples’ natural inclinations prompt them to behave in a “desirable” way, for example to come to work on time and use the recycling facilities. But the same actions could have negligible (or even negative) effects if other factors have equal or greater influence. Our personal inclinations and belief in our own world view, accurate or not, can therefore have huge repercussions for office culture, recruitment and decision-making.
Internal audit sits within this landscape, so can benefit from understanding the advantages and the risks associated with human behaviour – and how a small “nudge” may start a ball rolling that, for better or worse, can be hard to stop.
“It’s the eternal question – how do you incentivise workers to be productive and creative?” Halpern says. “Most people think they are good at understanding human behaviour, yet many of our theories are thin at best, or completely wrong. And this isn’t a marginal issue – it’s often the most important factor.”
Managers and internal audit teams should take note. “It is estimated that 60 per cent of productive life is lost because of behavioural and lifestyle choices – and there is a similar statistic for environmental progress,” Halpern explains.
In his presentation at the Chartered IIA’s Internal Audit Conference, he told delegates that behavioural science is particularly important now because our underlying theories of classical economics are not robust. “They are based on people working out their chances and making the best of them,” he says. “It’s less Mr Spock and more Captain Kirk.”
The confidence trap
People create shortcuts to work out what’s going on around them and to make personal risk assessments, yet these are largely unconscious and prone to systemic errors. A common example is our tendency to underestimate familiar mundane risks, such as the risk of travelling by car compared with the risk of flying.
We also tend to be overconfident about our own abilities – over 90 per cent of people think they are above average drivers and have above average intelligence (beliefs that are not tempered by driving penalties or exam results).
Add to this the human tendency to make sense of the world by creating a narrative that is often partial or false – for example, to explain why we didn’t finish the washing up this morning – and behaviour is clearly a potential minefield for internal audit. Bias, over-confidence about our own abilities and perceptions and false narratives constantly undermine organisations’ decision-making processes, even when people are doing their best to do the “right” thing.
What’s more, our natural tendency to over-confidence increases as we ascend the hierarchy and are rewarded for our judgment and performance. We tend to forget or minimise failures and over-inflate successes, and to assume full responsibility for the latter, while attributing the former to other causes. The people who make the most serious decisions are therefore most likely to be dangerously over-confident and least likely to make adequate contingency plans for sub-optimum outcomes.
Given that leaders want to sound even more confident about their plans in order to inspire others, you have a potentially dangerous cocktail – just at the level that is hardest to challenge.
“An ideally rational leader would build in a much wider error margin around their key decisions, but leaders think they have to supply clear leadership and display confidence,” Halpern says. “Television favours people with a simple message, so they become influential pundits, but people with a simple world view tend to be terrible predictors of the future. People with a more complex view of the world give a more nuanced account that incorporates more uncertainty. You need to look at how your organisation corrects this balance and whether they calibrate the top team to reflect uncertainty and alternative outlooks.”
This is particularly relevant for internal auditors, since they may need to challenge an unrealistically optimistic stance, or at least question whether decision-makers have taken full account of the risks, whether a decision falls within the board’s stated risk appetite, and whether any back-up plans are adequate. Understanding the impetus towards over-confidence is therefore vital.
Form a “red team”
One way to ensure that the “nay sayers” are listened to, and that teams and boards consider the worst-case scenario whenever they make important decisions is formally to appoint a person or a group to be the “red team”. The term comes from the US military and refers to a team set up to think like the enemy.
“If you assign someone in a meeting to be the red team, you are asking them to spot problems and to look for where the great plan you are discussing will fail. It can be difficult to go against the trend if everyone is excited about a new idea, so you need to build it into the team design,” Halpern recommends. “Internal auditors who see that this never happens at board level should automatically ratchet up the organisation risk factors,” he says.
Test, test, test
Not only do internal auditors need to exercise professional scepticism, but they can also help to devise and monitor objective tests that reduce bias – the results can undermine many preconceptions. Such tests have long been familiar in recruitment and selection – one famous example is the US orchestra which hired more women when it introduced “blind” auditions with candidates playing behind a screen. This is particularly important at a time when organisations are struggling to attract talent and are keen to make their workforce more diverse.
Structured interviews also help to reduce bias, but the best way to judge candidates fairly is to devise a practical test that indicates aptitude for key tasks, Halpern says. “I’m amazed how few selection processes do this,” he adds.
“If you want to ensure that you are hired for a job, you just need to ensure that the person before you performs terribly. Similarly, you are unlikely to be hired if you answer the first question badly, even if you answer the rest well,” he explains. “To reduce this kind of bias, we’ve tried breaking up interviews and running answers to each structured question past different interviewers before putting all their judgments back together. This gave very different results from showing the same panel the complete interview.”
At the very least, Halpern suggests, people on interviewing panels should be asked to form and record their opinions before they compare notes. Otherwise, it’s far too easy to be influenced by the most senior or most charismatic person in the room.
Bias is not restricted to the recruitment process and if the internal audit function spots the potential for bias, or the predominance of a particular strand of “received wisdom” in any part of an organisation, it is in a good place to call it out and suggest that the view or process is tested more rigorously.
“Question all tacit theories about what is happening and what drives behaviour in your organisation,” Halpern warns. “You will get the wrong prescription for a problem if you base it on a flawed theory.”
Cultural tells
Behavioural risks are obviously hugely important when assessing corporate culture, but Halpern warns that findings may be subtle. For example, a predominance of negative remarks between colleagues at work may indicate a toxic culture, but it takes a ratio of at least five positive comments to one negative one to indicate a positive culture.
Responses to seemingly innocuous questions such as “Do you have a best friend at work?” and “Do you view your supervisor as a boss or as a partner?” can have a surprisingly strong correlation with productivity or lack of it. Halpern says that people who say they see their supervisor as a partner experience the same improvement in their experience of work as those who receive a 30 per cent pay rise.
This matters because, he says, negative feelings at work have a direct correlation with low productivity. “When looking at culture you need to unpack behaviour to see what is really going on – and then whether you can use this as a vector for change,” he says. “Much of the productivity gap between the US and the UK comes down to the quality of management.”
Humans are complicated and don’t always respond the way managers (or governments) expect to a “nudge” – and how they respond will vary in different environments. What works for one will not necessarily work the same way elsewhere.
Internal audit is in an excellent position to spot opportunities to improve culture and decision-making, and to reduce bias and reliance on erroneous philosophies. “Ask how good the empirical evidence is behind what you do and what you think. Have your processes and assumptions been tested – and if so, where and how?” Halpern says. “Test your theories and ideas and record your results – and test several variations since, so often, intuition turns out to be flawed and an alternative version works better.”
Internal auditors and behavioural scientists have much to gain from each others’ expertise, Halpern adds. We all gain from more productive, happier workplaces and better decision-making – and, if all goes well, such collaboration could also improve the UK’s productivity.
What is the Behavioural Insights Team?
The concept of “nudge” in behavioural science came to popular fame in 2008 following the publication of Nudge: Improving Decisions About Health, Wealth and Happiness by Richard Thaler and Cass Sunstein. Various governments sat up and took notice of the concept that understanding behavioural science could help them to harness people’s natural ways of thinking and operating to encourage them to behave in a “desirable” way.
In the UK, David Cameron’s government set up a “Nudge” unit to explore these ideas and their uses in government. It then became The Behavioural Insights Team, which is now owned by innovation foundation Nesta and is run by Professor David Halpern.
David Halpern is CEO of The Behavioural Insights Team and was a speaker at the Chartered IIA’s Internal Audit Conference. You can view his (and other) sessions on demand on the Chartered IIA’s website.
This article was published in November 2022.