The reliance on data, metrics and statistical analysis to account for economic events can be misplaced if it ignores the human factor, says Jane Fuller
This article was first published in the October 2019 UK edition of Accounting and Business magazine.
At a time when human faces and language are being digitised and analysed as if they were numbers, there has never been a greater need to go beyond those numbers. This was brought home to me in my summer reading of books by economists who stress the human side of data analysis.
Andrew Smithers, in Productivity and the Bonus Culture, blames bonus-boosted executive pay for creating perverse incentives that deter the investment needed to improve productivity. He is particularly down on return on equity (RoE), which can be manipulated by share buybacks that flatter earnings per share but do nothing to improve underlying performance. He likens RoE targets to production targets for tractors in Soviet Russia. Just as the achievement of the tractor targets was undermined by the tractors all too often breaking down subsequently, ‘the targets for RoE serve to improve the published figures at the expense of a decline in the information they convey’.
Smithers’ tale is one of a rational response to poorly designed incentives, but human behaviour is also influenced by the context of the times. Those executives might have thought they were maximising shareholder value, a concept going out of fashion.
The shifting sands of human perception are captured in Robert Shiller’s book, Narrative Economics, with the telling subtitle, How Stories Go Viral and Drive Major Economic Events. He points out that economics and finance (he could have added accountancy) pay far less attention to narratives than history and anthropology do. He sides with a historian called Jerry Muller, whose book The Tyranny of Metrics points out that the availability of more data can lead number crunchers to overestimate the importance of ‘arbitrary quantifications’.
What economists underestimate, Shiller says, is that ideas – for example, that tech stocks can only go up, that housing prices never fall, or that some companies are too big to fail – can go viral and move markets. These widely held public beliefs can tip from one back-story to another. Excitement about new tech, for instance, often alternates with fear of job-killing automation.
This is the case now in the pessimistic coverage of artificial intelligence and robots. Roger Bootle, in The AI Economy: Work, Wealth and Welfare in the Robot Age, points out that while the industrial revolution made many jobs obsolete, they were replaced by previously unimagined roles, and living standards went up overall.
In its current phase of development, AI enables rapid processing of huge amounts of data, spots patterns and ‘learns’ as the inputs change. But it also inherits human bias and cannot judge moral intent. So it offers humans judgment that is better informed but not actually better. As accountants stretch themselves around metric-unfriendly factors, such as social impact, an appreciation of history and human behaviour may serve them better than ‘physics envy’.
Jane Fuller is a fellow of CFA Society of the UK and co-director of the Centre for the Study of Financial Innovation.
"The availability of more data can lead number crunchers to overestimate the importance of ‘arbitrary quantifications’"