What’s gone wrong with government policy making? Political expert Ivor Gaber says the answer is quite simple... in one word it is algorithms!

NO ONE, least of all me, underestimates the difficulties this government has faced in trying to deal with the first major pandemic in over a century.

They’ve had a thankless task, but they could have done better, much better.

One reason why they’ve made so many mistakes, the A-level fiasco being the latest, is because, unlike previous administrations, they have allowed their policy-making process,whether it has been about health, education or whatever, to be driven not by human but by artificial intelligence. We’ve heard a lot recently about algorithms but what are they and how are they driving policy?

An algorithm is a complex computer programme that uses “big data” not just to offer an answer to a problem but to implement it as well, without any human intervention. For example, every time you click on a web page, or a link in your social media feed, an algorithm springs into action, holding an auction in milli-seconds among potential advertisers to see who will pay the highest price to “access your eyeballs”. And algorithms drive both the auction and the decisions by the would-be advertisers, who use the thousands of bits of data they hold on all of us to decide how much they are willing to bid to pop up on your screen.

That’s why shortly after looking at sites about trainers, adverts for trainers suddenly start appearing. And algorithms are also why your social media feed gives highest prominence to those opinions that you already agree with.

So the data and behavioural scientists advising the Government, or in the case of A-levels, its agencies (in this case Ofqual) devised an algorithm to “award” A-level results that were manifestly unfair. The algorithm using “artificial intelligence” was eventually ditched in favour of the human intelligence of the teachers. The designers of government policy, not just in this case but across the board, appear to have forgotten three basic rules that those of us who work with statistics, but are not statisticians, have burnt into our brains:

Rule one. Always remember rubbish in, rubbish out. Rule two. Algorithms produce unambiguous answers when in fact the world is full of ambiguity. Rule three. Algorithms are good at predicting past behaviour but when it comes to the future they are no better than the rest of us (indeed sometimes worse).

The first rule is a warning that if your original data is flawed, or incomplete (as was the case with the A-levels) then your results will not just be flawed but substantially flawed as the algorithm can exaggerate incorrect trends.

The second rule is that when your algorithm starts to spew out “results” which seem to point to strong conclusions beware. Algorithms produce black and white answers and it is all too easy to be seduced by these and forget that this what the algorithm is designed to deliver. And third algorithms might be “intelligent”, but they are not prophets. They cannot predict how people will react to any changes they might propose. So for example, in the run-up to lockdown the algorithms used by the behavioural scientists led them to predict that people would rapidly tire of its restrictions and so its introduction should be delayed. They were wrong. People abided by the rules far more stringently, and for longer, than the algorithm predicted.

So what’s new, why have algorithms, and its close relations big data and behavioural science, come to dominate the Government’s policy making? The answer is simple. Dominic Cummings.

The Prime Minister’s chief adviser has been obsessed with these policy tools for a long time and has succeeded in convincing ministers of their efficacy, reinforced by the minor army of true believers he has infiltrated into both government and its lavishly paid consultants. And why have minsters fallen for these false gods so easily? First, because algorithms can make the difficult decisions for them. Second, they appear to absolve them of responsibility should things go wrong and allow them to claim the credit should things go right. And, finally, because they are, in Mrs Thatcher’s words “frit”.

They are frightened of standing up to Cummings et al and fearful that they will fall out of favour with the Prime Minister should they do so, who appears to value loyalty to him as the ultimate qualification for high office. Until we have a cabinet made up of ministers with backbone and a Prime Minister (and chief adviser) who whilst taking his job seriously is prepared to be challenged, then algorithms will continue to stalk the land and we will all continue to pay the price.

Ivor Gaber is Professor of Political Journalism at the University of Sussex and a former political correspondent based at Westminster.