This is a complaint about bureaucratic smokeblowing, not AI
I took an algorithm to court in Sweden. The algorithm won
Charlotta Kronblad
And, well, no, not really. The local council used an algo to allocate school places without telling the algo that Gothenburg has a large river running through it. Thus crow flies distances are markedly different from travel distances. That’s a problem with the local authority being dingbats.
Such things happen of course. All organisations will contain dingbats given the human propensity for being a dingbat. It’s the next bit:
The resulting algorithmic injustice is not an abstract problem, nor a problem specific to the Swedish context, it painfully echoes recent scandals across Europe. One is the Post Office scandal in the UK, where the Horizon IT system falsely accused hundreds of post office operators of theft, leading to prosecutions, bankruptcies and even imprisonment. For years, the system output was treated as near-infallible. Human testimony was bent to the authority of the machine. Another example is the childcare benefits scandal in the Netherlands, where a system deployed by the Dutch tax authority wrongly flagged thousands of parents as fraudsters. Families were plunged into debt. Many lost their homes. Children were taken into foster care. In both these cases, the algorithmic malfunctions continued for many years, as the automated systems operated behind a veil of technical complexity and institutional defensiveness. Errors multiplied. Harm deepened. Accountability lagged.
The initial Post Office error was the original planners - sketchers out - of the code being dingbats. We’ve been told by those in the industry that it was simply that incomplete transactions were counted as complete, so that when repeated - in order to complete - two transactions were recorded not the one. This is a level of dingbatry astonishing in those trying to write banking or payments software. But, such things happen because humans, you know, dingbats.
It’s not the algo, nor the original mistake. It’s the years of bureaucratic smokeblowing, refusal to acknowledge the mistake, rectify it or even engage with reality that’s the problem. It’s a problem with bureaucracy, not algos.
Which, of course, means that we don’t stop doing things with algos, we stop doing things with bureaucracies. Simply because a sufficient number of humans just are dingbats and therefore we need an external error correction mechanism. The very thing we don’t get in bureaucratic systems and do in market and competitive ones.
Fewer lanyards, more consequences, that’s the answer here.
Tim Worstall