Adam Smith Institute

View Original

Inverting the lesson to be learned

Not that we’ve actively got anything against John Naughton - or even The Observer - but this does seem to be an excellent example of taking the wrong lesson from events, even inverting it:

On Good Friday, a Microsoft engineer named Andres Freund noticed something peculiar. He was using a software tool called SSH for securely logging into remote computers on the internet, but the interactions with the distant machines were significantly slower than usual. So he did some digging and found malicious code embedded in a software package called XZ Utils that was running on his machine.

This then means that this open access software - which runs on pretty much every internet server in the world - is infected and poses a very great danger to us all, to civilisation and all that is right with the world.

This, however, is the wrong conclusion to draw from it:

Who knows? But two clear lessons can be drawn from what we know so far. The first is that we have constructed a whole new world on top of a technology that is intrinsically and fundamentally insecure. The second is that we are critically dependent on open-source software that is often maintained by volunteers who do it for love rather than money – and generally without support from either industry or government. We can’t go on like this, but we will. Those whom the Gods wish to destroy, they first make complacent.

Well, what’s the opposite to open source? Closed source, obviously. At which point a malicious actor might - imagine - introduce some similar malicious code into their proprietary software stack and we’d never know about it. Because we’d not be able to examine the source, not be able to see what they’d done. We’d just be victims without either knowing about it or being able to do anything about it.

Well, we might be able to do something about it we guess. If we knew about it. Which, actually, we just have done. We’ve ripped every Huawei chip out of the internet, haven’t we, on mere suspicion that this specific closed source manufacturer might do something like that?

Which is where we profess ourselves gobsmacked. Open source found the problem and a bit of software updating and we’re done. Closed source was merely suspected (No, Mr. Huawei, please don’t write in, we are not claiming you have or did) and we’ve had to physically rip kit out of the infrastructure. The gobsmacking coming from the conclusion reached, that closed source is therefore better and more secure?

Blimey.

Sure, we know things go wrong in open markets but closed designs for economies go wrong more expensively, for longer and worser……