As Facebook shows, this planning stuff is difficult

There’s an amusement in that it’s the Observer which retails this story to us. It’s about the problems that Facebook has in working out might be allowable to say on its site and what might not be. The point being that vast resources are being thrown at the problem and it’s not really being solved:

Two weeks ago, the New York Times was leaked 1,400 pages from the rulebooks that the company’s moderators are trying to follow as they police the stuff that flows through its servers.

Some rulebook there.

An examination of the leaked files, says the NYT, “revealed numerous gaps, biases and outright errors. As Facebook employees grope for the right answers, they have allowed extremist language to flourish in some countries while censoring mainstream speech in others.” Moderators were instructed, for example, to remove fundraising appeals for volcano victims in Indonesia because a co-sponsor of the drive was on Facebook’s internal list of banned groups; a paperwork error allowed a prominent extremist group in Myanmar, accused of fomenting genocide, to stay on the platform for months. And there was lots more in this vein.


Some numbers might help to put this in context. Facebook currently has 2.27bn monthly active users worldwide. Every 60 seconds,510,000 comments are posted, 293,000 statuses are updated and 136,000 photos are uploaded to the platform.

The conclusion?

To a cybernetician, though, it is merely confirmation that Facebook is no longer a viable system.

The whole thing doesn’t work because the central authorities do not and cannot have either the information or the rules to be able to plan and manage things centrally.

Now expand that out to the entire series of economic interactions between 7 billion people, not just the internet scribblings of a minority of them. The idea of the centre being able to even monitor, let alone plan, all of that is just not going to work, is it?

The truth here being that:

Way back in the 1950s, a pioneering British cybernetician, W Ross Ashby, proposed a fundamental law of dynamic systems. In his book An Introduction to Cybernetics, he formulated his law of requisite variety, which defines “the minimum number of states necessary for a controller to control a system of a given number of states”. In plain English, it boils down to this: for a system to be viable, it has to be able to absorb or cope with the complexity of its environment. And there are basically only two ways of achieving viability in those terms: either the system manages to control (or reduce) the variety of its environment, or it has to increase its internal capacity (its “variety”) to match what is being thrown at it from the environment.

That second isn’t possible therefore a planned economy could only be a simple economy. And the thing is that it’s the complexity itself which produces the riches from an economy. Thus simple economies are poor ones. We can thus be planned and poor or unplanned and at least possibly rich. Choose wisely.