To remind: AI should be biased because the world is biased
This is at least the start of the correct way to deal with bias in Artifical Intelligence:
A police chief has admitted artificial intelligence used to boost crime fighting will contain bias but pledged to combat the risks.
At the heart of the point is the question, well, is reality biased? There are certainly those myriads who insist it is, yes. OK, so we want to use AI to aid us in managing reality. Therefore the AI has to start from the point that reality is biased. And those shrieking loudest about reality’s bias are the very people who should be insisting the AI recognises that bias. Because we need the results from the AI to reflect reality.
Sure, sure, we might want to reduce that bias. Even correct for it in our actions. But we’ve got to acknowledge it first, see what the results are, then go to work.
For example, should AI assume that FTSE100 CEO jobs are equally distributed by sex? Or even gender? No, obviously not, as a report today points out, those jobs are not equally distributed. So, if we want to have something useful said about reality we’ve got to start from that information about that bias in society.
Note our ideas about bias are irrelevant here. Whether it’s a good thing, bad thing, there should be more or less of it. We must start from describing that reality outside the window.
This is all, at heart, that ought/is distinction. It’s possible that society ought to be less biased than it is. It’s even possible - tho’ perish the thought, eh? - that some of what is described as bias is just the way things are and will be. But to be able to make decision about anything we’ve got to acknowledge where we are right now.
To allow AI to assume the ought is to make the GOSPLAN mistake. Given that the plan for food production has been fulfilled, it is the empty shops which are the incorrect information - because the plan has been fulfilled, see?
A useful AI is not only going to be biased, it must be biased - for the very reason that the society around us is so.
Tim Worstall