I have repeatedly blogged about discrimination, especially against women and non-whites in labour markets. On raw numbers we often see different outcomes between groups, and since we know that discrimination goes on, we often instinctively attribute these "gaps" to discrimination, as Tim Harford does in an otherwise nice piece here. But once we dial down and get more detail, the gaps often evaporate—the more employers or clients have about individuals, the less they use averages about groups.
Three recent papers looking at room rental service Airbnb find things that point towards a similar conclusion. The first, by Ben Edelman, Michael Luca and Dan Svirsky (pdf) finds that applications from guests with distinctively African-American names are 16pp less likely to be accepted. It looks like an instance of straightforward discrimination, but the authors don't test for alternate possibilities (e.g. controlling for crime rates or socioeconomic status what is the effect of race), so it's impossible to say for sure. However, they do find that African-American hosts accept African-American applicants at the same rate as white hosts, implying sheer racism is unlikely to be the explanation—although of course it very reasonably may not feel that way to black Americans unable to find a room.
A second paper, from Morgane Laouénan and Roland Rathelot (pdf), also finds a raw gap between races, but they have the sorts of data that can distinguish between bigotry and statistical discrimination. Guests demand ethnic minority hosts' apartments less—resulting in prices 3.2% lower on average—but when minority hosts have reviews on the system this gap mostly goes away. It is not completely overturned, and guests do seem to have some sort of "taste" for same-ethnicity hosts, but this forms a relatively small portion of the gap, less than a quarter. They hypothesise that an even better feedback and information system would narrow the gap further.
Finally, a third paper, from Ruomeng Cui, Jun Li, and Dennis J Zhang (pdf) finds the same thing. Absent reviews, people discriminate against groups based on lower intra-ethnic trust, and facts like average crime rates. But as reviews accumulate—even bad ones that are not that bad—the gap falls towards zero. They call on platform owners like Airbnb to more strongly motivate reviewing to add information and shift judgements from "coarse grained" info like averages towards things centred on the individual.
As I said above, statistical discrimination may not feel fair. Yes, it's based on true facts (or it's driven out) and yes it's efficient (in the absence of better info), but it still judges you not like an individual, but like an average of your observable characteristics. But, satisfyingly, increasing info seems to work, driving out even statistical discrimination through robust endogenous incentives. People aren't incorrigibly discriminatory, they just need more data.