Slate - It’s This Invisible System of Harm - Interview with Cathy O'Neil

Aaron Mak spoke with Cathy O’Neil to understand how an algorithm could be systematically giving female Apple Card customers lower credit lines than men—and to discuss how the opacity of such algorithms often allows discrimination to persist.

What kind of information would we want to see in order to determine whether there was gender discrimination? 

There’s two broad types of evidence that we’d want. One of them is a statistical evidence that indicates they have a definition of fairness in various directions, like for every protected class they’re certain to not be discriminatory. They would have to define what it means to be fair with respect to gender, or fair with respect to race, or fair with respect to veteran status, or whatever the protected class is. They’d have to define fairness and they’d have to show us evidence that they’ve tested the algorithm for fairness. That’s a statistical question; that’s something that an individual probably would never be able to get at, which is why I’m not ready to say this is gender discrimination, because we just have two data points. We cannot categorize this problem right now with the information we have.

That’s one of the reasons that the only people who can do it must do it, and those are the people who use and deploy the algorithm. They’re the ones who are capable of doing this. Technically speaking it’s a legal requirement, but the regulators in charge of this kind of thing just simply do not know how to force the companies to do it. So that’s got to change.