Algorithms are really already part of the fabric of our lives in ways we don't really think about or appreciate. A lot of people think of them as neutral. Algorithms are machines. They learn. They don't have opinions, but what algorithms actually do is end up reproducing and amplifying biases that we know exist in the systems and that people have. Governments need to establish boards that would investigate the ethical parameters of how algorithms are being developed and used.
I'll give you one specific case. When men and women searched in one study for the same jobs online, men were six times more likely to get open positions that offered $200,000 or more than women. That's interesting to me because I come from an old-school newspaper background. Gender segregated ads were made illegal in the United States in roughly 1968 to 1972, but what is effectively happening under our noses and visibly is the re-segregation of the job market because of these algorithms that build on peoples' preferences. It may just be that more men apply for those jobs, which turns into their seeing more of these jobs, but right now no one is paying attention to the fact that women aren't seeing them at all. That's one thing, and we know this is happening in terms of racist impacts as well.
There are so many dimensions to algorithms and accountability that we could do this for weeks. I will mention one thing, however. Google's Jigsaw branch has been developing a tool called “wiki detox” which is meant to assess whether language on Wikipedia is an attack, or aggressive, or neutral. It's a learning tool. They had people rate language, then the algorithm learned that and can score language. Then if somebody says something very racist, the algorithm can say “That's an absolute attack and we shouldn't let it in our comments in X's newspaper”. When I put in gender slurs, they all came out as neutral, not attacks. If you actually put in a sentence that says, "Excuse me, you're a dick”, it comes out as an attack, but if you put in, “You should be raped”, it's 50/50.
We need to understand how these tools are being fed information, how the assessment is done, who's doing the assessing, and then how they're going to be implemented. There are really no mechanisms for that right now. Pornography plays into that because it's in the language, as it is in Twitter.