I'll jump in there. The other thing we have to be conscious of is that I don't think we actually want to be keeping kids from sexually explicit material. I think there's a lot of information that's necessary for kids to know about sexual activity and sexual health, which I distinguish from violent pornography. The idea of surveilling kids to prevent them from access to content about sexuality I think would be a real problem, whether or not you algorithmically distinguish between violent pornography—which in my view isn't just a problem for kids but a problem for adults too—and sexually explicit material, which is important for people to have access to. That's another problem.
Filters often over-filter, so that you don't have access to material that's important for sexual health, for example, or for developmentally appropriate sexual curiosity and self-definition. Again, going back to eGirls, the girls told us that surveillance is a problem, not a solution. I'm not sure that mechanisms that are surveilling kids or blocking kids are necessarily the approach we want to take, even if scientifically we actually could design the algorithms to do that fairly well.