The issue of non-consensual intimate partner images—some call this “revenge porn”—is an example where we accept that the computer will get it wrong and be more aggressive. It sometimes might take down an image of a person who looks very similar to that person. It's one of these questions around whether we accept false positives and false negatives. Having a broader definition for something like that is okay if the consequence is that a little bit of pornography disappears from the Internet.
For things that are more controversial topics, that's where relying on censorship is much harder, because the scope, the complexity and the diversity of ideas are so broad, and it's much harder for computers to do than just trying to identify whether this is the same person and this is the same image as what was reported. You're matching one-to-one instead of one-to-many.