Yes, absolutely.
We've been talking a lot about adults and this is also happening in the space of child sexual abuse material. There is a lot of harm that is done in terms of the systems that detect this type of material, which rely on hash values of real material. The fake material doesn't have those hash values in the databases that are being relied on, so removal of them becomes an incredible challenge.
There are all sorts of new CSAM out there. There's already a lot of CSAM out there, so we're now talking about making it even more—