I think you have spoken about the concept of having a 24-hour takedown rule, so that once it has been notified that material is there, there would be a provision for that. I think that's a good idea. Of course, the trouble is that when child sexual abuse material or non-consensual images have been up for even 24 hours, they can have hundreds or thousands of viewers—millions in the case of Pornhub and MindGeek. We've heard from victims that explicit images of them were online for three years before they found out. In the case of Serena Fleites, hers was shared and downloaded all over her school before she knew. Then she got into a never-ending back and forth to try to get the platforms to be accountable and to take down the materials.
Can you explain or enlighten us about what prevention mechanisms might actually be in place?