Thank you, Madam Chair.
I'm very honoured to be here. I apologize in advance that I also have a hard deadline, due to child care obligations, so let me get right to it.
I'm not an expert on the harms caused by what the committee is studying, that is, exposure to illegal explicit sexual content. The focus of my remarks today will be on the technological means by which this kind of content is distributed and what can be done about it in compliance with the charter.
Just to frame my remarks, I think we can distinguish between two kinds of material. There's certain material that's per se illegal. Child sexual exploitation material is always illegal, but we face a challenge with material that's what I would call “conditionally illegal”. I think non-consensual distribution of intimate imagery falls into this category, because the illegality depends on whether the distribution is consensual or not—or the creation, for that matter.
The challenge we face is in regulating the distribution of this content by means of distribution that are general purpose. Take a social media platform, whichever one you want—Instagram, TikTok—or take a messaging platform such as WhatsApp. The problem with regulating the distribution of this content on those platforms is, of course, that we use them for many positive purposes, but they of course can be used for ill as well.
I'd like to pivot briefly to discuss the online harms act, which is, of course, before Parliament right now and which I think offers a good approach to dealing with one part of the distribution challenge with regard to social media platforms. These are platforms that take content generated by individuals and make them available to a large number of people. I think the framework of this law is quite sensible in that it creates “a duty to act responsibly”, which gets to the systemic problem of how platforms curate and moderate content. The idea here is to reduce the risk that this kind of content does get distributed on these platforms.
The bill is, in my view, well designed, in that there's also a duty to remove content, especially child sexual exploitation material and non-consensual distribution of intimate imagery, to the extent that platforms' own moderation efforts or user reports flag that content as being unlawful. This is a very sensible approach that I think is very compliant with the charter in its broad strokes.
The challenge, however, is with the effectiveness of these laws. It's very hard to determine before the fact how effective these are, because of issues with determining both the numerator and the denominator. I don't want to take us too much into mathematical territory, but it's very hard for us to measure the prevalence of this content online or on any given platform. It's just hard to identify, in part because the legality—or not—of the content depends on the conditions in which it's distributed. Then, on the numerator, which is how well the platforms are doing the job of getting it off, again, we have issues with identifying what's in and what's out. This is a step forward, but the bill has limitations.
One way of understanding the limitations is with an analogy that a friend of mine, Peter Swire, who teaches at Georgia Tech, calls the problem of “elephants and mice”. There are some elephants in the room, which are large, powerful and visible actors. These are your Metas and your TikToks, or even a company like Pornhub, which has a very large and significant presence. These are players that can't hide from the law, but what is difficult in this space is that there are many mice. Mice are small, they're furtive and they reproduce very quickly. They move around in darkness. This law is going to be very difficult to implement with regard to those kinds of actors, the ones that we find on the darker corners of the Internet.
Again, I think Bill C-63 is a very—