Evidence of meeting #124 for Canadian Heritage in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was children.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Vivek Krishnamurthy  Associate Professor of Law, University of Colorado Law School, As an Individual
Emily Laidlaw  Associate Professor and Canada Research Chair in Cybersecurity Law, University of Calgary, As an Individual
Carol Todd  Founder and Mother, Amanda Todd Legacy Society
Clerk of the Committee  Ms. Geneviève Desjardins
Dianne Lalonde  Research and Knowledge Mobilization Specialist, Centre for Research and Education on Violence Against Women and Children
Jocelyn Monsma Selby  Clinical therapist, Researcher Specialising in Forensic Sexology and Addiction, and Chair, Connecting to Protect
Marc-Antoine Vachon  Lieutenant, Sûreté du Québec

June 11th, 2024 / 4:55 p.m.

Liberal

The Chair Liberal Hedy Fry

I'm calling the meeting to order, please.

Welcome to meeting number 124 of the House of Commons Standing Committee on Canadian Heritage.

I would like to acknowledge that this meeting is taking place on the unceded traditional territory of the Algonquin Anishinabe peoples.

Pursuant to Standing Order 108(2) and the motion adopted by the committee on February 14, 2022, the committee is resuming its study of online harms.

Before we begin, I want to do the usual housekeeping, mostly for the benefit of the visitors.

Please take note of the following preventative measures in place to protect the health and safety of all participants, including the interpreters.

Only a black approved earpiece is supposed to be used. The former grey earpieces must no longer be used. Keep your earpiece away from all microphones at all times. When you're not using your earpiece, please place it face down on the decal in front of you. Thanks for your co-operation.

You're not allowed to take pictures of the screen or of what is going on here. It will be posted publicly later on.

In accordance with the committee's routine motion, I think our clerk has already made sure the witnesses have completed the required connection tests in advance of the meeting. Thank you.

I want to make a few comments for the benefit of members and witnesses.

Please wait until I recognize you by name before speaking. If you're in the room, raise your hand if you wish to speak. If you're appearing virtually, raise your virtual hand. Thank you very much. All comments should be addressed through the chair.

Pursuant to the motion adopted by the committee on Tuesday, April 9, we have Claude Barraud, a psychotherapist from Homewood Health, in the room with us today. During the meeting, should you feel distressed or uncomfortable due to the sensitive nature of the committee's study, you can speak with Mr. Barraud, who is available to assist.

I now want to welcome our witnesses.

Joining us by video conference are Vivek Krishnamurthy, associate professor of law, University of Colorado law school; Emily Laidlaw, associate professor and Canada research chair in cybersecurity law, University of Calgary; Carol Todd, founder and mother, Amanda Todd Legacy Society; and Dianne Lalonde, research and knowledge mobilization specialist, Centre for Research and Education on Violence Against Women and Children.

In the room, we have, from Connecting to Protect, Dr. Jocelyn Monsma Selby, chair, clinical therapist and researcher specializing in forensic sexology and addiction; and from the Sûreté du Québec, Marc-Antoine Vachon, lieutenant.

You each have five minutes....

Monsieur Champoux, please go ahead.

5 p.m.

Bloc

Martin Champoux Bloc Drummond, QC

Madam Chair, I would like the members of the committee to agree on one thing before we start the meeting.

The bells are expected to ring in the next few minutes for us to go vote. So I would like us to make sure that the witnesses will be able to finish their opening remarks, even if the bells are ringing. If there is time for one round of questions, so much the better.

Out of respect for the witnesses who are testifying as part of our study, we must give them as much time as possible.

5 p.m.

Liberal

The Chair Liberal Hedy Fry

Thank you, Monsieur Champoux.

Is everyone in agreement? It's a reasonable request.

Mrs. Thomas, please go ahead.

5 p.m.

Conservative

Rachael Thomas Conservative Lethbridge, AB

Just to further define that, I wonder if there would be agreement around the table to allow remarks and questions to continue until, let's say, 10 minutes before the vote or five minutes before the vote, if we're willing to vote virtually.

5 p.m.

Liberal

The Chair Liberal Hedy Fry

That's traditional, yes.

Does anyone disagree with that?

5 p.m.

Liberal

Michael Coteau Liberal Don Valley East, ON

I'm not disagreeing, but if everyone's voting virtually from here, why can't we all vote and then get back to business?

We can't according to the rules. Is that right?

5 p.m.

Liberal

The Chair Liberal Hedy Fry

We can't according to the rules. We have to wait until the vote is counted.

5 p.m.

Liberal

Michael Coteau Liberal Don Valley East, ON

Then I'll just follow what was said to maximize the time.

5 p.m.

Liberal

The Chair Liberal Hedy Fry

Yes. We have to wait until the vote is counted, Michael.

Go ahead, Taleeb.

5 p.m.

Liberal

Taleeb Noormohamed Liberal Vancouver Granville, BC

Just to clarify, what Mrs. Thomas is saying, and I think I would be okay with this, is that, if it's a 30-minute bell, we do 20 minutes. People can go and vote and do what they need to do. Then we'd do as custom once the vote is counted.

5 p.m.

Liberal

The Chair Liberal Hedy Fry

There is an option, of course, for you not to go anywhere. I can suspend and you can vote here.

She's suggesting five or 10 minutes. Do you want five or 10, people? Speak now or I will make a decision, and you may not like it.

5 p.m.

Conservative

Rachael Thomas Conservative Lethbridge, AB

I say five.

5 p.m.

Liberal

The Chair Liberal Hedy Fry

Five is good. I think five is good, since we're voting virtually.

5 p.m.

An hon. member

[Inaudible—Editor]

5 p.m.

Liberal

The Chair Liberal Hedy Fry

Because we cannot go back into business until—

5 p.m.

Some hon. members

Oh, oh!

5 p.m.

Liberal

The Chair Liberal Hedy Fry

Guys, can I have one person speaking at a time?

We cannot go back into business until the votes are counted and read. We could cut short at the beginning. That's fine.

Okay. That's how we're going to do it.

Now, I've given the names of all those who will be presenting today. We've put them all into one presentation group as opposed to two separate groups. Again, that's for time.

I want to apologize to the witnesses. Votes tend to do this, and it disrupts committee a lot. Quite often, people who have come and are waiting to present find themselves unable to do so.

We will have your presentations now. You each have five minutes to present.

Mr. Krishnamurthy, we will begin with you. You have five minutes, please.

5 p.m.

Vivek Krishnamurthy Associate Professor of Law, University of Colorado Law School, As an Individual

Thank you, Madam Chair.

I'm very honoured to be here. I apologize in advance that I also have a hard deadline, due to child care obligations, so let me get right to it.

I'm not an expert on the harms caused by what the committee is studying, that is, exposure to illegal explicit sexual content. The focus of my remarks today will be on the technological means by which this kind of content is distributed and what can be done about it in compliance with the charter.

Just to frame my remarks, I think we can distinguish between two kinds of material. There's certain material that's per se illegal. Child sexual exploitation material is always illegal, but we face a challenge with material that's what I would call “conditionally illegal”. I think non-consensual distribution of intimate imagery falls into this category, because the illegality depends on whether the distribution is consensual or not—or the creation, for that matter.

The challenge we face is in regulating the distribution of this content by means of distribution that are general purpose. Take a social media platform, whichever one you want—Instagram, TikTok—or take a messaging platform such as WhatsApp. The problem with regulating the distribution of this content on those platforms is, of course, that we use them for many positive purposes, but they of course can be used for ill as well.

I'd like to pivot briefly to discuss the online harms act, which is, of course, before Parliament right now and which I think offers a good approach to dealing with one part of the distribution challenge with regard to social media platforms. These are platforms that take content generated by individuals and make them available to a large number of people. I think the framework of this law is quite sensible in that it creates “a duty to act responsibly”, which gets to the systemic problem of how platforms curate and moderate content. The idea here is to reduce the risk that this kind of content does get distributed on these platforms.

The bill is, in my view, well designed, in that there's also a duty to remove content, especially child sexual exploitation material and non-consensual distribution of intimate imagery, to the extent that platforms' own moderation efforts or user reports flag that content as being unlawful. This is a very sensible approach that I think is very compliant with the charter in its broad strokes.

The challenge, however, is with the effectiveness of these laws. It's very hard to determine before the fact how effective these are, because of issues with determining both the numerator and the denominator. I don't want to take us too much into mathematical territory, but it's very hard for us to measure the prevalence of this content online or on any given platform. It's just hard to identify, in part because the legality—or not—of the content depends on the conditions in which it's distributed. Then, on the numerator, which is how well the platforms are doing the job of getting it off, again, we have issues with identifying what's in and what's out. This is a step forward, but the bill has limitations.

One way of understanding the limitations is with an analogy that a friend of mine, Peter Swire, who teaches at Georgia Tech, calls the problem of “elephants and mice”. There are some elephants in the room, which are large, powerful and visible actors. These are your Metas and your TikToks, or even a company like Pornhub, which has a very large and significant presence. These are players that can't hide from the law, but what is difficult in this space is that there are many mice. Mice are small, they're furtive and they reproduce very quickly. They move around in darkness. This law is going to be very difficult to implement with regard to those kinds of actors, the ones that we find on the darker corners of the Internet.

Again, I think Bill C-63 is a very—

5:05 p.m.

Liberal

The Chair Liberal Hedy Fry

I'm sorry to interrupt. You have 26 seconds to wrap up.

5:05 p.m.

Associate Professor of Law, University of Colorado Law School, As an Individual

Vivek Krishnamurthy

Very well.

The only thing I will say to conclude is that Bill C-63 does not deal with messaging software, with things like WhatsApp, which are a primary vector by which this kind of content moves. I think that is a good call, because of the difficulty in doing so. It's something that requires further study, a lot of work and a lot of thought on dealing with that particular piece of the distribution problem.

Thank you, Madam Chair.

5:05 p.m.

Liberal

The Chair Liberal Hedy Fry

Thank you very much.

I'm going to the next person, who is Ms. Emily Laidlaw. Before you begin, I'll let you know that I will give you a 30-second shout-out, so that you can start to wrap up. If you miss some of your presentation, you can elaborate during the question-and-answer period. Thank you.

It's over to Ms. Laidlaw for five minutes, please.

5:10 p.m.

Dr. Emily Laidlaw Associate Professor and Canada Research Chair in Cybersecurity Law, University of Calgary, As an Individual

Thank you for inviting me.

With my time, I'm going to focus on social media regulation and on Bills C-63 and S-210.

Social media has historically been lightly regulated. Online safety has only been addressed if companies felt like it or they were pressured by the market. There have been some innovative solutions, and we need them to continue to innovate, but safety has generally taken a back seat to other interests.

Social media companies have also privately set rules for freedom of expression, privacy and children's rights. There are no minimum standards and no ways to hold companies accountable. That is changing globally. Many jurisdictions have passed online harms legislation. The online harms act, which is part of Bill C-63, aligns with global approaches. In my view, with tweaks, Bill C-63 is the number one avenue to address illegal sexually explicit content and sexual exploitation.

Bill S-210 would mandate age verification to access sites with sexually explicit material. It is a flawed bill, yes, but more importantly, it is unnecessary for two reasons.

First, age verification is the crucial next frontier of online safety, but it is about more than sexually explicit material and about child safety broadly. The technology is evolving, and if we are committed to freedom of expression, privacy and cybersecurity, how this technology is used must be scrutinized closely.

Second, age verification is only one tool in the tool box. A holistic approach is needed whereby safety is considered in product design, content moderation systems and the algorithms. Let me give you a few examples of safety by design that does not involve age verification.

Child luring and sextortion rates are rising. What steps could social media take? Flag unusual friend requests from strangers and people in distant locations. Remove network expansion prompts whereby friends are recommended based on location and interest. Provide easy-to-use complaints mechanisms. Provide user empowerment tools, like blocking accounts.

The non-consensual disclosure of intimate images and child sexual abuse material requires immediate action. Does the social media service offer quick takedown mechanisms? Does it follow through with them? Does it flag synthetic media like deepfakes? How usable are the complaints mechanisms?

For example, Discord has been used to livestream child sexual exploitation content. The Australian e-safety commissioner reported that Discord does not enable in-service reporting of livestreamed abuse. This is an easy fix.

The last example is that the Canadian child protection centre offers a tool to industry, called Project Arachnid, to proactively detect child sexual abuse material. Should social media companies be using this to detect and remove content?

In my view, Bill C-63, again with tweaks, is the best avenue to address sexual exploitation generally. I think the focus should be on how to improve that bill. There are many reasons for that. I'll give two here.

First, the bill imposes three different types of responsibility. Vivek discussed this. Notably, the strongest obligation is the power of the commissioner to order the removal of child sexual abuse content and non-consensual disclosure of intimate images. This recognizes the need for the swift removal of the worst kinds of content.

Second, all of this would be overseen by a digital safety commission, ombudsperson and office. Courts are never going to be fast to resolve the kinds of disputes here, and they're costly. The power of the commissioner to order the removal of the worst forms of content is crucial to providing access to justice.

Courts are just ill-suited to oversee safety by design as well, which is necessarily an iterative process between the commission and companies. The tech evolves, and so do the harm and the solutions.

With my remaining time, I want to flag one challenge before I close, which Vivek mentioned as well. That is private messaging. Bill C-63 does not tackle private messaging. This is a logical decision; otherwise, it opens a can of worms.

Many of the harms explored here happen on private messaging. The key here is not to undermine privacy and cybersecurity protections. One way to bring private messaging into the bill and avoid undermining these protections is to impose safety obligations on the things that surround private messaging. I've mentioned many, such as complaints mechanisms, suspicious friend requests and so on.

Thank you for your time. I welcome questions.

5:15 p.m.

Liberal

The Chair Liberal Hedy Fry

Thank you very much.

Now I go to Carol Todd from the Amanda Todd Legacy Society for five minutes, please.

5:15 p.m.

Carol Todd Founder and Mother, Amanda Todd Legacy Society

I'd like to thank the committee for inviting me to speak. It's an honour to be able to share knowledge.

I'm not coming as a researcher or someone who has studied this. I'm coming as a mom, and I'm coming as a parent and as an educator with lived experience, so confining my conversation to five minutes was difficult. I've written some notes that I will read until my time is up, and I do welcome questions at the end.

I have spent the last 12 years, I guess, looking at learning about sexual exploitation and online behaviours, and it is really hard to imagine the horrid things that are happening out there to our children. As a side note, I believe that Bill C-63 needs to be passed with some tweaks, because it is the safety net for our children and Canadians online.

This subject holds significant importance and warrants ongoing dialogue to tackle not just the ease of access to such material but also the profound harm that can be inflicted upon those who encounter sexually explicit content every day.

I am Carol Todd, widely known as Amanda Todd's mother. In addition, I am an educator in a British Columbia school district with my work primarily centred on digital literacy, online safety and child abuse prevention with a focus on exploitation and sextortion.

Empowering students, teachers and families with the knowledge and skills to navigate the digital world safely is essential, important and now a passion of mine. I will continue to talk forever about how we can keep families and children safe, because this is what we needed for my daughter, and it came a bit too late.

Amanda tragically took her life on October 10, 2012, following extensive online exploitation, tormenting harassment and cyber-abuse. Her story very much relates to what happens when there is creation, possession and distribution of sexually explicit material online and how easily others can access it as it becomes embedded online forever.

Amanda's story garnered global attention after her tragic death. To reclaim her voice while she was alive, Amanda created a video that she shared on YouTube five weeks before her passing. It has been viewed 50 million times worldwide and is now used as a learning tool for others to start the discussion and for students to learn more about what happened to her and why it's so important that we continue to talk about online safety, exploitation and sextortion.

As another side note, it has taken forever for us to catch up on the conversation of exploitation and sextortion. It was something that no one was able to talk about 12 years ago, in 2012. It has evolved because of the increase of exploitation and sextortion online, not only happening to young girls, young boys and young adults but men and women. The nefarious offenders online, because they've gotten away with it due to so many levels of the Internet these days, have increased in numbers and have caused much trauma and much harm, as this is a form of abuse and violence.

Over the past decade, we've observed rapid changes in the technology landscape. Technology primarily used to be used as a communication tool for email, and now we have seen the evolvement of applications for fun. They were explained as safe, but now we know differently, because they have increased the chaos, concern and undesirable behaviours online for Canadians and for all.

This isn't just a Canadian problem. It's a global problem, and I have watched other countries create legislation, laws and safety commissions, just as Canada, with Bill C-63, now wants an e-safety commissioner board, and I think this is a brilliant idea. For anyone here who gets to vote, I hope that it does pass.

The prevalence of sexually explicit material has markedly increased—

5:20 p.m.

Liberal

The Chair Liberal Hedy Fry

Can wrap up, please, Ms. Todd?