Evidence of meeting #124 for Canadian Heritage in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was children.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Vivek Krishnamurthy  Associate Professor of Law, University of Colorado Law School, As an Individual
Emily Laidlaw  Associate Professor and Canada Research Chair in Cybersecurity Law, University of Calgary, As an Individual
Carol Todd  Founder and Mother, Amanda Todd Legacy Society
Clerk of the Committee  Ms. Geneviève Desjardins
Dianne Lalonde  Research and Knowledge Mobilization Specialist, Centre for Research and Education on Violence Against Women and Children
Jocelyn Monsma Selby  Clinical therapist, Researcher Specialising in Forensic Sexology and Addiction, and Chair, Connecting to Protect
Marc-Antoine Vachon  Lieutenant, Sûreté du Québec

5:20 p.m.

Liberal

Michael Coteau Liberal Don Valley East, ON

Can I make a suggestion, Chair, that we extend the witness's testimony? Can we just give her two or three more minutes, if that's okay with everyone?

Thank you.

5:20 p.m.

Liberal

The Chair Liberal Hedy Fry

Yes, all right. I see the lights flashing, so the bells have started.

The committee wishes to give you two more minutes, Ms. Todd.

5:20 p.m.

Founder and Mother, Amanda Todd Legacy Society

Carol Todd

Thank you.

The prevalence of sexually explicit material has increased due to the widespread use of the Internet. It manifests in various forms, including visual representations, photos, videos, films, written content, audio recordings and print material. The volume grows exponentially day by day. The protection that we have for our children and for our adults isn't there on the Internet. Big tech companies need to take responsibility. I know that throughout the world now, there are more and more lawsuits where big tech companies are being held responsible.

When accessing sexually explicit material, some of the challenges that we are faced with include access to violent and explicit content that can impact sexual attitudes and behaviours, the harm to children through the creation, sharing and viewing of sexual abuse material, increased violence against women and girls, as well as sex trafficking. It can also influence men's views on women and relationships.

In my notes, I comment that we stereotype often that it is men who are violating others, but the offenders can be men and they can be women. They can also be other children—peer violence to peer violence. There is no one set rule on who is creating and who is causing, but we know that those who become traumatized and victimized can be anyone.

What more needs to be done? I'll just go through this quickly.

As an educator, I feel strongly that increasing education is crucial. The awareness and education needs to go to our children and our young adults and to our families.

We need stronger regulations and laws. Bill C-63 is one of them. I know that in the province of B.C., more legislation has been passed and is done.

We need to improve our online platforms and make them accountable. We need to increase parental controls and monitoring, and we need to encourage reporting.

We also need to promote positive online behaviours. Social emotional learning and social responsibility are part of the awareness and the education that needs to come on.

We need to be a voice. We need to stand up, and we also need to do more.

Thank you for the time, and I encourage questions so that I can finish reading my notes.

Thank you.

5:20 p.m.

Liberal

The Chair Liberal Hedy Fry

Thank you, Ms. Todd. Thank you very much.

5:20 p.m.

Conservative

Philip Lawrence Conservative Northumberland—Peterborough South, ON

I'm sorry. Can I just have a brief—

5:20 p.m.

Liberal

The Chair Liberal Hedy Fry

The bells are going I think.

5:20 p.m.

Conservative

Philip Lawrence Conservative Northumberland—Peterborough South, ON

We're going until 5:30 or five minutes—

5:20 p.m.

Liberal

The Chair Liberal Hedy Fry

Excuse me, Mr. Lawrence. I just said that the bells are going, so I need to check how much time we have to go before we get to our five minutes before we vote.

Go ahead, Mr. Lawrence.

5:20 p.m.

Conservative

Philip Lawrence Conservative Northumberland—Peterborough South, ON

First of all, I just want to commend all the witnesses for being here. I'm wondering, given their courage and the unbelievable testimony that they're bringing, if there is any way we could extend to seven o'clock tonight.

5:20 p.m.

Liberal

The Chair Liberal Hedy Fry

I don't think we have the resources to go beyond 6:30. I'm sorry. I was told that at the beginning of the meeting.

Mrs. Thomas.

5:20 p.m.

Conservative

Rachael Thomas Conservative Lethbridge, AB

Thank you.

Could you just check with the clerk? My understanding is that we have resources that actually could take us potentially to eight o'clock.

5:20 p.m.

Liberal

The Chair Liberal Hedy Fry

I'm sorry. I was told by the powers that be that we only have until 6:30 as a hard stop.

5:20 p.m.

Conservative

Rachael Thomas Conservative Lethbridge, AB

If you could confer with the clerk, that would be great.

5:20 p.m.

Liberal

The Chair Liberal Hedy Fry

Clerk, do we 6:30 as a hard stop?

5:20 p.m.

The Clerk of the Committee Ms. Geneviève Desjardins

The hard stop we were given was 6:30 p.m. However, if the committee wishes, I can request additional time. I can't guarantee it will be accepted, but we can ask the resources if they have availability.

5:25 p.m.

Liberal

The Chair Liberal Hedy Fry

Thank you.

How many minutes do we have before the vote? Is somebody keeping track?

5:25 p.m.

The Clerk

We have 25 minutes.

5:25 p.m.

Liberal

The Chair Liberal Hedy Fry

We will go for another 20 minutes.

I will go to Dianne Lalonde from the Centre for Research and Education on Violence Against Women and Children for five minutes, please.

5:25 p.m.

Dianne Lalonde Research and Knowledge Mobilization Specialist, Centre for Research and Education on Violence Against Women and Children

Thank you for inviting me into this space.

My perspective is informed by my work with survivors in the gender-based violence sector, and I will focus on the need for a gender-based analysis when we're talking about online harms and legislation.

Specifically, I'm going to focus on two online harms—the non-consensual distribution of intimate images, which I refer to as NCIID, and then also deepfake sexual abuse—although I'm happy to speak more to further forms that haven't been necessarily brought forth as much, such as cyberflashing.

Each of these forms of violence are increasing in the Canadian context. They target marginalized individuals, and they produce gendered and intersectional harms. When we're talking about the non-consensual distribution of intimate images, violence occurs when individuals have their content taken from even their computers that were private, but also posted online....

People do so for a variety of motivations, many of which link into other forms of violence. They do so to control, monitor and harass their current or past intimate partner. As well, we see especially young boys doing so, because of social pressures they face relating to traditional masculinity and expectations around sexual experience—that they should have this experience and that they should be promoting it.

We have also seen NCIID used as a tactic to advertise, recruit and maintain control over individuals who experience sex trafficking. NCIID does disproportionately target women. Out of the 295 Canadian cases of NCIID reported to police by adults in 2016, 92% were reported by women. Police-reported incidents, from 2015 to 2020, by youth 12 to 17, found girls, again, overrepresented as targets, at 86%, in comparison to boys at 11%.

Unfortunately, we are lacking intersectional Canadian data, but if we look at national studies in America and Australia, we see that they share that NCIID also disproportionately targets Black, indigenous and 2SLGBTQ2IA+ individuals, and people with disabilities.

We see very much the same targeting when we're talking about deepfake sexual abuse. Many of these forms of applications and technology only work on women's and girls' bodies. A study of 95,000 deepfake videos in 2023 found that 98% were sexually explicit, and of those, 99% targeted women.

When we're talking about the impacts, as you can imagine they are vast. They are emotional, economic, physical and social. Survivors have likened these forms of violence to additional forms of sexual violence wherein their autonomy is denied. They have also shared that one thing that's distinct about online harms is the way in which the harm becomes crowdsourced, and people are sharing this violent experience.

Technology-facilitated violence impacts different groups in qualitatively specific and intersecting ways. For instance, sexual double standards result in women in comparison to men being more likely to be blamed, discredited and stigmatized due to sexual imagery online. The 2SLGBTQ2IA+ individuals have identified that NCIID has been a tool to “out” their sexual orientation and their gender identity. Finally, deepfake sexual abuse also impacts sex workers, especially women and sex workers who have their likenesses stolen and used to inflict violence, and who then face stigma and criminalization in response.

In terms of ways to address this harm, I think much of the focus on legislation has been on regulation and removal of content, and that is absolutely essential. We also need to recognize the people this is impacting, the survivors and who survivors are going to. They are going to gender-based violence services in order to cope and heal from these harms. An added dimension when we're talking about addressing online harms is making sure we're supporting the gender-based violence agencies that are doing the work to support survivors who already have robust sex education programs.

Some of this work is also outlined in the national action plan to end gender-based violence.

As well, I want to echo Carol Todd's remarks about the importance of consent-based education, especially when we're talking about deepfake sexual abuse. Sometimes there's not an understanding of it as a form of harm, so we need to have education in schools and in society that is sex-positive and trauma-informed, to share that this is a form of violence and also to fight against victim blaming.

Thank you.

5:30 p.m.

Liberal

The Chair Liberal Hedy Fry

Thank you.

You can elaborate, as I said to all witnesses, during the Q and A period.

We now go to Dr. Selby for five minutes, please.

5:30 p.m.

Dr. Jocelyn Monsma Selby Clinical therapist, Researcher Specialising in Forensic Sexology and Addiction, and Chair, Connecting to Protect

Thank you for the opportunity to be here today.

My submission to you comes from 43 years of clinical practice and research and from chairing Connecting to Protect's global summit in 2022, which involved 23 countries addressing harms stemming from children accessing pornography online.

My experience links me directly to the consequences of childhood access to online pornography, which results in problematic sexual behaviour, including difficulties in conducting relationships, destruction of the family and, in more extreme cases, criminal behaviour. Access to pornography by children who are unable to process and understand the material is like a gateway drug, setting up future abuse and all the attendant consequences.

For the last 13 years, I've treated individuals with compulsive sexual behaviour disorder and individuals who've been accessing child sexual exploitation material online. We are facing a global epidemic of online child sexual abuse and exploitation as a result of unregulated access to the Internet. We're getting it wrong and we're missing the mark in protecting children.

My colleague and I have outlined in detail what we consider to be the proposed solution in our brief for Bill S-210. We simply advocate shifting the narrative from the focus on age verification to a broader consideration of age assurance options, in conjunction with device-level controls operating at the point of online access through Google, Apple or Microsoft. This approach is technologically possible and relatively quick to implement, with far greater reach and effectiveness. Device-level controls coupled with a multi-dimensional public health approach are needed, including the implementation of protective legislation and policy.

Sadly, sexual exploitation is happening right now in Canada, feeding the production of illegal sexually explicit material online. Cybertip.ca receives millions of reports of child sexual exploitation material yearly, while 39% of luring attempts reported to Cybertip.ca in the last several years involved victims under 13 years of age. Globally, from 2020 to 2022, WeProtect's global threat assessment—and I hope you're sitting down for this—found a 360% increase in self-generated sexual imagery of seven to 10-year-olds.

How does this happen? It is wrong on so many levels. There is not a child protection expert on the planet who agrees that this is okay. It's child sexual abuse via digital images.

The harms to children due to accessing legal and illegal sexually explicit material online include trauma, exploitation, self-produced sexual images, child-on-child abuse, objectification, violence, risky sexual behaviours, depression, difficulties in forming and maintaining close relationships, anxiety disorder, panic attacks, PTSD and complex PTSD symptoms, among others. Potential health issues and addiction carry on into adulthood, causing documented long-term mental health consequences that impact personal and family relationships and the very fabric of our society, unless there is early identification and treatment of the problem.

You might be wondering how certain individuals are vulnerable to developing a problem like this or a compulsive sexual behaviour disorder. It almost always involves access to legal sexually explicit material online at an early age. The average age of exposure is 12 years old.

I want to talk to you about the erototoxic implications of sexually explicit material online. We know we need to do something—

5:35 p.m.

Liberal

The Chair Liberal Hedy Fry

Dr. Selby, you only have 16 seconds left. I'm so sorry. You can elaborate when we get to questions and answers, and you can expand—

5:35 p.m.

Conservative

Philip Lawrence Conservative Northumberland—Peterborough South, ON

I think we have unanimous consent—

5:35 p.m.

Liberal

The Chair Liberal Hedy Fry

—on some of the things you want to say.

5:35 p.m.

Conservative

Philip Lawrence Conservative Northumberland—Peterborough South, ON

—to give her another minute.