Evidence of meeting #121 for Justice and Human Rights in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was consent.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

2:30 p.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

Thank you very much, Madam Chair.

I look forward to continuing to hear what Mr. Brock has to say and to hear the specificity in which he is addressing the very important issues related to Bill C-270 and the motion that Mr. Maloney moved, which is actually on the agenda.

I would just note, Madam Chair, that in terms of departure from the Standing Orders, the continual introduction of issues of debate into the conversation at hand by, in particular, members from the Liberal side, I would just suggest that those members simply put their names on the speaking list. I look forward to hearing from them when their names come up on the speaking list.

I would just ask that you outline again for the committee who exactly is on that speaking list. I know there's been a bit of discussion, with people going back and forth, and I know there was some discussion around a member who is present, although he is not a regular member of the committee. Perhaps we could have some clarity on that. I know you appreciate and respect having clarity and acting with precision, which is key for the smooth functioning of these parliamentary proceedings.

The Chair Liberal Lena Metlege Diab

Thank you, Mr. Kurek. I'll come back to you.

I have Madam Dhillon. She's been very patient.

Thank you very much for your support.

The floor is yours, Ms. Dhillon.

Anju Dhillon Liberal Dorval—Lachine—LaSalle, QC

Thank you, Madam Chair.

I just wanted to raise that I've been noticing for the last two hours that the Conservative members are talking down to you. They are talking aggressively to you and are making insinuations about how you are managing the committee. Under the guise of polite suggestions, they are saying things like, “may we suggest”, “may we do this”, “may we tell you how to do that”.

2:35 p.m.

Conservative

Jamil Jivani Conservative Durham, ON

We're polite.

Anju Dhillon Liberal Dorval—Lachine—LaSalle, QC

This is very condescending towards you. You pointed it out, but I was going to before you did, because it's really disturbing to see that a bunch of male Conservative colleagues are telling you how to manage the committee.

Shame on you. I am sick and tired of hearing this. Yes, you are surrounded only by males.

I'm sorry if I'm getting loud. I'm sorry to the interpreters.

This needs to stop. If we're going to do this for the next nine hours, bring it on. Who cares?

Madam Chair, I don't want to hear anyone speak condescendingly towards you.

2:35 p.m.

Conservative

Jamil Jivani Conservative Durham, ON

That's disrespectful.

2:35 p.m.

Conservative

Larry Brock Conservative Brantford—Brant, ON

Have her apologize.

Anju Dhillon Liberal Dorval—Lachine—LaSalle, QC

Points of order are points of order. It's up to you to qualify them, Madam Chair, but you're not even being allowed to listen to them.

I wish you lots of courage, Madam Chair.

The Chair Liberal Lena Metlege Diab

Thank you, Madam Dhillon, and I wish you a good recovery. I know you're not feeling well this week.

Are we finished with the points of order now?

I do appreciate everybody trying to work together again so that we can continue with the meeting.

Why don't we take a two-minute break?

Thank you. We'll suspend for a few minutes.

The Chair Liberal Lena Metlege Diab

Everybody, welcome back to our meeting.

I just want to inform you that, due to a number of events that are happening, I'm suspending for the day.

[The meeting was suspended at 2:45 p.m., Friday, November 8]

[The meeting resumed at 3:48 p.m., Monday, November 18]

The Chair Liberal Lena Metlege Diab

We are now back in session.

Good morning, everyone.

I will ask all in-person participants to read the guidelines written on the updated cards on the table, as a refresher. These measures are in place to help prevent audio feedback incidents.

This is to protect the health and safety of all participants, including interpreters.

You will also notice a QR code on the card, which links to a short awareness video.

I remind you that this is the continuation of meeting 121 of the Standing Committee on Justice and Human Rights.

The committee is meeting in public to continue its study of Bill C-270, an act to amend the Criminal Code regarding pornographic material. We are here in public to resume debate on the motion by James Maloney, a request for an extension of 30 sitting days to the period of committee consideration for Bill C-270 and reporting the bill back.

I am now ready to give the floor to members wishing to speak. I'm going to start a new list, because I'm not sure who ended last time.

Was it you, Mr. Brock?

2:45 p.m.

Conservative

Larry Brock Conservative Brantford—Brant, ON

It was.

The Chair Liberal Lena Metlege Diab

Okay. The floor is yours.

2:45 p.m.

Conservative

Larry Brock Conservative Brantford—Brant, ON

Thank you very much, Madam Chair.

Before I continue my remarks, Madam Chair, could I ask you to refresh our collective memories as to who currently is on that list, besides me?

The Chair Liberal Lena Metlege Diab

I know you are on it, because you are speaking. The floor is yours.

In terms of anyone else, members were going in and out. To be quite frank, I don't believe the list from 10 days ago exists. I don't have one, so I'm putting down names.

I have Mr. Bittle, Ms. Ferreri, Mr. Jivani and Mr. Van Popta, so far.

That's right, you were also on the list, Mr. Fortin. I'm sorry I forgot about you.

Rhéal Fortin Bloc Rivière-du-Nord, QC

I was indeed on the list, Madam Chair, but I must confess that I don't remember what I wanted to tell you, so please forget me.

The Chair Liberal Lena Metlege Diab

I remember very well now that you were on the list, after Mr. Brock.

That one I remember very clearly, because you were patiently waiting last time.

It's Mr. Brock, Mr. Fortin, Mr. Bittle, Madam Ferreri, Mr. Jivani and Mr. Van Popta.

Okay?

Thank you.

2:45 p.m.

Conservative

Larry Brock Conservative Brantford—Brant, ON

Thank you, Madam Chair.

I welcome back all colleagues after our constituency week. I hope we all had some rest. I know most of us, if not all of us, usually have schedules chock full of activities in our ridings. I was no exception to that, so it's good to be back, and it's good to be back to continue our discussion on Bill C-270.

Where I left off was providing the voice of our colleague Arnold Viersen. Clearly, there were certain members of the Liberal Party who were so eager to hear from him, but at the same time, they were not hiding from the fact that they had ulterior motives to hear from Mr. Viersen to fully cross-examine him on his personal views.

I might reiterate, just as I started off my last intervention, how disappointing and, quite frankly, shameful the actions being taken by certain Liberal members are in voicing their ulterior motives. This is because, as I indicated at the outset, weeks have now passed since a list of key stakeholder witnesses who wanted to participate in this debate was submitted not only to the clerk, but also to you, Madam Chair, with a recommendation that the last couple of meetings be set aside to hear from witnesses, as opposed to demanding that the sponsor of the bill, Arnold Viersen, attend and speak to the matter first.

In fact, if the schedule had been adhered to, today would have been set aside for clause-by-clause consideration after we had heard from those stakeholders, who definitely want to weigh in and add their voices to this discussion. It's shameful that political gamesmanship has been resorted to instead of dealing with the substance of Bill C-270, which would stop the Internet sexual exploitation of the most vulnerable members of our community.

Continuing my train of thought of providing voices to this discussion, I want to return to one church group, the Evangelical Fellowship of Canada, which has submitted a brief that I wish to read into the record at this time. It is entitled, “Submission to the Standing Committee on Justice and Human Rights on Bill C-270”, and it is dated November 5, 2024. It reads:

The Evangelical Fellowship of Canada (EFC) appreciates the opportunity to participate in the committee’s review of Bill C-270. We believe it’s crucial for Parliament to require pornography platforms ensure child sexual abuse materials and intimate images shared without consent are not uploaded to their sites. It is evident many of these platforms will not take such measures unless required to and held accountable for doing so.

The acronym for Evangelical Fellowship of Canada is EFC.

The EFC is the national association of evangelical Christians in Canada. Established in 1964, the EFC provides a constructive voice for biblical principles in life and society and a forum for engagement and collaboration for the roughly 2.2 million Evangelicals who are part of our constituency.

Our approach to this issue is based on the biblical principles of respect for human life and dignity, justice and care for those who are vulnerable. These principles are also reflected in Canadian law and public policy.

Under the heading of “The impact of posted images”, it reads:

There are devastating, lifelong consequences for those whose images are uploaded and distributed online. Children and youth face severe and extensive impacts when images of their abuse and exploitation are streamed and distributed.

In its 2021 hearings on the protection of privacy and reputation on platforms such as Pornhub, the Ethics Committee heard harrowing testimony from survivors whose intimate images, including images of abuse, had been posted on pornography platforms without their knowledge or consent. Some of the witnesses whose images had been posted on Pornhub were as young as 13 years old at the time the images were taken.

One young woman told the Ethics Committee how she was pressured to send the boy she liked an intimate video of herself when she was [only] in Grade 7. She then discovered the video had been uploaded to pornography sites. This video has been viewed millions of times. This young woman dropped out of school and her social circle, became homeless, fearful, anxious and suicidal.

Madam Chair, I want to pause for a moment. I want to reflect on my former career, when I prosecuted matters such as this, particularly those dealing with the possession, distribution and making of child pornography images. A point the experts unanimously agreed on, in unison with all of the victims I had the privilege of working with and assisting in the prosecution of these matters, is that they are a special class of victim.

They are unlike victims of sexual assault, which is horrendous in its very nature. They are unlike victims of a personal injury offence. Again, this could have lifelong implications for those victims. By and large, those two classes of victim are victimized once, with long-term—sometimes lifetime—consequences. The difference with victims in this particular area of the law is this: Each and every time their image is viewed, uploaded, saved and shared, they are revictimized. It's over and over again. As my esteemed colleague Mr. Van Popta eloquently put it, once an image hits the internet, there are limited means by which you can take it down. What you can't do is stop the purveyors of this filth from resharing those images on the Internet. That's why these victims hold a special place in my heart.

In this particular case, in reference to this 13-year-old girl, imagine the legacy she is going to carry for the rest of her life because she trusted a boy and shared an image. It is disgusting.

I'm going back to the report. It says:

One witness told of her discovery that her partner had taken videos and pictures of her without her knowledge or consent which were then posted on Pornhub. She described the destructive impact on her life, emotional trauma, suicidality and the toll on her health and employment.

Another witness told the Ethics Committee about discovering a video of herself on Pornhub in which she was unconscious, with a tag that said “sleeping pills.”

The viewers, rather than being turned away by sexual assault videos, were actively searching out that content. The tags made this possible, and they knew what they were watching before they clicked. It is a profound betrayal to know that thousands of men saw your assault and not only did nothing to flag it but actively sought it out and enjoyed it.... This video is not a one-off that slipped through a filter. Sexual assault is not an anomaly on the porn sites; it is a genre. This leaves little incentive for these sites to moderate such content.

These are real people in vulnerable moments who shared with parliamentarians the devastating impacts of their abuse and intimate images being shared online.

In each of these cases, the victims found the platform either unresponsive or slow to respond to their requests to have their images taken down.

Once a person's intimate images or images of their abuse or exploitation are uploaded, what happens to those images is beyond their control. They may be downloaded, shared or reposted countless times. A report by the Office of the Privacy Commissioner of Canada in February [of this year] told of a professional take-down service that found 700 copies of one person's intimate images on more than 80 websites. The report noted the devastating effects on employment, social network and mental health.

Once these images are online it is nearly impossible to have them permanently removed. In a report by the Canadian Centre for Child Protection, survivors of recorded child sexual abuse indicated that the imagery impacted them in a different way than the initial abuse. “The information shared by the respondents to this survey makes it clear that the recording of abuse and its distribution adds an extraordinary layer of trauma for a victim”.... Survivors describe feeling powerless to stop the destruction of the images. It is ongoing trauma.

Then we have under the heading, “Scope of the Problem”:

Child sexual abuse material (CSAM) online

Over 20 million suspected images of child sexual abuse were triggered for review by the Canadian Centre for Child Protection's web crawler between 2017-2020.

According to Statistics Canada, 15,630 incidents of online sexual offences against children and 45,816 incidents of online child sexual abuse material were reported by police from 2014 to 2022

Studies show that prepubescent children are at the greatest risk of being depicted in CSAM and 84.2% of these videos and images contain severe abuse.

Approximately one million reports of child sexual exploitation are received by the National [U.S.] Center for Missing and Exploited Children...CyberTipLine each month. The hotline has received, in total, more than 45 million reports.

That's just the United States.

The report continues:

Lianna McDonald, executive director of the Canadian Centre for Child Protection, described a “tsunami” of victims coming to organizations like theirs for help to get their images removed from the internet.

Non-Consensual Distribution of Intimate Images (NCDII)

Police-reported Canadian data indicate 896 cases of NCDII [have been] reported in 2022 [alone].

In police-reported incidents of NCDII, youth aged 12 to 17 years accounted for almost all(97%) victims with the large majority (86%) of victims being girls.

NCDII may include:

-images which are recorded without consent, including images of sexual assault or rape (no consent to sexual activity, e.g., drugged or sleeping individuals) or of a person's exploitation, and then distributed; or

-images which were recorded with consent, but where no consent was given to their sharing or distribution.

The 896 police-reported cases of non-consensual distribution of intimate images in 2022 are likely a fraction of the incidents of NCDII. These numbers only reflect the images that have been discovered and reported to the police.

It begs the question:

How many Canadian women and teens don't yet know their images have been posted without their knowledge or consent, or who to approach for help if they do?

One can only imagine, on this committee, the staggering numbers that really exist in this particular area.

The report continues:

As Canada's Privacy Commissioner notes in his report, “Investigation into Aylo (formerly MindGeek)'s Compliance with PIPEDA”, Canadian adults who are the victims of NCDII face a variety of risks:

Individuals who have had their intimate content disclosed without their consent have experienced severe consequences including reputational, financial and emotional harm. These harms can come in the form of targeted harassment that occurs online or in person, loss of job opportunities and mental health impacts up to and including suicide.

One study found that young women who have experienced NCDII “revealed declines in overall mental health, anxiety, depression, post-traumatic stress, suicidal [ideation], increased alcohol and drug consumption, and low self-esteem and confidence.” Victims of NCDII also face ongoing trauma and an ongoing violation of their privacy as they live with the permanence of their intimate images on the Internet.

The following is under the heading “Generative AI”:

A new and escalating threat is the use of AI technology to generate child sexual abuse materials depicting either real or fictional children, and intimate images or pornography made of a person. “According to one study, more than 96% of AI generated pornography was produced without the consent of those featured in it....” The use of images created through AI harasses, harms and humiliates victims, like all CSAM and NCDII. We need urgent action to develop legislation that protects victims of all ages from generative AI and deepfake pornography.

A study by the University of Toronto professors notes that Canada is one of the countries that has not yet taken meaningful action on this front. It also states, “These manipulations thrive in the pornography industry, where women's faces are superimposed onto others' bodies to create video illusions, resulting in non-consensual sexual image abuse and other harm.” The study's authors go on to say, “The sheer volume of CSAM that can be generated and distributed using AI tools, a number that is growing exponentially every year, far exceeds the existing capacities, resources, and abilities of law enforcement organizations, NGOs, platforms, moderators and tech companies to respond to, investigate, and address.”

Next we have under the heading, “The urgent need to act”:

Commercial pornography sites must be held responsible to ensure exploitive and non-consensual images are not uploaded in the first place.

The onus must not be on children and youth to monitor commercial pornography sites to ensure that depictions of their abuse and exploitation are not posted or, if discovered, to ensure they are swiftly removed. The onus must not be on victims of non-consensual uploads to watch for their content and ensure it is removed.

Companies must be responsible for ensuring that the content they host and profit from is not child sexual abuse material, that the people depicted in images or videos are not minors, and that they consent to their image being posted.

Bill C-270 would prevent illegal content from being uploaded in the first place. This is essential, as once the images or video are uploaded—

—as I've mentioned already—

—it is nearly impossible to control their circulation and remove them.

Testimony to the Ethics Committee and the report by the Office of the Privacy Commissioner both describe the extensive spread of such images to other platforms and the extreme difficulty in having images removed once posted. As we noted above, the Privacy Commissioner’s report told of a professional take-down service that found 700 copies of one person’s intimate images on more than 80 websites.

By requiring that the age and consent of every person depicted in sexually explicit material be verified before it is posted online, Bill C-270 puts the responsibility where it belongs.

Bill C-270 would fulfill the second recommendation in the Ethics Committee report, Ensuring the Protection of Privacy and Reputation on Platforms such as Pornhub.

We note and recommend to this committee the Privacy Commissioner’s recommendations to Aylo...as a template of what should be required of all those who create pornography for a commercial purpose. The Privacy Commissioner recommended that

the company: (i) cease allowing the upload of intimate content without first obtaining meaningful consent directly from each individual appearing in that content; (ii) delete all content that it previously collected without obtaining such consent; and (iii) implement a privacy management program to ensure that it is accountable for information under its control.”

Canada’s legal frameworks must require verification of the age and consent of all individuals depicted in sexually explicit content created or hosted for a commercial purpose. This framework must also include AI-generated content.

The current version of Bill—

2:45 p.m.

Conservative

Michelle Ferreri Conservative Peterborough—Kawartha, ON

On a point of order, Madam Chair, the member opposite is heckling and mimicking everything my colleague is saying. It's hard to hear my colleague. I can't concentrate.

It's unnecessary. He doesn't have the floor.

Chris Bittle Liberal St. Catharines, ON

On the same point of order, I'm definitely not heckling. I'm just reading along word for word what—

2:45 p.m.

Conservative

Michelle Ferreri Conservative Peterborough—Kawartha, ON

You don't have the floor.

Chris Bittle Liberal St. Catharines, ON

I'm just reading along. I wasn't heckling, just to point that out—

The Chair Liberal Lena Metlege Diab

Okay. No, it—

2:45 p.m.

Conservative

Jamil Jivani Conservative Durham, ON

Maybe he forgot he's on Zoom. He can't just chime in any time he feels like it.

Chris Bittle Liberal St. Catharines, ON

On that point of order, it's very bizarre to heckle me and say I don't have the floor when everyone else is now yelling at me—