Stopping Internet Sexual Exploitation Act

An Act to amend the Criminal Code (pornographic material)

Sponsor

Arnold Viersen  Conservative

Introduced as a private member’s bill. (These don’t often become law.)

Status

Report stage (House), as of Nov. 19, 2024

Subscribe to a feed (what's a feed?) of speeches and votes in the House related to Bill C-270.

Summary

This is from the published bill.

This enactment amends the Criminal Code to prohibit a person from making, distributing or advertising pornographic material for commercial purposes without having first ascertained that, at the time the material was made, each person whose image is depicted in the material was 18 years of age or older and gave their express consent to their image being depicted.

Elsewhere

All sorts of information on this bill is available at LEGISinfo, an excellent resource from the Library of Parliament. You can also read the full text of the bill.

Votes

May 8, 2024 Passed 2nd reading of Bill C-270, An Act to amend the Criminal Code (pornographic material)

December 11th, 2024 / 5:30 p.m.


See context

Director of research and analytics, Canadian Centre for Child Protection

Jacques Marcoux

Mr. Kurek, just so I'm clear, isn't Bill C-270Mr. Viersen's private member's bill?

Damien Kurek Conservative Battle River—Crowfoot, AB

Thank you very much.

Mr. Marcoux, you're not in the room here, but we have an Amber alert that just rang, and I certainly pray that the child is found soon, safely and quickly.

It speaks to how live of an issue protecting kids is. I spent some time at the justice committee discussing Bill C-270, a bill that would help with ensuring that there's that accountability for the distribution of non-consensual explicit material and would ensure that it is taken down and that there would be responsibility for both those who would share it and the companies that in some cases make it just incredibly difficult for victims. There are heartbreaking stories, and I shared some of that testimony during my time at the justice committee.

Here we are discussing freedom of expression. We have an Amber alert, which highlights how this is such a live issue. I'm wondering if you can comment specifically on Bill C-270 with regard to making sure that when it comes to protecting kids, there is accountability for those who would share explicit material without consent—whether it be children or adults, maybe intimate partners or the like—and ensuring that there's accountability for both those who would share and the companies that have profited in many cases off that material.

Michelle Ferreri Conservative Peterborough—Kawartha, ON

Madam Chair, this is just for clarification for anybody who's following at home.

We had a motion on the table:

That the Committee request an extension of 30 sitting days to the period of Committee consideration for Bill C-270.

The amendment put forward was to add “and that the Committee invite the Minister of Justice to appear for one hour on the Supplementary Estimates (B) and reinvite Arnold Viersen to appear on the subject of Bill C-270.”

What that means is that we're talking about this amendment. We're debating an amendment that's been put forward. However, when we ask a minister to come in and talk about supplementary estimates, we now have a lot of latitude in terms of what we're going to discuss.

This is the justice committee. As you can imagine, there's not a person or a Canadian, I assume, who isn't watching at home who hasn't felt the impact of the increasing crime after nine years of the Prime Minister.... What we're doing now is really trying to ask the Liberals, the NDP and the Bloc to really dive into the crisis that this country is under.

I have so many things, obviously, as the critic for families, children and social development. This is one of the biggest impacts to families across this country. Public safety should be there for everyone, and it's not.

In my community, in Peterborough, for example, it feels like at least every day there is a headline of another stabbing or a shooting in what was once a very sleepy, sweet town. That is really what we're here to discuss. How do we improve that? That's what committees are designed to do. You can't correct a problem if you don't acknowledge a problem.

In this committee, we are tasked with bringing forward information, listening to experts and really having tough discussions about what's happening.

Let's put it into the context of data. I think everybody knows there's nobody out there who has gone outside who can't say that things don't feel less safe in Canada after nine years of Justin Trudeau. That's just a fact. Total sexual assaults are up 75%. Sexual violations against children are up 119%. Forcible confinement or kidnapping is up 11%. Indecent and harassing communications are up 86.4%. Non-consensual distribution of intimate images is up 801%.

Michelle Ferreri Conservative Peterborough—Kawartha, ON

Thank you, Madam Chair.

As I was saying, what are we doing here today? Well, we're talking about a bill, and then an amendment that was put forward. Bill C-270 is much, much needed. It's an act to amend the Criminal Code related to pornographic material. What I was speaking about at the beginning of this was sextortion. For a lot of parents, you know, this is a tough conversation to have at home, but it's important that we know what this is.

What is sextortion? Well, it is where people are having a conversation online through an app. It can be Snapchat or Instagram or a lot of these applications that our children use every day and that adults use, but obviously it's a different can of worms when minors are impacted or involved. There is an exchange or an ask for an image, an intimate image. That person says okay and they send it to them. That picture or image or video, or whatever it is, is then used to extort that person. They are asked for money.

Chris Bittle Liberal St. Catharines, ON

I'm requesting an addition. The amendment would be at the end of the motion. It would be, “and that the Committee invite the Minister of Justice to appear for one hour on the Supplementary Estimates (B) and reinvite Arnold Viersen to appear on the subject of Bill C-270.”

I sent it to the clerk.

Chris Bittle Liberal St. Catharines, ON

Thank you, Chair.

Let me repeat. It's genuinely surprising. If Liberals spent hours filibustering a bill using victim testimonies, I'd be genuinely curious about what the Conservatives would say. I guess it's easier to do this job sometimes if you don't have any shame. It's been shocking to watch what the Conservatives are willing to do to prevent Mr. Viersen from testifying for an hour. It's not like he's coming for weeks and going to be grilled for weeks. Mr. Fortin is right. He goes on podcasts quite a bit, it seems. I'm sure he's been stopped recently from going on podcasts and shouting his views from the rooftops. That's great. That's why we're here. We talk about what we believe in. However, using victim testimonials to prevent Arnold Viersen from testifying is shocking.

We could have started this bill already. We could have Mr. Viersen come later. Perhaps there are some things going on. We can have him come later, at the end of the day. I noticed on his Facebook page that he was on a hunting trip last week. He's not busy, so why isn't he here? This is important to him. I know it is. I have been hearing him talk about issues like this since 2015. For nine years, he has wanted an opportunity to do this. It's probably from the leader's office, because it takes some organization to set up a filibuster over multiple meetings. Yet, here his colleagues are, continuing to prevent this.

As a side note, I hope that, when the online harms act comes up, there's the same willingness to listen to victims. I doubt there will be. I'm predicting we'll see filibusters on the other side when that comes forward and we are dealing with the issue.

I have tried to move a motion for unanimous consent, in order to get this study moving quicker. It was denied. I sent an amendment to the clerk. I will move an amendment to the motion at the end.

The whole motion will now read:

That the Committee request an extension of 30 sitting days to the period of Committee consideration for Bill C-270,

The amendment is:

and that the Committee invite the Minister of Justice to appear for one hour on the Supplementary Estimates (B) and reinvite Arnold Viersen to appear on the subject of C-270.

I think that's reasonable. Let's get on with business. Let's get Mr. Viersen here. Let's get the minister here. Let's get on with our job. I think Mr. Fortin is right. Let's do what we're here to do. Let's help the victims. Let's move things forward. I know Mr. Brock is shocked that questions may get asked of Mr. Viersen outside of the scope of something. I don't think I've ever seen a minister appear on the estimates where the questions were contained to the estimates, but let's keep things moving. Let's do what the committee is here to do. Let's get to work. Let's study this and also have the minister appear. You can ask him whatever you want on whatever topic you like, as is your right and as is the case. Let's keep this moving.

Thank you.

Rhéal Fortin Bloc Rivière-du-Nord, QC

Thank you, Madam Chair.

We've been trying for I don't know how many hours to debate Bill C‑270. It's been going on for a few days now. This is an important bill.

I listen to my Conservative colleagues, for whom I have a lot of respect, and my Liberal colleagues, for whom I also have a lot of respect, and I'm stunned. It's mind-boggling. Both sides say there are victims, and I agree. Mr. Brock has just come back to the poignant testimonies of young people who are victims of pornography. We're talking here about people under 18 appearing in photos or videos circulating on the net. We, the parliamentarians elected by the general public, could solve the problem. We agree on this and we know how to solve this problem. Mr. Viersen has tabled a bill. Each of us might want to propose certain amendments to it, when the time comes, but we all agree that this problem needs to be resolved.

I don't know how to describe our attitude. I say “our” attitude as a committee, because that includes me. I don't want to blame anyone, but it just doesn't make sense. The only reason for dithering and filibustering on this bill is that Mr. Viersen is against abortion. Everyone knows this, both in Parliament and across the country. Mr. Viersen makes no secret of it. He has given press conferences on the subject. Is he right or wrong? I have my opinion on that, but I don't think it's relevant to this bill.

On the one hand, the Conservatives don't want Mr. Viersen to testify, because they suspect the Liberals will ask him about abortion. So they are systematically obstructing him. They say he won't be heard and that another witness should be called. On the other hand, since the Liberals want to boost their election campaign by saying that Mr. Viersen is anti-abortion, they insist that he testify. So we're at war over whether or not Mr. Viersen will come to support his bill.

However, this is immaterial to us. If the victims whose testimonies Mr. Brock has been recounting were sitting here, they'd be discouraged to see us acting this way. They'd be reminding us how messed up they are and how much they need our help, when all we can do is argue about whether or not Mr. Viersen will testify. Couldn't we declare a truce, agree to pass this bill, after which we'll have plenty of time to quibble?

I'm sure no one in Canada is going to vote differently in the next election because Mr. Viersen will have come here to testify. He's going to say he's against abortion, that's for sure. He's said it in every forum. He's not going to change his mind, he's going to repeat it. What will that change? The Conservatives won't be any less well represented or any different in the next election campaign. For their part, the Liberals have nothing to gain. We know as well as anyone that Mr. Viersen is against abortion. It's all over the media. Just recently, I read a few reports about it.

What's distressing, however, is that there are victims, young people under 18 who appear in pornographic videos circulating on the web. We all agree that this makes no sense. Yet it's simple: Bill C‑270 says that, before distributing a pornographic film or publishing such images, the distributor will have to make sure that the protagonists are of age, i.e., over 18, and consenting. I simply can't believe that we're going to continue to bicker for weeks on end, and that at the end of the day, we're going to tell these people that they're going to continue to be victims and that we're sorry, but that it's not our fault, because that's the way things are, all because we're being stubborn.

I don't understand the reasoning behind this. Quite frankly, I find the situation very unedifying. As I've already said, I have a great deal of respect for my colleagues who, on both sides, are now systematically obstructing this bill. I believe they are intelligent men and women. Most of us are professionals, and we're all aware that the way we're acting right now makes no sense whatsoever. Couldn't we make a little effort? For my part, I'm ready. I don't know if there's anything I can do, but if there is, I'm going to do it.

Please, let's spare a thought for these victims. Instead of using them by saying that Mr. Viersen would come and say this or that, or that he would think this or that, let's think about these victims and pass Bill C‑270.

Thank you, Madam Chair.

Larry Brock Conservative Brantford—Brant, ON

Madam Chair, I'll continue:

Between the time an image is uploaded, detected and taken down, it could have been viewed, shared or reposted millions of times—even if all of this occurs within a 24-hour period. Platforms must be required to have mechanisms in place to verify age and consent of those depicted in sexually explicit material to ensure illegal content is never uploaded in the first place.

We urge the committee to support Bill C-270's measures to ensure illegal content is not uploaded in the first place. Please ensure AI-generated content is addressed.

I now want to move on and read out the personal stories of various victims, some of whom have testified at committee.

Larry Brock Conservative Brantford—Brant, ON

Thank you, Madam Chair.

I welcome back all colleagues after our constituency week. I hope we all had some rest. I know most of us, if not all of us, usually have schedules chock full of activities in our ridings. I was no exception to that, so it's good to be back, and it's good to be back to continue our discussion on Bill C-270.

Where I left off was providing the voice of our colleague Arnold Viersen. Clearly, there were certain members of the Liberal Party who were so eager to hear from him, but at the same time, they were not hiding from the fact that they had ulterior motives to hear from Mr. Viersen to fully cross-examine him on his personal views.

I might reiterate, just as I started off my last intervention, how disappointing and, quite frankly, shameful the actions being taken by certain Liberal members are in voicing their ulterior motives. This is because, as I indicated at the outset, weeks have now passed since a list of key stakeholder witnesses who wanted to participate in this debate was submitted not only to the clerk, but also to you, Madam Chair, with a recommendation that the last couple of meetings be set aside to hear from witnesses, as opposed to demanding that the sponsor of the bill, Arnold Viersen, attend and speak to the matter first.

In fact, if the schedule had been adhered to, today would have been set aside for clause-by-clause consideration after we had heard from those stakeholders, who definitely want to weigh in and add their voices to this discussion. It's shameful that political gamesmanship has been resorted to instead of dealing with the substance of Bill C-270, which would stop the Internet sexual exploitation of the most vulnerable members of our community.

Continuing my train of thought of providing voices to this discussion, I want to return to one church group, the Evangelical Fellowship of Canada, which has submitted a brief that I wish to read into the record at this time. It is entitled, “Submission to the Standing Committee on Justice and Human Rights on Bill C-270”, and it is dated November 5, 2024. It reads:

The Evangelical Fellowship of Canada (EFC) appreciates the opportunity to participate in the committee’s review of Bill C-270. We believe it’s crucial for Parliament to require pornography platforms ensure child sexual abuse materials and intimate images shared without consent are not uploaded to their sites. It is evident many of these platforms will not take such measures unless required to and held accountable for doing so.

The acronym for Evangelical Fellowship of Canada is EFC.

The EFC is the national association of evangelical Christians in Canada. Established in 1964, the EFC provides a constructive voice for biblical principles in life and society and a forum for engagement and collaboration for the roughly 2.2 million Evangelicals who are part of our constituency.

Our approach to this issue is based on the biblical principles of respect for human life and dignity, justice and care for those who are vulnerable. These principles are also reflected in Canadian law and public policy.

Under the heading of “The impact of posted images”, it reads:

There are devastating, lifelong consequences for those whose images are uploaded and distributed online. Children and youth face severe and extensive impacts when images of their abuse and exploitation are streamed and distributed.

In its 2021 hearings on the protection of privacy and reputation on platforms such as Pornhub, the Ethics Committee heard harrowing testimony from survivors whose intimate images, including images of abuse, had been posted on pornography platforms without their knowledge or consent. Some of the witnesses whose images had been posted on Pornhub were as young as 13 years old at the time the images were taken.

One young woman told the Ethics Committee how she was pressured to send the boy she liked an intimate video of herself when she was [only] in Grade 7. She then discovered the video had been uploaded to pornography sites. This video has been viewed millions of times. This young woman dropped out of school and her social circle, became homeless, fearful, anxious and suicidal.

Madam Chair, I want to pause for a moment. I want to reflect on my former career, when I prosecuted matters such as this, particularly those dealing with the possession, distribution and making of child pornography images. A point the experts unanimously agreed on, in unison with all of the victims I had the privilege of working with and assisting in the prosecution of these matters, is that they are a special class of victim.

They are unlike victims of sexual assault, which is horrendous in its very nature. They are unlike victims of a personal injury offence. Again, this could have lifelong implications for those victims. By and large, those two classes of victim are victimized once, with long-term—sometimes lifetime—consequences. The difference with victims in this particular area of the law is this: Each and every time their image is viewed, uploaded, saved and shared, they are revictimized. It's over and over again. As my esteemed colleague Mr. Van Popta eloquently put it, once an image hits the internet, there are limited means by which you can take it down. What you can't do is stop the purveyors of this filth from resharing those images on the Internet. That's why these victims hold a special place in my heart.

In this particular case, in reference to this 13-year-old girl, imagine the legacy she is going to carry for the rest of her life because she trusted a boy and shared an image. It is disgusting.

I'm going back to the report. It says:

One witness told of her discovery that her partner had taken videos and pictures of her without her knowledge or consent which were then posted on Pornhub. She described the destructive impact on her life, emotional trauma, suicidality and the toll on her health and employment.

Another witness told the Ethics Committee about discovering a video of herself on Pornhub in which she was unconscious, with a tag that said “sleeping pills.”

The viewers, rather than being turned away by sexual assault videos, were actively searching out that content. The tags made this possible, and they knew what they were watching before they clicked. It is a profound betrayal to know that thousands of men saw your assault and not only did nothing to flag it but actively sought it out and enjoyed it.... This video is not a one-off that slipped through a filter. Sexual assault is not an anomaly on the porn sites; it is a genre. This leaves little incentive for these sites to moderate such content.

These are real people in vulnerable moments who shared with parliamentarians the devastating impacts of their abuse and intimate images being shared online.

In each of these cases, the victims found the platform either unresponsive or slow to respond to their requests to have their images taken down.

Once a person's intimate images or images of their abuse or exploitation are uploaded, what happens to those images is beyond their control. They may be downloaded, shared or reposted countless times. A report by the Office of the Privacy Commissioner of Canada in February [of this year] told of a professional take-down service that found 700 copies of one person's intimate images on more than 80 websites. The report noted the devastating effects on employment, social network and mental health.

Once these images are online it is nearly impossible to have them permanently removed. In a report by the Canadian Centre for Child Protection, survivors of recorded child sexual abuse indicated that the imagery impacted them in a different way than the initial abuse. “The information shared by the respondents to this survey makes it clear that the recording of abuse and its distribution adds an extraordinary layer of trauma for a victim”.... Survivors describe feeling powerless to stop the destruction of the images. It is ongoing trauma.

Then we have under the heading, “Scope of the Problem”:

Child sexual abuse material (CSAM) online

Over 20 million suspected images of child sexual abuse were triggered for review by the Canadian Centre for Child Protection's web crawler between 2017-2020.

According to Statistics Canada, 15,630 incidents of online sexual offences against children and 45,816 incidents of online child sexual abuse material were reported by police from 2014 to 2022

Studies show that prepubescent children are at the greatest risk of being depicted in CSAM and 84.2% of these videos and images contain severe abuse.

Approximately one million reports of child sexual exploitation are received by the National [U.S.] Center for Missing and Exploited Children...CyberTipLine each month. The hotline has received, in total, more than 45 million reports.

That's just the United States.

The report continues:

Lianna McDonald, executive director of the Canadian Centre for Child Protection, described a “tsunami” of victims coming to organizations like theirs for help to get their images removed from the internet.

Non-Consensual Distribution of Intimate Images (NCDII)

Police-reported Canadian data indicate 896 cases of NCDII [have been] reported in 2022 [alone].

In police-reported incidents of NCDII, youth aged 12 to 17 years accounted for almost all(97%) victims with the large majority (86%) of victims being girls.

NCDII may include:

-images which are recorded without consent, including images of sexual assault or rape (no consent to sexual activity, e.g., drugged or sleeping individuals) or of a person's exploitation, and then distributed; or

-images which were recorded with consent, but where no consent was given to their sharing or distribution.

The 896 police-reported cases of non-consensual distribution of intimate images in 2022 are likely a fraction of the incidents of NCDII. These numbers only reflect the images that have been discovered and reported to the police.

It begs the question:

How many Canadian women and teens don't yet know their images have been posted without their knowledge or consent, or who to approach for help if they do?

One can only imagine, on this committee, the staggering numbers that really exist in this particular area.

The report continues:

As Canada's Privacy Commissioner notes in his report, “Investigation into Aylo (formerly MindGeek)'s Compliance with PIPEDA”, Canadian adults who are the victims of NCDII face a variety of risks:

Individuals who have had their intimate content disclosed without their consent have experienced severe consequences including reputational, financial and emotional harm. These harms can come in the form of targeted harassment that occurs online or in person, loss of job opportunities and mental health impacts up to and including suicide.

One study found that young women who have experienced NCDII “revealed declines in overall mental health, anxiety, depression, post-traumatic stress, suicidal [ideation], increased alcohol and drug consumption, and low self-esteem and confidence.” Victims of NCDII also face ongoing trauma and an ongoing violation of their privacy as they live with the permanence of their intimate images on the Internet.

The following is under the heading “Generative AI”:

A new and escalating threat is the use of AI technology to generate child sexual abuse materials depicting either real or fictional children, and intimate images or pornography made of a person. “According to one study, more than 96% of AI generated pornography was produced without the consent of those featured in it....” The use of images created through AI harasses, harms and humiliates victims, like all CSAM and NCDII. We need urgent action to develop legislation that protects victims of all ages from generative AI and deepfake pornography.

A study by the University of Toronto professors notes that Canada is one of the countries that has not yet taken meaningful action on this front. It also states, “These manipulations thrive in the pornography industry, where women's faces are superimposed onto others' bodies to create video illusions, resulting in non-consensual sexual image abuse and other harm.” The study's authors go on to say, “The sheer volume of CSAM that can be generated and distributed using AI tools, a number that is growing exponentially every year, far exceeds the existing capacities, resources, and abilities of law enforcement organizations, NGOs, platforms, moderators and tech companies to respond to, investigate, and address.”

Next we have under the heading, “The urgent need to act”:

Commercial pornography sites must be held responsible to ensure exploitive and non-consensual images are not uploaded in the first place.

The onus must not be on children and youth to monitor commercial pornography sites to ensure that depictions of their abuse and exploitation are not posted or, if discovered, to ensure they are swiftly removed. The onus must not be on victims of non-consensual uploads to watch for their content and ensure it is removed.

Companies must be responsible for ensuring that the content they host and profit from is not child sexual abuse material, that the people depicted in images or videos are not minors, and that they consent to their image being posted.

Bill C-270 would prevent illegal content from being uploaded in the first place. This is essential, as once the images or video are uploaded—

—as I've mentioned already—

—it is nearly impossible to control their circulation and remove them.

Testimony to the Ethics Committee and the report by the Office of the Privacy Commissioner both describe the extensive spread of such images to other platforms and the extreme difficulty in having images removed once posted. As we noted above, the Privacy Commissioner’s report told of a professional take-down service that found 700 copies of one person’s intimate images on more than 80 websites.

By requiring that the age and consent of every person depicted in sexually explicit material be verified before it is posted online, Bill C-270 puts the responsibility where it belongs.

Bill C-270 would fulfill the second recommendation in the Ethics Committee report, Ensuring the Protection of Privacy and Reputation on Platforms such as Pornhub.

We note and recommend to this committee the Privacy Commissioner’s recommendations to Aylo...as a template of what should be required of all those who create pornography for a commercial purpose. The Privacy Commissioner recommended that

the company: (i) cease allowing the upload of intimate content without first obtaining meaningful consent directly from each individual appearing in that content; (ii) delete all content that it previously collected without obtaining such consent; and (iii) implement a privacy management program to ensure that it is accountable for information under its control.”

Canada’s legal frameworks must require verification of the age and consent of all individuals depicted in sexually explicit content created or hosted for a commercial purpose. This framework must also include AI-generated content.

The current version of Bill—

The Chair Liberal Lena Metlege Diab

We are now back in session.

Good morning, everyone.

I will ask all in-person participants to read the guidelines written on the updated cards on the table, as a refresher. These measures are in place to help prevent audio feedback incidents.

This is to protect the health and safety of all participants, including interpreters.

You will also notice a QR code on the card, which links to a short awareness video.

I remind you that this is the continuation of meeting 121 of the Standing Committee on Justice and Human Rights.

The committee is meeting in public to continue its study of Bill C-270, an act to amend the Criminal Code regarding pornographic material. We are here in public to resume debate on the motion by James Maloney, a request for an extension of 30 sitting days to the period of committee consideration for Bill C-270 and reporting the bill back.

I am now ready to give the floor to members wishing to speak. I'm going to start a new list, because I'm not sure who ended last time.

Was it you, Mr. Brock?

Damien Kurek Conservative Battle River—Crowfoot, AB

Thank you very much, Madam Chair.

I look forward to continuing to hear what Mr. Brock has to say and to hear the specificity in which he is addressing the very important issues related to Bill C-270 and the motion that Mr. Maloney moved, which is actually on the agenda.

I would just note, Madam Chair, that in terms of departure from the Standing Orders, the continual introduction of issues of debate into the conversation at hand by, in particular, members from the Liberal side, I would just suggest that those members simply put their names on the speaking list. I look forward to hearing from them when their names come up on the speaking list.

I would just ask that you outline again for the committee who exactly is on that speaking list. I know there's been a bit of discussion, with people going back and forth, and I know there was some discussion around a member who is present, although he is not a regular member of the committee. Perhaps we could have some clarity on that. I know you appreciate and respect having clarity and acting with precision, which is key for the smooth functioning of these parliamentary proceedings.

Larry Brock Conservative Brantford—Brant, ON

Thank you.

To recap, members from the government and the NDP want to hear words from Mr. Viersen. This is what Mr. Viersen had to say at second reading:

Madam Speaker, imagine being the parent of a teenage daughter who has been missing for months and somebody discovers 50 explicit videos of that daughter being sexually abused on Pornhub, the most popular porn site in the world. Imagine how one would feel if intimate images of one's sibling was uploaded and Pornhub refused one's request to remove that content. Now, imagine if those videos of their exploited loved ones were being monetized and published for profit by Pornhub and were made available to Pornhub's over 130 daily visitors.

I think ”130 daily visitors” is a typo. I would imagine it's probably in the millions worldwide.

He continues:

How would someone feel if Pornhub's only response was an auto-reply email? Understandably, one would be outraged. One would be furious, yet this happens over and over. Survivors, including a 12-year-old from Ontario, have had to seek justice through their own lawsuits because in Canada, the onus is on survivors and on law enforcement to prove, after the material has been uploaded, that the individuals depicted in those videos are either under age or have not consented to their distribution. This is a serious problem that Bill C-270, the stopping internet sexual exploitation act, seeks to fix.

It's important to note that for years survivors, child protection agencies and the police have spoken out about this exploitation. They have not been silent. Survivors have shared how pornographic companies like Pornhub have been profiting from content depicting minors, sex trafficking victims, sexual assault, intimate images and gender-based violence for years. As early as 2019, companies like PayPal cut ties with MindGeek due to the availability of exploitive and abusive content.

In March 2020, a few parliamentarians and I wrote a public letter to the Prime Minister to alert him about the exploitation that was happening on MindGeek. We followed up in November 2020 with a letter to the then Minister of Justice, urging him to ensure that our laws were adequate to prevent women and girls from being exploited by Pornhub.

It was The New York Times exposé on December 4, 2020, in a piece written by Nicholas Kristof, that finally got the public's and the government's attention. It was entitled “The Children of Pornhub: Why does Canada allow this company to profit off videos of exploitation and assault?” That article finally kicked off a firestorm of international attention on Pornhub, which is one of many pornographic websites owned by MindGeek, a Canadian company based in Montreal. About a year ago, it was bought and rebranded as Aylo by a company called Ethical Capital Partners, based in Ottawa.

A few days after that article, the House of Commons ethics committee initiated an investigation into Pornhub. I joined the ethics committee for its study on Pornhub and listened to the harrowing stories of young women who had videos of sexual assaults or intimate content shared without their consent.

I know Mr. Van Popta has shared some of those quotes.

Mr. Viersen continues:

Many of these women were minors when the videos were created and uploaded to pornography sites like Pornhub. I want to take a moment to share some of their testimony.

Serena Fleites, whose story was covered by The New York Times exposé, had videos of her at age 13 uploaded by her ex-boyfriend. After that, her whole life came crumbling down. She experienced depression and drug use. She was harassed by people at her school who found her video and sent it to family members. She was blackmailed. She had to pretend to be her mother to have the videos taken down from Pornhub. This was all while she was 13 years old. In the end, she stopped going to school. She told us:

I thought that once I stopped being in the public so much, once I stopped going to school, people would stop re-uploading it. But that didn't happen, because it had already been basically downloaded by [all the] people...[in] the world. It would always be uploaded, over and over and over again. No matter how many times I got it taken down, it would be right back up again.

It basically became a full-time job for her to just chase down those images and to get them removed from Pornhub.

Some witnesses appeared anonymously to protect their identities. One witness stated, “I was 17 when videos of me on Pornhub came to my knowledge, and I was only 15 in the videos they [were] profiting from.” She went on to say, “Every time they took it down, they also allowed more and more videos of me to be reuploaded.” That witness also said, “Videos of me being on Pornhub has affected my life so much to the point that I don't leave my house anymore. I stopped being able to work because I [am]...scared to be out in public around other people.”

Another survivor who spoke to us at committee is Victoria Galy. As a result of discovering non-consensual images and videos of herself on Pornhub, she completely lost her sense of self-worth, and at times, she was suicidal. She told us at committee, “There were over eight million views just on Pornhub alone. To think of the amount of money that Pornhub has made off my trauma, date rape and sexual exploitation makes me sick to my stomach.” She added, “I have been forced to stand up alone and fight Pornhub”.

It is a serious failure of our justice system when survivors have to launch their own lawsuits to get justice for the harms caused by companies like MindGeek. This Canadian company has not faced a single charge or consequence in Canada for publishing its videos of exploitation and for profiting from them. This is truly shameful.

Last year, a survivor named Uldouz Wallace reached out to me. Uldouz is a survivor of the 2014 iCloud hack. She is also an award-winning actress, executive producer, activist and director of Foundation RA. Uldouz had photos and videos taken in the 2014 iCloud hack and uploaded onto porn sites like Pornhub, and she fought for years to get them taken down. As a result of this, she told us, “I lost followers, I lost everything that you could think of. It was just such hard time for me. I ended up spending over a million dollars over a three-year span just to get the content taken down on me with no success.... They're making so much money off of the non-consensual uploading of images and videos. The re-uploading is also a billion dollar industry.” She added, “There's still no federal laws. There's barely any laws at all to hold anyone online accountable. There's currently foreign revenge laws but for people like me there's nothing.”

Rachel, a survivor from Alberta, said that it was devastating and that it is going to haunt her for the rest of her life. She said that she will always be someone's porn.

I want to point out the incredible courage of Victoria, Serena, Uldouz, Rachel and many other survivors who have spoken out. In the midst of one of the most difficult moments of their lives, they are fighting back against a billion-dollar industry that seeks to profit from their pain and exploitation. I thank Victoria, Serena, Uldouz, and Rachel for refusing to back down. I thank them for their courage. I thank them for their relentless pursuit of justice. I would encourage members to listen to their full testimonies, and they can do so at www.siseact.ca.

Throughout the ethics committee hearings and from the interactions I have had with survivors since, it is clear that this is a common problem. Pornographic companies are publishing and monetizing content without verifying the age and the consent of the people depicted in them. This is particularly a problem for Canada as many of those websites are hosted here

That is a shameful legacy of this country.

He went on:

Bill C-270, the stopping Internet sexual exploitation act, would stop this. I am going to quote right from the summary of my bill. It states that the SISE act would:

...prohibit a person [including companies] from making, distributing or advertising pornographic material for commercial purposes without having first ascertained that, at the time the material was made, each person whose image is depicted in the material was 18 years old or older and gave their express consent to their image being depicted.

The SISE act would also allow individuals to revoke their consent. This is an important part to express the ongoing consent. Finally, the SISE act would provide for aggravating factors when the material created or published actually depicts minors or non-consensual activity.

I am also pleased to share that I consulted on the bill with a variety of child protection agencies, law enforcement groups and the Canadian Centre for Child Protection to ensure that there are no gaps and that police have the tools to ensure they can seek justice.

The heart of the bill is consent. No one should be publishing sexually explicit material without the express consent of everyone depicted in that material. Children cannot consent to exploitation. Victims of sex trafficking and sexual assault cannot consent. Those filmed without their knowledge cannot consent, yet pornography companies freely publish this content and profit from it because there is no onus on them to verify the age or the consent of those depicted.

That is why the second recommendation of the 2021 ethics committee report is:

That the Government of Canada mandate that content-hosting platforms operating in Canada require affirmation from all persons depicted in pornographic content, before it can be uploaded, that they are 18 years old or older and that they consent to its distribution, and that it consult with the Privacy Commissioner of Canada with respect to the implementation of such obligation.

We have heard from survivors who testified that their images of abuse would not be online if companies like Pornhub had bothered to check for age and consent. Bill C-270 would fulfill this important recommendation from the ethics committee report and, importantly, I should add that this report was unanimously supported by all parties at the ethics committee.

The recommendation also suggests consulting with the Privacy Commissioner. I am happy to share with my colleagues that on February 29, 2024, the Privacy Commissioner released his investigation into Pornhub's operator Aylo, formerly MindGeek. The report was initially scheduled to be released on May 23, but it was delayed for over nine months when MindGeek, or Aylo, and its owners, Ethical Capital Partners took the Privacy Commissioner to court to block the release of that report.

The Privacy Commissioner’s investigation into Aylo, MindGeek, was in response to a woman whose ex-boyfriend had uploaded intimate images of her to MindGeek's website without her consent. The young woman had to use a professional service to get it taken down and to remove her images from approximately 80 websites, where they had been re-posted more than 700 times.

The report shared how the publishing of the woman’s intimate images led to a permanent loss of control of the images, which had a devastating effect on her. It caused her to withdraw from her social life and to live in a state of fear and anxiety. The Commissioner stated:

“This untenable situation could have been avoided in many cases had MindGeek obtained direct consent from each individual depicted in content prior to or at the time of upload.”

“Pornhub’s own Monthly Non-Consensual Content reports suggest that non-consensual content is still regularly uploaded and viewed by thousands of users before it is removed.”

“We find that by continuing to rely solely on the uploader to verify consent, MindGeek fails to ensure that it has obtained valid and meaningful consent from all individuals depicted in content uploaded to its websites.”

Ultimately, the Privacy Commissioner recommended that Pornhub and its owners adopt measures that would verify age and consent before any content is uploaded. I would urge all members to read the Privacy Commissioner's report on Pornhub.

While Pornhub and its owners are the biggest pornography company in the world, this bill would ensure that age verification and consent applies to all pornography companies because whether it is videos of child exploitation, sex trafficking, AI deepfakes, sexual assault or an intimate encounter filmed by a partner, once a video or image has been uploaded, it is virtually impossible to eliminate. Each video can be viewed and downloaded millions of times within a 24-hour period, starting an endless nightmare for victims who must fight to get those videos removed, only for them to be uploaded again within minutes or hours.

Canada must do more to prevent this exploitive content from ever reaching the Internet in the first place. I hope I have the support of my colleagues in ending this nightmare for so many and in preventing it for so many more. To the survivors, some of whom are watching today, we thank them. Their voices are being heard.

I want to thank the organizations that have supported me along the way in getting this bill to this point: National Centre on Sexual Exploitation, National Council of Women of Canada, Ottawa Coalition to End Human Trafficking, London Abused Women's Centre, Defend Dignity, Vancouver Collective Against Sexual Exploitation, The Salvation Army, Survivor Safety Matters, Foundation RA, Montreal Council of Women, CEASE UK, Parents Aware, Joy Smith Foundation, Hope Resource Centre Association, Evangelical Fellowship of Canada, Colchester Sexual Assault Centre, Sexual Assault and Violence Intervention Services of Halton, and Ally Global Foundation.

Those, colleagues, are the words of Arnold Viersen, whom you so passionately asked that he present this bill—

Larry Brock Conservative Brantford—Brant, ON

Thank you, Madam Chair.

I think I will start my intervention by recapping some of the important points and areas that I've heard from my colleagues yesterday and today.

The most important point that I wish to reiterate is for what I trust to be thousands of Canadians who are following this and watching this particular committee. I'll give them advance notice that this committee will potentially sit until 11:30 this evening.

The important point that I wish to make—this is following up on my colleague Mr. Van Popta's earlier interventions—is that there is an overriding theme that is being developed here. It's not being propagated and established by any party other than the Liberal Party of Canada, supported by their coalition partners, the NDP.

If they truly cared about victims—I intend to go through some legislative history over my three-plus years as a parliamentarian that demonstrates the complete opposite of empathy towards victims in this country—they would not be engaging in this particular position that they are taking. This is nothing more than political gamesmanship. It is partisanship and, quite frankly, it's petty politics, which I find extremely disgusting.

As Mr. Van Popta pointed out, this particular bill reached our committee before we recessed this past summer, in June 2024. We returned to Parliament in mid-September, and committees resumed toward the end of September. While the justice committee was studying two important reports regarding the rise of both anti-Semitism and Islamophobia, Bill C-270 was always waiting in the wings. You, Madam Chair, would bring it up from time to time.

I'm also mindful of the fact that we had many meetings over the course of two-plus months that ended early. Some meetings didn't actually happen at all. I can't say with any degree of confidence that all the meetings that have been scheduled for the justice committee since we returned this past fall have been utilized effectively in terms of utilizing all the resources that we had available to us. Here we are now, with a looming deadline that we were all made aware of weeks ago.

Not to put too fine a point on it, Madam Chair, but the Conservative Party of Canada submitted a significant list of witnesses—subject matter experts in this particular area. That was last Friday. In that interim, we had a meeting on Monday. We had a meeting yesterday. We're meeting today. Perhaps we're meeting again next week, but there is absolutely no sense, no urgency and no direction from this committee that this committee is prioritizing the hearing of witnesses.

When I listen to some Liberal members—

Tako Van Popta Conservative Langley—Aldergrove, BC

Thank you.

Let me take the opportunity to point out once again that we could have witnesses here today. Whether or not Mr. Viersen appears, we could have witnesses here today. There are a lot of important witnesses we could hear from who would give valuable testimony that would help inform our report back to the House of Commons. Again, today we are failing to take that opportunity.

The sense that I have is that the Liberals do not like Bill C-270. At second reading, they voted in favour of sending it to committee, but with “serious reservations”. What are some of those reservations? I'm just going to continue here with some more of Mr. Maloney's comments. I just want to underline, Mr. Maloney, that it was a good speech. It was well-researched and useful information. We might just disagree on the direction that we should be going.

He said:

Individuals who informally make or distribute pornographic material of themselves and of people they know are unlikely to verify age by examining legal documentation.... They are also unlikely to secure formal written consent. It concerns me that such people would be criminalized by the bill's proposed offences, where they knew that everyone implicated was consenting and of age, merely because they did not comply with the...regulatory regime....

We're getting to the heart of their objection. They think that it is a regulatory scheme and that it's not going to work. They also prefer the government bill, Bill C-63, the online harms act, which picks up on some of the direction that the private member's bill that is before us today is taking, but it, too, creates a regulatory scheme. So they are saying, “We don't like your regulatory scheme; we prefer our regulatory scheme.” Is that what it's coming down to?

I think this is a good point to talk about what a couple of the witnesses who appeared at the ethics committee for its study in 2021 said, which goes right to the point that I'm making here. This is witness 1, unidentified, and she had this to say:

When I was 24, I met someone I thought was a really nice guy. I married him, and as soon as he thought I was stuck, he stopped being nice pretty quickly. In April 2020, I moved away from our home to be safe, and obviously, we're not together anymore.

It's going to go on for just a couple of paragraphs, but I think this is really important to get on the record to set the context.

During our relationship, I had let him take some pictures. I was uncomfortable at first, because I had never been in any picture like that, but I trusted him and I wanted to keep him happy. It wasn't until August of 2020 that I discovered those private photos had been uploaded to porn sites, including Pornhub.

Here I want to make a point, Madam Chair. She was of age and she gave consent, but not for what he did with it later, so he would have had a defence against the bill that the Liberals are suggesting would be better than Bill C-270.

She goes on:

I was upset about the photos, but it was about to get worse. Finding the photos led me to a video. I did not know the video existed. I found out about it by watching it on Pornhub.

I don't want to get into the details. It was quite distasteful, but she was drugged. In any event, she was asleep. She had no recollection of it, and she was filmed in—I'm trying to find a polite way to say it—a compromised position. This is what was on the Internet. It was all over the Internet. It was taken by her husband. She was of age. She had consented to some form of photos, but not to that and not to the uploading on Pornhub.

She goes on:

My video had been uploaded in August of 2017, so by the time I found it, it had been active on Pornhub for over three years, and I had no idea.

Then she made a comment about Pornhub and sites like that:

Sexual assault is not an anomaly on the porn sites; it is a genre. This leaves little incentive for these sites to moderate such content.

To give an idea of the scope of the spread, as of early January 2021—after the December purge, and after the RCMP had removed a bunch for me—googling the name of my Pornhub video still returned over 1,900 results....

Thanks to Pornhub, today is day 1,292 that I have been naked on these porn sites.

This is what we are trying to fight. This is what the private member's bill, Bill C-270, is all about. We think it is worth fighting for.

Now, another objection from the Liberals is that the private member's bill is apparently “not consistent with the basic principles of criminal law”, in that it does not require mens rea. Most of us are lawyers here, but for those who aren't, mens rea is the Latin term for the mental element of a crime. Not only must the Crown prove that an event happened, but the Crown also has to prove that the person who caused the criminal event to happen had a guilty mind about it and knew that what they were doing was wrong. Then they go on: “for example, that the accused knew or was reckless as to whether those depicted in the pornographic material did not consent or were not of age.”

Well, in response to that, I'm going to just read something from another person who appeared before the same ethics committee. This is someone who was known only as “Witness 2”. This is what she had to say. It's just a few paragraphs:

I'm now 19 years old. I was 17 when videos of me on Pornhub came to my knowledge, and I was only 15 in the videos they've been profiting from.

“They” means the porn sites.

When I was 15, I was extorted by a man who was unknown at the time into sending massive amounts of videos and images of me.

Why she did that.... It was probably not very wise, but she did it.

Then, two years later.... She said:

This was the first time I had any knowledge of being on their site.

During this time, I stopped eating and leaving the house, and I was even considering suicide. I started getting hundreds of follow requests daily on my social media accounts and at least 50 messages a day sending me links of videos of me on Pornhub. That's when I realized that my name and social media had been posted alongside the videos.

Tako Van Popta Conservative Langley—Aldergrove, BC

Thank you.

Just in response to Mr. Bittle's intervention, nobody on this side of the table had any objection to who appeared at the meetings. They were all active participants and added value to the discussion. I would just underline that. But there was a problem stemming from what certainly appeared to us to be two groups not having conversations with each other. The analysts did their best to create some sort of a concordance between the two reports. That took time. Now we are at a place where we are running against the clock.

I appreciate what you said, Madam Chair, that the potential witnesses hadn't been invited until recently, or the list hadn't been made available until recently. I wasn't expecting that this would have been done in September, but surely in the last four to six weeks we could have found a way to start on this very important study and get the witnesses here.

To get into the substantive part of Bill C-270 and what it's all about, I want to read briefly the summary of the bill, as follows:

This enactment amends the Criminal Code to prohibit a person from making, distributing or advertising pornographic material for commercial purposes without having first ascertained that, at the time the material was made, each person whose image is depicted in the material was 18 years of age or older and gave their express consent to their image being depicted.

There are two things here, the age requirement and the consent requirement, keeping in mind that people under age can't actually give consent. Personally, I'd never thought too much about the topic, but I was eager to get into the study. I did sit in once when the private member's bill was debated. It was debated twice at second reading, once on April 9 and once on May 7. I sat in for part of the May 7 debate, I believe. I heard some stories about victims and survivors and I became very interested in the topic.

Reading in Hansard these two hours of debate on the private member's bill, I felt a sense of multi-party co-operation on an issue that is so important to all of us—namely, preventing children from being exploited sexually online and stopping the uploading and distribution of non-consensual images. I felt a sense of co-operation among all the speakers. As I said, I was there for only one of them, but I read all the speeches from both hours of debate.

I just want to highlight a couple of them. First, MP Rempel Garner, who happens to be a co-sponsor of Bill C-270, had this to say on April 9: “I am very pleased to hear the multipartisan nature of debate on these types of issues, and that there is at least a willingness to bring forward these types of initiatives to committee to have the discussions”.

MP Garrison, from the NDP, on that same day made this positive comment about the initiative being brought forward by this private member's bill:

It is also important to remember that whatever we do here has to make our law more effective at getting those who are profiting from the images. That is really what the bill is aimed at, and I salute the member for Peace River—Westlock for that singular focus because I think that is really key.

I want to quote from MP Larouche of the Bloc Québécois. It's important to note that she also chaired the All Party Parliamentary Group to End Modern Slavery and Human Trafficking. She has a long track record of being interested in this topic and advocating for victims. She had this to say: “Let us not forget that these [online porn] companies are headquartered right in Montreal. The fact that our country is home to mafia-style companies that profit from sexual exploitation is nothing to be proud of.”

I would say, Madam Chair, that that is an understatement. That's an embarrassment for us. The New York Times picked up the story on this, and the world now knows that Canada is headquarters for mafia-style companies and child pornography. I applaud those who are fighting to combat that.

Even the Liberals supported this private member's bill at second reading, but with serious reservations. This is what MP Maloney had to say. I believe he is online, so I'm going to quote my friend and colleague, Mr. Maloney. He had this to say: “I want to say at the outset that the government will be supporting this bill, Bill C-270, at second reading, but with some serious reservations.” He then pointed out that Bill C-270 was in response to a 2021 report of the House of Commons Standing Committee on Access to Information, Privacy and Ethics. That committee, the ethics committee, commenced that study at least partially in response to the New York Times story that had run earlier that year, or it might have been the previous year.

I just want to read a couple of pieces from that report, because I think it is very relevant to what we're talking about today. I'm not going to belabour the point, because the report is available for anybody to read. These are just a couple of paragraphs from the summary of that report:

Recent reports regarding the presence of child sexual abuse material (CSAM) and other non-consensual content on the adult platform Pornhub led the House of Commons Standing Committee on Access to Information, Privacy and Ethics (the Committee) to undertake a study on the protection of privacy and reputation on online platforms such as Pornhub. [This is a Canadian company.] This study gave the Committee a window into the world of adult websites and how their content moderation practices have failed to protect the privacy and reputation of individuals online.

The Committee heard harrowing accounts from survivors who had had images and videos of themselves uploaded to the Pornhub website without their consent. Some were minors. Some were adults. All encountered difficulties in having those images and videos taken down. The Committee also heard from the executives of MindGeek and Pornhub, who told the Committee that they have appropriate practices in place and are constantly striving to improve these measures.

I, for one, do not believe that, and certainly the investigation that this committee undertook and the conclusions that they came to would underline that as well.

I just want to read one of the recommendations. This is recommendation 2 of 14 recommendations. I am not belabouring the point; I'm just picking up on some of the highlights, some of the important things to set a context for what we're talking about today.

Recommendation 2 concerning the duty to verify age and consent.

That the Government of Canada mandate that content-hosting platforms operating in Canada require affirmation from all persons depicted in pornographic content, before it can be uploaded, that they are 18 years old or older and that they consent to its distribution, and that it consult with the Privacy Commissioner of Canada with respect to the implementation of such obligation.

Madam Chair, that was recommendation 2 from that 2021 report from the ethics committee, which forms the foundation of the private member's bill that is before us now, and that was the point that Mr. Maloney was making in his speech in the House on May 7.

I have another quote from Mr. Maloney's speech, which was a good speech and it's worth quoting from.