Thank you, Madam Chair.
I welcome back all colleagues after our constituency week. I hope we all had some rest. I know most of us, if not all of us, usually have schedules chock full of activities in our ridings. I was no exception to that, so it's good to be back, and it's good to be back to continue our discussion on Bill C-270.
Where I left off was providing the voice of our colleague Arnold Viersen. Clearly, there were certain members of the Liberal Party who were so eager to hear from him, but at the same time, they were not hiding from the fact that they had ulterior motives to hear from Mr. Viersen to fully cross-examine him on his personal views.
I might reiterate, just as I started off my last intervention, how disappointing and, quite frankly, shameful the actions being taken by certain Liberal members are in voicing their ulterior motives. This is because, as I indicated at the outset, weeks have now passed since a list of key stakeholder witnesses who wanted to participate in this debate was submitted not only to the clerk, but also to you, Madam Chair, with a recommendation that the last couple of meetings be set aside to hear from witnesses, as opposed to demanding that the sponsor of the bill, Arnold Viersen, attend and speak to the matter first.
In fact, if the schedule had been adhered to, today would have been set aside for clause-by-clause consideration after we had heard from those stakeholders, who definitely want to weigh in and add their voices to this discussion. It's shameful that political gamesmanship has been resorted to instead of dealing with the substance of Bill C-270, which would stop the Internet sexual exploitation of the most vulnerable members of our community.
Continuing my train of thought of providing voices to this discussion, I want to return to one church group, the Evangelical Fellowship of Canada, which has submitted a brief that I wish to read into the record at this time. It is entitled, “Submission to the Standing Committee on Justice and Human Rights on Bill C-270”, and it is dated November 5, 2024. It reads:
The Evangelical Fellowship of Canada (EFC) appreciates the opportunity to participate in the committee’s review of Bill C-270. We believe it’s crucial for Parliament to require pornography platforms ensure child sexual abuse materials and intimate images shared without consent are not uploaded to their sites. It is evident many of these platforms will not take such measures unless required to and held accountable for doing so.
The acronym for Evangelical Fellowship of Canada is EFC.
The EFC is the national association of evangelical Christians in Canada. Established in 1964, the EFC provides a constructive voice for biblical principles in life and society and a forum for engagement and collaboration for the roughly 2.2 million Evangelicals who are part of our constituency.
Our approach to this issue is based on the biblical principles of respect for human life and dignity, justice and care for those who are vulnerable. These principles are also reflected in Canadian law and public policy.
Under the heading of “The impact of posted images”, it reads:
There are devastating, lifelong consequences for those whose images are uploaded and distributed online. Children and youth face severe and extensive impacts when images of their abuse and exploitation are streamed and distributed.
In its 2021 hearings on the protection of privacy and reputation on platforms such as Pornhub, the Ethics Committee heard harrowing testimony from survivors whose intimate images, including images of abuse, had been posted on pornography platforms without their knowledge or consent. Some of the witnesses whose images had been posted on Pornhub were as young as 13 years old at the time the images were taken.
One young woman told the Ethics Committee how she was pressured to send the boy she liked an intimate video of herself when she was [only] in Grade 7. She then discovered the video had been uploaded to pornography sites. This video has been viewed millions of times. This young woman dropped out of school and her social circle, became homeless, fearful, anxious and suicidal.
Madam Chair, I want to pause for a moment. I want to reflect on my former career, when I prosecuted matters such as this, particularly those dealing with the possession, distribution and making of child pornography images. A point the experts unanimously agreed on, in unison with all of the victims I had the privilege of working with and assisting in the prosecution of these matters, is that they are a special class of victim.
They are unlike victims of sexual assault, which is horrendous in its very nature. They are unlike victims of a personal injury offence. Again, this could have lifelong implications for those victims. By and large, those two classes of victim are victimized once, with long-term—sometimes lifetime—consequences. The difference with victims in this particular area of the law is this: Each and every time their image is viewed, uploaded, saved and shared, they are revictimized. It's over and over again. As my esteemed colleague Mr. Van Popta eloquently put it, once an image hits the internet, there are limited means by which you can take it down. What you can't do is stop the purveyors of this filth from resharing those images on the Internet. That's why these victims hold a special place in my heart.
In this particular case, in reference to this 13-year-old girl, imagine the legacy she is going to carry for the rest of her life because she trusted a boy and shared an image. It is disgusting.
I'm going back to the report. It says:
One witness told of her discovery that her partner had taken videos and pictures of her without her knowledge or consent which were then posted on Pornhub. She described the destructive impact on her life, emotional trauma, suicidality and the toll on her health and employment.
Another witness told the Ethics Committee about discovering a video of herself on Pornhub in which she was unconscious, with a tag that said “sleeping pills.”
The viewers, rather than being turned away by sexual assault videos, were actively searching out that content. The tags made this possible, and they knew what they were watching before they clicked. It is a profound betrayal to know that thousands of men saw your assault and not only did nothing to flag it but actively sought it out and enjoyed it.... This video is not a one-off that slipped through a filter. Sexual assault is not an anomaly on the porn sites; it is a genre. This leaves little incentive for these sites to moderate such content.
These are real people in vulnerable moments who shared with parliamentarians the devastating impacts of their abuse and intimate images being shared online.
In each of these cases, the victims found the platform either unresponsive or slow to respond to their requests to have their images taken down.
Once a person's intimate images or images of their abuse or exploitation are uploaded, what happens to those images is beyond their control. They may be downloaded, shared or reposted countless times. A report by the Office of the Privacy Commissioner of Canada in February [of this year] told of a professional take-down service that found 700 copies of one person's intimate images on more than 80 websites. The report noted the devastating effects on employment, social network and mental health.
Once these images are online it is nearly impossible to have them permanently removed. In a report by the Canadian Centre for Child Protection, survivors of recorded child sexual abuse indicated that the imagery impacted them in a different way than the initial abuse. “The information shared by the respondents to this survey makes it clear that the recording of abuse and its distribution adds an extraordinary layer of trauma for a victim”.... Survivors describe feeling powerless to stop the destruction of the images. It is ongoing trauma.
Then we have under the heading, “Scope of the Problem”:
Child sexual abuse material (CSAM) online
Over 20 million suspected images of child sexual abuse were triggered for review by the Canadian Centre for Child Protection's web crawler between 2017-2020.
According to Statistics Canada, 15,630 incidents of online sexual offences against children and 45,816 incidents of online child sexual abuse material were reported by police from 2014 to 2022
Studies show that prepubescent children are at the greatest risk of being depicted in CSAM and 84.2% of these videos and images contain severe abuse.
Approximately one million reports of child sexual exploitation are received by the National [U.S.] Center for Missing and Exploited Children...CyberTipLine each month. The hotline has received, in total, more than 45 million reports.
That's just the United States.
The report continues:
Lianna McDonald, executive director of the Canadian Centre for Child Protection, described a “tsunami” of victims coming to organizations like theirs for help to get their images removed from the internet.
Non-Consensual Distribution of Intimate Images (NCDII)
Police-reported Canadian data indicate 896 cases of NCDII [have been] reported in 2022 [alone].
In police-reported incidents of NCDII, youth aged 12 to 17 years accounted for almost all(97%) victims with the large majority (86%) of victims being girls.
NCDII may include:
-images which are recorded without consent, including images of sexual assault or rape (no consent to sexual activity, e.g., drugged or sleeping individuals) or of a person's exploitation, and then distributed; or
-images which were recorded with consent, but where no consent was given to their sharing or distribution.
The 896 police-reported cases of non-consensual distribution of intimate images in 2022 are likely a fraction of the incidents of NCDII. These numbers only reflect the images that have been discovered and reported to the police.
It begs the question:
How many Canadian women and teens don't yet know their images have been posted without their knowledge or consent, or who to approach for help if they do?
One can only imagine, on this committee, the staggering numbers that really exist in this particular area.
The report continues:
As Canada's Privacy Commissioner notes in his report, “Investigation into Aylo (formerly MindGeek)'s Compliance with PIPEDA”, Canadian adults who are the victims of NCDII face a variety of risks:
Individuals who have had their intimate content disclosed without their consent have experienced severe consequences including reputational, financial and emotional harm. These harms can come in the form of targeted harassment that occurs online or in person, loss of job opportunities and mental health impacts up to and including suicide.
One study found that young women who have experienced NCDII “revealed declines in overall mental health, anxiety, depression, post-traumatic stress, suicidal [ideation], increased alcohol and drug consumption, and low self-esteem and confidence.” Victims of NCDII also face ongoing trauma and an ongoing violation of their privacy as they live with the permanence of their intimate images on the Internet.
The following is under the heading “Generative AI”:
A new and escalating threat is the use of AI technology to generate child sexual abuse materials depicting either real or fictional children, and intimate images or pornography made of a person. “According to one study, more than 96% of AI generated pornography was produced without the consent of those featured in it....” The use of images created through AI harasses, harms and humiliates victims, like all CSAM and NCDII. We need urgent action to develop legislation that protects victims of all ages from generative AI and deepfake pornography.
A study by the University of Toronto professors notes that Canada is one of the countries that has not yet taken meaningful action on this front. It also states, “These manipulations thrive in the pornography industry, where women's faces are superimposed onto others' bodies to create video illusions, resulting in non-consensual sexual image abuse and other harm.” The study's authors go on to say, “The sheer volume of CSAM that can be generated and distributed using AI tools, a number that is growing exponentially every year, far exceeds the existing capacities, resources, and abilities of law enforcement organizations, NGOs, platforms, moderators and tech companies to respond to, investigate, and address.”
Next we have under the heading, “The urgent need to act”:
Commercial pornography sites must be held responsible to ensure exploitive and non-consensual images are not uploaded in the first place.
The onus must not be on children and youth to monitor commercial pornography sites to ensure that depictions of their abuse and exploitation are not posted or, if discovered, to ensure they are swiftly removed. The onus must not be on victims of non-consensual uploads to watch for their content and ensure it is removed.
Companies must be responsible for ensuring that the content they host and profit from is not child sexual abuse material, that the people depicted in images or videos are not minors, and that they consent to their image being posted.
Bill C-270 would prevent illegal content from being uploaded in the first place. This is essential, as once the images or video are uploaded—
—as I've mentioned already—
—it is nearly impossible to control their circulation and remove them.
Testimony to the Ethics Committee and the report by the Office of the Privacy Commissioner both describe the extensive spread of such images to other platforms and the extreme difficulty in having images removed once posted. As we noted above, the Privacy Commissioner’s report told of a professional take-down service that found 700 copies of one person’s intimate images on more than 80 websites.
By requiring that the age and consent of every person depicted in sexually explicit material be verified before it is posted online, Bill C-270 puts the responsibility where it belongs.
Bill C-270 would fulfill the second recommendation in the Ethics Committee report, Ensuring the Protection of Privacy and Reputation on Platforms such as Pornhub.
We note and recommend to this committee the Privacy Commissioner’s recommendations to Aylo...as a template of what should be required of all those who create pornography for a commercial purpose. The Privacy Commissioner recommended that
the company: (i) cease allowing the upload of intimate content without first obtaining meaningful consent directly from each individual appearing in that content; (ii) delete all content that it previously collected without obtaining such consent; and (iii) implement a privacy management program to ensure that it is accountable for information under its control.”
Canada’s legal frameworks must require verification of the age and consent of all individuals depicted in sexually explicit content created or hosted for a commercial purpose. This framework must also include AI-generated content.
The current version of Bill—