Evidence of meeting #21 for Access to Information, Privacy and Ethics in the 43rd Parliament, 2nd Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was pornhub.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Clerk of the Committee  Ms. Miriam Burke
Lianna McDonald  Executive Director, Canadian Centre for Child Protection
Daniel Bernhard  Executive Director, Friends of Canadian Broadcasting
John F. Clark  President and Chief Executive Officer, National Center for Missing & Exploited Children
Lloyd Richardson  Director, Information Technology, Canadian Centre for Child Protection
Commissioner Stephen White  Deputy Commissioner, Specialized Policing Services, Royal Canadian Mounted Police
Normand Wong  Senior Counsel, Criminal Law Policy Section, Department of Justice
Superintendent Marie-Claude Arsenault  Royal Canadian Mounted Police

11:05 a.m.

Liberal

The Vice-Chair Liberal Brenda Shanahan

I call this meeting to order. I think members are aware by now that I am chairing the meeting today due to the unavoidable absence of our regular chair, Mr. Warkentin. We certainly wish his family well during a difficult time.

This is meeting number 21 of the House of Commons Standing Committee on Access to Information, Privacy and Ethics. We are resuming our study on the protection of privacy and reputation on online video platforms such as Pornhub. I would like to remind you that today's meeting is webcast and will be made available via the House of Commons website.

Today's meeting is taking place in a hybrid format pursuant to the House order of January 25, 2021. Therefore, members may attend in person in the room or remotely using the Zoom application.

I believe, Madam Clerk, that the witnesses have been briefed on the usual procedures for the hybrid format. I need only remind all present that members and witnesses may speak in the official language of their choice. Please use the raise hand feature should you wish to speak or alert the chair, and this is a reminder that all comments by members and witnesses should be addressed through the chair.

11:05 a.m.

NDP

Charlie Angus NDP Timmins—James Bay, ON

I have a point of order.

11:05 a.m.

Liberal

The Vice-Chair Liberal Brenda Shanahan

Go ahead, Mr. Angus.

11:05 a.m.

NDP

Charlie Angus NDP Timmins—James Bay, ON

Madam Shanahan, I want to welcome you to your first running as chair, and I know it's going to be very successful. Certainly we all express our concerns for Mr. Warkentin and his family.

I am sorry to interrupt. I just want a point of clarification, because I know there will be committee business. In case I have to step out, I'm asking for a minute of clarification.

At the meeting on January 29, we passed a motion that we were going to call Mr. Victor Li, Madam Marquez and Guy Spencer Elms and that we would be issuing summons. I know we have issued the legal summons on Mr. Victor Li and Madam Marquez, but I didn't hear the status of Mr. Guy Spencer Elms, who is the key director of many of the financial operations of the Kielburger operations in Kenya. Given the really disturbing allegations that have come out, both by CBC's Fifth Estate and Bloomberg, I think his testimony will help clear the air for a lot of people, particularly around the allegations of children being beaten in the schools in Kenya, which I think we all find pretty shocking and surprising.

Could the chair tell us if Mr. Spencer Elms has agreed to come before our committee? Would that be a yes or a no?

11:05 a.m.

Liberal

The Vice-Chair Liberal Brenda Shanahan

I believe, Mr. Angus, that the clerk has been working in that regard and can provide us with an update. The clerk can tell us what she can now, but we will be addressing this in committee business in camera at 1:30.

11:05 a.m.

The Clerk of the Committee Ms. Miriam Burke

Thank you, Madam Chair.

I have been unable to reach Mr. Spencer Elms at this point.

11:05 a.m.

NDP

Charlie Angus NDP Timmins—James Bay, ON

Mr. Spencer Elms, who runs a major law firm in Kenya, has not responded. You've not been able to contact him at all.

11:05 a.m.

The Clerk

No, sir.

11:05 a.m.

NDP

Charlie Angus NDP Timmins—James Bay, ON

Okay. Thank you for that.

11:05 a.m.

Liberal

The Vice-Chair Liberal Brenda Shanahan

Thank you, Mr. Angus.

I recognize Mr. Fergus.

11:05 a.m.

Liberal

Greg Fergus Liberal Hull—Aylmer, QC

Thank you, Madam Chair.

Perhaps it was just me, because I wasn't here during the sound check. I see that Mr. Angus is participating in person in the committee room along with our clerk.

Madam Chair, could I ask the clerk which other members might be appearing in person today?

11:10 a.m.

Liberal

The Vice-Chair Liberal Brenda Shanahan

Very well.

Go ahead, Madam Clerk.

11:10 a.m.

The Clerk

Mr. Viersen is here as well.

11:10 a.m.

Conservative

Arnold Viersen Conservative Peace River—Westlock, AB

Mr. Arnold Viersen.

11:10 a.m.

Liberal

The Vice-Chair Liberal Brenda Shanahan

Welcome, Mr. Viersen.

Thank you. I appreciate clarification on that. With the hybrid format, it's not always easy to see who is in person in the room and who is on screen.

I would like to proceed now with welcoming our witnesses for today for this very important study. As the witnesses know, they have time for presentations.

From the Canadian Centre for Child Protection, we will hear from Lianna McDonald, executive director, and Lloyd Richardson, director, information technology. We also have with us, from the Friends of Canadian Broadcasting, Daniel Bernhard, executive director. From the National Center for Missing and Exploited Children, we have Mr. Clark, president and chief executive officer.

I believe that each of you has a presentation.

Ms. McDonald, the floor is yours.

11:10 a.m.

Lianna McDonald Executive Director, Canadian Centre for Child Protection

Good morning, Chairperson and distinguished members of the committee. Thank you for giving us this opportunity to present.

I am Lianna McDonald, executive director of the Canadian Centre for Child Protection, a charity dedicated to the personal safety of children. Joining me today is Lloyd Richardson, our director of technology.

By way of background, our agency operates Cybertip.ca, which is Canada’s tip line for reporting the online sexual exploitation of children. The tip line has been operating for over 18 years and currently receives, on average, 3,000 or more public reports per month.

Our agency has witnessed the many ways in which technology has been weaponized against children and how the proliferation of child sexual abuse material, otherwise known as CSAM, and non-consensual material fosters ongoing harm to children and youth. Over the last decade, there has been an explosion of digital media platforms hosting user-generated pornographic content. This, coupled with a complete absence of meaningful regulation, has created the perfect storm whereby transparency and accountability are notably absent. Children have been forced to pay a terrible price for this.

We know that every image or video of CSAM that is publicly available is a source of revictimization for the child in that image or video. For this reason, in 2017 we created Project Arachnid. Processing tens of thousands of images per second, this powerful tool detects known CSAM for the purpose of quickly identifying and triggering the removal of this illegal and harmful content. Project Arachnid has provided our agency with an important lens into how the absence of a regulatory framework fails children. To date, Arachnid has processed more than 126 billion images and has issued over 6.7 million takedown notices to providers around the globe. We keep records of all these notices we send, how long it takes for a platform to remove CSAM once advised of its existence, and data on the uploading of the same or similar images on platforms.

At this point, we would like to share what we have seen on MindGeek’s platforms. Arachnid has detected and confirmed instances of what we believe to be CSAM on their platform at least 193 times in the past three years. These sightings include 66 images of prepubescent CSAM involving very young children; 74 images of indicative CSAM, meaning that the child in the image appears pubescent and roughly between the ages of 11 to 14; and 53 images of post-pubescent CSAM, meaning that sexual maturation of the child may be complete and we have confirmation that the child in the image is under the age of 18.

We do not believe the above numbers are representative of the scope and scale of this problem. These numbers are limited to obvious CSAM of very young children and of identified teenagers. There is likely CSAM involving many other teens that we would not know about, because many victims and survivors are trying to deal with the removal issue on their own. We know this.

MindGeek testified that moderators manually review all content that is uploaded to their services. This is very difficult to take seriously. We know that CSAM has been published on their website in the past. We have some examples to share.

The following image was detected by Arachnid. This image is a still frame taken from a CSAM video of an identified sexual abuse survivor. The child was pubescent, between the ages of 11 and 13, at the time of the recording. The image shows an adult male sexually assaulting the child by inserting his penis in her mouth. He is holding the child’s hair and head with one hand and his penis with the other hand. Only his midsection is visible in the image, whereas the child’s face is completely visible. A removal request was generated by Project Arachnid. It took at least four days for that image to come down.

The next example was detected also by Project Arachnid. It is a CSAM image of two unidentified sexual abuse victims. The children pictured in the image are approximately 6 to 8 years of age. The boy is lying on his back with his legs spread. The girl is lying on top of him with her face between his legs. Her own legs are straddling his head. The girl has the boy’s penis in her mouth. Her face is completely visible. The image came down the same day we sent the notice requesting this removal.

We have other examples, but my time is limited.

While the spotlight is currently focused on MindGeek, we want to make it clear that this type of online harm is occurring daily across many mainstream and not-so-mainstream companies operating websites, social media and messaging services. Any of them could have been put under this microscope as MindGeek has been by this committee. It is clear that whatever companies claim they are doing to keep CSAM off their servers, it is not enough.

Let's not lose sight of the core problem that led to this moment. We've allowed digital spaces where children and adults intersect to operate with no oversight. To add insult to injury, we have also allowed individual companies to decide the scale and scope of their moderation practices. This has left many victims and survivors at the mercy of these companies to decide if they take action or not.

Our two-decades-long social experiment with an unregulated Internet has shown that tech companies are failing to prioritize the protection of children online. Not only has CSAM been allowed to fester online, but children have also been harmed by the ease with which they can easily access graphic and violent pornographic content. Through our collective inaction we have facilitated the development of an online space that virtually has no rules, certainly no oversight and that consistently prioritizes profits over the welfare and the protection of children. We do not accept this standard in other forms of media, including television, radio and print. Equally, we should not accept it in the digital space.

This is a global issue. It needs a global coordinated response with strong clear laws that require tech companies to do this: implement tools to combat the relentless reuploading of illegal content; hire trained and effectively supervised staff to carry out moderation and content removal tasks at scale; keep detailed records of user reports and responses that can be audited; be accountable for moderation and removal decisions and the harm that flows to individuals when companies fail in this capacity; and finally, build in, by design, features that prioritize the best interests and rights of children.

In closing, Canada needs to assume a leadership role in cleaning up the nightmare that has resulted from an online world that is lacking any regulatory and legal oversight. It is clear that relying upon the voluntary actions of companies has failed society and children miserably. The time has come to impose some guardrails in this space and show the leadership that our children deserve.

I thank you for your time.

11:15 a.m.

Liberal

The Vice-Chair Liberal Brenda Shanahan

Thank you very much, Ms. McDonald.

Mr. Bernhard, you may make your presentation.

11:15 a.m.

Daniel Bernhard Executive Director, Friends of Canadian Broadcasting

Madam Chair, honourable members of the committee, thank you for inviting me to appear today.

My name is Daniel Bernhard, and I am the executive director of Friends of Canadian Broadcasting, an independent citizens' organization that promotes Canadian culture, values and sovereignty on air and online.

Last September, Friends released “Platform for Harm”, a comprehensive legal analysis showing that under long-standing Canadian common law, platforms like Pornhub and Facebook are already liable for the user-generated content they promote.

On February 5, Pornhub executives gave contemptuous and, frankly, contemptible, testimony to this committee, attempting to explain away all the illegal content that they promoted to millions of Canadians and millions more around the world.

Amoral as the Pornhub executives appear to be, it would be a mistake, in my opinion, to treat their behaviour as a strictly moral failing. As Mr. Angus said on that day the activity that you are studying is quite possibly criminal.

Pornhub does not dispute having disseminated vast amounts of child sexual abuse material, and Ms. McDonald just confirmed that fact. On February 5, the company's executives acknowledged that 80% of their content was unverified, some 10 million videos, and they acknowledged that they transmitted and recommended large amounts of illegal content to the public.

Of course, Pornhub's leaders tried to blame everybody but themselves. Their first defence is ignorance. They claim they can't remove illegal content from the platform because until a user flags it for them, they don't know it's there. In any case, they claim that responsibility lies with the person who uploaded the content and not with them. However, the law does not support this position. Yes, uploaders are liable, but so are platforms promoting illegal content if they know about it in advance and publish it anyway or if they are made aware of it post-publication and neglect to remove it.

This brings us to their second defence, incompetence. Given the high cost of human moderation, Pornhub employs software to find offending content, yet they hold themselves blameless when their software doesn't actually work. As Mark Zuckerberg has done so many times, Pornhub promised you that they'll do better. “Will do better” isn't a defence. It's a confession.

I wish Pornhub were an outlier, but it's not. In 2018, the U.S. National Center for Missing and Exploited Children received over 18 million referrals of child sexual abuse materials, according to the New York Times. Most of it was found on Facebook. There were more than 50,000 reports per day. That's just what they caught. The volume of user-uploaded, platform-promoted child sexual abuse material is now so vast that the FBI must prioritize cases involving infants and toddlers, and according to the New York Times, “are essentially not able to respond to reports of anybody older than that”.

These platforms also disseminate many illegal contents that are not of a sexual nature. These include incitement to violence, death threats, and the sale of drugs and illegal weapons, among others. The Alliance to Counter Crime Online group regularly discovers such content on Facebook, YouTube and Amazon. There is even an illegal market for human remains on Facebook.

The volume of content that these platforms handle does not excuse them from disseminating and recommending illegal material. If widespread distribution of illegal content is an unavoidable side effect of your business, then your business should not exist, period.

Can you imagine an airline being allowed to carry passengers when every other flight crashes? Imagine if they just said that flying is hard and kept going. Yet Pornhub and Facebook would have you believe just that: that operating illegally is fine because they can't operate otherwise. That's like saying, “Give me a break officer. Of course I couldn't drive straight. I had way too much to drink.”

The government promises new legislation to hold platforms liable in some way for the content that they promote and this is a welcome development. But do we really need a new law to tell us that broadcasting child sexual assault material is illegal? How would you react if CTV did? Exactly.

In closing, our research is clear. In Canada, platforms are already liable for circulating illegal user-generated content. Why hasn't the Pornhub case led to charges? Perhaps you can invite RCMP Commissioner Lucki to answer that question. Ministers Blair and Lametti could also weigh in. I'd be curious to hear what they have to say.

Don't get me wrong. The work that you are doing to draw attention to Pornhub's atrocious behaviour is vital, but you should also be asking why this case is being tried at committee and not in court.

Here's the question: Does Pornhub's CEO belong in Hansard or in handcuffs? This is a basic question of law and order and of Canada's sovereignty over its media industries. It is an urgent question. Canadian children, young women and girls cannot wait for a new law and neither should we.

Thank you very much. I welcome your questions.

11:25 a.m.

Liberal

The Vice-Chair Liberal Brenda Shanahan

Thank you very much, Mr. Bernhard.

Mr. Clark, you may begin your presentation.

11:25 a.m.

John F. Clark President and Chief Executive Officer, National Center for Missing & Exploited Children

Good morning, Madam Chair Shanahan and honourable members of the committee.

My name is John Clark. I am the president and CEO of the U.S.-based National Center for Missing and Exploited Children, sometimes known as NCMEC.

I am honoured to be here today to provide the committee with NCMEC's perspective on the growing problem of child sexual exploitation online, the role of combatting the dangers children can encounter on the Internet, and NCMEC's experience with the website Pornhub.

Before I begin with my testimony, I'd like to clarify for the committee that NCMEC and Pornhub are not partners. We do not have a partnership with Pornhub. Pornhub has registered to voluntarily report instances of child sexual abuse material on its website to NCMEC. This does not create a partnership between NCMEC and Pornhub, as Pornhub recently claimed during some of their testimony.

NCMEC was created in 1984 by child advocates as a private, non-profit organization to help find missing children, reduce child sexual exploitation and prevent child victimization. Today I will focus on NCMEC's mission to reduce online child sexual exploitation.

NCMEC's core program to combat online child sexual exploitation is the CyberTipline. The CyberTipline is a tool for members of the public and electronic service providers, or ESPs, to report child sexual abuse material to NCMEC.

Since we created the CyberTipline over 23 years ago, the number of reports we receive has exploded. In 2019 we received 16.9 million reports to the CyberTipline. Last year we received over 21 million reports of international and domestic online child sexual abuse. We have received a total of over 84 million reports since the CyberTipline began.

A United States federal law requires a U.S.-based ESP to report apparent child sexual abuse material to NCMEC's CyberTipline. This law does not apply to ESPs that are based in other countries. However, several non-U.S. ESPs, including Pornhub, have chosen to voluntarily register with NCMEC and report child sexual abuse material to the CyberTipline.

The number of reports of child sexual exploitation received by NCMEC is heartbreaking and daunting. So, too, are the many new trends NCMEC has seen in recent years. These trends include the following: a tremendous increase in sexual abuse videos reported to NCMEC, reports of increasingly graphic and violent sexual abuse images, and videos of infants and young children. These include on-demand sexual abuse in a pay-per- view format, and videos showing the rape of young children.

A broader range of online platforms are being used to access, store, trade and download child sexual abuse material, including chats, videos and messaging apps, video- and photo-sharing platforms, social media and dating sites, gaming platforms and email systems.

NCMEC is fortunate to work with certain technology companies that employ significant time and financial resources on measures to combat online child sexual abuse on their platforms. These measures include large teams of well-trained human content moderators; sophisticated technology tools to detect abusive content, report it to NCMEC and prevent it from even being posted; engagement in voluntary initiatives to combat online child sexual exploitation offered by NCMEC and other ESPs; failproof and readily accessible ways for users to report content; and immediate removal of content reported as being child sexual abuse.

NCMEC applauds the companies that adopt these measures. Some companies, however, do not adopt child protection measures at all. Others adopt half-measures as PR strategies to try to show commitment to child protection while minimizing disruption to their operations.

Too many companies operate business models that are inherently dangerous. Many of these sites also fail to adopt basic safeguards, or do so only after too many children have been exploited and abused on their sites.

In March 2020, MindGeek voluntarily registered to report child sexual abuse material, or CSAM, on several of its websites to NCMEC's CyberTipline. These websites include Pornhub, as well as RedTube, Tube8 and YouPorn. Between April 2020 and December 2020, Pornhub submitted over 13,000 reports related to CSAM through NCMEC's CyberTipline; however, Pornhub recently informed NCMEC that 9,000 of these reports were duplicative. NCMEC has not been able to verify Pornhub's claim.

After MindGeek's testimony before this committee earlier this month, MindGeek signed agreements with NCMEC to access our hash-sharing databases. These arrangements would allow MindGeek to access hashes of CSAM and sexually exploitive content that have been tagged and shared by NCMEC with other non-profits and ESPs to detect and remove content. Pornhub has not taken steps yet to access these databases or use these hashes.

Over the past year NCMEC has been contacted by several survivors asking for our help in removing sexually abusive content of themselves as children that was on Pornhub. Several of these survivors told us they had contacted Pornhub asking them to remove the content, but the content still remained up on the Pornhub website. In several of these instances NCMEC was able to contact Pornhub directly, which then resulted in the content being removed from the website.

We often focus on the tremendous number of CyberTipline reports that NCMEC receives and the huge volume of child sexual abuse material contained in these reports. However, our focus should more appropriately be on the child victims and the impact the continuous distribution of these images has on their lives. This is the true social tragedy of online child sexual exploitation.

NCMEC commends the committee for listening to the voices of the survivors in approaching these issues relating to Pornhub. By working closely with the survivors, NCMEC has learned the trauma suffered by these child victims is unique. The continued sharing and recirculation of a child's sexually abusive images and videos inflicts significant revictimization on the child. When any website, whether it's Pornhub or another site, allows a child's sexually abusive video to be uploaded, tagged with a graphic description of their abuse and downloaded and shared, it causes devastating harm to the child. It is essential for these websites to have effective means to review content before it's posted, to remove content when it's reported as child sexual exploitation, to give the benefit of doubt to the child or the parent or lawyer when they report content as child sexual exploitation, and to block the recirculation of abusive content once it has been removed.

Child survivors and the children who have yet to be identified and recovered from their abuse depend on us to hold technology companies accountable for the content on their platforms.

I want to thank you for the opportunity to appear before this committee. This is an increasingly important topic. I look forward to answering the committee's questions regarding NCMEC's work on these issues.

11:30 a.m.

Liberal

The Vice-Chair Liberal Brenda Shanahan

Thank you very much, Mr. Clark.

We will now turn to our first round of questions.

Ms. Stubbs, you have six minutes.

11:30 a.m.

Conservative

Shannon Stubbs Conservative Lakeland, AB

Thank you, Madam Chair.

Once again, as every day on this committee, I am shocked and sick to my stomach and haunted by the amount of time this has all gone on. I thank you all for your work and your efforts and your expertise. I can't even imagine the level and the years of frustration you must have experienced. Thanks for being here today.

I hope that at the end of all of this there's actually content to combat this scourge, rather than what happens sometimes, where reports are written and then nothing occurs.

Again, I hardly even know where to start.

Ms. McDonald, you mentioned in your 2020 report about the various platforms, including Pornhub, that now I think many people's minds are being blown, that they include child sexual assault material. Could you tell us whether or not there are features that actually allow the reporting specifically of child sexual abuse material on those platforms you reviewed?

11:35 a.m.

Executive Director, Canadian Centre for Child Protection

Lianna McDonald

Thank you very much. I'm going to turn this over to Lloyd Richardson, our director of technology, in one second.

The point I want to make before he gives those concrete examples is that we took that review on when we were examining the now signed-on voluntary principles to address this, which the Five Eyes countries are signatories to. Our agency wanted to find out how easy it was for a user or a victim to report CSAM on very well-known platforms. We were absolutely shocked at how difficult it was often even to find the term CSAM.

We noticed a number of tactics that were used to actually discourage, if you can imagine, the reporting of CSAM. We can only surmise it's because many of those companies didn't necessarily want the numbers, didn't want to show how much of this was on their platforms, because of the volume of it coming in.

Before Lloyd speaks to the examples, I also want to note the number of survivors who, as our colleague and friend John Clark mentioned, are coming into organizations such as ours right now. We have a tsunami of these victims who either want to get their illegal material down or are having a difficult time reporting. The review was intended to shed light on a number of platforms and the inability of people to effectively and easily report.

Lloyd.

11:35 a.m.

Lloyd Richardson Director, Information Technology, Canadian Centre for Child Protection

It's important to note when looking at these different platforms that this is only one dynamic of the way these companies operate in the space, the ability of people to report, “Hey, this is my material. Please remove it.” We need to know that many people aren't even necessarily doing that.

When we look at reports that we send in, typically industry will use the term “trusted flagger program” and what have you. Essentially, that just means they pay more attention to child protection charities when they send a notice in. When a member of the public does it, it generally has a much lower priority. This is typical across most tech companies, including MindGeek.

Another piece that's a bit of an issue is that to actually remove something is not a one-click option. The idea that these companies allow for the upload of this material—or historically have—and that you can upload it with no sort of contact information and away you go.... The process you need to go through to actually get something removed is quite heavy. In some cases you need to provide identification. If you have your material up there, would you really want to provide your email address or contact information to a company such as MindGeek?

Certainly, some of these things have changed. MindGeek fared well compared with some of the big tech companies, but that certainly doesn't mean it's doing very well in this space.

11:35 a.m.

Executive Director, Canadian Centre for Child Protection

Lianna McDonald

There's one last point I want to make. It's really important also, when we look at all the reports, and even the information and the data that organizations such as ours or NCMEC have, that we only come across what we know.

What we know—and we all understand this—is that for a young adolescent girl who has had a sexual image of her land on such sites as this, the fear and humiliation in coming forward to organizations and even approaching organizations for help are incredibly difficult. What we know for sure is that our numbers for this type of victimization vastly underestimate it.