Evidence of meeting #21 for Access to Information, Privacy and Ethics in the 43rd Parliament, 2nd Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was pornhub.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Clerk of the Committee  Ms. Miriam Burke
Lianna McDonald  Executive Director, Canadian Centre for Child Protection
Daniel Bernhard  Executive Director, Friends of Canadian Broadcasting
John F. Clark  President and Chief Executive Officer, National Center for Missing & Exploited Children
Lloyd Richardson  Director, Information Technology, Canadian Centre for Child Protection
Commissioner Stephen White  Deputy Commissioner, Specialized Policing Services, Royal Canadian Mounted Police
Normand Wong  Senior Counsel, Criminal Law Policy Section, Department of Justice
Superintendent Marie-Claude Arsenault  Royal Canadian Mounted Police

11:35 a.m.

Conservative

Shannon Stubbs Conservative Lakeland, AB

This is very disturbing.

I think I only have a minute left, so if we run out of time, I hope we'll get at this for all of our witnesses.

I'm hoping you can help us understand better one matter. What I understand from testimony is that when these websites have to take down child sexual abuse material, they'll put up a notice that says, “Removed because of copyright”, instead of something such as “Taken down because of a report to NCMEC”. Could you comment on that?

Then frankly, to Mr. Bernhard's point, which I was “out loud on mute” supporting, what boggles my mind is that at least under Canadian law—and I'm glad that these things are being reported to NCMEC—it seems to me very clear that they have a responsibility to be reporting child sexual abuse material to the police.

I wonder whether any or all of you have comments on those two points.

11:40 a.m.

Liberal

The Vice-Chair Liberal Brenda Shanahan

Keep to very short answers, please.

11:40 a.m.

Conservative

Shannon Stubbs Conservative Lakeland, AB

Oh, shoot. They can get at it later, too, further in the hour. I think these are the crucial questions for our committee.

11:40 a.m.

President and Chief Executive Officer, National Center for Missing & Exploited Children

John F. Clark

I'll jump in really quickly.

At the National Center, we work with a lot of technology companies. Of course, we have encouraged their reporting, but of the thousands of Internet service providers, we only have about 170 that are actively reporting. Of those 170, there are only about 20, maybe less than 20, that are actually significantly reporting.

Of course, we'd like to get that part of the whole ecosystem working well first and then obviously report it to the police, because as has been noted, many, many of these instances are criminal activity. Make no mistake about it. It's criminal activity. Not to mention—

11:40 a.m.

Liberal

The Vice-Chair Liberal Brenda Shanahan

Thank you very much, Mr. Clark. You can no doubt continue in a subsequent question.

Mr. Sorbara, please, for six minutes.

February 22nd, 2021 / 11:40 a.m.

Liberal

Francesco Sorbara Liberal Vaughan—Woodbridge, ON

Thank you, Chair, and thank you to everybody for their testimony this morning. It was very enlightening. Thank you for all the work you do in this very important area in helping kids in very bad situations.

First, I just want to go over two numbers that were reported by the Canadian Centre for Child Protection. I can go back to the blues, but I just want to get this out there again. You used the number in “billions” of images. Is that correct?

11:40 a.m.

Director, Information Technology, Canadian Centre for Child Protection

Lloyd Richardson

Yes. That's images scanned. If we're talking about needles in a haystack, that's the whole haystack in terms of images that we've detected. We've sent 6.7 million notices to providers on images. The 126 billion is not all child sexual abuse material. That's just the swath of material that we've scanned.

11:40 a.m.

Liberal

Francesco Sorbara Liberal Vaughan—Woodbridge, ON

Okay. Thank you.

Mr. Bernhard, I skimmed the report, “Platform for harm: Internet intermediary liability in Canadian law”. I also saw that you had produced an opinion piece in the Toronto Star on Thursday, December 10. Thank you for all the work you're doing in holding to account providers of these images when they know they should not be up there, if I can just put it in very plain language.

I wish to ask a question, and I believe this is under your domain. In a September 2020 report, Friends concludes that existing Canadian laws should be sufficient to hold platforms such as Pornhub accountable for illegal content that appears on their platform, despite the fact that the content is user generated and is not created or uploaded by Pornhub.

First, can you explain your position in more detail? Second, do you think MindGeek's algorithms provide the company sufficient knowledge of non-consensual content to give it “knowing involvement” in their publication and dissemination?

11:40 a.m.

Executive Director, Friends of Canadian Broadcasting

Daniel Bernhard

Thank you, Mr. Sorbara. You've touched on a key point, which is the difference between the law in Canada and the law in the United States.

In the United States, there has been a lot of talk about the Communications Decency Act and section 230 of that act, which holds platforms not to be liable for user-generated content that they are distributing.

Can you hear me still?

11:40 a.m.

Liberal

Francesco Sorbara Liberal Vaughan—Woodbridge, ON

Yes.

11:40 a.m.

Liberal

The Vice-Chair Liberal Brenda Shanahan

Sorry. No. Now you're frozen.

11:40 a.m.

Liberal

Francesco Sorbara Liberal Vaughan—Woodbridge, ON

Chair, can I move on, then?

11:40 a.m.

Liberal

The Vice-Chair Liberal Brenda Shanahan

Yes, please.

11:40 a.m.

Liberal

Francesco Sorbara Liberal Vaughan—Woodbridge, ON

To the National Center for Missing and Exploited Children, welcome. Mr. Clark, thank you for availing yourself.

I'm very curious. What does a partnership with NCMEC entail?

11:40 a.m.

President and Chief Executive Officer, National Center for Missing & Exploited Children

John F. Clark

That's a good question.

We work with some of the Internet service providers to, first and foremost, make sure they have good content moderation, that there are actual human beings looking for and seeking to immediately take down items, illegal content, CSAM when it's first discovered. We work, and try to work, closely with those companies who are willing and like-minded with us in trying to take down things that are apparently criminal activity and getting that off the sites, period. That's one of the things we look for when we're talking about—in air quotes—“partnership”, making sure that we continue with more of a collaboration model, working with the Internet service providers.

11:45 a.m.

Liberal

Francesco Sorbara Liberal Vaughan—Woodbridge, ON

We'll get Mr. Bernhard back on in my last minute or two of time, but staying with you, Mr. Clark, you mention Internet service providers. How many adult platform providers would you have a partnership with currently?

11:45 a.m.

President and Chief Executive Officer, National Center for Missing & Exploited Children

John F. Clark

We don't technically, as I said, have a partnership with Pornhub, although they have begun to voluntarily report, but I believe they are the only one. The others that we are working closely with are just large tech companies generally.

11:45 a.m.

Liberal

Francesco Sorbara Liberal Vaughan—Woodbridge, ON

That is a very important distinction to make when you have a partnership with an Internet service provider which allows entities to put content onto their platform versus having a partnership with what would be called an adult platform. Understood?

11:45 a.m.

President and Chief Executive Officer, National Center for Missing & Exploited Children

John F. Clark

Understood.

11:45 a.m.

Liberal

Francesco Sorbara Liberal Vaughan—Woodbridge, ON

Okay.

Mr. Bernhard, welcome back. Perhaps you could continue with your answer.

11:45 a.m.

Executive Director, Friends of Canadian Broadcasting

Daniel Bernhard

I'm sorry. The Internet providers are conspiring against me here.

We're talking about a difference between American law, which holds companies that deal in user-generated content indemnified from that content, and Canadian law, which does not.

In Canada, as our report documents, a company becomes liable for something that somebody else said or did under two circumstances: first, if they know about it in advance and publish it anyway, and second, if they are notified about it after the fact and fail to take action.

That's the first thing. In the case of Pornhub, both appear to be true. They are notified and take a long time to remove content. Also, to address your point about algorithms and recommendation of content, we believe they have a pretty sophisticated understanding of what this content is. If a relatively small not-for-profit organization in Manitoba is able to deploy technology that can find this material in large numbers, surely a company the size of MindGeek can do the same thing.

There is a difference between hosting content and actively recommending it to people. In that sense, the platforms are arguably more liable and more responsible for the offending content than the users themselves.

11:45 a.m.

Liberal

Francesco Sorbara Liberal Vaughan—Woodbridge, ON

Chair, please interrupt me when I'm beyond my time.

Daniel, I would agree with you, because the algorithms, as we all know with regard especially to social media platforms, are very powerful. I think I have 4,000 friends on Facebook, but I only see content from 25 of them on a daily basis, when I check. We know that the AI technology that is being used for what's being recommended that people should see and be viewing is very powerful. I'm sure that this adult platform we're talking about utilizes that technology.

11:45 a.m.

Executive Director, Friends of Canadian Broadcasting

Daniel Bernhard

Yes, I think you're entirely right.

I'm glad you mentioned Facebook, because to some extent I think the sexual nature, the shocking nature of this illegal activity can allow us to focus too much. The general question is: Are these platforms responsible for illegal activity that they promote, period. The child sexual abuse material is part of it. It's terrible. I have a 10-week-old daughter. This means a lot to me.

We know, however, that there is other illegal activity, including incitements to violence, the sale of drugs and arms, and so—

11:45 a.m.

Liberal

The Vice-Chair Liberal Brenda Shanahan

I'm sorry, but I will have to stop you there, Mr. Bernhard.

11:45 a.m.

Executive Director, Friends of Canadian Broadcasting