Madam Chair, honourable members of the committee, thank you for inviting me to appear today.
My name is Daniel Bernhard, and I am the executive director of Friends of Canadian Broadcasting, an independent citizens' organization that promotes Canadian culture, values and sovereignty on air and online.
Last September, Friends released “Platform for Harm”, a comprehensive legal analysis showing that under long-standing Canadian common law, platforms like Pornhub and Facebook are already liable for the user-generated content they promote.
On February 5, Pornhub executives gave contemptuous and, frankly, contemptible, testimony to this committee, attempting to explain away all the illegal content that they promoted to millions of Canadians and millions more around the world.
Amoral as the Pornhub executives appear to be, it would be a mistake, in my opinion, to treat their behaviour as a strictly moral failing. As Mr. Angus said on that day the activity that you are studying is quite possibly criminal.
Pornhub does not dispute having disseminated vast amounts of child sexual abuse material, and Ms. McDonald just confirmed that fact. On February 5, the company's executives acknowledged that 80% of their content was unverified, some 10 million videos, and they acknowledged that they transmitted and recommended large amounts of illegal content to the public.
Of course, Pornhub's leaders tried to blame everybody but themselves. Their first defence is ignorance. They claim they can't remove illegal content from the platform because until a user flags it for them, they don't know it's there. In any case, they claim that responsibility lies with the person who uploaded the content and not with them. However, the law does not support this position. Yes, uploaders are liable, but so are platforms promoting illegal content if they know about it in advance and publish it anyway or if they are made aware of it post-publication and neglect to remove it.
This brings us to their second defence, incompetence. Given the high cost of human moderation, Pornhub employs software to find offending content, yet they hold themselves blameless when their software doesn't actually work. As Mark Zuckerberg has done so many times, Pornhub promised you that they'll do better. “Will do better” isn't a defence. It's a confession.
I wish Pornhub were an outlier, but it's not. In 2018, the U.S. National Center for Missing and Exploited Children received over 18 million referrals of child sexual abuse materials, according to the New York Times. Most of it was found on Facebook. There were more than 50,000 reports per day. That's just what they caught. The volume of user-uploaded, platform-promoted child sexual abuse material is now so vast that the FBI must prioritize cases involving infants and toddlers, and according to the New York Times, “are essentially not able to respond to reports of anybody older than that”.
These platforms also disseminate many illegal contents that are not of a sexual nature. These include incitement to violence, death threats, and the sale of drugs and illegal weapons, among others. The Alliance to Counter Crime Online group regularly discovers such content on Facebook, YouTube and Amazon. There is even an illegal market for human remains on Facebook.
The volume of content that these platforms handle does not excuse them from disseminating and recommending illegal material. If widespread distribution of illegal content is an unavoidable side effect of your business, then your business should not exist, period.
Can you imagine an airline being allowed to carry passengers when every other flight crashes? Imagine if they just said that flying is hard and kept going. Yet Pornhub and Facebook would have you believe just that: that operating illegally is fine because they can't operate otherwise. That's like saying, “Give me a break officer. Of course I couldn't drive straight. I had way too much to drink.”
The government promises new legislation to hold platforms liable in some way for the content that they promote and this is a welcome development. But do we really need a new law to tell us that broadcasting child sexual assault material is illegal? How would you react if CTV did? Exactly.
In closing, our research is clear. In Canada, platforms are already liable for circulating illegal user-generated content. Why hasn't the Pornhub case led to charges? Perhaps you can invite RCMP Commissioner Lucki to answer that question. Ministers Blair and Lametti could also weigh in. I'd be curious to hear what they have to say.
Don't get me wrong. The work that you are doing to draw attention to Pornhub's atrocious behaviour is vital, but you should also be asking why this case is being tried at committee and not in court.
Here's the question: Does Pornhub's CEO belong in Hansard or in handcuffs? This is a basic question of law and order and of Canada's sovereignty over its media industries. It is an urgent question. Canadian children, young women and girls cannot wait for a new law and neither should we.
Thank you very much. I welcome your questions.