Good evening. I'm Matt Hatfield, the executive director of OpenMedia, a non-partisan, grassroots community of over 250,000 people in Canada working for an open, affordable and surveillance-free Internet.
I'm joining you from the unceded territory of the Stó:lō, Tsleil-Waututh, Squamish and Musqueam nations.
It's a pretty remarkable thing to be here today to talk about the online harms bill. When Canadians first saw what this bill might look like as a white paper back in 2021, we didn't much like what we saw. OpenMedia called it a blueprint for making Canada's Internet one of the most censored and surveilled in the democratic world, and we were far from alone in being concerned.
For once, our government listened. The rush to legislate stopped. National consultations were organized across the country on how to get regulation right with a wide range of stakeholders and experts on harms and speech. The resulting part 1 of Bill C-63 is an enormous, night-and-day improvement. Simple-minded punitive approaches that would have done more harm than good are gone, and nuances and distinctions made throughout show real sophistication about how the Internet works and how different harms should be managed. Packaging part 1—the online harms act itself—with changes to the Criminal Code and Human Rights Act proposed alongside it badly obscured that good work. That's why, alongside our peers, we called for these parts to be separated and why we warmly welcome the government's decision to separate those parts out.
I'll focus here on part 1 and part 4.
OpenMedia has said for years that Canadians do not have to sacrifice our fundamental freedoms to make very meaningful improvements to our online safety. The refocused Bill C-63 is the proof. Instead of trying to solve everything unpleasant on the Internet at once, Bill C-63 focuses on seven types of already-illegal content in Canada, and treats the worst and most easily identifiable content—child abuse material and adult material shared without consent—most severely. That's the right call. Instead of criminalizing platforms for the ugly actions of a small number of users, which would predictably make them wildly overcorrect to surveil and censor all of us, Bill C-63 asks them to write their own assessments of the risks posed by these seven types of content and document how they try to mitigate that risk. That's the right call again. It will put the vast engineering talent of platforms to work for the Canadian public, thinking creatively about ways to reduce these specific illegal harms. It will also make them explain what they are doing as they do it, so we can assess whether it makes sense and correct it if it does not.
However, I want to be very clear: It is not the time to pass Bill C-63 and call it quits. It's just the opposite. Because the parts that are now being separated raise so many concerns, there has not been nearly enough attention paid to refining part 1. I know you'll be hearing from a range of legal and policy experts about concerns they have with some of the part 1 wording and recommended fixes. I hope you will listen very carefully to all of them and pass on many of the fixes they suggest to you.
This is not the time to be a rubber stamp. The new digital safety commission is granted extraordinary power to review, guide and make binding decisions on how platforms moderate the public expression of Canadians in the online spaces we use the most. That's appropriate if, and only if, you make sure they carefully consider and minimize impacts on our freedom of expression and privacy. It isn't good enough for the commission to think about our rights and its explicit decisions. A badly designed platform safety plan could reduce an online harm but have a wildly disproportionate impact on our privacy or freedom of expression. You need to make sure platforms and the regulator make written assessments of the impact of their plans on our rights and ensure that any impact is small and proportionate to the harm mitigated. Bill C-63's protections of private, encrypted communication, and against platforms surveilling their users, need to be strengthened further and made airtight.
OpenMedia has a unique role in this discussion because we are both a rights-defending community that will always stand up for our fundamental freedoms and a community of consumer advocates who fight for common-sense regulation that empowers us and improves our daily lives. If you do your work at this committee, you can made Bill C-63 a win on both these counts. Since 2021, members of our community have sent nearly 22,000 messages to government asking you to get online harms right. Taking your time to study Bill C-63 carefully and make appropriate fixes before passing it would fulfill years of our activism and make our Internet a better, healthier place for many years to come.
Thank you, and I look forward to your questions.