Good afternoon. I'm Matt Hatfield. I'm the executive director of OpenMedia, a grassroots community of nearly 300,000 people in Canada who work together for an open, accessible and surveillance-free Internet.
I'm speaking to you today from the unceded territory of the Tsawout, Saanich, Cowichan and Chemainus nations.
What is there to say about Bill C-27? One part is long-overdue privacy reform, and your task is closing its remaining loopholes and getting the job of protecting our data done. One part is frankly undercooked AI regulation that you should take out of Bill C-27 altogether and take your time to get right. I can't address both at the length they deserve. I shouldn't have to, but we are where the government has forced us to be, so let's talk privacy.
There are some great changes in Bill C-27. These include real penalty powers for the OPC and the minister's promised amendments to entrench privacy as a human right. OpenMedia hopes this change to PIPEDA will clearly signal to the courts that our ownership of our personal data is more important than a corporation's interest in profiting off that data, but any regulatory regime is only as strong as its weakest link. It does no good for Canada to promise the toughest penalties in the world if they're easy to evade in most real-world cases. The weaknesses of Bill C-27 will absolutely be searched for and attacked by companies wishing to do Canadians harm.
That's why it's critical that you remove the consent exceptions in Bill C-27 and give Canadians the right to ongoing, informed and withdrawable consent for all use of our data. While you're fixing consent, you must also broaden Bill C-27's data rules to apply to every non-governmental body. This includes political parties, non-profit organizations like OpenMedia and vendors that sell data tools to any government body. No other advanced democracy tolerates a special exception to respecting privacy rules for the same parties that write privacy law. That's an embarrassing Canada original, and it shouldn't survive your scrutiny of this bill.
Privacy was the happier side of my comments on Bill C-27. Let's talk AI.
I promise you that our community understands the urgency to put some rules in place on AI. Earlier this year, OpenMedia asked our community what they hoped for and were worried about with generative AI. Thousands of people weighed in and told us they believe this is a huge moment for society. Almost 80% think this is bigger than the smart phone, and one in three of us thinks it will be as big or bigger than the Internet itself. “Bigger than the Internet” is the kind of thing you're going to want to get right, but being first to regulate is a very different thing from regulating right.
Minister Champagne is at the U.K.'s AI safety conference this week, telling media the risk is in doing too little, not too much. However, at the same conference, Rishi Sunak used his time to warn that we need to understand the impact of AI systems far more than we currently do, in order to regulate them effectively, and that no regulation will succeed if countries hosting AI developments do not develop their standards in close parallel. That's why the participants of that conference are working through foundational questions about exactly what is at stake and in scope right now. It's an important, necessary project, and I wish them all success with it.
If they're doing that work there, why are we here? Why has this committee been tasked with jamming AIDA through within a critical but unrelated bill? Why is Canada confident that we know more than our peers about how to regulate AI—so confident that we're skipping the basic public consultation that even moderately important legislation normally receives?
I have to ask this: Is AIDA about protecting Canadians, or is it about creating a permissive environment for shady AI development? If we legislate AI first, without learning in tandem with larger and more cautious jurisdictions, we're not going to wind up with the best protections. Instead, we're positioning Canada as a kind of AI dumping ground, where business practices that are not permitted in the U.S. or the EU can be produced here in rights-violating and even dangerous ways. I'm worried that this is not a bug, but rather the point—that our innovation ministry is fast-tracking this legislation precisely to guarantee Canada will have lower AI safety standards than our peers.
If generative AI is a hype cycle whose products will mostly underwhelm, then this is much ado about not much and there is no need to rush the legislation. However, if even a fraction of it is as powerful as its proponents claim, failing to work with experts and our global peers on best-in-class AI legislation is a tremendous mistake.
I urge you to separate AIDA from Bill C-27 and send it back for a full public consultation. If that isn't in your power, at the very least, you cannot allow Canada to become an AI dumping ground. That's why I urge you to make the AI commissioner report directly to you, our Parliament, not to ISED. A ministry whose mandate is to sponsor AI will have a strong temptation to look the other way on shady practices. The commissioner should be charged with reporting to you yearly on the performance of AIDA and on gaps that have been revealed in it. I also urge you to mandate parliamentary review of AIDA within two years of Bill C-27's taking effect, in order to decide whether it must be amended or replaced.
Since PIPEDA reform was first proposed in 2021, OpenMedia's community has sent more than 24,000 messages to our MPs demanding urgent comprehensive privacy protections. In the last few months, we've sent another 4,000 messages asking our Parliament to take the due time to get AIDA right. I hope you will hear us on both points.
Thank you, and I look forward to your questions.