Thank you.
Thank you for inviting me to share some views about Bill C-27 on behalf of the Privacy and Access Council of Canada, an independent, non-profit and non-partisan organization that is not funded by government or by industry.
Our members in public, private and non-profit sector organizations work with and assess new technologies every day, as have I through my 30-plus-year career as a privacy adviser. For that entire time, we have all heard the same promise: Technology will provide great benefits. To an extent, it has.
We’ve also been nudged to do everything digitally, and data is now the foundation of many organizations that collect, analyze and monetize data, often without the knowledge, much less the real consent, of the people the data is about.
It's understandable that there's great support for Bill C-27, except that many of the people who support it don't like it. They figure, though, that it's taken 20 years to get this much, and we can't wait another 20 for something better to replace PIPEDA, so it's better than nothing at all.
With respect, we disagree. We do not share the view that settling for the sake of change is better than standing firm for a law that, at its heart, would definitively state that Canadians have a fundamental right to privacy. The minister's concession to add that into the bill itself and not just the preamble is very welcome.
We disagree that settling for bad law is better than nothing, and Bill C-27 is bad law because it would undermine everyone's privacy, including children's—however they're defined in each jurisdiction. It also does nothing to counter the content regulation laws that would undermine encryption, would criminalize children who try to report abuse and would make it impossible for even your private communications to be confidential, whether you consent or not.
Definition determines outcomes, and Bill C-27 starts off by defining us all as “consumers” and not as individuals with a fundamental human right to privacy. It promotes data sharing to foster commerce, jobs and taxes. It adds a new bureaucracy that would be novel among data protection authorities and would delay individuals' recourse by years. It does not require AI transparency or restrict AI use by governments, only by the private sector that has not yet been deputized by government, which then gets sheltered by our current ATIP laws.
It won't slow AI and facial recognition from infiltrating our lives further. It won't slow the monetization of our personal information by a global data broker industry already worth more than $300 billion U.S. It doesn't impose any privacy obligations on political parties. It doesn't allow for executives to be fined—only organizations that then include the fine as a line item in their financials and move on, happy that their tax liabilities have been reduced.
Bill C-27 does allow personal information to be used for research but by whom or where in the world isn't limited. Big pharma using your DNA to research new medicines without your consent is just fine if it's been de-identified, although it can be easily reidentified, and larger and larger AI datasets make that more and more likely every day.
Bill C-27 would require privacy policies to be in plain language, and that would be great if it stated the degree of granularity required, but it doesn't. It allows the same vague language and generalities we now have, yet it still doesn't allow you to control what data about you may be shared or with whom, or give you a way to be forgotten.
It lets organizations collect whatever personal information they can from you and about you, without consent, as long as they say, in their self-interested way, that it's to make sure nothing about you is a threat to their “information, system or network security”, or if they say the collection and use “outweighs any potential adverse effect” on you resulting from that collection or use, and leaves it to you to find out about and to challenge that claim.
We've all heard industry's threat that regulation will hamper innovation. That red herring was invalidated when radio didn't kill newspapers, TV didn't kill radio and the Internet didn't kill either one. Industry adapted and innovated, and tech companies already do that with each new product, update and patch.
Companies that have skirted the edge of privacy compliance can adapt and innovate and can create things that, at their core, have a genuine respect for privacy, human rights, and sound ethics and morality. They can, but in almost a half a century since computers landed on desktops, most haven't. Politely asking organizations to consider the special interests of minors is lovely but hardly compelling, considering that, 20 years after PIPEDA came into force, barely more than half of Canadian companies the OPC surveyed have privacy policies or have even designated someone to be responsible for privacy.
Those are basic and fundamental components of a privacy management program that do not take 20 years to figure out. We don't have time to wait, but we also cannot afford legislation that is inadequate before it's proclaimed, that's not aligned with Quebec's Law 25, the U.S. executive order on AI or other jurisdictions that are well ahead of Canada on this. We also can't afford something that further erodes trust in government and industry as it freely trades away the privacy rights of Canadians for the sake of commercial gain.
I will be happy to answer your questions, and we will be detailing our views in a submission to the committee. I hope you hear us.