Thank you, Mr. Chair.
Ladies and gentlemen members of the committee, I am pleased to be back to assist the committee in its study of Bill C‑27, Digital Charter Implementation Act, 2022, which would enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act, and the Artificial Intelligence and Data Act.
When I previously appeared before the committee three weeks ago, I delivered opening remarks about the bill and presented my 15 key recommendations to improve and strengthen the bill. Today, I want to briefly highlight and respond to the letter the Minister of Innovation, Science and Industry sent to the committee on October 3, 2023, and to answer any questions that you may still have.
I welcome the minister's stated position on the amendments being developed with respect to the proposed CPPA, in which he seems prepared to agree with four of my office's 15 key recommendations, namely by explicitly recognizing privacy as a fundamental right; by strengthening the protection of children's privacy; by providing more flexibility for my office to use compliance agreements, including through the use of financial penalties; and by allowing greater co-operation between regulators.
I also note and commend his statement of openness to further amendments following the study by this committee.
I would like to take this opportunity to highlight other ways in which the bill should be strengthened and improved in order to better protect the fundamental privacy rights of Canadians, which are addressed in our remaining recommendations to the committee.
I will briefly highlight five of our recommendations that stand out in particular in light of the minister's letter, and I would be happy to speak to all of our recommendations in the discussion that will follow.
First, privacy impact assessments, PIAs, should be legally required for high-risk activities, including AI and generative AI. This is critically important in the case of AI systems that could be making decisions that have major impacts on Canadians, including whether they get a job offer, qualify for a loan, pay a higher insurance premium or are suspected of suspicious or unlawful behaviour.
While AIDA would require those responsible for AI systems to assess and mitigate the risks of harm of high-impact AI systems, the definition of harm in the bill does not include privacy. This means that there would be proactive risk assessments for non-privacy harms but not for privacy harms. This is a significant gap, given that in a recent OECD report on generative AI, threats to privacy were among the top three generative AI risks recognized by G7 members.
In my view, responsible AI must start with strong privacy protections, and this includes privacy impact assessments.
Second, Bill C‑27 does not allow for fines for violations of the appropriate purposes provisions, which require organizations to only collect, use and disclose personal information in a manner and for purposes that a reasonable person would consider appropriate in the circumstances. This approach would leave the federal private sector privacy law as a standout when compared with the European Union and the Quebec regime, which allow the imposition of fines for such important privacy violations.
If the goal is, as the minister has indicated, to have a privacy law that includes tangible and effective tools to encourage compliance and to respond to major violations of the law in appropriate circumstances—an objective I agree with—I think this shortcoming surely needs to be addressed for such a critical provision.
Third, there remains the proposed addition of a new tribunal, which would become a fourth layer of review in the complaints process. As indicated in our submission to the committee, this would make the process longer and more expensive than the common models used internationally and in the provinces.
This is why we've recommended two options to resolve this problem. The first would be to have decisions of the proposed tribunal reviewed directly by the Federal Court of Appeal, and the second would be to provide my office with the authority to issue fines and to have our decisions reviewable by the Federal Court without the need to create a new tribunal, which is the model that we most commonly see in other comparable jurisdictions.
Fourth, the bill as drafted continues to allow the government to make exceptions to the law by way of regulations, without the need to demonstrate that those exceptions are necessary. This needs to be corrected as it provides too much uncertainty for industry and for Canadians, and it could significantly reduce privacy protections without parliamentary oversight.
Fifth, and finally, the bill would limit the requirement for organizations to explain, upon request, the predictions, recommendations or decisions that are being made about Canadians using AI, to situations that have a significant impact on an individual. At this crucial time in the development of AI, and given the privacy risks that have been recognized by the G7 and around the world, I would recommend more transparency in this area rather than less.
With that, I would be happy to answer any questions that you may have.