Good afternoon. Thank you for inviting us to appear before you today.
I am the interim director of the privacy, technology and surveillance program at the Canadian Civil Liberties Association, an organization that has been standing up for the rights, civil liberties and fundamental freedoms of people in Canada since 1964.
Protecting privacy and human rights in our tech-driven present is no small undertaking. We commend the government for trying to modernize Canada's legislative framework for the digital age, and we commend the work that this committee is doing to get this legislation right.
We also acknowledge the procedural hurdles that may make it challenging for us to speak completely to Bill C-27 and its potential amendments. However, I will highlight three amendments from CCLA's written submission that we believe must be adopted to make Bill C-27 more respectful of people's rights in Canada.
First, Bill C-27 does not give fundamental rights their due and frequently puts them in second place, behind commercial interests. It has been said before but CCLA believes that it's worth emphasizing that Bill C-27 must be amended to recognize privacy as a human right, both in the CPPA and in AIDA, since privacy is something that should be respected at all points throughout data's life cycle.
This bill must also be amended to recognize our equality rights in the face of data discrimination and algorithmic bias, risks that grow exponentially as more and more data is gathered and fed into AI systems that make predictions or decisions of resounding consequence.
Privacy, data and AI legislation the world over, such as that in the European Union, already have stronger rights-based framing and protections. Canada simply needs to catch up.
Second, there are concerning gaps in Bill C-27 around the issue of sensitive information. Sensitivity is a concept that appears often throughout the CPPA; however, it is left undefined, allowing private interests to interpret its meaning as they see fit. A lot of personal information does qualify as sensitive, and although information's sensitivity often depends on context, there are special categories of information whose collection, use and disclosure carry inherent and extraordinary risks.
I want to draw your attention to one category in particular, the collection and use of which have implications for both the CPPA and AIDA, and that is biometric data.
Biometric data is perhaps the most vulnerable data we have, and its abuse can be particularly devastating to members of equity-seeking groups. Look no further than the prevalence of facial recognition technology. Facial recognition is used everywhere from law enforcement to shopping malls, and it relies on biometric information that is often collected without people's awareness and without people's consent. Right2YourFace coalition, of which CCLA is a member, has advocated having stronger legislative safeguards with respect to facial recognition and the sensitive biometric data that fuels it. Bill C-27 must be amended to not only explicitly define sensitive information and its many categories but also to unequivocally define biometric information as sensitive information worthy of special care and protection.
Third and finally, we take issue with the number of consent carve-outs in proposed section 18 of the CPPA, and how these can ultimately trickle down to AIDA. These carve-outs are, by and large, an affront to meaningful consent, and so to people's right to privacy. People should be able to meaningfully consent or decline to consent to how private companies gather and handle their personal data. Prioritizing a company's legitimate interest to violate consumer consent over people's privacy is simply inappropriate, as is leaving room for more consent carve-outs to be added in regulations later on. Bill C-27 is, frankly, porous with these exemptions and exceptions, and these gaps come at the expense of people's privacy.
There is no shortage of concerns around this bill, and I haven't really spoken to the issues that CCLA has with AIDA's narrow conception of harm, its lack of transparency requirements and its dangerous exclusions of national security institutions whose public mandates are often performed with privately acquired artificial intelligence technologies. We address these issues in greater depth in our written submission to the committee, but I'd be happy to expand on them in questioning.
I'd also like to direct the committee's attention to our written submission, which flags some of these concerns and includes an AI regulation petition that received over 8,000 signatures.
Bill C-27 overall needs tighter provisions to prioritize people's fundamental rights. The CPPA needs to plug its gaps around information sensitivity and consent, and if AIDA is not to be scrapped outright, reset or just separated from this bill, it needs fundamental rethinking.
Thank you.