Evidence of meeting #99 for Industry, Science and Technology in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was board.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Barry Sookman  Senior Counsel, McCarthy Tétrault, As an Individual
Elizabeth Denham  Chief Strategy Officer, Information Accountability Foundation
Kristen Thomasen  Assistant Professor, Peter A. Allard School of Law, University of British Columbia, Women's Legal Education and Action Fund
Geoffrey Cape  Chief Executive Officer, R-Hauz, As an Individual
Andrée-Lise Méthot  Founder and managing partner, Cycle Capital, As an Individual

3:40 p.m.

Liberal

The Chair Liberal Joël Lightbound

I call this meeting to order.

Good afternoon, everyone. Welcome to meeting No. 99 of the House of Commons Standing Committee on Industry and Technology.

I appreciate the cheery atmosphere and hope it will last for the entire meeting.

Pursuant to the order of reference of Monday, April 24, 2023, the committee is resuming consideration of Bill C‑27, An Act to Enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other acts.

I'd like to welcome our witnesses today.

We have with us today in person Mr. Barry Sookman, senior counsel for McCarthy Tétrault. Online, we have Elizabeth Denham, chief strategy officer with the Women's Legal Education and Action Fund. We have Kristen Thomasen, assistant professor at the Peter A. Allard school of law at UBC, who is also joining us by video conference.

Thank you to all of our witnesses. Each will have five minutes.

Before we start, I see that Mr. Perkins has a point of order.

Mr. Perkins.

3:40 p.m.

Conservative

Rick Perkins Conservative South Shore—St. Margarets, NS

Thank you, Mr. Chair.

On a quick point of order, the motion we passed to produce the draft amendments for PIPEDA—in the long, compromising meetings that we had—was also done with the understanding that, before we start AIDA, the government would table the draft AIDA amendments as well. We gave an extra six weeks for the study.

I would just like an update from the government, if possible, about the tabling of the draft AIDA amendments from the minister.

3:40 p.m.

Liberal

The Chair Liberal Joël Lightbound

Thank you, Mr. Perkins. We'll will follow up with the department. The clerk and I will see where the department and the minister's office are on the said amendments.

It is true that we are nearing the end of the privacy part of our study on Bill C-27, so it would forthcoming. It would be good to have these amendments forthcoming. I'll reach out to the department.

Thank you, Mr. Perkins.

Ms. Thomasen, I believe you're also with the Women's Legal Education and Action Fund, so my apologies for that.

We'll start, without further ado, with Mr. Sookman for five minutes.

The floor is yours.

3:40 p.m.

Barry Sookman Senior Counsel, McCarthy Tétrault, As an Individual

Thank you very much for the opportunity to appear.

I am senior counsel at McCarthy Tétrault, with a practice focused on technology, intellectual property and privacy. I'm the author of several books in the field, including an eight-volume treatise on computer, Internet and electronic commerce law. I'm here in my personal capacity.

Although my remarks will focus on AIDA, I've submitted to the clerk articles published related to CPPA and AIDA, which contain much-needed improvements. You can see that my submission is substantial.

My remarks are going to focus on AIDA, as I've mentioned. In my view, AIDA is fundamentally flawed.

Any law that is intended to regulate an emerging transformative technology like AI should meet certain basic criteria. It should protect the public from significant risks and harms and be effective in promoting and not hindering innovation. It must also be intelligible—that is, members of Parliament and the public must be able to know what is being regulated and how the law will apply. It must respect parliamentary sovereignty and the constitutional division of powers and employ an efficient and accountable regulatory framework. AIDA either fails or its impact is unknowable in every respect.

AIDA has no definition of “high-impact system”, and even with the minister's letter that was delivered, has no criteria and no guiding principles for how AI systems will be regulated. We don't know what the public will be protected from, how the regulations will affect innovation or what the administrative monetary penalties will be. We know that fines for violating the regulations can reach $10 million or 3% of gross revenues, but we have no idea what the regulations will require that will trigger the mammoth fines against both small and large businesses that operate in this country.

In short, none of the key criteria to assess AIDA are knowable. In its current form, AIDA is unintelligible.

AIDA is, in my view, an affront to parliamentary sovereignty. AIDA sets a dangerous precedent. What will be next? Fiat by regulation for quantum computing, blockchain, the climate crisis or other threats? We have no idea.

AIDA also invokes a centralized regulatory framework that leaves all regulation to ISED. This departs from the sensible, decentralized, hub and spoke pro-innovation approach being taken so far in the United Kingdom and the United States, which leverages existing agencies and their expertise and avoids overlapping regulation. It recognizes that AI systems of all types will pervade all aspects of society and that one regulatory authority alone is ill-suited to regulate them. Rather, what is needed is a regulatory framework for a centralized body that sets standards and policies, coordinates regulation within Canada and internationally, and has a mechanism for addressing areas where there are gaps, if there are any.

AIDA also paves the way for a bloated and unaccountable bureaucracy within ISED. ISED will make and enforce the regulations, and they will be administered and enforced by the AI and data commissioner, who is not accountable to Parliament like the Privacy Commissioner. The commissioner is also not subject to any express judicial oversight, even though the commissioner has the power to shut down businesses and to levy substantial fines.

Last, a major problem with AIDA is that its lack of intelligibility and guiding principles make it impossible to evaluate its impact on innovation. We need to recognize that Canada is a middle country. It is risky for Canada to be out in front of our major trading partners with a law that may not be interoperable with those of our partners and may inadvertently and unnecessarily create barriers to trade. Our AI entrepreneurs are heavily dependent on being able to access and exploit AI models like ChatGPT from the United States. We should not risk creating obstacles that inhibit adoption, the realizing of the maximum potential of AI or the continued growth of AI and the ecosystem and the high-paying jobs it will create.

AI is going to be as transformative and as important as the steam engine, electricity and the microchip in prior generations. Canadian organizations in all sectors need open access to AI systems to support adoption and innovation and to be competitive in world markets. If we fail to get this right, there could be significant long-term detrimental consequences for this country.

To go back to my first point, there is nothing in AIDA to provide comfort that these risks will be avoided. While my opening remarks, Mr. Chairman, relate to AIDA, I also have concerns about the CPPA. I would be glad to answer any questions you may have about AIDA or the CPPA.

Thank you again for the opportunity to appear.

3:45 p.m.

Liberal

The Chair Liberal Joël Lightbound

Thank you very much, Mr. Sookman.

I'll now yield the floor to Elizabeth Denham with the Information Accountability Foundation. My apologies for the mistake in my presentation. There was a bit of confusion.

The floor is yours.

November 28th, 2023 / 3:45 p.m.

Elizabeth Denham Chief Strategy Officer, Information Accountability Foundation

Thank you very much.

Good afternoon, Chair.

Good afternoon, committee members and Madam Clerk.

Thank you for the invitation to appear before you today. Hopefully my input will benefit the committee's important work.

I speak from decades of experience as a privacy professional and from 15 years as an information rights regulator in four jurisdictions. My ongoing work takes place really on the international stage, but it's backed by long-standing familiarity with our own federal and provincial privacy laws.

When I became the information commissioner for the United Kingdom in 2016, that role really brought me into the EU's oversight board that administered the GDPR implementation. That brought me into direct collaboration with all EU member states, and that experience greatly expanded my view of data protection and privacy that was first cultivated at the federal level in Canada, in Alberta and British Columbia.

During my five years as the U.K. information commissioner, I also served three years as the chair of the Global Privacy Assembly. That position greatly expanded my horizons once again and enhanced my knowledge of other laws and other cultures, including the global south, the Middle East and the Asia-Pacific. To this day, the work I do spans continents.

The issues of pressing concern are largely the same, and those are children's privacy and safety and the regulation of artificial intelligence.

Looking first at Canada's CPPA from a global perspective, I see a big missing piece, and the legislation's language, in my view, needs adjusting so that it explicitly declares privacy as a fundamental right for Canadians. Its absence really puts us behind nations who lead the way in privacy and data protection.

The legislative package goes some way towards establishing expectations for AI governance, but it lacks specific and much-needed protections for children and youth. In a study I conducted through my work with an international law firm, Baker McKenzie, which surveyed 1,000 policy influencers across five jurisdictions, we found that all those surveyed came to a single point of agreement: The Internet was not created and not designed with children in mind.

All those policy influencers felt that we need to do better to protect children and youth online. Canada is a signatory to the United Nations Convention on the Rights of the Child, and I think Canada owes it to our young people to enshrine the right for them to learn and to play, to explore, to develop their agency and to be protected from harms online.

In the U.K., I oversaw the creation of a children's age-appropriate design code, which is a statutory enforceable code, and the design of that code has influenced laws, guidance and codes around the world. I'd be happy to answer more questions about that.

Additionally, I believe the legislature should go further than it does to provide the Privacy Commissioner with robust enforcement powers. I exported my career from Canada to the U.K. in large part because I wanted to gain hands-on experience administering laws with real powers and meaningful sanctions.

In Britain, privacy harms are treated as real harms ever since the GDPR came into effect. One result was the leap in the U.K. information commissioner's fining authority, but other enforcement powers were equally powerful: stop processing orders, orders to destroy data, streamlined search and seizure powers, mandatory audit powers and so on.

These enforcement powers were mandated by a comprehensive law that covers all types of organizations, not just digital services but a business of any kind, a charity or a political party. By comparison with the GDPR, Bill C-27 lacks broad scope. It doesn't cover charitable organizations, which are not above misusing personal data in the name of their worthy causes. Neither does Bill C-27 cover political parties. It leaves data and data-driven campaigns off the table for regulatory oversight.

Serving as a privacy commissioner at the federal and provincial levels in Canada exposed me to towering figures in my field. I think of Jennifer Stoddart, the former federal privacy commissioner, and David Flaherty, the former B.C. information and privacy commissioner. Their names recall a time when Canadian regulators and Canadian law were deeply respected internationally, when our laws and our regulators really served the world as a bridge between the U.S. and Europe. Although commissioners who followed, Daniel Therrien and Philippe Dufresne, have continued to contribute internationally, Canada’s laws have fallen behind any global benchmark.

I think we can recover some ground by returning to fundamental Canadian values, by remembering that our laws once led the way for installing accountability as the cornerstone of the law. Enforceable accountability means companies taking responsibility and standing ready to demonstrate that the risks they are creating for others are being mitigated. That's increasingly part of reformed laws around the world, including AI regulation. The current draft of the CPPA does not have enforceable accountability. Neither does it require mandatory privacy impact assessments. That puts us alarmingly behind peer nations when it comes to governing emerging technologies like AI and quantum.

My last point is that Bill C-27 creates a tribunal that would review recommendations from the Privacy Commissioner, such as the amount of an administrative fine, and it inserts a new administrative layer between the commissioner and the courts. It limits the independence and the order-making powers of the commissioner. Many witnesses have spoken against this development, but a similar arrangement does function in the U.K.

Companies can appeal commissioner decisions, assessment notices and sanctions to what is called the first-tier tribunal. That tribunal is not there to mark the commissioner’s homework or to conduct de novo hearings. I would suggest that, if Parliament proceeds with a tribunal, it has to be structured appropriately, according to the standard of review and with independence and political neutrality baked in.

As a witness before you today, I have a strong sense of what Canada can learn from other countries and what we can bring to the world. Today, Canada needs to do more to protect its citizens’ data. Bill C-27 may bring us into the present, but it seems to me inadequate for limiting, controlling or making sure we have responsible emerging technologies.

Thank you for hearing my perspective this afternoon. I very much look forward to your questions.

3:55 p.m.

Liberal

The Chair Liberal Joël Lightbound

Thank you very much, Ms. Denham.

I'll now yield the floor to Kristen Thomasen for five minutes.

3:55 p.m.

Dr. Kristen Thomasen Assistant Professor, Peter A. Allard School of Law, University of British Columbia, Women's Legal Education and Action Fund

Thank you so much, Mr. Chair, Madam Clerk and committee members, for this opportunity to speak with you today about centring human rights and substantive equality in Canadian AI legislation.

I am a law professor at UBC. I have been researching and writing in the areas of—

3:55 p.m.

Bloc

Sébastien Lemire Bloc Abitibi—Témiscamingue, QC

Mr. Chair, I apologize for interrupting Ms. Thomasen. We aren't getting the French interpretation.

3:55 p.m.

Liberal

The Chair Liberal Joël Lightbound

Just one second, Ms. Thomasen. We'll just make sure that the interpretation is working.

3:55 p.m.

Bloc

Sébastien Lemire Bloc Abitibi—Témiscamingue, QC

It's working now. Thank you.

3:55 p.m.

Liberal

The Chair Liberal Joël Lightbound

You can resume.

3:55 p.m.

Assistant Professor, Peter A. Allard School of Law, University of British Columbia, Women's Legal Education and Action Fund

Dr. Kristen Thomasen

Thank you.

I have been researching and writing in the areas of tort law, privacy law and the regulation of automated technologies for over a decade, with a particular focus on rights and substantive equality, including recent publications on safety in AI and robotics governance in Canada and work with the B.C. Law Institute's civil liability and AI project.

I'm here today representing the Women's Legal Education and Action Fund. LEAF is a national charitable, non-profit organization that works toward ensuring that the law guarantees substantive equality for all women, girls, trans and non-binary people. I'm a member of LEAF's technology-facilitated violence advisory committee and will speak to LEAF's written submissions, which I co-authored with LEAF senior staff lawyer Rosel Kim. Our submission and my comments today focus on the proposed AI and data act.

You've heard this before, but if we're going to regulate AI in Canada, we need to get it right. LEAF agrees with previous submissions emphasizing that AI legislation must be given the special attention it deserves and should not be rushed through with privacy reform. To the extent that this committee can do so, we urge that AIDA be separated from this bill and wholly revisited. We also urge that any new law be built from a foundation of human rights and must centre substantive equality.

If the AI and data act is to proceed, it will require amendments. We examined this law with an acute awareness that many of the harms already arising from the introduction of AI into social contexts are inequitably experienced by people who are already marginalized within society, including on the grounds of gender, race and class. If the law is not cognizant of the inequitable distribution of harm and profit from AI, then despite its written neutrality, it will offer inequitable protection. The companion document to AIDA suggests that the drafters are cognizant of this.

In our written submission, we made five recommendations, accompanied by textual amendments, to allow this law to better recognize at least some of the inequalities that will be exacerbated by the growing use of AI.

The act is structured to encourage the identification and mitigation of foreseeable harm. It does not require perfection and, in fact, is likely to be limited by the extent to which harms are not considered foreseeable to the developers and operators of AI systems.

In this vein, and most urgently, the definitions of “biased output” and “harm” need to be expanded to capture more of the many ways in which AI systems can negatively impact people, for instance, through proxies for protected grounds and through harm experienced at the group or collective level.

As we note in our submission, the introduction of one AI system can cause harm and discriminatory bias in a complex and multi-faceted manner. Take the example we cite of frontline care workers at an eating disorder clinic who had voted to unionize and were then replaced by an AI chatbot system. Through an equity lens, we can see how this would cause not just personal economic harm to those who lost their jobs but also collective harm to those workers and others considering collective action.

Additionally, the system threatened harm to care-seeking clients, who were left to access important medical services through an impersonal and ill-equipped AI system. When we consider equity, we should emphasize not only the vulnerable position of care workers and patients, but also the gendered, racialized and class dimensions of frontline work and experience with eating disorders. The act as currently framed does not seem to prompt a fulsome understanding nor a mitigation of the different complex harms engaged here.

Furthermore, as you've already heard, the keystone concept in this legislation, “high-impact system”, is not defined. Creating only one threshold for the application of the act and setting it at a high bar undermines any regulatory flexibility that might be intended by this. At this stage in the drafting, absent a rethinking of the law, we would recommend removing this threshold concept and allowing the regulations to develop in various ways to apply to different systems.

Finally, a key challenge with a risk mitigation approach, such as the one represented in this act, is that many of the harms of AI that have already materialized were unforeseeable to the developers and operators of the systems, including in the initial decision to build a given tool. For this reason, our submission also recommends a requirement for privacy and equity audits that are transparent to the public and that bring the attention of the persons responsible to as extensive as possible prevention and mitigation.

Finally, I would emphasize that concerns about the resources required to mitigate harm should not dissuade this committee from ensuring that the act will mitigate as much harm and discrimination as possible. We should not look to expand an AI industry that causes inequitable harm. Among many other reasons, we need a human rights approach to regulating AI for any chance of an actually flourishing industry in this country.

Industries will also suffer if workers in small enterprises are not protected against harm and discrimination by AI.

Public resistance to a new technology is often based on an understanding that a select few stand to benefit, while many stand to lose out. To the extent possible, this bill should try to mitigate some of that inequity.

Thank you for your time, and I look forward to your questions and the conversation.

4 p.m.

Liberal

The Chair Liberal Joël Lightbound

Thank you very much.

To start the discussion, I will now yield the floor to MP Williams.

4 p.m.

Conservative

Ryan Williams Conservative Bay of Quinte, ON

Thank you, Mr. Chair.

Thank you to the witnesses for being part of this important discussion, online and in person.

Mr. Sookman, Bill C-27 introduces a large number of new terms to our privacy protection regime, and then leaves them undefined and open to interpretation. For example, the bill gives extra protection to the sensitive information of a minor, but it does not define who a minor is or what sensitive information is. The list goes on: reasonable person, legitimate business interest, appropriate purposes and appropriate circumstances for data collection.

Do we need to define these terms in the legislation. If so, how would you do so?

4 p.m.

Senior Counsel, McCarthy Tétrault, As an Individual

Barry Sookman

Thank you very much for the question. It's a very good one.

One of the concerns I have about the CPPA, in particular, is that it creates these ambiguous new standards and requirements with open-ended tests, such as what would be appropriate or what a reasonable person would think. This is very hard for any organization that seriously wants to comply to know how to comply. “Appropriate Purposes” is a good example.

As for “minor”, I think it would be useful to have a common definition that applies throughout.

In terms of “sensitive information”, I think there are already sufficient cases for what “sensitive” is. I think “sensitive” is contextual, depending on the circumstances. There is a lot of guidance from the Privacy Commissioner on that. I don't think it would hurt to have criteria, but I think the courts are well suited to figuring that one out.

4 p.m.

Conservative

Ryan Williams Conservative Bay of Quinte, ON

Could you submit specific wording to the clerk for any of these definitions? I'll give them to you again before you go on: reasonable person, legitimate business interest, appropriate purposes and appropriate circumstances for data collection. If you have that and can submit it, it helps with our amendments to the bill, sir.

Something else we believe, on the Conservative side, is that there needs to be a balance between protecting a Canadian's fundamental right to privacy and ensuring the ability of businesses to use data for good.

Do you feel Bill C-27, as written, achieves that balance?

4 p.m.

Senior Counsel, McCarthy Tétrault, As an Individual

Barry Sookman

Thank you for that follow-up question.

I have reviewed the amendments proposed by the minister, which make it clear that the joint purposes of the act are the protection of the fundamental right of privacy and the legitimate interests of business. I think that's an appropriate way to do it. One has to understand that every fundamental right, including the fundamental rights in the charter, is subject to the Oakes test, which is a balancing exercise. The purpose clause makes it clear that we have to balance that fundamental right and the interests of business. This gives the courts the appropriate tools to solve the problem.

I'll also point out that the “Appropriate Purposes” section is an override section. If there is something an organization does that, frankly, is offside, this trumps everything. When you put together the purposes of the act and “Appropriate Purposes”, the public is adequately protected.

I won't get into the fact that, also, the Privacy Commissioner has huge discretionary powers as to how to enforce it, with very limited rights to appeal. If the Privacy Commissioner believes a fundamental right has been violated and that it's an inappropriate purpose, the commissioner has all the powers required to do the proper calibration to protect the public.

4:05 p.m.

Conservative

Ryan Williams Conservative Bay of Quinte, ON

Thank you, sir.

Ms. Denham, the GDPR has been criticized for imposing a high cost of compliance on small businesses.

Do you feel Bill C-27 creates a burden for small businesses when it comes to complying with the data protection and filing obligations?

4:05 p.m.

Chief Strategy Officer, Information Accountability Foundation

Elizabeth Denham

I heard that criticism about the GDPR being burdensome on small businesses. I think organizations like the EU Data Protection Supervisor and the Information Commissioner's Office of the U.K. have really focused on small businesses and innovators to help them with their compliance.

I suppose there isn't such a strict category line that we can draw between a small business and a medium-sized business in terms of the harms that they could create. I'll just remind you that Cambridge Analytica was a small business.

Therefore, I think what is more appropriately in context is the sensitivity and the amount of data that's being processed, whether it's two people working in their garage or a small political consultancy. There are many larger companies that aren't processing sensitive personal data. I think the point is to be able to delineate the potential harms and risks that a business is creating for Canadians and to make sure that those risks are properly mitigated.

4:05 p.m.

Conservative

Ryan Williams Conservative Bay of Quinte, ON

Ms. Denham, there are two sections of the GDPR that Bill C-27 copied. It ended up with a copy that, strangely, looks like it has been AI-generated. That would be “sensitive information” and the “legitimate interest” exemption.

In the GDPR, legitimate interest is meant to be a rare exception—not used normally, as it is in Bill C-27. The GDPR has a legitimate interest analysis that must be submitted and approved. Do we need to reform Bill C-27 to better copy the GDPR?

4:05 p.m.

Chief Strategy Officer, Information Accountability Foundation

Elizabeth Denham

Yes, I believe so. Legitimate interest is one of six legal bases that a company can use to process personal information—the others being required by contract, informed consent, binding corporate rules and public tasks. There are many legitimate bases for processing personal information. Legitimate interest is not meant to be an exception. It's one type of legal basis for collecting and processing data.

I think what has happened in Bill C-27

4:05 p.m.

Bloc

Sébastien Lemire Bloc Abitibi—Témiscamingue, QC

Mr. Chair, I have to interrupt the discussion because we're being told we have poor-quality sound.

4:05 p.m.

Liberal

The Chair Liberal Joël Lightbound

Yes, I just heard that, Mr. Lemire. The sound doesn't seem to be good enough.

Ms. Denham, is it possible to maybe move the boom of your microphone up a little bit and say a few words?

4:05 p.m.

Chief Strategy Officer, Information Accountability Foundation

Elizabeth Denham

Is that better? Can you hear me now?