Evidence of meeting #94 for Industry, Science and Technology in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was c-27.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Daniel Konikoff  Interim Director of the Privacy, Technology & Surveillance program, Canadian Civil Liberties Association
Tim McSorley  National Coordinator, International Civil Liberties Monitoring Group
Matthew Hatfield  Executive Director, OpenMedia
Sharon Polsky  President, Privacy and Access Council of Canada
John Lawford  Executive Director and General Counsel, Public Interest Advocacy Centre
Yuka Sai  Staff Lawyer, Public Interest Advocacy Centre
Sam Andrey  Managing Director, The Dais, Toronto Metropolitan University

3:30 p.m.

Liberal

The Chair Liberal Joël Lightbound

Good afternoon, everyone. I call this meeting to order.

Welcome to meeting no. 94 of the House of Commons Standing Committee on Industry and Technology.

Today's meeting is taking place in a hybrid format, pursuant to the standing orders.

Pursuant to the order of reference of Monday, April 24, 2023, the committee is resuming consideration of Bill C‑27, An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts.

I'd like to welcome our witnesses today: Daniel Konikoff, interim director of the Privacy, Technology & Surveillance program at the Canadian Civil Liberties Association; Tim McSorley, national coordinator at the International Civil Liberties Monitoring Group; Matthew Hatfield, executive director of OpenMedia; Sharon Polsky, president of the Privacy and Access Council of Canada; John Lawford, executive director and general counsel at the Public Interest Advocacy Centre, who is joined by staff lawyer Yuka Sai; and Sam Andrey, managing director of The Dais at Toronto Metropolitan University.

Thank you for being here today.

I'm pleased that we are able to start on time.

Without further ado, Mr. Konikoff from Canadian Civil Liberties Association, you have the floor for five minutes.

3:30 p.m.

Daniel Konikoff Interim Director of the Privacy, Technology & Surveillance program, Canadian Civil Liberties Association

Good afternoon. Thank you for inviting us to appear before you today.

I am the interim director of the privacy, technology and surveillance program at the Canadian Civil Liberties Association, an organization that has been standing up for the rights, civil liberties and fundamental freedoms of people in Canada since 1964.

Protecting privacy and human rights in our tech-driven present is no small undertaking. We commend the government for trying to modernize Canada's legislative framework for the digital age, and we commend the work that this committee is doing to get this legislation right.

We also acknowledge the procedural hurdles that may make it challenging for us to speak completely to Bill C-27 and its potential amendments. However, I will highlight three amendments from CCLA's written submission that we believe must be adopted to make Bill C-27 more respectful of people's rights in Canada.

First, Bill C-27 does not give fundamental rights their due and frequently puts them in second place, behind commercial interests. It has been said before but CCLA believes that it's worth emphasizing that Bill C-27 must be amended to recognize privacy as a human right, both in the CPPA and in AIDA, since privacy is something that should be respected at all points throughout data's life cycle.

This bill must also be amended to recognize our equality rights in the face of data discrimination and algorithmic bias, risks that grow exponentially as more and more data is gathered and fed into AI systems that make predictions or decisions of resounding consequence.

Privacy, data and AI legislation the world over, such as that in the European Union, already have stronger rights-based framing and protections. Canada simply needs to catch up.

Second, there are concerning gaps in Bill C-27 around the issue of sensitive information. Sensitivity is a concept that appears often throughout the CPPA; however, it is left undefined, allowing private interests to interpret its meaning as they see fit. A lot of personal information does qualify as sensitive, and although information's sensitivity often depends on context, there are special categories of information whose collection, use and disclosure carry inherent and extraordinary risks.

I want to draw your attention to one category in particular, the collection and use of which have implications for both the CPPA and AIDA, and that is biometric data.

Biometric data is perhaps the most vulnerable data we have, and its abuse can be particularly devastating to members of equity-seeking groups. Look no further than the prevalence of facial recognition technology. Facial recognition is used everywhere from law enforcement to shopping malls, and it relies on biometric information that is often collected without people's awareness and without people's consent. Right2YourFace coalition, of which CCLA is a member, has advocated having stronger legislative safeguards with respect to facial recognition and the sensitive biometric data that fuels it. Bill C-27 must be amended to not only explicitly define sensitive information and its many categories but also to unequivocally define biometric information as sensitive information worthy of special care and protection.

Third and finally, we take issue with the number of consent carve-outs in proposed section 18 of the CPPA, and how these can ultimately trickle down to AIDA. These carve-outs are, by and large, an affront to meaningful consent, and so to people's right to privacy. People should be able to meaningfully consent or decline to consent to how private companies gather and handle their personal data. Prioritizing a company's legitimate interest to violate consumer consent over people's privacy is simply inappropriate, as is leaving room for more consent carve-outs to be added in regulations later on. Bill C-27 is, frankly, porous with these exemptions and exceptions, and these gaps come at the expense of people's privacy.

There is no shortage of concerns around this bill, and I haven't really spoken to the issues that CCLA has with AIDA's narrow conception of harm, its lack of transparency requirements and its dangerous exclusions of national security institutions whose public mandates are often performed with privately acquired artificial intelligence technologies. We address these issues in greater depth in our written submission to the committee, but I'd be happy to expand on them in questioning.

I'd also like to direct the committee's attention to our written submission, which flags some of these concerns and includes an AI regulation petition that received over 8,000 signatures.

Bill C-27 overall needs tighter provisions to prioritize people's fundamental rights. The CPPA needs to plug its gaps around information sensitivity and consent, and if AIDA is not to be scrapped outright, reset or just separated from this bill, it needs fundamental rethinking.

Thank you.

3:35 p.m.

Liberal

The Chair Liberal Joël Lightbound

Thank you very much.

I'll now yield the floor to Mr. McSorley from the International Civil Liberties Monitoring Group.

3:35 p.m.

Tim McSorley National Coordinator, International Civil Liberties Monitoring Group

Thank you, Chair, and thank you for the invitation to share the perspectives of the ICLMG today regarding Bill C-27.

We're a Canadian coalition that works to defend civil liberties from the impact of national security and anti-terrorism laws. Our concerns regarding Bill C-27 are grounded in this mandate.

While we support efforts to modernize Canadian privacy laws and establish AI regulations, the bill unfortunately contains multiple exemptions for national security purposes that are unacceptable and undermine Bill C-27's stated goal of protecting the rights and privacy of people in Canada.

We have submitted a written brief to the committee with 10 recommendations and accompanying amendments. I'd be happy to speak in more detail about any of these during the question period, but for now, I'd like to make three specific points.

First, in regard to the CPPA, we are opposed to proposed sections 47 and 48 of the act, which create exceptions to consent by allowing an organization to disclose, collect or use personal information if it simply “suspects that the information relates to national security, the defence of Canada or the conduct of international affairs”. This is an incredibly low threshold for circumventing consent.

Proposed section 48 is particularly egregious. It allows for an organization of “its own initiative” to collect, use or disclose an individual's personal information if it simply suspects that the information relates to these three areas. The concern does not even need to be connected to a suspected threat. Again, it only needs to relate, and that's not defined in the bill.

Not only are these sections very broad, they're also unnecessary. Other sections of the law would allow for more targeted disclosure to government departments, institutions and law enforcement agencies. For example, proposed section 45 allows an organization to proactively divulge information if it “has reasonable grounds to believe”—a much higher threshold—“that the information relates to a contravention” of a law that has been, is being or will be committed. We contrast that “reasonable grounds to believe” threshold with simply suspecting that it “relates”.

In that regard, we find proposed sections 47 and 48 unnecessary and overly broad. We propose, then, that proposed sections 47 and 48 simply be removed from the CPPA. Barring that, we've proposed specific language in our brief that would help to establish a more robust threshold for disclosing personal information.

Second, we're deeply concerned with the artificial intelligence and data act overall. In line with other witnesses, we believe it is a deeply flawed piece of legislation that must be withdrawn in favour of a more considered and appropriate framework. We have outlined these concerns in our brief, as well as in a joint letter shared with the committee and the minister, signed by 45 organizations and experts in the fields of AI, civil liberties and human rights.

AIDA was developed without appropriate public consultation or debate. It fails to integrate appropriate human rights protections. It lacks fundamental definitions. Egregiously, it would create an AI and data commissioner operating at the discretion of the Minister of Innovation, resulting in a commissioner with no independence to enforce the provisions of AIDA, as weak as they may be.

Finally, I'd like to address an unacceptable exception for national security that is found in AIDA as well.

Canadian national security agencies have been open regarding their interest and use of artificial intelligence tools for a wide range of purposes, including for facial recognition, surveillance, border security and data analytics. However, no clear framework has been established to regulate the development or use of these tools in order to prevent serious harm.

AIDA should present an opportunity to address this gap. Instead, it does the opposite in proposed subsection 3(2), where it explicitly excludes the application of the act to:

a product, service or activity that is under the direction or control of

(a) the Minister of National Defence;

(b) the Director of the Canadian Security Intelligence Service;

(c) the Chief of the Communications Security Establishment; or

(d) any other person who is responsible for a federal or provincial department or agency and who is prescribed by regulation.

This means that any AI system developed by a private sector actor that falls under the direction or control of this open-ended list of national security agencies would face absolutely no independent regulation or oversight.

It is inconceivable how such a broad exemption can be justified. Under such a rule, companies could create tools for our national security agencies without the need to undergo any assessment or mitigation for harm or bias, creating a human rights and civil liberties black hole. What if such technology were leaked, stolen or even sold to state or private entities outside of Canada's jurisdiction? All AI systems developed by the private sector must face regulation, regardless of their use by national security agencies.

Our brief includes specific examples of the harms that this lack of regulation can cause. I'd be happy to discuss these more with the committee. Overall, if AIDA does go ahead, we believe that proposed subsection 3(2) should simply be removed.

Thank you.

3:40 p.m.

Liberal

The Chair Liberal Joël Lightbound

Thank you very much, Mr. McSorley.

I'll now turn to Mr. Hatfield from OpenMedia, who is joining us by video conference.

3:40 p.m.

Matthew Hatfield Executive Director, OpenMedia

Good afternoon. I'm Matt Hatfield. I'm the executive director of OpenMedia, a grassroots community of nearly 300,000 people in Canada who work together for an open, accessible and surveillance-free Internet.

I'm speaking to you today from the unceded territory of the Tsawout, Saanich, Cowichan and Chemainus nations.

What is there to say about Bill C-27? One part is long-overdue privacy reform, and your task is closing its remaining loopholes and getting the job of protecting our data done. One part is frankly undercooked AI regulation that you should take out of Bill C-27 altogether and take your time to get right. I can't address both at the length they deserve. I shouldn't have to, but we are where the government has forced us to be, so let's talk privacy.

There are some great changes in Bill C-27. These include real penalty powers for the OPC and the minister's promised amendments to entrench privacy as a human right. OpenMedia hopes this change to PIPEDA will clearly signal to the courts that our ownership of our personal data is more important than a corporation's interest in profiting off that data, but any regulatory regime is only as strong as its weakest link. It does no good for Canada to promise the toughest penalties in the world if they're easy to evade in most real-world cases. The weaknesses of Bill C-27 will absolutely be searched for and attacked by companies wishing to do Canadians harm.

That's why it's critical that you remove the consent exceptions in Bill C-27 and give Canadians the right to ongoing, informed and withdrawable consent for all use of our data. While you're fixing consent, you must also broaden Bill C-27's data rules to apply to every non-governmental body. This includes political parties, non-profit organizations like OpenMedia and vendors that sell data tools to any government body. No other advanced democracy tolerates a special exception to respecting privacy rules for the same parties that write privacy law. That's an embarrassing Canada original, and it shouldn't survive your scrutiny of this bill.

Privacy was the happier side of my comments on Bill C-27. Let's talk AI.

I promise you that our community understands the urgency to put some rules in place on AI. Earlier this year, OpenMedia asked our community what they hoped for and were worried about with generative AI. Thousands of people weighed in and told us they believe this is a huge moment for society. Almost 80% think this is bigger than the smart phone, and one in three of us thinks it will be as big or bigger than the Internet itself. “Bigger than the Internet” is the kind of thing you're going to want to get right, but being first to regulate is a very different thing from regulating right.

Minister Champagne is at the U.K.'s AI safety conference this week, telling media the risk is in doing too little, not too much. However, at the same conference, Rishi Sunak used his time to warn that we need to understand the impact of AI systems far more than we currently do, in order to regulate them effectively, and that no regulation will succeed if countries hosting AI developments do not develop their standards in close parallel. That's why the participants of that conference are working through foundational questions about exactly what is at stake and in scope right now. It's an important, necessary project, and I wish them all success with it.

If they're doing that work there, why are we here? Why has this committee been tasked with jamming AIDA through within a critical but unrelated bill? Why is Canada confident that we know more than our peers about how to regulate AI—so confident that we're skipping the basic public consultation that even moderately important legislation normally receives?

I have to ask this: Is AIDA about protecting Canadians, or is it about creating a permissive environment for shady AI development? If we legislate AI first, without learning in tandem with larger and more cautious jurisdictions, we're not going to wind up with the best protections. Instead, we're positioning Canada as a kind of AI dumping ground, where business practices that are not permitted in the U.S. or the EU can be produced here in rights-violating and even dangerous ways. I'm worried that this is not a bug, but rather the point—that our innovation ministry is fast-tracking this legislation precisely to guarantee Canada will have lower AI safety standards than our peers.

If generative AI is a hype cycle whose products will mostly underwhelm, then this is much ado about not much and there is no need to rush the legislation. However, if even a fraction of it is as powerful as its proponents claim, failing to work with experts and our global peers on best-in-class AI legislation is a tremendous mistake.

I urge you to separate AIDA from Bill C-27 and send it back for a full public consultation. If that isn't in your power, at the very least, you cannot allow Canada to become an AI dumping ground. That's why I urge you to make the AI commissioner report directly to you, our Parliament, not to ISED. A ministry whose mandate is to sponsor AI will have a strong temptation to look the other way on shady practices. The commissioner should be charged with reporting to you yearly on the performance of AIDA and on gaps that have been revealed in it. I also urge you to mandate parliamentary review of AIDA within two years of Bill C-27's taking effect, in order to decide whether it must be amended or replaced.

Since PIPEDA reform was first proposed in 2021, OpenMedia's community has sent more than 24,000 messages to our MPs demanding urgent comprehensive privacy protections. In the last few months, we've sent another 4,000 messages asking our Parliament to take the due time to get AIDA right. I hope you will hear us on both points.

Thank you, and I look forward to your questions.

3:45 p.m.

Liberal

The Chair Liberal Joël Lightbound

Thank you.

Now to hear from Ms. Polsky, from the Privacy and Access Council of Canada.

3:45 p.m.

Sharon Polsky President, Privacy and Access Council of Canada

Thank you.

Thank you for inviting me to share some views about Bill C-27 on behalf of the Privacy and Access Council of Canada, an independent, non-profit and non-partisan organization that is not funded by government or by industry.

Our members in public, private and non-profit sector organizations work with and assess new technologies every day, as have I through my 30-plus-year career as a privacy adviser. For that entire time, we have all heard the same promise: Technology will provide great benefits. To an extent, it has.

We’ve also been nudged to do everything digitally, and data is now the foundation of many organizations that collect, analyze and monetize data, often without the knowledge, much less the real consent, of the people the data is about.

It's understandable that there's great support for Bill C-27, except that many of the people who support it don't like it. They figure, though, that it's taken 20 years to get this much, and we can't wait another 20 for something better to replace PIPEDA, so it's better than nothing at all.

With respect, we disagree. We do not share the view that settling for the sake of change is better than standing firm for a law that, at its heart, would definitively state that Canadians have a fundamental right to privacy. The minister's concession to add that into the bill itself and not just the preamble is very welcome.

We disagree that settling for bad law is better than nothing, and Bill C-27 is bad law because it would undermine everyone's privacy, including children's—however they're defined in each jurisdiction. It also does nothing to counter the content regulation laws that would undermine encryption, would criminalize children who try to report abuse and would make it impossible for even your private communications to be confidential, whether you consent or not.

Definition determines outcomes, and Bill C-27 starts off by defining us all as “consumers” and not as individuals with a fundamental human right to privacy. It promotes data sharing to foster commerce, jobs and taxes. It adds a new bureaucracy that would be novel among data protection authorities and would delay individuals' recourse by years. It does not require AI transparency or restrict AI use by governments, only by the private sector that has not yet been deputized by government, which then gets sheltered by our current ATIP laws.

It won't slow AI and facial recognition from infiltrating our lives further. It won't slow the monetization of our personal information by a global data broker industry already worth more than $300 billion U.S. It doesn't impose any privacy obligations on political parties. It doesn't allow for executives to be fined—only organizations that then include the fine as a line item in their financials and move on, happy that their tax liabilities have been reduced.

Bill C-27 does allow personal information to be used for research but by whom or where in the world isn't limited. Big pharma using your DNA to research new medicines without your consent is just fine if it's been de-identified, although it can be easily reidentified, and larger and larger AI datasets make that more and more likely every day.

Bill C-27 would require privacy policies to be in plain language, and that would be great if it stated the degree of granularity required, but it doesn't. It allows the same vague language and generalities we now have, yet it still doesn't allow you to control what data about you may be shared or with whom, or give you a way to be forgotten.

It lets organizations collect whatever personal information they can from you and about you, without consent, as long as they say, in their self-interested way, that it's to make sure nothing about you is a threat to their “information, system or network security”, or if they say the collection and use “outweighs any potential adverse effect” on you resulting from that collection or use, and leaves it to you to find out about and to challenge that claim.

We've all heard industry's threat that regulation will hamper innovation. That red herring was invalidated when radio didn't kill newspapers, TV didn't kill radio and the Internet didn't kill either one. Industry adapted and innovated, and tech companies already do that with each new product, update and patch.

Companies that have skirted the edge of privacy compliance can adapt and innovate and can create things that, at their core, have a genuine respect for privacy, human rights, and sound ethics and morality. They can, but in almost a half a century since computers landed on desktops, most haven't. Politely asking organizations to consider the special interests of minors is lovely but hardly compelling, considering that, 20 years after PIPEDA came into force, barely more than half of Canadian companies the OPC surveyed have privacy policies or have even designated someone to be responsible for privacy.

Those are basic and fundamental components of a privacy management program that do not take 20 years to figure out. We don't have time to wait, but we also cannot afford legislation that is inadequate before it's proclaimed, that's not aligned with Quebec's Law 25, the U.S. executive order on AI or other jurisdictions that are well ahead of Canada on this. We also can't afford something that further erodes trust in government and industry as it freely trades away the privacy rights of Canadians for the sake of commercial gain.

I will be happy to answer your questions, and we will be detailing our views in a submission to the committee. I hope you hear us.

3:55 p.m.

Liberal

The Chair Liberal Joël Lightbound

Thank you very much.

I'll now give the floor to Mr. Lawford and Ms. Sai from the Public Interest Advocacy Centre.

3:55 p.m.

John Lawford Executive Director and General Counsel, Public Interest Advocacy Centre

Thank you, Chair.

The Public Interest Advocacy Centre is a national, non-profit and registered charity that provides legal and research services on behalf of consumers—in particular, vulnerable consumers. PIAC has been active in the field of consumer privacy law and policy for over 25 years.

My name is John Lawford. I'm the executive director and general counsel. With me today is Yuka Sai, staff lawyer at PIAC.

Bill C-27 reverses 25 years of privacy law in Canada. Businesses can now assume consent, and consumers must prove abuse. If this sounds uncomfortable from an individual rights perspective, that's because it is.

Firstly, with regard to consent, the new business activities exception to consent, which is in proposed subsection 18(1), makes full use of your personal information without your consent, or even your knowledge, legal for business. Business activities are defined so widely and tautologically in proposed subsection 18(2) that only businesses will be able to define what a business is. It's ridiculous. Proposed section 18 completely reverses the default of an individual's informed consent for the collection or use of personal information under PIPEDA. Do Canadians really want that?

The addition of an exception to consent and knowledge in proposed subsection 18(3), for the collection or use of additional personal information for legitimate interests, is an import from European law but without the fundamental right to privacy that it modulates in Europe.

Secondly, with regard to de-identification, under proposed section 20, consumers also lose out on opportunities to scrutinize the use of their personal information when it is de-identified. De-identify is defined as:

to modify personal information so that an individual cannot be directly identified from it, though a risk of the individual being identified remains.

It is akin to saying that to kill means to take the life of a person directly, although a chance of their remaining alive remains. It is contradictory and meaningless.

De-identification was also clearly a “use” of personal information under PIPEDA. What that use approach stops is the indiscriminate filling of databases with personal information with only the most cursory removal of tombstone information identifiers from the data. Reidentification is therefore a real risk, but even de-identified information can harm individuals when they are profiled in databases that are then used to market to them or to deny them services. Bill C-27 supercharges this outcome.

Go ahead, Yuka.

3:55 p.m.

Yuka Sai Staff Lawyer, Public Interest Advocacy Centre

Proposed section 39 facilitates a pipeline of data between the industry and the public sector. The government can prescribe any purpose or public entity as “socially beneficial” and consumers would never know to question it until issues emerge. We remind everyone of Telus giving PHAC cellphone data information in 2021.

Artificial intelligence, AI, simply is rocket fuel for discrimination. The AIDA portion of this bill lacks substance on bias, systemic harms, high impact systems and government applicability, and denies independent oversight.

The proposed tribunal is purpose-built to kill enforcement of the new act. It enables businesses to prolong the resolution process with a soon-to-be captured review board akin to the Competition Tribunal. It delays to the point of death any class action. The Privacy Commissioner, instead, should have the power to issue orders and penalties, with decisions subject to appeal before the Federal Court.

3:55 p.m.

Executive Director and General Counsel, Public Interest Advocacy Centre

John Lawford

On EU adequacy, unless the EU really looks the other way, any law in this mould certainly will be inadequate for European adequacy.

In conclusion, this bill should be wholly rejected. Consumers are infinitely better protected under PIPEDA. The bill is a deliberate attempt to grease the rails for business and for AI.

These are our thoughts. Thank you very much, and we look forward to your questions.

3:55 p.m.

Liberal

The Chair Liberal Joël Lightbound

Thank you very much.

I'll now turn to Sam Andrey, managing director of the The Dais, Toronto Metropolitan University.

4 p.m.

Sam Andrey Managing Director, The Dais, Toronto Metropolitan University

Thank you for the invitation to address the committee today.

I'm Sam Andrey. I'm the managing director of the Dais, a think tank at Toronto Metropolitan University where we work to develop the policy ideas to advance an inclusive, innovative economy, education system and democracy for Canada.

I'm going to focus my remarks today on the AI and data act. As many of my colleagues have noted, AI has the potential to have a transformative impact on our economy and our daily lives, but it also poses significant risks, including systemic forms of discrimination, psychological harms and malicious use.

The latest data from StatsCan shows that only about 4% of Canadian businesses are using AI, so to reach AI's full potential and increase adoption, we need a responsible governance framework.

Unfortunately, we think the current bill fails to adequately do that. The bill's surprise introduction and lack of public consultation since have limited the ability of folks in civil society, experts, industry and equity-deserving communities to engage with this important legislation. Our team at TMU, led by Christelle Tessono, has partnered with McGill University's Centre for Media, Technology and Democracy to engage with many of these folks over the last year and has produced recommendations for improving the bill, which we'll be sending to the committee.

I'm just going to highlight three of those today that we hope can be addressed if AIDA is moving forward.

First, the bill's definition of “harm” is very narrowly focused on individuals, but the harms of AI systems also occur at broader community and group levels. Depending on the type and context of the system in question, harm to individuals can be difficult to prove and only evident when assessed at a population level. Moreover, there are types of collective harms that are manipulative and exploitative from AI that would likely not be captured by this definition. Things like election interference, harm to the environment and collective harms to children are not harms that would be captured by the definition, which is focused on individuals.

Second, as my colleagues have said, the proposed regulatory model does not create sufficient independence from the minister of ISED, who would have competing roles of championing the economic benefits of AI while regulating and enforcing its risks. We think that the proposed AI and data commissioner needs to be independent from the minister, ideally through a parliamentary appointment and certainly with sufficient resources to support their role.

We would also propose two additions. One is the ability for individuals to make complaints to the commissioner. Currently to launch any investigation, the minister has to have reasonable grounds to believe that an investigation is warranted, which is a very high bar. The other is for the commissioner to be able to conduct pre-emptive audits.

Third, as has been mentioned, this bill currently only applies to the private sector. Minister Champagne's proposed list of high impact systems that he's shared with this committee that would be potentially subject to regulation includes a number of AI systems commonly used by public sector actors, like facial recognition used by the police and health care, but it creates a double standard where the private sector developers of these systems are going to be subject to regulation and our public servants operating them will not be.

This double standard is unlike the EU, and it fails to position the Canadian government as leading by example through legal bans and guardrails for its own responsible development and use of AI. The current structure of the bill, particularly its commissioner being an ISED departmental official, makes it poorly structured to provide oversight for all public sector AI. We acknowledge that it would not be an easy amendment job, but I would just note that Parliament needs to prioritize the development of AI regulation for the public sector, which needs to include adequate public consultation and engagement.

I want to close by saying that Canada's investments in developing AI systems and research have not yet been matched by a comparable effort to regulate the quickly evolving risks of the technology. We're encouraged that the minister and this committee are open to amendments that will strengthen the bill, and there's really a large community across Canada who wants to help.

Thank you for the opportunity.

4 p.m.

Liberal

The Chair Liberal Joël Lightbound

Thank you very much.

To get discussion started, I'm going to give the floor to Mr. Perkins for six minutes.

4 p.m.

Conservative

Rick Perkins Conservative South Shore—St. Margarets, NS

Thank you, Mr. Chair.

Thank you, witnesses, for your excellent presentations on this important bill.

In the first few meetings, my colleagues on the government side were probably sick of hearing me say that it's a broken bill, but it is a broken bill in the—

4 p.m.

Liberal

Tony Van Bynen Liberal Newmarket—Aurora, ON

I'm on the English channel and getting French.

4 p.m.

Liberal

The Chair Liberal Joël Lightbound

Hold on for one second. We'll make sure that everything is working properly.

4:05 p.m.

Liberal

Tony Van Bynen Liberal Newmarket—Aurora, ON

It's working now. Thank you.

4:05 p.m.

Liberal

The Chair Liberal Joël Lightbound

Okay, that's perfect.

Mr. Perkins, you can start from the beginning.

4:05 p.m.

Conservative

Rick Perkins Conservative South Shore—St. Margarets, NS

I'll start from the beginning.

Thank you for coming and for your excellent presentations on this important bill.

In the first number of meetings, we were calling this bill a broken bill for a lot of the reasons that all of you outlined. You've probably been following it.

The fundamental right in the purpose clause is critical from our perspective. It's certainly critical that, in the purpose section, it is at a level of superiority to the need of an organization's ability to use it.

Perhaps I could start off by asking Mr. Konikoff if he believes that the words there need to be not personal privacy and an organization's right, but some other language that makes it superior to that.

4:05 p.m.

Interim Director of the Privacy, Technology & Surveillance program, Canadian Civil Liberties Association

Daniel Konikoff

At present, my gripe would be with the lack of extension of the fundamental right to privacy to AIDA. I'd say that is really the biggest weak spot with regard to this. We commend the minister for including that in his proposed amendment.

As for the language, I'd need to take a moment to review that, but I'd say that perhaps the most concerning piece is the fact that it doesn't essentially trickle down from the CPPA to AIDA.

I'd happily defer to anyone else on this panel if they have any—

4:05 p.m.

Conservative

Rick Perkins Conservative South Shore—St. Margarets, NS

I will move on then to Mr. Lawford.

One thing that's come up recently about the issues.... I've spoken a lot about the issues of proposed sections 12, 15 and 18, which you outlined. Proposed section 15 outlines plain language in consent, which obviously is not something we get a lot of when we do that.

I'm reading the latest terms and conditions from Zoom, which were released in the summer. It reached the news that Zoom was actually taking the right to transcribe and own everything that is said.

The thing that really bothers me is 15.2 of their terms and conditions, which is in almost every organization. It says, “You agree that Zoom may modify, delete, and make additions to its guides, statements, policies, and notices, with or without notice to you, and for similar guides, statements, policies, and notices applicable to your use of the Services by posting an updated version on the...webpage.” They don't actually come out and say that, if they're changing the terms, they'll just post it somewhere on a mysterious web page and assume that you've consented to the fact that they're now going to transcribe and own everything you say on Zoom.

I'll leave proposed section 18 because that's a different discussion, but what can be done in proposed section 15 to fix that so that companies don't have the right to do whatever they want to the terms and conditions without an individual's knowledge?

November 2nd, 2023 / 4:05 p.m.

Executive Director and General Counsel, Public Interest Advocacy Centre

John Lawford

Under the present act, if you change the purpose for which you're using and collecting personal information, you have to give a chance for reconsent. The Privacy Commissioner hasn't always received complaints on that type of approach, but at the moment at least, you could complain that the initial consent was based on a different set of terms and conditions. If they want to change the terms and conditions, especially if it's just posting and all you do is use it to accept them, then at least at the moment you could do that.

My concern is with proposed section 18. Zoom can say that it's the way business is done with online programs now, so you have to complain to the Privacy Commissioner. That's why I've said this reverses the onus from the present law, where consent has to be sought. You could change this bill to require new consent when the purposes have changed. That could go in proposed section 15 for certain—

4:05 p.m.

Conservative

Rick Perkins Conservative South Shore—St. Margarets, NS

In other words, it's not there. There's nothing in Bill C-27 that prevents this practice from continuing, where a business says it's changing the current terms and conditions of consent and complying with the law by posting it. If anyone challenges it, even though they'll never discover it because they don't know it happened, the business can go through a process to appeal and under the CPPA's proposed subsection 18(3) in particular, it can say, “Too bad. We have the right because it's in our business's legitimate interest to do so.”