Digital Charter Implementation Act, 2022

An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts

Sponsor

Status

In committee (House), as of April 24, 2023

Subscribe to a feed (what's a feed?) of speeches and votes in the House related to Bill C-27.

Summary

This is from the published bill. The Library of Parliament has also written a full legislative summary of the bill.

Part 1 enacts the Consumer Privacy Protection Act to govern the protection of personal information of individuals while taking into account the need of organizations to collect, use or disclose personal information in the course of commercial activities. In consequence, it repeals Part 1 of the Personal Information Protection and Electronic Documents Act and changes the short title of that Act to the Electronic Documents Act . It also makes consequential and related amendments to other Acts.
Part 2 enacts the Personal Information and Data Protection Tribunal Act , which establishes an administrative tribunal to hear appeals of certain decisions made by the Privacy Commissioner under the Consumer Privacy Protection Act and to impose penalties for the contravention of certain provisions of that Act. It also makes a related amendment to the Administrative Tribunals Support Service of Canada Act .
Part 3 enacts the Artificial Intelligence and Data Act to regulate international and interprovincial trade and commerce in artificial intelligence systems by requiring that certain persons adopt measures to mitigate risks of harm and biased output related to high-impact artificial intelligence systems. That Act provides for public reporting and authorizes the Minister to order the production of records related to artificial intelligence systems. That Act also establishes prohibitions related to the possession or use of illegally obtained personal information for the purpose of designing, developing, using or making available for use an artificial intelligence system and to the making available for use of an artificial intelligence system if its use causes serious harm to individuals.

Elsewhere

All sorts of information on this bill is available at LEGISinfo, an excellent resource from the Library of Parliament. You can also read the full text of the bill.

Votes

April 24, 2023 Passed 2nd reading of Bill C-27, An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts
April 24, 2023 Passed 2nd reading of Bill C-27, An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts

David Young Principal, Privacy and Regulatory Law Counsel, David Young Law, As an Individual

Thank you for the invitation to appear before this committee for its important review of Bill C-27.

This bill includes significant proposed amendments to Canada's privacy laws at the same time as it introduces a proposed oversight regime for artificial intelligence. The AIDA component warrants focused study by the committee. Certainly, as you've heard from my co-witnesses, there's a lot to consider there. However, I will restrict my comments to the privacy components.

I am a privacy and regulatory lawyer. My practice over the past 25 years has included advising private sector organizations—both for-profit and non-profit—as well as government and Crown agencies. I address all relevant areas, including individual privacy, employee privacy and health privacy.

In these introductory comments, I will focus on one impactful area of the bill, which you have heard some comments about already: de-identified and anonymized information. I'm hoping to provide some clarification as well as my thoughts on how the proposed provisions can be improved.

The proposed treatment of such information in Bill C-27 is critically important. Firstly, it clarifies a category of information that, while not being fully identifiable and therefore available for specific uses without consent, is still deemed appropriate for protection under the law. Secondly, it provides for a category of anonymized information that can be used more broadly for research purposes, innovation and policy development.

The first category, de-identified information, is governed by all of the law's privacy protections, subject to certain specific exceptions. Conversely, the second category, anonymized information, is stated to not be subject to the law. However, as I will mention, this stipulation—that it's not subject to the law—is not the end of the story. The law will and should continue to provide oversight over anonymized information. This is a point that is sometimes missed. I certainly heard it raised as a concern in previous comments. I think it's very important to understand that, however we define the term—and we've heard a number of comments here—it will continue to be subject to the law.

I have a number of recommendations for improvement.

First, with respect to de-identified information, the definition should be amended to stipulate appropriate processes to ensure no person can be directly identified from the information. Additionally, proposed section 74 of the CPPA, which addresses technical and administrative protections, should be amended to include, as an additional criterion, the risk of re-identification.

Secondly, the definition of anonymized information should be amended to make more explicit the processes required for anonymization. With its law 25, Quebec got it right in this area. I recommend aligning with Quebec's approach, which stipulates that the generally accepted best practices for anonymization should be those set out in appropriate regulations. Such regulations should include transparency, risks of re-identification, accountability and guardrails for downstream uses. The Quebec law also recognizes that it is not possible, from a practical perspective, to say that anonymized information cannot be re-identified. The CPPA provision should reflect the same approach. Additionally, there should be a requirement for the organization performing any anonymization process to conduct a re-identification risk analysis. This is a proposed requirement in Quebec's regulations governing anonymized information.

Thirdly, the applicability of the law's protections for de-identified information is a bit of a complicated area. I can certainly go into it in more detail during questions, if you like. Currently, the CPPA provides that de-identified information is personal information, except for certain provisions, where it will not be considered personal information.

This is the wrong approach. Instead, as recommended by the OPC, a simple statement should be made that all de-identified personal information remains personal information. Also, the list of exceptions in the bill is confusing. To make it simpler and clearer, many of the exceptions should be omitted entirely—they are not needed. I can explain that in more detail if you wish.

My final comment is to address, as I mentioned a couple of minutes ago, a concerned voice by some stakeholders that the statute's anonymization regime should be made expressly subject to oversight by the Privacy Commissioner. I know you've heard that from at least one witness and maybe others here. In my view, such a provision is not required. The commissioner will have oversight over an organization's compliance with the anonymization rules, whatever they are. Also, and very importantly, if anonymized information does become identifiable—and that's this whole risk of reidentification—all of the statute's protective provisions again will apply with full vigour, and the commissioner will have oversight. Actually, there are two routes whereby the commissioner will or may continue to have oversight.

In sum, my recommendations are as follows.

First, the definition of “de-identified” information should be made more rigorous, including addressing the risk of reidentification. Secondly, the definition of anonymized information should be amended to make more explicit the processes required to achieve anonymization, and these should be set out in regulations, including a requirement for risk assessment. Finally, the regime for applicability of the CPPA's protections for de-identified information should be made clearer, in particular, stating that all such information remains personal information.

I will be happy to elaborate and answer any questions you have regarding these comments or any other provisions of the bill.

Carole Piovesan Co-founder and Partner, INQ Law, As an Individual

Thank you, Mr. Chair and members of the committee, for the opportunity to speak to Bill C-27.

I am the managing partner of INQ Law, where my practice focuses on data- and AI-related laws. I’m here in my personal capacity and the views presented are my own.

Every day, we are hearing new stories about the promise and perils of artificial intelligence. AI systems are complex computer programs that process large amounts of data, including large amounts of personal information, for training and output purposes. Those outputs can be very valuable.

There is a possibility that AI can help cure diseases, improve agriculture yields or even help us become more productive, so we can each play to our best talents. That promise is very real, but as you've already heard on this panel, that promise does not come without risk. Complex as these systems are, they are not perfect and they are not neutral. They are being developed at such a speed that those on the front lines of development are some of the loudest voices calling for some regulation.

I appreciate that this committee has heard quite a bit of testimony over the last several weeks. While the testimonies you've heard have certainly run the gamut of opinions, there seem to be at least two points of consistency.

The first is that Canada’s federal private sector privacy law should be updated to reflect the increasing demand for personal information and changes to how that information is collected and processed for commercial purposes. In short, it’s time to modernize PIPEDA.

Second, our laws governing data and AI should strive for interoperability or harmonization across key jurisdictions. Harmonization helps Canadians understand and know how to assert their rights, and it helps Canadian organizations compete more effectively within the global economy.

The committee has also heard opposing views about Bill C-27. The remainder of my submissions will focus on five main points to do with parts 1 and 3 of the bill.

Part 1, which proposes the consumer privacy protection act, or CPPA, proposes some important changes to the governance of personal information in Canada. My submissions focus on the legitimate interest consent exception and the definition of anonymized data, much of which you've already heard on this panel.

First, the new exceptions to consent in the bill are welcome. Not only do they provide flexibility for organizations to use personal data to advance legitimate and beneficial activities, but they also align Canada’s law more closely with those of some of our key allies, including internally within Canada, such as Quebec’s Law 25, more specifically. Critically, they do so in a manner that is reasonably measured. I agree with earlier testimony that you've heard in this committee, that the application of the legitimate interest exception in the CPPA should align more closely with other notable privacy laws, namely Europe's GDPR.

Second, anonymized data can be essential for research, development and innovation purposes. I support the recommendations put to this committee by the Canadian Anonymization Network with respect to the drafting of the definition of “anonymize”. I also agree with Mr. Lamb's submissions as to the insertion of existing notions of reasonable foreseeability or a serious risk of reidentification.

As for part 3 of the bill, the proposed artificial intelligence and data act, first, I support the flexible approach adopted in part 3. I caution and recognize that the current draft contains some major holes, and that there is a need to plug those holes as soon as possible. As well, any future regulation would need to be subject to considerate consultation, as contemplated in the companion document to AIDA.

Our understanding of how to effectively promote the promise of AI and prevent harm associated with its use is evolving with the technology itself. Meaningful regulation will need to benefit from consultation with broad stakeholders, including, importantly, the AI community.

Second, Minister Champagne, in the letter he submitted to this committee, proposes to amend AIDA to define “high impact” by reference to classes of systems. The definition of high impact is the most striking omission in the current draft bill.

The use of a classification approach aligns with the EU's draft artificial intelligence act and supports a risk-based approach to AI governance, which I support. When the definition is ultimately incorporated into the draft, it should parallel the language in the companion document and provide criteria on what “high impact” means, with reference to the classifications as illustrated.

Finally, I support the proposed amendments to align AIDA more closely with OECD guidance on responsible AI. Namely, this is the definition in proposed section 2 of AIDA, which has also been adopted by the National Institute of Standards and Technology in the United States in its AI risk management framework.

To the extent that Canada can harmonize with other key jurisdictions where it makes sense for us to do so, we should.

I look forward to the committee's questions, as well as to the comments from my fellow witnesses.

Scott Lamb Partner, Clark Wilson LLP, As an Individual

Thank you, Mr. Chair and members of the committee, for having me here today on the important matter of reform of our privacy legislation and Bill C-27.

I'm a partner at the law firm of Clark Wilson in Vancouver, and I'm called to the bar in Ontario and British Columbia. I've been practising in the area of privacy law since approximately 2000. I've advised both private sector organizations in a variety of businesses and public bodies such as universities in the public sector. I've also acted as legal counsel before the Information and Privacy Commissioner for British Columbia in investigations, inquiries and judicial review.

With the limited amount of time we have, I'll be confining my remarks to the proposed consumer privacy protection act, specifically the legitimate interest exception, anonymization and de-identification, and the separate review tribunal. Hopefully, I'll have a bit of time to get into the artificial intelligence and data act, AIDA, with respect to high-impact systems.

I will of course be happy to discuss other areas of Bill C-27 and questions you may have. Also, subsequent to my presentation, I'll provide a detailed brief on the areas discussed today.

Starting with the proposed consumer privacy protection act and the legitimate interest exception, it's important to point out that arguably the leading privacy law jurisdiction, the EU with its GDPR, provides for a stand-alone right of an organization to collect, use and disclose personal information if it has a legitimate interest. Accordingly, if Canada is to have an exception to consent based on an organization's legitimate interest, it's important to look, in detail, at how that will operate and the implications of that exception.

First, to reiterate, the draft provisions in proposed subsection 18(3) are an exception to the consent requirements and not a stand-alone right for an organization as set out in the GDPR.

What's the significance of this? A stand-alone right generally is not as restrictively interpreted by the courts as an exception to an obligation from a purely statutory interpretation point of view. In short, the legitimate interest exception is very likely to be a narrower provision in scope than the GDPR's legitimate interest provisions.

A stand-alone right may be a means to circumvent or generally undercut the consent structure of our privacy legislation, which again is at the heart of our legislation and is a part of the inculcated privacy protection culture in Canada. Maintaining the legitimate interest provisions as an exception to the consent structure, on balance, is preferable to a stand-alone right.

Second, the exception is only for the collection or use of personal information and is not permitted for the disclosure of personal information to third parties. The prohibition on application of the exception to disclosure of personal information that is in the legitimate interest of an organization, in my view, doesn't make sense. While I'm in favour of the first instance of an exception over a stand-alone right, I think you have to expand this to cover disclosure as well.

The provisions in proposed subsection 18(3) expressly state that the legitimate interest of an organization “outweighs any potential adverse effect”. This is effectively a high standard of protection. The usefulness of this exception, if limited to only collection and use, is significant for organizations. For example, a business may have a legitimate interest in collection and use of personal information to measure and improve the use of its services or to develop a product. However, proposed subsection 18(3) prevents that organization from actually disclosing that personal information to a business partner or third party vendor to give effect to its legitimate purpose.

Finally, the point is that other jurisdictions allow for a legitimate interest of an organization to apply to disclosure of personal information as well as to collection and use. Specifically, again, that is not only the EU GDPR but also the Singapore law. I note that when you look at those pieces of legislation standing side by side, Singapore also has it as an exception. Singapore also has some case law that has moved forward.

I think it would give a lot of comfort to this committee if it were to examine some of the case law from Singapore, as well as some of the more current case law from the GDPR regime. It does give some sense of what this means as a legitimate interest, which I can appreciate at first instance may seem rather vague and could be seen as a giant loophole. However, my submission is that's not the case.

The next item I'd like to talk about is anonymization and de-identification. Clarity on this issue has been sought for some time, and it's reassuring that the change from Bill C-11 to Bill C-27 introduced this idea, a concept of anonymization, as separate from de-identification. However, technologically and practically speaking, you're never going to reach the standard set out in the definition of anonymization, so why put it in the act in the first place? There's been some commentary on this, and I am generally in support of the recommendation that you should insert into that definition the reasonableness to expect in the circumstances that an individual can be identified after the de-identification process. Then the data is not anonymized and is still caught by the legislation and the specific requirements for the use and disclosure of such data.

In terms of use and disclosure, I also note that proposed section 21 confines the use to internal use by the organization. The utility of this provision could be remarkably limited by this, again compared to what our trading partners have, because in modern research and development you have the idea of data pooling and extensive partnerships in the use of data. If it's strictly for internal purposes, we could lose this important tool in a modern technological economy that relies on this. Therefore, I recommend that it be deleted as well.

Also, proposed section 39 would limit the disclosure of de-identified personal information to, effectively, public sector organizations—this is very restrictive—and consideration should be given to disclosing to private sector organizations that are really fundamentally important to our modern economy and research and development.

In terms of the separate review tribunal, I know that the Privacy Commissioner has been hostile to this and I recognize that the Privacy Commissioner performs an invaluable role in investigating and pursuing compliance with our privacy legislation. However, given the enormous administrative monetary penalties that may be awarded against organizations—the higher of 3% of gross annual revenue or $10 million—for breaches, clear appeal rights to an expert tribunal and review of penalties are required to ensure due process and natural justice standards and, frankly, to develop the law in this area.

It is also noteworthy that judicial oversight of the decision of the tribunal would be according to the Supreme Court of Canada's test in Vavilov, which is limited to a review on the reasonableness standard, which is a very deferential and limited review. It's been suggested that you try to limit these things from going on forever and ever. With judicial review, they would be limited. I know there was one suggestion that the ability to seek judicial review should jump right from the tribunal to the Federal Court of Appeal. I think that's fine if you want to expedite this and meet that concern. I think that's probably right, but I do like the structure of a separate review tribunal.

Finally, on artificial intelligence and the high-impact systems, I think the focus of that, in terms of identifying the concept of high-impact systems, is sound in structure and potentially generally aligned with our trade partners in the EU. However, the concept cannot be left to further development and definition in regulations. This concept needs extensive consultation and parliamentary review.

It is recommended that the government produce a functional analysis of a high-impact system from qualitative and quantitative impact, risk assessment, transparency and safeguards perspectives.

It's further recommended that distinctions be made between artificial intelligence research and development for research purposes only and artificial intelligence that is implemented into the public domain for commercial or other purposes. What I would not want to see come out of our AIDA legislation is that we have some sort of brake on research in artificial intelligence.

We are vulnerable and our allies are vulnerable to other international actors that are at the forefront of research in artificial intelligence. We should not have anything in our legislation to break that. However, we should protect the public when artificial intelligence products are rolled out to the public domain, and ensure that we are protected. I think that's a distinction that is missing in the discussion, and it's very important that we advance that.

Those are my submissions.

Thank you.

François Joli-Cœur Partner, Borden Ladner Gervais, As an Individual

Good afternoon.

Thank you for inviting me. I'm pleased to have the opportunity to share my thoughts on Bill C‑27 with the committee.

I am a partner at Borden Ladner Gervais, BLG, and a member of the privacy practice group. I am also the national lead of BLG's artificial intelligence, AI, group. I am appearing today as an individual.

My remarks will focus on the AI provisions in the bill, in both the artificial intelligence and data act, or AIDA, and the consumer privacy protection act, or CPPA.

To start, I want to say how important it is to modernize the federal privacy regime, something Quebec, the European Union and some of the world's largest economies have done recently.

I commend the government's commitment to AI legislation. In spite of the criticisms against AIDA, the bill has the advantage of putting forward a flexible approach. Nevertheless, some key concepts should be provided for in the act, instead of in the regulations. Furthermore, it is imperative that the government consult extensively on the regulations that flow from AIDA.

The first point I want to make has to do with the anonymized data in the CPPA. The use of anonymized personal information is an important building block for AI models, and excluding anonymized information from coverage by the act will allow Canadian businesses to keep innovating.

The definition of anonymization should, however, be more flexible and include a reasonableness standard, as other individuals and groups have recommended. That would bring the definition in line with those in other national and international laws, including recent amendments to Quebec's regime.

The CPPA should explicitly state that organizations can use an individual's personal information without their consent to anonymize the information, as is the case for de‑identified information.

Lastly, AIDA includes references to anonymized data, but it isn't defined in the act. The two acts should be consistent. AIDA, for instance, could refer to the definition of “anonymize” set out in the CPPA.

The second point I want to make concerns another concept in the CPPA, automated decisions. Like most modern privacy laws, the proposed act includes provisions on automated decisions. On request by an individual, organizations would be required to provide an explanation of the organization’s use of any automated decision system to make predictions, recommendations or decisions about individuals that could have a significant impact on them.

An automated decision system is defined as any technology that assists or replaces the judgment of human decision-makers. The definition should be amended to capture only systems with no human intervention at all. That would save organizations the heavy burden of having to identify all of their decision support systems and introduce processes to explain how those systems work, even when the final decision is made by a human. Such a change would increase the act's interoperability with Quebec's regime and the European Union's, which is based on the general data protection regulation.

Turning to AIDA, I want to draw your attention to high-impact systems. The act should include a definition of those systems. Since most of the obligations set out in the act flow from that designation, it's not appropriate for the term to be wholly defined in the regulations. The definition should include a contextual factor, specifically, the risk of harm caused by the system. For example, it could take into account whether the system posed a risk of harm to health and safety or a risk of an adverse impact on fundamental rights. That factor could be combined with the classes of systems that would be considered high-impact systems, as set out in the act.

Including a list of classes of systems that would de facto be considered high-impact systems, as the minister proposed in his letter, could capture too many systems, including those that pose moderate risk.

My last point concerns general purpose AI systems. In his letter, the minister proposed specific obligations for generative AI and other such systems. While generative AI has become wildly popular in the past year, regulating a specific type of AI system could render the act obsolete sooner.

Not all general purpose AI systems pose the same degree of risk, so it would be more appropriate to regulate them as high-impact systems when they meet the criteria to be designated as such.

Thank you very much. I would be happy to answer any questions you have.

Alexander Jarvie Partner, Davies Ward Phillips & Vineberg LLP, As an Individual

Thank you very much.

Good afternoon, and thank you for the invitation to share my thoughts on Bill C-27 with the committee.

I am a partner at Davies Ward Phillips & Vineberg LLP, practising as a lawyer in the firm’s technology group. I am appearing today in a personal capacity, presenting my own views.

Recent years have seen significant technological developments related to machine learning. In part, these have come to pass because of another relatively recent development, namely, the vast amount of information, including personal information, that is now generated by our activities and circulates in our economy and our society. Together, these developments hold great promise for future innovation, but they also carry significant risks, such as risks to privacy, risks of bias or discrimination and risks relating to other harms.

I am, therefore, encouraged that a bill has been introduced that seeks to address these risks while supporting innovation. I will begin by making some remarks on the proposed consumer privacy protection act, CPPA, and by suggesting changes to certain provisions of the bill that could better support innovation involving machine learning while introducing important guardrails. I will then share some observations in relation to the proposed artificial intelligence and data act, AIDA.

In my view, there could be improvements made to the CPPA consent exception framework that would facilitate personal information exchange among, and collection by, private sector actors that wish to undertake socially beneficial projects, study or research. In particular, proposed sections 35, 39 and, in part, 51 could be combined and generalized so as to permit private sector actors to disclose and exchange personal information or to collect information from the public Internet for those purposes, study or research, provided that certain conditions are fulfilled.

Those could include conducting a privacy impact assessment, entering into an agreement containing relevant contractual assurances where applicable, and providing notice to the commissioner prior to the disclosure or collection. Noting that de-identified data is sufficient for the training of machine learning models in many cases and noting that de-identification is a requirement in proposed section 39, as currently drafted, but not in proposed section 35, I would note only that whether the information should be de-identified in a given case should be a factor in the proposed privacy impact assessment.

Suitably crafted, these changes could provide material but appropriately circumscribed support for section 21 of the proposed CPPA, which permits the use of personal information that has been de-identified for internal research and analysis purposes, and for proposed subsection 18(3), which permits use of personal information in its native form for legitimate interests, provided that an assessment has been undertaken.

With respect to the AIDA, I begin with the definition of the term “artificial intelligence system”. This definition is of fundamental importance, given that the entire scope of the act depends upon it. The current definition risks being overbroad. The minister’s letter proposes to provide better interoperability by introducing a definition that seeks to align with a definition used by the OECD, but the text provided differs from the OECD formulation and introduces the word “inference” in a suboptimal way. We also do not have the final wording.

There are also different definitions to consider in other instruments, including the European Union’s proposed AI act, the recent U.S. President’s executive order, and the NIST AI risk management framework, among others. Some of these do converge on the OECD’s definition, but in each case the wording differs.

I would recommend to the committee—or, at least, I would urge the committee—when it begins clause-by-clause review, to make a survey of existing definitions to determine the state of the art and to ensure that the definition ultimately chosen indeed maximizes interoperability yet also remains extensible to account for new techniques or technologies.

I would also recommend that the purpose clause of the AIDA, as well as other relevant provisions, be amended to include harms to groups and communities, as these may also be adversely affected by the decisions, recommendations or predictions of AI systems.

Finally, there should be an independent artificial intelligence and data commissioner. The companion document to the AIDA notes that the model whereby the regulator would be a departmental official was chosen in consideration of a number of factors, including the objectives of the regulatory scheme. However, since the scope of what is being left to regulation is so extensive, the creation of an independent regulator to administer and enforce the AIDA will counterbalance skepticism concerning the relative lack of parliamentary oversight and thereby help to instill trust in the overall regulatory scheme.

I will submit a brief for consideration by the committee, elaborating on the matters raised here. Machine learning technologies are poised to play a significant role in future innovation. Through legislation, we can achieve meaningful support for this potential while providing effective protections for individuals, groups and society.

Thank you for your attention. I welcome your questions.

The Chair Liberal Joël Lightbound

Good afternoon, everyone.

I call this meeting to order.

Welcome to meeting number 96 of the House of Commons Standing Committee on Industry and Technology.

Today’s meeting is taking place in a hybrid format, pursuant to the Standing Orders.

Pursuant to the order of reference of Monday, April 24, 2023, the committee is resuming consideration of Bill C‑27, an act to enact the consumer privacy protection act, the personal information and data protection tribunal act and the artificial intelligence and data act and to make consequential and related amendments to other acts.

I’d like to welcome our witnesses today: Alexander Max Jarvie, partner, Davies Ward Phillips and Vineberg LLP; François Joli-Coeur, partner, Borden Ladner Gervais; Scott Lamb, partner, Clark Wilson LLP; Carole Piovesan, co‑founder and partner, INQ Law; and David Young, principal, privacy and regulatory counsel, David Young Law.

Welcome, everyone, and thank you again for joining us this afternoon.

Without further ado, I yield the floor to Mr. Jarvie for five minutes.

National Security Review of Investments Modernization ActGovernment Orders

November 9th, 2023 / noon


See context

Bloc

Sébastien Lemire Bloc Abitibi—Témiscamingue, QC

Madam Speaker, I will agree with my colleague from Winnipeg North that our provinces have something in common. I dream of the day when I can go to a Nordiques game in Winnipeg. There is a lot of sharing that we could do.

The economy is changing. I think the member for Winnipeg North would be welcome on the committee because the points he has raised would be very useful around the table. I would like to see him get out of the House sometimes, get his hands dirty, and present these amendments in committee.

I feel that the government has indeed done a diligent job, but within the limits imposed on us by the shackles of Bill C‑34. The law needed to be modernized to meet the realities of a new economy.

Right now, the Standing Committee on Industry and Technology is examining Bill C-27. I think everyone agrees on the fundamental aspect of data protection for all Quebeckers and Canadians, and especially for children. However, when it comes to developing AI and protecting our cultural sovereignty—and here I am thinking in particular of Quebec's cultural sovereignty, our French language and our accent, which CBC values so much—we definitely need to modernize this law and go even further. This is also important for protecting our start-ups and emerging companies that have patents and those that are working on and developing AI. We have some very painstaking work to do. I thank the government for its collaboration on Bill C-34.

National Security Review of Investments Modernization ActGovernment Orders

November 9th, 2023 / 11:05 a.m.


See context

Conservative

Cathay Wagantall Conservative Yorkton—Melville, SK

Madam Speaker, I am pleased to have an opportunity to speak to a bill that Conservatives believe is critical to the safety and security of Canadians.

At face value, Bill C-34 would amend the Investment Canada Act with the intent to bolster Canada’s foreign investment review process and increase penalties for certain instances of malpractice or contraventions of the act. Canadians could consider this bill an attempt by the Liberals to take threats posed by some cases of foreign investment seriously. However, we live in an increasingly volatile world and, as we have seen over these past few months, Canada is not immune to infiltration and manipulation from abroad.

In the past, Liberals have failed to thoroughly review transactions involving Chinese state-owned enterprises. This pattern is repeating itself through Bill C-34. Namely, clause 15 would remove the obligation for any foreign investment to be subject to a mandatory consultation with cabinet.

On this side of the floor, we believe that Canada’s economic and security interests are paramount and this bill would not go far enough to protect them. That is why we put forward 14 very reasonable amendments at committee that would have intensified the review process of business acquisitions from foreign state-owned entities. Unfortunately, the Liberals and the NDP rejected all but four of them. They are nonetheless critical to improving the bill, so I will touch on each of them.

First, the government was prepared to pass a bill that would have given carte blanche access to investment from state-owned enterprises, no matter their relationship with Canada. There were no provisions that would require any investment by a state-owned enterprise to be subject to an automatic national security review when the government introduced this bill. Our amendment reduced the threshold to trigger a review from $512 million to zero dollars, meaning that all state-owned enterprise investments in Canada must undergo a national security review.

Second, Conservatives introduced an amendment which would ensure that the acquisition of any assets by a state-owned enterprise would be subject to review under the national security review process. It would guarantee that not only new business establishments, acquisitions and share purchases would be considered under the review but also that all assets are included in this process, which is another very good amendment to the bill.

Third, when the government introduced the bill, it failed to address concerns regarding companies that have previously been convicted of corruption charges. This makes no sense to me at all. The Conservative amendment now, fortunately, would require an automatic national security review to be conducted whenever a company with a past conviction is involved.

Finally, the government would have been happy to pass a bill that gives more authority and discretion to the minister, despite multiple blunders over the past eight years to take seriously the real threats posed by some foreign investments. The original bill would have left it to the minister to decide whether to trigger a national security review when the threshold was met. The Conservative amendment addresses this oversight and would make a review mandatory, rather than optional, when the $1.9-billion threshold is met.

I do not understand why the government would not have automatically included this in the bill. It concerns me that so many pieces of legislation from the government are giving more and more authority to individual ministers and not to those beyond them to make sure that, within cabinet and the oversight of the House, those things are truly transparent and that sober thought has been applied.

These amendments, the four that I mentioned, are crucial elements to strengthening this bill, but the Liberal-NDP government also denied Canadians further protections by rejecting some other key improvements that Conservatives really do feel should have been there.

Witnesses at the committee stressed that many Chinese enterprises operating internationally are indentured to requests from the CCP, even if they are privately owned. That almost seems like an oxymoron, does it not? Instead of taking sensitive transactions seriously, the Liberals and the NDP rejected our amendment to modify the definition of a state-owned enterprise to include companies headquartered in an authoritarian state, such as China.

In addition, the coalition chose to not provide exemptions to Five Eyes intelligence state-owned enterprises. Conservatives proposed an exemption to prevent an overly broad review process, which the Liberals and NDP rejected. Rather than focusing on real and serious threats to safety, the government would rather utilize its time and resources on scrutinizing our most trusted security partners.

This makes no sense. Clearly, the government has struggled to get things done in a timely manner, and this would have been an opportunity for it to be far more efficient and to also show an improving relationship with our Five Eyes partners and allies.

Lastly, rather than supporting our amendment to create a list of sectors considered strategic to national security, the Liberals and the NDP chose to leave the process up to regulation and put it at risk of becoming a political exercise, which Canadians are very concerned about when it comes to the government, where stakeholders may invoke national security concerns to protect their own economic interests. Clearly the government has failed over and over again to show it is truly operating in the best interests of Canadians.

I am glad to say that the amendments we were able to pass turned a minor process bill into a major shift in our nation’s approach to foreign takeovers of Canadian companies, but there is still more that could be done to improve it. As it currently written, the bill would give the Minister of Industry and the Minister of Public Safety near sole authority to bypass cabinet and approve projects coming into Canada.

Given past precedent, Conservatives have been sounding the alarm for years on why this would be a critical mistake. I am reminded of when the former minister neglected to conduct a full national security review of partially China-owned Hytera Communications’ purchase of B.C.’s Norsat International in 2017.

Twenty-one counts of espionage later, the United States Federal Communications Commission blacklisted Hytera in 2021 due to “an unacceptable risk to the national security of the United States”. However, it was not until 2022 that the then minister was left scrambling when the RCMP suspended its contract with Norsat for radio frequency equipment.

Shockingly, Public Services and Procurement Canada confirmed that security concerns were not taken into consideration during the bidding process for the equipment. This, of course, raises alarms. The Liberals also failed to consult Canada’s own Communications Security Establishment on the contract. Instead, the contract was merely awarded to the lowest bidder. This is also interesting because, quite often, it seems we are hearing of funds being shared by the government with organizations that simply do not do anything for Canadians with the money they are given.

Why was this allowed to happen? Why was a piece of technology meant to ensure secure communications within Canada’s national police force contracted out to a company accused of compromising national security around the world, as well as serving as a major supplier to China’s Ministry of Public Security?

Let us go back to 2020, when the government was prepared to award Nuctech with a $6.8-million deal to provide Canada’s embassies and consulates with X-ray equipment. Nuctech is, again, Chinese-based and founded by the son of a former secretary general of the CCP.

Deloitte Canada reviewed the offer and made a staggering recommendation to the government that it should only install security equipment in Canadian embassies if it originates from companies with national security clearances. Deloitte found that Nuctech’s hardware and software had advanced beyond the government’s existing security requirements to the point that its X-ray machines are capable of gathering information and accessing information networks. This raises huge alarm bells.

Global Affairs Canada did not review Nuctech for risks to national security during its procurement process, nor was the Canadian Centre for Cyber Security asked to conduct its own review. The government often says it will do better and can do better, but these things are happening over and over again. However, all this might have been too little too late, as the government has awarded four additional CBSA contracts to Nuctech since 2017. The government’s laissez-faire attitude to national security is simply beyond comprehension.

It does not end there. The government also cannot be trusted to safeguard the security of Canadians because it cannot even follow its own rules. In March of 2021, the minister updated guidelines for national security reviews for transactions involving state-owned enterprises and Canada’s critical minerals. Less than a year later, the same minister violated his own rules by expediting the takeover of the Canadian Neo Lithium Corporation by Chinese state-owned Zijin Mining. Once again, this was done without a national security review.

To make matters worse, the minister defended his decision by refusing to order them to divest from Neo Lithium while ordering three other Chinese companies to divest their ownership of three other critical minerals firms. It is confusing to me that the government would be so inconsistent. The hypocrisy is astounding. The government is once again picking winners and losers, and it is disconcerting who they are choosing to be winners. This time, national security is on the table. This cannot be allowed to continue.

We have seen a pattern of missteps by the government on how programs and projects are approved. Over the last eight years, there has been an unacceptable shift toward putting more power within the hands of ministers and outside advisory councils, with little to no accountability to this place. We certainly see that, and Canadians see it, too. There is less and less of a sense of responsibility in this place to Canadians. It is as though the government can simply go ahead and provide its ministers with legislation that gives them a carte blanche ability to do things, along with organizations and advisory councils that are outside of this place and do not have the proper oversight that the House of Commons, which reflects Canadians, certainly should have.

Often, we find that appointed advisory councils are established at the minister’s discretion prior to a bill even being signed into law. That just shows the incredible lack of respect of the Liberal government to due process in this place.

Other times, we see that the Liberals just cannot seem to pick a lane. With Bill C-27, for instance, the Privacy Commissioner’s new powers to investigate contraventions of the Consumer Privacy Protection Act were diminished by a personal information and data tribunal. In this tribunal, only three of its six members were required to have experience in information and privacy law—

Rick Perkins Conservative South Shore—St. Margarets, NS

In support of what Dr. Ellis said, I think it's really important to have the minister here, because ultimately he was the person who had to sign off as the minister—he's been the minister for 34 months—on these contracts. Officials would have made the recommendations. He's ultimately accountable for the $150 million that's being paid out now to the contract. He's accountable for the $223 million that was committed to go in. While the Minister of Health has a role in the process of whether or not the vaccine works, the industry minister is the one who had to fund it.

Officials aren't accountable for the dollars. Ultimately, it's the minister. I would encourage members to please keep the industry minister there. We won't have a chance, as I said, to look at this in the industry committee. We're going to be dealing with PIPEDA and Bill C-27 until February or March.

Rick Perkins Conservative South Shore—St. Margarets, NS

Thank you, Mr. Chair, for indulging me as an associate member of this committee.

My normal role, besides sitting with Mr. Hanley on fisheries, is as vice-chair of the industry committee. I've had a motion for a study on Medicago on the industry committee since the spring, but legislation takes precedence. We were dealing with Bill C-34 on the Investment Canada Act changes and Bill C-27, the privacy and artificial intelligence bill, so we've not had a chance to get to the motion.

That is why I think the motion here before the committee is so important. The industry committee did an examination, initially—it was tabled in June, since it was started in the last Parliament—of the response to COVID-19 in terms of vaccines, as, I believe, this committee did. I believe there are not only minister of health issues with regard to this study but also a large industry role. Unfortunately, the industry committee doesn't have time to discuss it.

You will note, in the appendix of the report tabled in the House on June 14 by the industry committee, that an agreement with Medicago was signed on October 23, 2020, to purchase up to 76 million doses of the vaccine. This is a vaccine Health Canada had approved and to which the government initially committed. It was up to $223 million through a couple of funds, in order to develop a non-mRNA vaccine, a plant-based vaccine, which they successfully did. I think it got Health Canada approval.

The committee needs to study it for various reasons. It's not clear to us why not a single vaccine was produced, and why that contract was signed for 76 million. A great deal of provincial and federal government money went into creating that vaccine plant in Quebec City 10 years or so ago, in order to produce vaccines. My understanding, from everything I've seen, read and heard, is that, in this case, it was a successful vaccine with a fairly high efficacy rate.

This investment was made and seems to have not gone anywhere, mainly because the World Health Organization has a policy not to endorse products produced by companies that have any kind of tobacco manufacturing involvement. I think Philip Morris had 40% ownership, with Mitsubishi having the remainder. I'd love to ask both the health minister and the industry minister this: Why would you sign such a contract or even invest up to $223 million of taxpayer money to develop a vaccine with a company that you knew the WHO would not endorse for promotion around the world? This would leave it, essentially, a Canadian domestic market vaccine. I think there are a lot of questions to ask around that and the thinking leading up to it.

We know the thinking was about trying to develop, as MP Thériault said, domestic vaccine manufacturing capacity. A lot of money was going into it, at a very intense time in the world and in this country. In choosing to make it with this particular company, it looked to me like it was doomed to failure regarding its ability to, even if successful, be a vaccine acquired by other countries. That would ultimately be the goal in addition to our own use. Without a WHO “good housekeeping” seal of approval, it was unlikely to have any success in its sales.

In business, we call it a “sunk cost”; once it's done, you can't get it back. In this case, the sunk cost is in, so let's buy some of the vaccines and contract with it.

An incredible amount of taxpayer money went into this. Where are the patents? Who owns the patents? Where have they gone?

The inability of this organization, for whatever reason, to produce the vaccines in this plant that was set up, where 400 people worked, looked like it had a ray of light in December last year, when Mitsubishi bought out Philip Morris.

When that happened, I thought, okay, this is good news. Maybe this great taxpayer-funded vaccine can be produced and marketed around the world, now that it no longer has a tobacco company ownership structure. There are rumours out there of what Mitsubishi paid for that. Some have said it's as low as about $14 million, which is incredible, given that it had almost $200 million of federal taxpayer money with patents on a successful vaccine.

Nonetheless, we all lead a public, elected life. We're all optimists by nature, or we wouldn't be doing this job. I think we held out hope that somehow, it would be seen as a step forward.

Lo and behold, what happened six weeks later? Six weeks later, Mitsubishi shut the company down, threw 400 people in Quebec out of work—after all of that taxpayer money—and then started this dance of the questions that we started to ask.

What's happened? There's a contract to produce up to 76 million vaccines. I believe the cost was $20 per vaccine, so what are we on the hook for as a country, to pay for a vaccine that was never produced? Where did all that investment in that IP go?

I suspect we don't know the answers to that or whether or not Mitsubishi has chosen to actually sell the Canadian-financed patents for a plant-based COVID vaccine somewhere in the world. We don't know that. We haven't had it before this committee and we haven't had it before the industry committee. This committee has the opportunity, perhaps, with its agenda to do that, which we don't in the industry committee. I would be urging members to take a look at that, because it seems to me there are at least two flaws in this process.

The first flaw is that there wasn't any protection of Canadian taxpayers when $200 million was committed in a contract to develop the vaccine in the first place. There were no issues around the taxpayers' claim on the patents if something went south.

Somehow, as the financier of this, either through university-owned patents or through the rights of the granting councils through the SIF program—or whichever ISED program paid for this, because I believe the money came out of ISED—we were obviously so poor at negotiating contracts that we didn't get an ownership stake in that or any protection for the taxpayer if, for example.... They must have known going in that it would have had trouble being marketed because of the Philip Morris ownership. There wasn't some protection for the taxpayer from that company in the contract to give us the money back from Philip Morris and Mitsubishi for the investment or, in the case of the situation that arose, the fact that the taxpayer would actually own the patents so that they couldn't leave this country and couldn't be sold by a foreign multinational. However, it appears that's the situation we're in.

If that wasn't bad enough, obviously, the cancellation clauses were non-existent in the contract to buy the 76 million doses of the vaccine that were never produced, because we are now on the hook for another $150 million for something that was never made. It's thin air, it's vapour, it's nothing. It's $150 million for not even an empty vial.

There was $200 million that went into developing the vaccine and $150 million for absolutely nothing. Some 400 people in Quebec City are out of work, and Mitsubishi gets to walk away with all of the patents and all of the potential to sell them for the small price of a few million dollars buying out Philip Morris.

That's the way it appears. Maybe that's not the case. Maybe the witnesses could actually shed some light on these contracts. Maybe officials could explain to us why they signed contracts that appear to leave the Canadian taxpayer with nothing but the bill and leave a Japanese company with an innovative Canadian patented technology.

Again, because we don't have the ability to do this in Industry, we would like to get this committee to examine these things. That's why Dr. Ellis put forward the motion in the first place. I would urge that our committee members not only vote on the amendment as amended. I think that we need not limit ourselves to four meetings or six meetings. I think you have to follow the evidence and then get to the main motion so that the committee gets this on the agenda.

That's my opening. I'll leave it at that for members to consider. The numbers add up to quite a large loss to the Canadian taxpayer. To me, it's a bit of a scandal. I hope it's not. I hope we can actually get those patents back.

Thank you, Mr. Chair.

Prof. Fenwick McKelvey

I would say two things briefly. Bill C-27 builds in large exemptions for what types of data can be collected, so if it is anonymized or for legitimate business purposes. I feel like that actually warrants more consideration of what that entails and of the potential impacts it has on workers.

The second part is that, really, what these exemptions do is.... They are backstopped by AIDA—the artificial intelligence and data act, which is at the end—which really causes some notable concerns because it's putting a lot of the investigative powers in a loosely defined data commissioner role. I actually feel as though part of the task, ahead of the legislative agenda, is changing it from AI to being simply a matter of an economic strategy, and also thinking about ways of mitigating its potential negative and positive social impacts.

Yes, I think some ways of addressing how this impacts labour and trying to make sure that there is targeted legislation would be a boon, because I think this is not something that is going to be addressed by an omnibus bill.

Bonita Zarrillo NDP Port Moody—Coquitlam, BC

Thank you so much.

My question is for Mr. McKelvey.

You mentioned Bill C-27 quite a bit. It's quite extensive. I'm wondering if you think that the labour portion, the workers portion, of artificial intelligence should have its own stand-alone legislation or if you think workers will be duly protected in Bill C-27.

Prof. Fenwick McKelvey

First, there is a need to consider this around Bill C-27 and the ways in which we're trying to understand privacy and data. Partially what is really important now is recognizing our data power. What AI demonstrates is that there's power in collecting large amounts of data. You can now mobilize it. Really, it's trying to think about privacy law and data as bigger than the traditional concerns about personal information. That's an important broader shift that we've been witnessing, but it just hits it home.

I think the second thing is then trying to understand these uneven and disparate impacts. Certainly we're going to hear ample evidence about the benefits of artificial intelligence. I think it's incumbent on the government to understand and protect those marginalized and precarious workers who might be on the outside of those benefits.

That's certainly part of what's going on with generative AI. We're trying to understand a different class. That's why there's so much attention right now. It's a different class of workers, typically white-collar creative workers, who are potentially now facing greater competition from automated solutions. That's not to say that the effects are going to be easy to predict, but it's also saying that we're seeing a marked shift. That needs to be taken into consideration in how we're going to understand this relationship with AI and the labour market.

Finally, it's to ensure that we are making sure that we have strong protections for workers and making sure that this is something that we value as a society and part of how we frame our legislative agenda.

Prof. Fenwick McKelvey

Yes, I've been able to review it briefly, but not in complete depth. I'd say that it certainly demonstrates the clear gaps that I see in Canada's approach to the artificial intelligence and data act. You see much more fulsome treatment of potential harms and willingness to engage in the sector-specific issues around artificial intelligence. I think it's a document worth studying just to demonstrate the complexity of the challenges facing regulators and legislators...and then in comparison to AIDA.

I would agree with Dr. Frank that there is probably a need for a harmonized approach. Canada is quite active in that to some degree, whether it's participating in a global partnership on AI or in some of its bilateral agreements with France or the United Kingdom. I think there is a debate that Canada is going to have to position itself where it's at least working—and I know there are efforts to talk about treaties with the EU around AI—in parallel with the United States.

The one thing I would say is that with Bill C-27 and Quebec's Law 25, I think there is a big test about GDPR compliance. Really, what should be front and centre when we are talking about our legislative agenda for AI is understanding it in relationship to the movement that's happening in Europe around the AI act, and I think to a lesser degree with the United States, although I commend what that order has been able to accomplish.

Prof. Fenwick McKelvey

First, some of my comments were also drawn from the forum for AI presentation in Montreal and some of the panel discussions around labour. I actually think it is important to recognize the differences in Quebec's leadership on addressing the social impacts of artificial intelligence. That was an important milestone in trying to push an agenda of trying to think about AI as not simply economic policy but also as social policy.

The challenge, presently, with Bill C-27 is that it's complex enough in itself, and then there is the added AIDA amendment. It's a really challenging moment to make very important legislation work, so having more eyes on it, particularly attention from your committee on the labour impacts of Bill C-27, would be welcome.

Given the time that this committee will have to investigate the multitude of changes, I don't think there is going to be enough time to address those effectively. This is an important way of coordinating AI policy across the government, which in my own research I found lacking.