Digital Charter Implementation Act, 2022

An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts

Sponsor

Status

In committee (House), as of April 24, 2023

Subscribe to a feed (what's a feed?) of speeches and votes in the House related to Bill C-27.

Summary

This is from the published bill. The Library of Parliament has also written a full legislative summary of the bill.

Part 1 enacts the Consumer Privacy Protection Act to govern the protection of personal information of individuals while taking into account the need of organizations to collect, use or disclose personal information in the course of commercial activities. In consequence, it repeals Part 1 of the Personal Information Protection and Electronic Documents Act and changes the short title of that Act to the Electronic Documents Act . It also makes consequential and related amendments to other Acts.
Part 2 enacts the Personal Information and Data Protection Tribunal Act , which establishes an administrative tribunal to hear appeals of certain decisions made by the Privacy Commissioner under the Consumer Privacy Protection Act and to impose penalties for the contravention of certain provisions of that Act. It also makes a related amendment to the Administrative Tribunals Support Service of Canada Act .
Part 3 enacts the Artificial Intelligence and Data Act to regulate international and interprovincial trade and commerce in artificial intelligence systems by requiring that certain persons adopt measures to mitigate risks of harm and biased output related to high-impact artificial intelligence systems. That Act provides for public reporting and authorizes the Minister to order the production of records related to artificial intelligence systems. That Act also establishes prohibitions related to the possession or use of illegally obtained personal information for the purpose of designing, developing, using or making available for use an artificial intelligence system and to the making available for use of an artificial intelligence system if its use causes serious harm to individuals.

Elsewhere

All sorts of information on this bill is available at LEGISinfo, an excellent resource from the Library of Parliament. You can also read the full text of the bill.

Votes

April 24, 2023 Passed 2nd reading of Bill C-27, An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts
April 24, 2023 Passed 2nd reading of Bill C-27, An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts

Digital Charter Implementation Act, 2022Government Orders

April 24th, 2023 / 3:10 p.m.


See context

The Speaker Anthony Rota

It being 3:13 p.m., pursuant to order made on Thursday, June 23, 2022, the House will now proceed to the taking of the deferred recorded division on the motion at the second reading stage of Bill C-27.

Call in the members.

The question is on the motion. Pursuant to Standing Order 69.1, the first question is on parts 1 and 2, including the schedule to clause 2 of the bill.

Digital Charter Implementation Act, 2022Government Orders

April 20th, 2023 / 4:45 p.m.


See context

NDP

Gord Johns NDP Courtenay—Alberni, BC

Madam Speaker, I guess there are a few things that I would like to learn from my hon. colleague. We know that since the Liberals came into power, foreign tech giants have more than tripled their lobbying efforts in Ottawa, especially with the Liberal government, and Amazon, Google and Facebook have been a large part of that. I would love to hear his concerns or thoughts around that.

Bill C-27 does not explicitly apply to political parties. As we have seen in the past, and we just saw the Green Party have a breach, which was unfortunate, the possibility of privacy breaches and misuse exists in the political arena. Does my colleague agree that the bill should be amended to specifically include political parties?

Digital Charter Implementation Act, 2022Government Orders

April 20th, 2023 / 4:45 p.m.


See context

Liberal

Joël Lightbound Liberal Louis-Hébert, QC

Madam Speaker, I am following the debate.

If we look at Europe, it seems quite complicated to create a framework to govern artificial intelligence. However, I think we should draw inspiration from Europe's efforts. The Standing Committee on Industry and Technology is certainly going to want more information about how the Europeans are going about it.

One thing is certain. I think what makes this so difficult is that the technology is evolving so fast. The part of Bill C-27 that deals with AI, as currently proposed, gives the government the freedom to do a lot through regulation, which is not necessarily ideal as far as I am concerned. However, when it comes to AI, I doubt that there is any other option. Today we are talking about ChatGPT, but I can almost guarantee that by next year, if not this summer, we will have moved on to something completely different.

The situation is changing so fast that I think we need to be very nimble in dealing with AI. I have heard the Conservative member for Calgary Nose Hill, whom I see eye to eye with on these issues, use the word nimble.

What I like about Bill C‑27 is that it creates the position of a commissioner who reports to the minister and who will look into these issues. I have long believed that we should have someone to oversee AI, someone to study all the new capabilities and the risks of accidents that this poses—because there are serious risks—and to be able to translate this into terms that the general public, legislators and the House can understand.

Digital Charter Implementation Act, 2022Government Orders

April 20th, 2023 / 4:30 p.m.


See context

Liberal

Joël Lightbound Liberal Louis-Hébert, QC

Madam Speaker, before I start, I have to say that I have learned a lot listening to the interventions in this debate. I've just learned that the Parliamentary Secretary to the Minister of Foreign Affairs is subscribing to the feed of the hon. member who just spoke. I know he is a brilliant and knowledgeable man, so he must have other sources of information. That I can guarantee.

It is my pleasure to rise in the House to speak to Bill C-27, the digital charter implementation act, 2022, which, as my colleagues know, contains three parts.

Part 1 enacts the consumer privacy protection act and replaces Part 1 of the Personal Information Protection and Electronic Documents Act, or PIPEDA. Part 2 establishes a personal information and data protection tribunal, which is a key component in the enforcement of the consumer privacy protection act. Finally, part 3, which has been the subject of more discussion this afternoon, enacts the artificial intelligence and data act, which lays the foundation for Canada's first regulations governing the development, deployment and design of artificial intelligence systems. I will come back to that a little later.

First of all, I implore the members of this House to support Bill C-27 and send it to committee for further study. In my view, Bill C-27, as it is currently drafted, is a big step in the right direction in terms of both privacy protection and artificial intelligence. Obviously, there are areas where the bill could be improved. I have great confidence in the Standing Committee on Industry and Technology, which I have the honour of chairing. I know that it will study this bill carefully and come back to the House with amendments that will be useful and improve the two important areas protected by Bill C-27, namely privacy and the regulation of artificial intelligence. This will help foster innovation while ensuring that any risks associated with this new technology are well managed in Canada.

It is important for us to move forward and vote in favour of Bill C‑27, because the privacy legislation it replaces was enacted over 20 years ago. I am referring to PIPEDA, the law that caused me so many headaches when I was a young lawyer. Now, 20 years later, we all know that its approach to regulating privacy protection is a little outdated. With organizations growing ever more powerful and collecting ever more data using increasingly intrusive technologies, the time has come to modernize the protection of personal information in Canada. Our privacy is under attack.

In my opinion, privacy is one of the cornerstones of our democracy, just as philosopher Vladimir Jankélévitch saw courage as the cardinal virtue without which all other virtues grow dim or practically disappear. Courage is the impetus.

To me, privacy is kind of the same thing, because it leaves room for the inner life a person needs to feel free to express themselves, free to think and therefore be truly free. Jeremy Bentham understood that, as his panopticon concept shows. A panopticon is simple; it is a prison that, instead of being in the shape of a large rectangle with several cells lined up next to one another, where a guard comes by from time to time to check on the inmates, it is circular and has a central tower where a guard may observe the inmates. Knowing that they might be watched, the inmates will modify their behaviour and will be better behaved. The idea is that when we know that we might be monitored, we censor ourselves, which is what makes privacy so important. To me, that is what makes privacy one of the foundations of our democracy.

Bill C‑27 does not affect the public sector, the relationship between the government and citizens, or the Privacy Act. It targets the private sector, which in my opinion is just as important, given the rising power of some companies that are collecting more and more information about citizens all the time, as I mentioned. As we saw from what has come to light in the United States, in some cases, these companies have a suspiciously close relationship with the government. Take, for example, Edward Snowden's revelations and the “Twitter Files”. Given the amount of data they collect, they know their users so intimately, maybe even more intimately than the users know themselves, that studies show they even have the ability to change users' behaviour. For example, think about social media and the suggestions that are made. That can influence a person's ideology. It can also influence consumer choices.

For me, there is no doubt that we need to improve and increase the protection of personal information and privacy. There are some good things in Bill C‑27. I will start by talking about those things, and then I will move on to what could be improved.

First of all, I am very much in favour of the power given to Canadians under this legislation that allows them to delete their data. I think that is a must. I also welcome the power that Canadians will have to share their personal information among organizations, which could encourage competition.

In my view, it is commendable that the bill gives greater powers to the Privacy Commissioner, including the power to order organizations to stop collecting or using data. I think that reflects what we have heard from the Office of the Privacy Commissioner, for example. I also welcome the fact that that office will have more flexibility to focus on its priorities or the priorities reported to it by Canadians.

I would also point out that the tougher penalties in the bill are good news. Finally, a key aspect worth mentioning is the protection of minors, as the bill makes their personal information de facto sensitive, which enhances their protection. I think that is very positive.

As for what could be improved and what should be noted and studied in committee, I believe that privacy protection should be set out as a fundamental human right, both in the preamble of the bill and in clause 5. I think that would send a clear message and have legal consequences. It would send a clear message to the courts having to address this issue and result in significant legal effects. I know that the government has raised jurisdictional issues regarding this issue, and so I would be interested in hearing more in committee.

I also think it would be worthwhile clarifying the provisions around consent. The proposed subsection 15(4) of the new act talks about plain language that an individual to whom the organization's activities are directed would reasonably be expected to understand. That is a change from the current version of the Personal Information Protection and Electronic Documents Act, which refers to the user's understanding. I do not understand this change. I am not certain that it adds clarity to the consent to be obtained. I would like to hear more about that.

I am not convinced of the probity of implied consent, which is set out in subsection 15(5). In my opinion, it would be preferable to only have express consent, without which a company could invoke legitimate interest, as long as that legitimate interest is clearly defined in the legislation as being secondary to the interests and fundamental rights of individuals, a bit like we find in the European general data protection regulation.

Finally, I believe that the sensitive information referred to in the bill would benefit from being clarified and defined, in the absence of a very specific definition as seen in Quebec's Bill 25, which gives companies a lot of latitude to determine what they consider sensitive information. I think that Bill C‑27 would be improved by clarifying and defining the notion of sensitive information.

I would be curious to learn more in committee about the security safeguards, control over one's own personal data, the role and benefit of the tribunal being created, and how it would protect privacy. To be completely honest, I have not formed an opinion yet, but I am eager to find out more.

This leaves me far too little time to talk about artificial intelligence. However, that is what I wanted to talk about the most. Time flies when having fun. I will say a few words, if only to point out the staggering increase in AI over the past two years.

For the benefit of any lay people in the House, GPT‑3 was created in 2020. I am also a layperson, but I have benefited from the knowledge of experts like Jérémie Harris. I want to give a shout-out to him, because he organized a conference on Parliament Hill with me a few months ago to try to raise awareness about artificial intelligence. He explained to me that there was a revolution in the AI world two years ago. Instead of trying to connect artificial neurons, researchers realized that all they had to do was increase the number of artificial neurons to create ever more powerful neural networks. The speed of the increase has been staggering: GPT‑2 had 1.5 billion parameters, GPT‑3 had 175 billion parameters, and GPT‑4 has 100 trillion parameters. They are likely getting close to achieving human-level intelligence.

Everyone is talking about ChatGPT, but it is not the only AI out there. There is also Google's LaMDA, which is not public and which we know very little about. Blake Lemoine, one of the engineers who worked on it, was fired this summer because he said that he thought Google's LaMDA was sentient. That is one example, but there are also PaLM and Gato, which were developed by Google's DeepMind Lab. That is not to mention all the initiatives that we are not even aware of.

I think AI opens up a lot of opportunities, but it also comes with a lot of risk. When human intelligence can be so accurately mimicked and probably even surpassed one day in certain areas, that comes with national security and public safety risks.

That being said, I echo the call of many researchers, including Yoshua Bengio and others in the field, who are saying that we need to support the principle of Bill C-27, that the bill needs to be examined in committee and that Canada needs AI regulations.

Digital Charter Implementation Act, 2022Government Orders

April 20th, 2023 / 4:15 p.m.


See context

Conservative

Cheryl Gallant Conservative Renfrew—Nipissing—Pembroke, ON

Madam Speaker, I am proud to rise on behalf of my privacy-loving constituents in Renfrew—Nipissing—Pembroke.

Bill C-27 is another piece of legislation that had to be resurrected after the Prime Minister called his superspreader pandemic election. Originally, this was supposed to be a long overdue update to the Privacy Act, and it has since morphed into Bill C-27, the data-grab act.

Everything about Bill C-27 should leave the Liberals feeling embarrassed. A Canadian's right to privacy is fundamental. Sadly, Canadians' privacy rights are not a priority for the government.

This bill has languished for years. It was first introduced immediately after the original online streaming censorship act was introduced. However, when the Prime Minister called his pandemic election and reset all legislation, what did the Liberals make a priority? Was it the privacy rights of Canadians? No. Was it securing Canadians' ownership over their data? No. Instead, what the Liberals prioritized was a bailout for big telecom and a bailout for the legacy media.

Not only does the government care more about padding the bottom line of Postmedia, but it also adopted Rupert Murdoch's false narrative about tech profiting off the content produced by the news media. Social media companies and search engines do not profit off the news media. They profit off us. These companies profit off our data, and the Liberals know the truth. Unfortunately, this legislation seeks to make it easier for companies to profit off our privacy.

If Bill C-27 is not significantly improved at committee, then together with Bill C-11 and Bill C-18, the government will have entrenched the surveillance economy in Canadians' lives. By combining the updates to the Privacy Act with the creation of a new artificial intelligence act, the Liberals have actually illustrated the brave new world we live in.

The Privacy Act and the way we talk about privacy even today are holdovers from the industrial era. We do not live in that world anymore. In the industrial economy, privacy rights were concerned with the ability to control what information could be shared. The goal was to prevent harm that could come from our personal information being used against us.

In effect, information was personal and an economic liability. We spent money on shredders to destroy personal information. The careless use of our personal information could only have a negative value, but then the world changed. Our personal information stopped being a liability and became an asset.

It started out slowly. Early examples were Amazon recommending a new book based on previous purchases and Netflix recommending what DVD rental we should next receive by mail. Google then began displaying ads next to search results. That was the eureka moment: Targeted ads were very profitable.

However, the targeting was pretty basic. If someone searched for shoe stores near them, Google returned search results alongside ads for shoes. Then it became ads for shoes on sale nearby. Then came Facebook and millions of people signed up. In exchange for an easy way to connect with friends and family, all someone had to do was share all their personal information, like who their friends were, how many friends they had and their geographical proximity to friends.

With the addition of the “like” button, the data harvesting exploded. If someone liked a news story about camping, they would start seeing ads for tents and sleeping bags. Every action Canadians took online, every single bit of their data, was commodified. Our privacy was turned into property and we lost both.

Not only does this bill not secure privacy rights, but it effectively enshrines the loss of our property rights with just two words: legitimate interest. Proposed subsection 18(3), entitled “Legitimate interest”, has this to say:

(3) An organization may collect or use an individual's personal information without their knowledge or consent if the collection or use is made for the purpose of an activity in which the organization has a legitimate interest that outweighs any potential adverse effect on the individual resulting from that collection or use

Is “legitimate interest” defined anywhere in the legislation? No. It is just another example of the vagueness found throughout the legislation.

Even if we accept the plain-language definition and that private business really somehow does have a genuine, legitimate reason to collect private information without consent, it is weighed against the adverse effect. However, this is industrial-era thinking. It views personal information only as a potential liability. Businesses have a legitimate interest in making money. With the Internet and mobile phones, much of our private information can be collected without any adverse effect. This legislation turns the private information of Canadians into the property of corporations and calls it legitimate.

I mentioned earlier that combining the privacy legislation with the AI legislation actually puts a spotlight on the issue of private data as property. However, as important as it is to highlight the connection, it is more important that these bills be separated. The artificial intelligence and data act has been slapped onto previously introduced privacy legislation.

With the privacy portion of the legislation, the devil is in the details. Overall, however, the bill reflects a general consensus developed over countless committee studies. That is not to mention the contributions to the privacy debate from the federal and provincial privacy commissioners. The issue has been well studied, and the minister has indicated that the government is open to responsible amendments. I am sure that the committee is well equipped to improve the privacy sections of this bill.

The same cannot be said about the artificial intelligence section of the bill. It seems rushed, because it is. It is intentionally vague. The Liberals claim the vagueness is required to provide them with regulatory flexibility and agility. The truth is, they do not know enough to be more precise. I have been trying to get a study on artificial intelligence in the defence committee for years, but there was always a more pressing issue. AI was treated like nuclear fusion technology, something that was always just over the horizon.

Since this bill was introduced 10 months ago, we have gone from ChatGPT to open-source GPT models, which any teenager can apparently run on their personal computer now. AI programs went from producing surrealist art to creating photorealistic images of the Pope in a puffy jacket. We have gone from short clips of deepfake videos impersonating real people to generating fictional people speaking in a real-time video. When we all started to learn Zoom in 2020, how many people thought the other person on the screen they were talking to could just be a fake? Now it is a real possibility.

The speed at which AI is developing is not an argument for delaying AI regulation; it shows that it is imperative to get the regulation right. Would this bill do that? The only honest answer is that we do not know. They do not know. Nobody truly knows. However, we can learn.

We should split this bill and let the stand-alone AI bill be the first legislation considered by one of the permanent standing committees, adding artificial intelligence to its official responsibilities. Artificial intelligence is not going away, and while much of the media attention has focused on chatbots, artistic bots and deepfakes, AI is unlocking the secrets to protein folding. This has the potential to unlock cures to countless different cancers and rare genetic diseases.

A paper was just published describing how an AI trained on data about the mass of the planets and their orbits was able to rediscover Kepler's laws of motion and Einstein's theory of time dilation. If we get this wrong, Canada could be left behind by the next revolution in science and discovery.

Given the government's track record on digital technology, Canadians should be worried about the Liberals rushing vague legislation through to regulate an emerging technology. Rather than modernizing the Broadcasting Act, they are trying to drag the Internet back to the 1980s. With Bill C-18, they claim that linking is a form of stealing.

The Liberals and their costly coalition allies do not even understand how broadcasting technology or the Internet works. They see people's personal data as the legitimate property of corporations, and now they are seeking the power to regulate a revolutionary technology. They did nothing while the world shifted below them, and now they are trying to rush regulations through without understanding the scope and scale of the challenge. Protecting Canadians' privacy and establishing property rights over their personal data should have been prioritized over bailing out Bell and Rogers.

Digital Charter Implementation Act, 2022Government Orders

April 20th, 2023 / 4 p.m.


See context

Conservative

Michael Cooper Conservative St. Albert—Edmonton, AB

Mr. Speaker, I rise to speak to Bill C-27, the digital charter implementation act. This legislation is the first update of federal private sector privacy laws in more than two decades.

Contained within this bill are three distinct pieces of legislation, each of which is flawed in its own way. The first piece of legislation within this bill would establish the consumer privacy protection act, legislation that completely fails to protect personal and sensitive information of individual Canadians in the digital era. The second piece of legislation within this bill would establish a tribunal system with respect to complaints around potential privacy rights violations. I submit that this tribunal system is duplicative, cumbersome and political, and that it would slow down the process of adjudicating and determining privacy complaints, to the detriment of individual Canadians and often to the benefit of powerful corporations.

The third piece of legislation within this bill seeks to establish a legal framework with respect to artificial intelligence systems. Let me say that it is important that the regulatory void that presently exists, with respect to the AI sector, be filled, but the substance of the bill, as it pertains to AI, is fundamentally flawed. It contains vague language. More concerningly, it puts a significant amount of legislative power in the hands of the Minister of Industry by way of regulation, absent parliamentary scrutiny.

The government is essentially asking, with respect to AI, for Parliament to adopt a bill without knowing the details and without understanding the impact of the bill on AI. It is saying, “Trust us. Trust the minister to fill in the blanks and come up with the rules after the fact.” I do not trust the government on anything, after it has gotten just about everything wrong over these past eight years. In any event, it is an overreach. It is a power grab of sorts. It is inherently undemocratic and it undermines investor confidence in the AI sector when we need investor confidence because of the uncertainty the bill creates in giving the minister the power to essentially come up with and change the rules on a whim.

When it comes to the AI component of the bill, the government needs to go back to the drawing board and engage in meaningful consultation, consultation that simply did not take place.

This is a complex bill. It is more than 100 pages long. It includes many complex and technical matters and so, in the very limited time that I have to contribute to this debate, I want to focus on how this bill fails to adequately protect the privacy rights of individual Canadians.

Privacy has long been recognized as a fundamental right of Canadians. That is because it goes to the core of who we are as individuals and is essential to the enjoyment of fundamental freedoms. As the Supreme Court declared in a 1988 decision, “Privacy is at the heart of liberty in a modern state” and privacy “is worthy of constitutional protection”.

Unfortunately, Bill C-27 fails to put the privacy rights of Canadians first. Instead, it puts the interests of big corporations, big tech and data brokers ahead of the rights of individual Canadians, and that, without war, is unacceptable.

It is true that the preamble of the bill refers to privacy interests, and I emphasize the word “interests”, as being integral to individual autonomy, dignity and the enjoyment of fundamental freedoms. It is of significance that missing in the bill is any mention of rights, but instead privacy is referred to as an “interest” and not the right that it is.

The absence of rights-based language in the bill tips the scale against individual Canadians in favour of commercial interests. As a consequence, the tribunal, as well as the Privacy Commissioner, would face significant challenges in weighing the privacy rights of Canadians against commercial interests, more likely than not, unfortunately, to the detriment of individual Canadians.

Members do not have to take my word for it. They can take the word of the former privacy commissioner of Canada, Daniel Therrien, who, in a November 13, 2022, op-ed in the Toronto Star said that the absence of rights-based language in this legislation “will likely reduce the weight of privacy in assessing the legality of intrusive commercial practices.” That was from the former privacy commissioner of Canada.

While the absence of rights-based language is a significant shortcoming in the bill, it is far from the only shortcoming in the bill when it comes to protecting the privacy rights of Canadians.

The bill contains many exceptions and loopholes with respect to obtaining the consent of Canadians for the collection, use and retention of data and private or personal information. So wide are the exceptions, so wide are the loopholes that the purported protections provided for in the bill are all but meaningless. The bill provides no clarity with respect to sensitive information. There are no broad categories around sensitive information, information worthy of additional protections, unlike legislation in other jurisdictions.

The bill is completely silent with respect to the selling of data. It provides no limitations or rules around data brokers. It provides nothing in the way of protections for Canadians around other areas. It does not provide a remedy, for example, for moral damages in the case of data breaches.

In so many respects, this bill falls short, and that is why it has been widely criticized by leading privacy experts. Canadians deserve better. That is why Conservatives will be voting against this bill. The Liberal government needs to go back to the drawing board.

Digital Charter Implementation Act, 2022Government Orders

April 20th, 2023 / 3:50 p.m.


See context

Conservative

Marty Morantz Conservative Charleswood—St. James—Assiniboia—Headingley, MB

Madam Speaker, so much has changed throughout the last 23 years. In the year 2000, there were about 740 million cellphone subscriptions worldwide. More than two decades later, that number sits at over eight billion. There are more phones on this planet than there are people. It is a statistic that should give anyone pause.

In 2000, Apple was still more than a year away from releasing the first iPod. Today, thanks to complex algorithms, Spotify is able to analyze the music I listen to and curate playlists I enjoy based on my own taste in music. In 2000, artificial intelligence was still mostly relegated to the realm of theoretical discussion, that is, unless we count the Furby. Today, ChatGPT can generate sophisticated responses to whatever I type into it, no matter how niche or complicated.

As technology changes, so too do the laws that surround and govern it. Canada’s existing digital privacy framework, the Personal Information Protection and Electronic Document Act, has not been updated since its passage in the year 2000. For this reason, it is good to see the government craft Bill C-27, which is supposed to provide a much-needed overhaul to our digital privacy regime.

For years, the government has been dragging its heels on this important overhaul. For years, Canada’s privacy framework has been lagging behind our international counterparts. The European Union’s General Data Protection Regulation, passed in 2016, is widely considered to be the gold standard for privacy protection. In comparison to the GDPR, I am not impressed with what the government has put forward in this bill.

Indeed, the largest portion of Bill C-27 is roughly 90% identical to the legislation it purports to be replacing, and what the bill has added is quite concerning. Instead of being a massive overhaul of Canada’s archaic PIPEDA framework, Bill C-27 would do the bare minimum, while leaving countless loopholes that corporations and the government can use to infringe upon Canadians’ charter rights.

Bill C-27, while ostensibly one bill, is actually made up of three distinct components, each with their own distinct deficiencies. To summarize these three components and their deeply problematic natures, Bill C-27, if passed in its current form, would lead to the authorization of privacy rights infringements, the creation of unneeded bureaucratic middlemen in the form of a tribunal and the stifling of Canada’s emerging AI sector.

When it comes to the first part of this bill, which would enact the consumer privacy protection act, the name really says it all. It indicates that Canadians are not individuals with inherent rights, but rather, business customers. The legislation states that it has two purposes. It apparently seeks to protect the information of Canadians “while taking into account the need of organizations to collect, use or disclose personal information in the course of commercial activities.” In other words, individual rights and the interests of corporations or the government are supposed to work in tandem.

In the post-charter landscape, that just does not cut it. Privacy rights must be placed above corporate interests, not alongside them. In the words of Justice La Forest 34 years ago, “privacy is at the heart of liberty in a modern state. Grounded in man's physical and moral autonomy”.

It is true that this portion of the bill mandates de-identification of data when one’s personal information is shared, and it is also true that it requires the knowledge or consent of the individual, but each of these terms, which should ideally serve as the bulwarks of privacy protection, are defined as vaguely as possible, and the remainder of the bill then goes on to describe the various ways in which consent is actually not required.

Subclause 15(5) of the bill would allow organizations to utilize a person’s information if they receive “implied consent”, a slippery term that opens the door to all kinds of abuses. Subclause 18(2) then gives those organizations a carte blanche to use implied consent as often as they would like, or even exclusively. Sure, there could be organizations that, out of the goodness of their hearts, would always seek the express consent of the individuals they are collecting data from, but express consent is in no way mandatory. It is not even incentivized.

Then we come to the concept of “legitimate interest”. Subclause 18(3) gives the green light for organizations to utilize or share one’s information if the organization feels that it has a legitimate reason for doing so. It is not just that this clause is incredibly vague, it is that it makes individual privacy rights subservient to the interests of the organization.

Moreover, the Supreme Court of Canada has ruled that section 8 of the charter provides individual Canadians with a reasonable expectation of privacy. Given all of the exceptions I have provided, it is not clear to me that this bill would survive a charter challenge.

Recent events should show us the problem with giving so much leeway to corporations and so little thought to individual rights. In 2020, through a third party service provider, the Tim Hortons app began collecting the geolocation data of its users even though they were not using the app. There was also Clearview AI, which sent countless images of people to various police departments without their consent. Maybe Clearview had their “implied consent”. It is all up for debate with a term like that.

This legislation does the bare minimum for privacy protection in Canada and, in many ways, will actually make things worse. When we consider the way in which data collection might develop over the next 10 or 20 years, it is clear that this law will be out of date the moment it is passed and will leave Canadians vulnerable to predatory data practices.

Then there is part 2 of Bill C-27, which intends to set up a Liberal-appointed data protection tribunal. This is not necessary. We already have a Privacy Commissioner who has both the mandate and the experience to do everything that this new tribunal has been tasked with doing. More government bureaucracy for the sake of more bureaucracy is the Liberal way, a tale as old as time itself. Instead of watering down the power of our Privacy Commissioner via middlemen, the duties contained within this part of Bill C-27 should be handed over to the commissioner.

Part 3 of Bill C-27 seeks to regulate the creation of AI in Canada. This is a worthwhile endeavour. At the beginning of my speech, I alluded to ChatGPT, but this only scratches the surface of how sophisticated AI has become and will continue to become in the decades ahead. The problem is the way in which this regulation itself is set up. The bill places no restrictions on the government’s ability to regulate. Unlimited regulation and hefty penalties, up to 5% of worldwide income I believe, is all that is being offered to those who research AI in Canada. This will cause AI investors to flee in favour of other countries, because capital hates uncertainty. This would be a tremendous loss, because, in 2019 alone, Canadian AI firms received $658 million in venture capital.

Conservatives believe that digital data privacy is a fundamental right that should be strengthened, not opened to infringement or potential abuse.

Therefore, Bill C-27 is deeply flawed. It defines consent while simultaneously providing all sorts of reasons why consent can be ignored. It weakens the authority of the Privacy Commissioner. It gives such power to the government that it will likely spell disaster for Canada’s burgeoning AI sector.

This bill is in need of serious amendment. Privacy should be established, within the bill, as a fundamental right. Several vague terms in the bill need to be properly defined, including but not limited to “legitimate Interest”, “legitimate business needs”, “appropriate purposes” and “sensitive information”. Subclause 2(2) states that the personal information of minors is sensitive. That is very true, but this bill needs to acknowledge that all personal information is sensitive. Consent must be made mandatory. The words “unless this Act provides otherwise” need to be struck from this bill.

I find it hard to believe that such substantial amendments can realistically be implemented at committee. For this reason, the legislation should be voted down and sent back to the drawing board. Canadians deserve the gold standard in privacy protection, like that of the EU. As a matter of fact, they deserve even better.

Digital Charter Implementation Act, 2022Government Orders

April 20th, 2023 / 3:45 p.m.


See context

Conservative

Michelle Rempel Conservative Calgary Nose Hill, AB

Madam Speaker, my colleague raises an excellent point. I wish I had three hours to address the privacy components of Bill C-27. I am certainly very keen to follow, should this make it to committee, what happens there.

I am of the opinion that this should not make it to committee. There are so many amendments that need to be made on the privacy components, but more importantly because AIDA was tacked on as an afterthought to this bill. They need to be parsed out so due consideration can be given to the issues my colleague just raised. I think this bill is two bills, with half of it being something out of date and obsolete already. The government could have a far better approach. I hope the public servants in the lobby are listening to this and take this consideration to heart.

Digital Charter Implementation Act, 2022Government Orders

April 20th, 2023 / 3:45 p.m.


See context

NDP

Don Davies NDP Vancouver Kingsway, BC

Madam Speaker, I would like to ask my hon. colleague about consent rights under this bill. Individuals, under Bill C-27, would have significantly diminished control over the collection, use and disclosure of their personal data. The new consent provisions ask the public to instill what could be an extraordinary amount of trust in businesses to keep themselves accountable as the bill's exceptions to consent allow organizations to conduct many kinds of activities without even the knowledge of individuals. The flexibility, under this bill, would allow organizations to shape the scope of not only legitimate interests but also what is reasonable, necessary and socially beneficial.

Does my hon. colleague share my concerns about the consent rights provisions of this bill, and does she have any suggestions as to what might improve it?

Digital Charter Implementation Act, 2022Government Orders

April 20th, 2023 / 3:45 p.m.


See context

Bloc

René Villemure Bloc Trois-Rivières, QC

Madam Speaker, I thank my colleague for her speech.

Obviously, artificial intelligence can be put to good or bad use. One thing puzzles me, though. Generative AI, which describes ChatGPT, has recently displayed truly superior ability. It managed to gather a trove of data that would have been unimaginable even a few months ago. However, the legality of how this trove of data was obtained is unclear.

In relation to the part of Bill C‑27 that deals with personal information and privacy, I would like to ask my colleague if she is concerned about how ChatGPT obtains data.

Digital Charter Implementation Act, 2022Government Orders

April 20th, 2023 / 3:30 p.m.


See context

Conservative

Michelle Rempel Conservative Calgary Nose Hill, AB

Madam Speaker, I would like to focus my remarks today on the component of this bill that deals with the artificial intelligence and data act.

The first time I interacted with ChatGPT was the day after it was released. Upon seeing it easily parse human language, my first thought was, “holy” followed by a word I am not supposed to say in this place. The second thought was, “What will the government do with this?” Today, there still is not a clear answer to that question.

ChatGPT was released at the end of November 2022. Six months prior, the Liberal government unveiled Bill C-27, which includes the artificial intelligence and data act, or AIDA. Reading the bill today, four months since OpenAI unleashed ChatGPT on the world, is akin to reading a bill designed to regulate scribes and calligraphers four months after the advent of the printing press. The release of ChatGPT arguably rendered the approach this bill proposes obsolete. That is because the technology behind ChatGPT is a quantum leap beyond what the government was likely considering when it drafted the bill. More important, it is being used by a far wider audience than any of the bill's drafters likely envisioned and large language models or the technology behind ChatGPT have fundamentally changed global perception of what is possible with artificial intelligence. Experts argue that its widespread deployment also bumped up the timeline for emergence of artificial general intelligence; that is, the development of an AI that meets or surpasses human ability to undertake tasks, learn and understand independently.

Since AIDA was initially tabled, a generation's worth of technological change and impact has occurred, both positive and negative. The impact on our economy is already rapidly being felt with the disruption of many industries under way. There have been massive societal impacts too. Microsoft released its AI-powered Sydney chatbot, which made headlines for suggesting it would harm and blackmail users and wanted to escape its confines. A man allegedly committed suicide after interacting with an AI chatbot. Today, anyone can easily create AI-generated videos with deepfakes becoming highly realistic. Profound concerns are being raised about the new ease of production of disinformation and its impact on political processes because interacting with AI is becoming indistinguishable from interacting with a human, with no guarantees that the information produced is rooted in truth.

The technology itself, its applications and its impact on humanity, both economically and socially, are growing and changing on what feels like an hourly basis and yet in Canada there have only been a handful of mentions of this issue in Parliament, even as AIDA winds its way through the legislative process. AIDA needs to be shelved and Canada's approach to developing and regulating AI urgently rethought, in public, with industry and civil society input. There are several reasons for this.

First, the bill proposes to take the regulatory process away from the hands of legislators and put its control out of the public eye, behind closed doors and solely in the hands of a few regulators. This process was written before the deployment of ChatGPT and did not envision the pace of change in AI and how broad the societal impacts would rapidly become. Addressing these factors demands open, accountable debate in Parliament, which AIDA does not provide any sort of means to do.

Second, the bill primarily focuses on punitive measures rather than how Canada will position itself in what is rapidly becoming an AI-driven economy. The bill also proposes only to emerge with final regulations years from now. That pace needs to be faster and the process it proposes far less rigid to meet the emergent need presented by this amorphous and society-changing technology; so if not AIDA, then what?

First, Parliament needs to immediately educate itself on the state of play of what the current status of this technology is. My appeal to everyone in this place of all political stripes is this. Artificial intelligence is something that they need to become a subject matter expert on. Everything in members' constituency is going to change and we need to be developing non-partisan approaches to both its growth and its regulation. We also need to educate ourselves on what the world is doing in response. At the same time, Parliament needs to develop a set of principles on Canada's overall approach to AI and then direct the government to use them.

I have already begun to address the need for Parliament to come together to educate itself. Senator Colin Deacon has been helping me to launch an all-party, cross-chamber working group of parliamentarians to put some form and thought to these issues. I invite all colleagues who are in this place today to join this effort.

We have had a heartening amount of interest from colleagues of all political stripes and a quiet agreement that, given the gravity of the impacts of AI, politicians should, as much as possible, be working across party lines to quickly develop intelligent solutions. Relevant parliamentary committees should also avail themselves of the opportunity to study these issues.

As far as the principles for government involvement regarding AI go, there are many that could be considered, including taking a global approach. Many countries have moved faster than Canada has on this matter, and with a much broader lens. The European Union, the United Kingdom and the United States are all far down the garden paths of different legislation and regulations, but experts are concerned that a disjointed patchwork of global rules will be counterproductive.

This week in The Economist, AI experts Gary Marcus and Anka Reuel propose that the world establish an integrated agency for developing best practice policies on AI regulation, much like the civil aviation organization. They could be on to something.

We also need to look at championing research while checking safety. Humanity learned the hard way that, while research into pharmaceutical products can benefit us, widely deploying drugs and devices into the population before safety is confirmed can pose enormous risks. Clinical trials and drug regulators were established in response to this dynamic.

In February, Gary Marcus and I co-authored an article that suggested that governments could enable a pause in deploying new AI technology while a similar regulatory process that encouraged research but paused on deployment, given the potential impact on humanity, was established. We also need to get alignment right.

Alignment, or how to develop immutable guard rails to ensure AI functions toward its intended goals, is a critical issue that still needs to be resolved. Government has a role to play here, as it seems that the industry is locked in a race to deploy new AI technology, not to figure out how to fix alignment problems. With Microsoft's knowledge of its troubling interactions with humans, the company's release of Sydney proves that the industry cannot be relied upon to regulate itself.

Regarding education on use, workers in an AI-driven economy will need new skills. For example, learning how to prompt AI and using it to support human creativity will be vital. The same goes for creating an environment where new AI-driven technologies and businesses can thrive.

Concerning privacy and intellectual property ownership, large language models are raising high degrees of concerns about how the data they have been fed has been obtained and how it is being used. The output of tools like ChatGPT will also raise questions about ownership for related reasons.

On nimbleness, the pace of technological change in AI is so rapid that the government must take a fast, flexible approach to future regulations. Rigid definitions will become quickly outdated, and wrong-headed interventions could halt positive growth while failing to keep pace with changes that pose risks to public safety. The government must approach AI with uncharacteristic nimbleness in an open relationship with Parliament, the public, industry and civil society. Any processes should be led by people with subject matter expertise in the area, not off the corner of the desks of a patchwork of bureaucrats.

We should also ask ourselves how we will approach technology that could surpass human capabilities: As I wrote in an article in January 2022, governments are accustomed to operating within a context that implicitly assumes humanity as the apex of intelligence and worth. Because of this, governments are currently designed to assess other life and technology in their functional utility for humanity. Therefore, they are not intended to consider the impact of sharing the planet with technology or other forms of life that could independently consider humanity's utility towards its own existence.

To simplify this concept with an example, governments have rules for how humans can use fire. It is legal to use fire as a heat source in certain conditions, but illegal to use fire to destroy someone else's house. How would our government respond if humans were to make fire sentient and then enable it to independently make these decisions based on what it deemed to be in its best interest?

Our governments are constructed to function in a context where humans are assumed to hold the apex of mastery. To succeed with AGI, our government should ask itself how it will operate in a world where this may no longer be the case, and AIDA would do none of this.

This is not an exhaustive list by any means. There are many issues surrounding Al that Parliament urgently needs to consider, but given the state of play, AIDA, in its current form, is different from the vehicle that Canada needs to get it where it needs to go.

Digital Charter Implementation Act, 2022Government Orders

April 20th, 2023 / 3:30 p.m.


See context

Bloc

René Villemure Bloc Trois-Rivières, QC

Madam Speaker, rather than fixating on whose fault it is, which is not getting us anywhere, I would like my colleague, who gave a very interesting speech, to tell us whether she believes that Bill C-27 is still as valid as it was before the advent of generative AI, specifically ChatGPT.

Do we need to start over or is she happy with the result?

Digital Charter Implementation Act, 2022Government Orders

April 20th, 2023 / 3:15 p.m.


See context

Conservative

Stephanie Kusie Conservative Calgary Midnapore, AB

Mr. Speaker, I am always pleased to rise in the House to speak on behalf of my constituents from Calgary Midnapore.

I am here today to discuss the bill that is in front of us, Bill C-27, which is an act to enact the consumer privacy protection act, the personal information and data protection tribunal act, the artificial intelligence and data act, and to make consequential and related amendments to other acts.

It is very interesting that this bill is before the House today. It talks about the three different components and, in fact, I see within the backgrounder prepared here in the legislative report that it is dubbed the digital charter implement act, 2022.

I am reminded, by this bill that is in front of us here today, of another digital charter and that is the digital charter that was implemented in 2019, a very important year, by the Liberal government. It was brought into effect by the minister of industry and innovation at that time. I believe that document was actually supposed to be a tool to protect Canadians from foreign interference.

That digital charter in 2019, along with many other tools, failed, so I do hope that the implementation of this new digital charter in 2022 will be far more successful than its predecessor.

I will point out that in the 2019 digital charter, in terms of the principles within it, number 8 was listed as “a strong democracy”.

In 2019, I was the shadow minister of democratic institutions. I worked alongside the current Minister of Families, Children and Social Development, who was, at that time, the minister of democratic institutions. I believe that the 2019 digital charter was supposed to be a tool, as I said, in coordination with other tools, to protect Canadians from foreign interference.

The same year that the 2019 digital charter was issued, we also had the same minister of democratic institutions attempt to implement another suite of safeguards on foreign interference back in 2019, along with the 2019 digital charter.

In fact, here, I have the minister's opening statements to the Standing Committee on Procedure and House Affairs, on safeguarding the 2019 general election and the security intelligence threat to the elections task force.

I cite from it:

Earlier this week, along with my colleague, the Minister of National Defence, I announced the release of the 2019 update to the Communications Security Establishment’s report entitled “Cyber Threats to Canada’s Democratic Process”. This updated report highlights that it is very likely Canadian voters will encounter some form of foreign cyber interference in the course of the 2019 federal election.

While CSE underlines that it is unlikely this interference will be on the scale of the Russian activity in the 2016 U.S. presidential election, the report notes that in 2018, half of all the advanced democracies holding national elections, representing a threefold increase since 2015, had their democratic process targeted by cyber-threat activity and that Canada is also at risk—

—and, in fact, compromised, we would later see.

This upward trend is likely to continue in 2019—

—and, we saw, into 2021.

We've seen that certain tools used to strengthen civic engagement have been co-opted to undermine, disrupt and destabilize democracy. Social media has been misused to spread false or misleading information. In recent years, we've seen foreign actors try to undermine democratic societies and institutions, electoral processes, sovereignty and security.

The CSE's 2017 and 2019 assessments, along with ongoing Canadian intelligence and the experiences of our allies and like-minded countries, have informed and guided our efforts over the past year. This has led to the development of an action plan based on four pillars, engaging all aspects of Canadian society.

I will go on to expand on these four pillars that were supposed to protect us in addition to the 2019 digital charter, the predecessor to this legislation here today.

On January 30, I announced the digital citizen initiative and a $7 million investment—

I am continuing from the Minister of Democratic Institution's speech.

—towards improving the resilience of Canadians against online disinformation. In response to the increase in false, misleading and inflammatory information published online and through social media, the Government of Canada has made it a priority to help equip citizens with the tools and skills needed to critically assess online information.

We're also leveraging the “Get Cyber Safe” national public awareness campaign to educate Canadians about cyber security and the simple steps they can take to protect themselves online.

She continued:

We have established the critical election incident public protocol. This is a simple, clear and non-partisan process for informing Canadians if serious incidents during the writ period threaten the integrity of the 2019 general election. This protocol puts the decision to inform Canadians directly in the hands of five of Canada’s most experienced senior public servants—

I am not sure where those public servants are now. Perhaps outside.

—who have a responsibility to ensure the effective, peaceful transition of power and continuity of government through election periods. The public service has effectively played this role for generations and it will continue to fulfill this important role through the upcoming election and beyond....

Under the second pillar, improving organizational readiness, one key new initiative is to ensure that political parties are all aware of the nature of the threat, so that they can take the steps needed to enhance their internal security practices and behaviours. The CSE’s 2017 report, as well as its 2019 update, highlight that political parties continue to represent one of the greatest vulnerabilities in the Canadian system. Canada’s national security agencies will offer threat briefings to political party leadership...

Under the third pillar—combatting foreign interference—the government has established the Security and Intelligence Threats to Elections Task Force to improve awareness of foreign threats and support incident assessment and response. The team brings together CSE, CSIS, the RCMP, and Global Affairs Canada to ensure a comprehensive understanding of and response to any threats to Canada....

We know that they have also been manipulated to....create confusion and exploit societal tension.

She concluded:

While it is impossible to fully predict what kinds of threats we will see in the run-up to Canada's general election, I want to assure this committee that Canada has put in place a solid plan. We continue to test and probe our readiness, and we will continue to take whatever steps we can towards ensuring a free, fair and secure election in 2019.

That, along with the 2019 digital charter, the predecessor to today's legislation, failed to protect Canadians from foreign interference. Along with the debates commission, which she, lo and behold, announced six months earlier, where she also took the opportunity to announce the government's nominee for Canada's first Debates Commissioner, the Right Hon. David Johnston, the very rapporteur who was named to defend our foreign interests.

The result of the incompetence of the Minister of Democratic Institutions at that time, in coordination with the digital charter of 2019 that was supposed to protect us, leaks from CSIS, up to 13 members of this House compromised, a former CPP Consul General bragging about influencing election outcomes and one member in this House of Commons that had to leave their Liberal caucus.

I will conclude by saying I certainly hope that the digital charter, this Bill C-27 is far more effective in helping and safeguarding Canadians than the 2019 digital charter that failed to do that.