Digital Charter Implementation Act, 2022

An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts

Sponsor

Status

In committee (House), as of April 24, 2023

Subscribe to a feed (what's a feed?) of speeches and votes in the House related to Bill C-27.

Summary

This is from the published bill. The Library of Parliament has also written a full legislative summary of the bill.

Part 1 enacts the Consumer Privacy Protection Act to govern the protection of personal information of individuals while taking into account the need of organizations to collect, use or disclose personal information in the course of commercial activities. In consequence, it repeals Part 1 of the Personal Information Protection and Electronic Documents Act and changes the short title of that Act to the Electronic Documents Act . It also makes consequential and related amendments to other Acts.
Part 2 enacts the Personal Information and Data Protection Tribunal Act , which establishes an administrative tribunal to hear appeals of certain decisions made by the Privacy Commissioner under the Consumer Privacy Protection Act and to impose penalties for the contravention of certain provisions of that Act. It also makes a related amendment to the Administrative Tribunals Support Service of Canada Act .
Part 3 enacts the Artificial Intelligence and Data Act to regulate international and interprovincial trade and commerce in artificial intelligence systems by requiring that certain persons adopt measures to mitigate risks of harm and biased output related to high-impact artificial intelligence systems. That Act provides for public reporting and authorizes the Minister to order the production of records related to artificial intelligence systems. That Act also establishes prohibitions related to the possession or use of illegally obtained personal information for the purpose of designing, developing, using or making available for use an artificial intelligence system and to the making available for use of an artificial intelligence system if its use causes serious harm to individuals.

Elsewhere

All sorts of information on this bill is available at LEGISinfo, an excellent resource from the Library of Parliament. You can also read the full text of the bill.

Votes

April 24, 2023 Passed 2nd reading of Bill C-27, An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts
April 24, 2023 Passed 2nd reading of Bill C-27, An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts

Raquel Dancho Conservative Kildonan—St. Paul, MB

Thank you very much, Mr. Chair.

As I was saying, I'm the vice-chair of the public safety and national security committee, and it's a real pleasure to be with industry today. I appreciate and applaud my colleagues for bringing forward this very comprehensive motion to investigate a very critical issue, which I think many Canadians are paying attention to. The Prime Minister, of course, has weighed in on this as has the Minister of Public Safety.

I would like to hear more from Minister Champagne, given that he is the industry minister lead, of course, and I think it does impact a number of different areas of government, national security and, of course, industry. We can also look at the impact this will have on setting a precedent should we allow these types of contracts to continue.

Now, the government has said that it is pausing this contract, but I do have concerns given that the company that received the contract is ultimately owned in part by the company Hytera, as mentioned by my colleague, which is based in the People's Republic of China. We know that some of that technology in this contract is already being implemented in Ontario and Saskatchewan. I have not heard from Minister Champagne or the Prime Minister or the Minister of Public Safety whether this pause of the contract will mean that this government will be insisting on the removal of that technology that's already in place, again, for surveillance purposes, for RCMP. It's quite shocking when you consider that the parent company, which is in part owned by the People's Republic of China, is now sort of responsible for the surveillance technology of our RCMP.

I would have thought that would be one of the first things they would have committed to. If there were any threat to our national security, in setting a precedent in this surveillance industry that we have in Canada, whether it's for national security or within our telecoms utilized by, for example, the Department of National Defence, you would think they would set a clear standard that this is unacceptable and it would be removed immediately.

We did see, with the Liberal government, they took about five years to commit to removing the Huawei technology, and, because it took so long, it will cost hundreds of millions of additional dollars that will be passed down to the consumer. Huawei and the 5G technology we saw have so infiltrated our telecommunications systems that it will be very hard work to remove that.

I have those same concerns with what's happening here. As my colleague mentioned, earlier this year, I believe on February 22, the U.S. Department of Justice unsealed a federal indictment showing that there were 21 charges of conspiracy to commit theft of trade secrets against Hytera.

We see that in the United States they're being very aggressive and transparent with the threat from Hytera, which again is sort of the parent of the parent company that owns Sinclair. We see the Americans taking very strong action on this, yet we have not seen the Prime Minister or the Ministry of Industry or the Minister of Public Safety make a very clear statement that this surveillance technology that is being provided by this ultimately Chinese-owned company, so to speak, will be removed in Ontario and Saskatchewan.

I'd like to hear that and I'd like the Minister of Public Safety and the Ministry of Industry to come to this committee and make that commitment.

Further to that, Mr. Chair, I am concerned that there may be other contracts like this and that has not been made clear. This was found because of very solid journalism in this country. That's great, but are there more? You would think if there's one, there are likely others. We know that recently the Minister of Foreign Affairs put forward her Indo-Pacific strategy, and that falls under the Canada-China committee, which I also sit on.

There was certainly appreciation for the tougher stance that was communicated in that Indo-Pacific strategy, but what I would say is that the government on one side is saying that procurement is independent. They're blaming the independent system of procurement of this government. They're saying it's independent and they don't agree with it, but it is independent. They're sort of blaming others for what has happened under their watch, but what I would say is that every independent agency of government certainly has to follow the ethos, the values set forward by the Prime Minister and his cabinet.

I would argue that perhaps if the Indo-Pacific strategy for which the Conservatives have been calling for quite some time had been brought forward sooner, the procurement agency would have had a better idea of the threat analysis of China and companies that are partly Chinese-owned that provide surveillance technology and other technologies. Perhaps they would have had that lens to apply to this contract.

I don't believe that it is an appropriate assessment by the Prime Minister to sort of kick this over to the independent procurement agency and say it's all on them. If they had brought forward the Indo-Pacific strategy, which makes quite a bit more clear the threat analysis of China, perhaps the independent procurement agency would have had a more clear picture in order to enter any contracts with companies like this with eyes wide open.

I know there is some discussion around whether this falls under public safety, whether this falls under foreign affairs, whether this falls under the China committee or whether this falls under industry. Certainly, Minister Champagne is bringing forward bills like Bill C-27, which is in part related to the Minister of Public Safety's Bill C-26. Bill C-26 ultimately is a bill to deal with telecommunications in this country and other companies that are providing national security critical infrastructure types of services.

I would say that both committees and both ministers play a role. Given that Bill C-26 and Bill C-27 are closely related in some ways, and given what I know about the industry committee, I think it would make sense and would not be out of scope to have the ministers come forward to this committee.

I hope that members consider that, given that this may be an industry-wide problem, even beyond telecommunications and surveillance. This could be in data management. We can see health services and the privacy information therein. There are countless industries across Canada that may very well have contracts owned in part or in full that are connected to the People's Republic of China.

This is a national security concern. My point is that it also impacts a number of industries, and that's why we're seeing similar bills under Minister Mendicino and Minister Champagne.

I do feel that it is appropriate to set the standard for industry at the industry committee that these types of contracts will not be tolerated any longer. Certainly, we must bring to the attention of the Minister of Procurement and other ministers impacted by this, that, given the very clear message—or, I would say, clearer message—set forth in the Indo-Pacific strategy, there needs to be a whole-of-government approach to reviewing all contracts provided.

The last thing I will mention is that it is not just government contracts that are of concern. There are other private contracts that are of concern in multiple different industries, or there may be. If there's one that got through the procurement vetting process with the Government of Canada, it is very likely that there are a number of private entities that have contracts that would impact our national security and that really go across a number of industries.

I appreciate the very comprehensive 106(4) motion put forward. It certainly is exhaustive, and I think that's important because we want to make sure we don't have any cracks. It is very critical that we ensure that the veil is lifted on this so to speak. By passing this 106(4), the industry committee sends a very clear message to all industries that may have contracts with the People's Republic of China—which may impact data security, surveillance and the like—to take note. The industry committee taking a leadership role in that, I think, sends a very strong message across industries that are critical to our national security.

I hope that the committee considers that. I hope it considers taking that leadership position and certainly leads by example at this committee and sets a very clear tone, so that any industry impacted by national security concerns shall be made aware.

Those would be my remarks. Thank you, Mr. Chair.

Matthew Hatfield

I don't think this bill covers that in itself. I think we need changes under privacy reform with Bill C-27 to also guarantee that.

I do want to flag something in terms of the question of what a repair is. I think focusing on original functionality might be worth looking at, rather than the exact state that the manufacturer handed it off in. Looking at the use case where a manufacturer goes out of business or stops supporting a device, you might need to “modify” the device just to provide security protection or get it up to the standard that the device is intended to operate in.

We should be looking at the consumer's relationship to the device and making sure that looks more or less the same—not necessarily the manufacturer's code.

Matthew Hatfield

That's a huge question. I think in general I'll defer the interoperability discussion to both the Bill C-294 discussion and also looking at our Competition Act—and the privacy act, for that matter, in Bill C-27.

The big picture around interoperability is that many, many digitally savvy companies are locking their consumers within walled gardens. As many people on Twitter know these days, it can be very hard to leave a company once they get you locked in, no matter how you feel about that company. In general, we want to see our government passing legislation that gives consumers real ownership of our data and makes it easy for us to see our data, take our data out of a system and put it into another system. We want them to really facilitate that transfer, because people don't have the options they deserve in terms of who to do business with anymore. A lot of us are locked into commercial relationships that we are not satisfied by.

Digital Charter Implementation Act, 2022Government Orders

November 28th, 2022 / 6:25 p.m.


See context

NDP

Leah Gazan NDP Winnipeg Centre, MB

Mr. Speaker, the private right of action would allow individuals and groups of consumers to seek compensation in court. This has been used effectively in the United States to remedy violations, but it is very burdensome in Bill C-27 to make it even usable.

For example, if the Privacy Commissioner does not investigate or rule on a complaint, an individual has no right of action. If the Privacy Commissioner does investigate and rule on a complaint but the tribunal does not uphold it, the individual has no right of action. These are a couple of examples.

Does my hon. colleague feel that this bill should be amended to fix this?

Digital Charter Implementation Act, 2022Government Orders

November 28th, 2022 / 6:05 p.m.


See context

Conservative

Tracy Gray Conservative Kelowna—Lake Country, BC

Mr. Speaker, it is always a privilege to rise on behalf of the residents of Kelowna—Lake Country. Today we are debating Bill C-27, an act that would enact the consumer privacy protection act, the personal information and data protection tribunal act and the artificial intelligence and data act.

Canadians know we no longer live in the year 2000, but unfortunately much of our digital regulation still does. We have come a long way since Canadians' primary online concern was Y2K. The last time Parliament passed a digital privacy framework was PIPEDA, or the Personal Information Protection and Electronic Documents Act, on April 13, 2000. The most popular website in Canada that month was AOL.

When Parliament last wrote these regulations, millions of homes did not have dial-up, let alone Wi-Fi. Cellular phones lacked apps or facial recognition, and people still went continually to libraries to get information, and did not have the Alexas of the world as an alternative. They also called restaurants directly for delivery. Digital advertising amounted to flashing banners and pop-up ads.

In only 22 years, we have experienced a paradigm shift in how we treat privacy online. Personal data collection is the main engine driving the digital economy. A Facebook account is now effectively required to use certain types of websites and help those websites; a laptop can create a biometric password for one's bank account, and Canadians are more concerned about privacy than ever before.

One of the most common videos I share with residents in my community of Kelowna—Lake Country is one relating to privacy concerns during my questioning at the industry committee in 2020, as many people reached out to me about privacy concerns. It was to a Google Canada representative regarding cellphone tracking. This was in the immediate aftermath of reports of Canadians' cellphone data being used to track people's locations during the pandemic.

Cellphone tracking is something I continue to receive correspondence about, and I am sure other members in the House do as well. As traditionally defined, our right to privacy has meant limiting the information others can get about us. The privacy of one's digital life should be no different from the physical right to privacy on one's property. Canadians must have the right to access and control the collection, use, monitoring, retention and disclosure of their personal data.

Privacy as a fundamental right is not stipulated in the legislation we are discussing today, Bill C-27. It is mentioned in the preamble, which is the narrative at the beginning, but that is not binding. It is not in the legislation itself. While the degree to which someone wishes to use this right is ultimately up to the individual, Parliament should still seek to update the rules using detailed definitions and explicit protections. Canadians are anxious to see action on this, and I have many concerns about this legislation, which I will outline here today.

As drafted, Bill C-27 offers definitions surrounding consent rules to collect or preserve personal information. It would mandate that when personal information is collected, tech companies must protect the identity of the original user if it is used for research or commercial purposes. The legislation outlines severe penalties for those who do not comply and would provide real powers of investigation and enforcement. It presents Canada's first regulations surrounding the development of artificial intelligence systems.

Even though Bill C-27 presents welcome first steps in digital information protection, there is still a long way to go if we are to secure digital rights to the standard of privacy regulation Canadians expect, and most importantly, the protection of personal privacy rights. As is mentioned in Bill C-27, digital privacy rights are in serious need of updating. However, they are not in this legislation.

I agree with the purpose of the legislation, but many of my concerns are about inefficient, regulatory bureaucracy being created and the list of exemptions. Also, the artificial intelligence legislation included in this bill has huge gaps and should really be its own legislation.

From a purely operational perspective, while the legislation would empower the Privacy Commissioner's office with regard to compliance, it also constructs a parallel bureaucracy in the creation of a digital tribunal. If Bill C-27 is enacted, Canada's Privacy Commissioner can recommend that the tribunal impose a fine after finding that a company has violated our privacy laws. However, the final decision to pursue monetary penalties would ultimately rest with the new tribunal. Will this result in a duplicate investigation undertaken by the tribunal to confirm the commissioner's investigation?

As someone who has operated a small business, I am all too aware of the delays and repetitiveness of government bureaucracy. While it is important to have an appeal function, it is evident in this legislation that the Liberals would be creating a costly, bureaucratic, regulatory merry-go-round for decisions.

Canadians looking to see privacy offenders held accountable need to see justice done in a reasonable time frame. That is a reasonable expectation. Why not give Canada's Privacy Commissioner more authority? Of course, Canadian courts stand available. The EU, the U.K., New Zealand and Australia do not have similar tribunals to mediate their fines.

In addition to concerns about duplications of process, I am worried that we may be leaving the definitions of offending activity too broad.

While a fairly clear definition in Bill C-27, which we are debating here today, has the consent requirement for personal data collection, there is also a lengthy list of exemptions from this requirement. Some of these exemptions are also enormously broad. For example, under exemptions for business activities, the legislation states:

18 (1) An organization may collect or use an individual’s personal information without their knowledge or consent if the collection or use is made for the purpose of a business activity described in subsection (2) and

(b) the personal information is not collected or used for the purpose of influencing the individual’s behaviour or decisions.

On plain reading, this exemption deals more with the field of human psychology than with business regulation.

Also in the legislation is this:

(3) An organization may collect or use an individual’s personal information without their knowledge or consent if the collection or use is made for the purpose of an activity in which the organization has a legitimate interest that outweighs any potential adverse effect on the individual resulting from that collection or use

There is also an exemption to consent that would allow an organization to disclose personal information without the individual's knowledge or consent for a “socially beneficial purpose”. This is defined as “a purpose related to health, the provision or improvement of public amenities or infrastructure, the protection of the environment or any other prescribed purpose.” Who determines what constitutes a socially beneficial purpose? This sounds incredibly subjective, and I have a lot of concerns when legislation is this vague.

Let me give a very simple example. Suppose a person using a coffee company app occasionally adds flavourings to their coffee while doing a mobile order. That company could recommend a new product with those flavourings already in it while a person is not physically in their business. Is this not personal information that is collected and used for the purpose of influencing an individual's decision, as in this legislation?

This example is not hypothetical. In an investigation from actions in 2020, Tim Hortons was caught tracking the locations of consumers who had the app installed on their phones even when they were not using the company's app. Tim Hortons argued that this was for a business activity: targeted advertising. However, the report from the federal Privacy Commissioner found that the company never used it for that purpose. Instead, it was vacuuming up data for an undefined future purpose. Would Tim Hortons have been cleared if the current regulations in Bill C-27 were in place and if it had argued that the data was going to be used for future business activity or for some socially beneficial purpose, which is an exemption in the legislation?

While I worry about the loopholes this legislation, Bill C-27, may create for large corporations, I am equally concerned about the potential burden it may place on start-ups as well. This legislation calls for companies to have a privacy watchdog and to maintain a public data storage code of conduct. This is vital for companies like Google, Facebook or Amazon, which have become so integral to our everyday lives and oversee our financial details and private information. Having an officer internally to advocate for the privacy of users is likely long overdue. However, while that requirement would not put much financial burden on these Fortune 500 companies, it could undermine the ability of Canadian digital innovators to get started.

Canada has seen a boom in small-scale technology companies for everything from video game and animation studios to wellness or shopping sites for almost every good or service one could imagine. Digital privacy laws should be strong enough to not require a start-up with just a few staff to have to be mandated to have such a position internally. We should ensure that a concept of scale is appropriately applied in regulating the giants of today without crushing the future digital entrepreneurial spirit of tomorrow.

I would like to address the presence of Canada's first artificial intelligence, or AI, regulations in this bill. While I do welcome the progress on recognizing this growing innovation need for a regulatory framework, I question whether it is a topic too large to be properly studied and included in this bill. In just the last few months, we have seen the rapid evolution of the ability of AI to create an online demand digital artwork, for example, thanks to the self-evolving abilities of machine learning.

The impact of AI on everything from our foreign policies to agriculture production is evident. Computer scientists observed a phenomenon known as Moore's law, which showed that the processing power of a computer would exponentially double every two years, and in the 57 years since this was proposed, this law has apparently not been broken.

I am concerned that most of the rules around AI will be in regulation and not in legislation. We have seen the Liberals do this many times. They do not want to do the hard work to put policies into legislation that will be brought to Parliament and committees to be debated and voted on. They prefer to do the work behind closed doors and bring forth whatever regulations they want to impose without transparency and scrutiny. We have seen the Liberals conduct themselves many times in this way.

Experts in the field have already made the case that Bill C-27 falls seriously short of the global gold standard, the EU's 2016 General Data Protection Regulation. Canadians deserve nothing less.

Though Conservatives agree with the premise of strengthening our digital privacy protection, this bill has many concerns and gaps. Clause 6 outlines that privacy protections do not apply with respect to personal information that has been anonymized. To anonymize is defined in the legislation as “irreversibly and permanently modify personal information, in accordance with generally accepted best practices, to ensure that no individual can be identified from the information, whether directly or indirectly, by any means.”

There are a lot of risks around this. Under this legislation, information could be disclosed in numerous ways, and that is very concerning. This goes back to what I mentioned at the beginning of my speech with respect to my questioning of Google Canada early in the pandemic about tracing the locations of people through their phones and sending it to the government.

The legislation creates more costly bureaucracy. It does not protect personal privacy as a fundamental right. It has questionable exemptions to protect the privacy of people based on ideologies. It allows the government to create large areas of regulations with no oversight or transparency and it is far from the gold standard that other countries have.

Digital Charter Implementation Act, 2022Government Orders

November 28th, 2022 / 6:05 p.m.


See context

NDP

Lisa Marie Barron NDP Nanaimo—Ladysmith, BC

Mr. Speaker, I want to thank my colleague.

Bill C‑27 does not explicitly apply to political parties. As we have seen in the past, the potential for invasion of privacy and misuse exists in the political arena. I was wondering if my colleague would agree that the bill should be amended to specifically include political parties.

Digital Charter Implementation Act, 2022Government Orders

November 28th, 2022 / 6 p.m.


See context

Bloc

Sébastien Lemire Bloc Abitibi—Témiscamingue, QC

Mr. Speaker, I thank my colleague from Winnipeg North for his remarks.

Indeed, I think such a bill was urgently needed. I commend the government's leadership and congratulate it on having understood the errors in Bill C-11 and making some improvements.

I met with the Minister of Innovation, Science and Industry in January, when it was time to think about developing this bill. I emphasized the importance of the Quebec legislation and of ensuring its primacy. I thank him for listening to me and for the respect evident in Bill C-27.

With respect to the urgent need to take action, Europe is putting a lot of pressure on us. Indeed, Europe has set guidelines and is currently threatening to withdraw its confidence in our artificial intelligence systems in Canada, particularly in the banking sector. It was necessary to act; better late than never.

I hope the principle will be adopted quickly, but more importantly, I hope that the committee work will be thorough and that the experts will be heard. This will be more than welcome.

Digital Charter Implementation Act, 2022Government Orders

November 28th, 2022 / 6 p.m.


See context

Winnipeg North Manitoba

Liberal

Kevin Lamoureux LiberalParliamentary Secretary to the Leader of the Government in the House of Commons

Mr. Speaker, I would concur with the member and the many others who are, in essence, saying that Bill C-27 is a substantive piece of legislation that is ultimately designed to ensure privacy for Canadians.

As I made reference to earlier, I think we could look at how effective the legislation of the Quebec legislation has been, which was passed just over a year ago, and what the response has been to it. I understand that was what the member was saying. Taking into consideration AI, the tribunal, digital and just how much the digital economy has grown, 20 years ago is the last time we have seen any sort of substantive changes to our privacy legislation.

I am wondering if the member could provide his thoughts in regard to why it is important that we update and modernize. After all, 20 years ago, we did not even have iPhones.

Digital Charter Implementation Act, 2022Government Orders

November 28th, 2022 / 5:50 p.m.


See context

Bloc

Sébastien Lemire Bloc Abitibi—Témiscamingue, QC

Mr. Speaker, I am pleased to speak to this bill after my colleague from Rivière‑des‑Mille‑Îles, whom I would like to congratulate. I am also pleased to be following my colleague from Trois‑Rivières, an ethics expert who enlightened us on the potential impact of this bill and the dangers involved.

Unfortunately, very few people are interested in this type of bill, and yet, in the digital age, we cannot afford not to regulate the use of personal information. We cannot deny the fact that the digital shift has exploded in Quebec and elsewhere over the last decade, and it has greatly changed our lifestyles.

It is impressive to see which path companies have chosen during the pandemic, and I think it is a timely discussion to have today. However, I would like to draw attention to the new part of the bill that deals with artificial intelligence. I think it deserves serious consideration.

Part 3 of the bill raises many questions, and opinions from experts in the field of artificial intelligence are mixed. The use of artificial intelligence is a rapidly growing field that risks expanding beyond our control and jurisdiction if we do not begin to regulate the practice and define certain concepts.

Recent developments in AI in general and deep learning in particular have led to the creation of autonomous intelligent agents, which are essentially robots capable of deciding what to do without third-party intervention. These agents' autonomy raises new questions about civil liability, so we have to think about criminal provisions that would apply if someone were put in a dangerous situation, for example.

How should we approach this, and what legal status are we granting them? What legislative framework is the best fit for these autonomous agents?

At this point, we think some important definitions are missing. The law clerks who are examining the bill's provisions from a legal standpoint told us that again today. What is a high-risk intelligence system? What is a high-impact system?

The algorithms produced in applications that use artificial intelligence enable artificial beings to create goods or services or to generate predictions or results. If we compare them to human beings and use the existing framework, how will we interpret the notions of independence and unpredictability attributable to these artificial beings? The experts will help us understand all that.

Quite a few goods already exist that have a layer of artificial intelligence built into them, and 90% of those goods should not pose a problem. Experts at Meta have even said that this technology has reached its limits, because the data to train an algorithm is insufficient in quantity and lacks depth.

Let us get back to the main problem we have with Bill C‑27. Until the department clarifies its thinking on what constitutes a high-impact system, it will be difficult to assess the scope of part 3. Let us assume that everything can be considered high risk. This would mean that many companies would be accountable. If we had greater accountability, the Googles of this world might be the only ones that could risk using artificial intelligence.

The bill does not need to cover everything a machine can do for us or everything software can do once it is developed and generates predictions and results like a calculator.

If we compare it to the European legislation, we note that the latter is currently targeting employment discrimination systems, systems that would determine whether or not a permit to study there can be granted. That is essentially the limit of what the machine can do in our place.

Although the law in this document concerning artificial intelligence is far from being exhaustive, I believe it is important that we start somewhere. By starting here, with a framework, we can lay the groundwork for a more comprehensive law.

My speech this evening will help my colleagues better understand what needs to be clarified as soon as possible so we can have an important discussion about how to regulate the applications that use artificial intelligence and how to process these systems' data.

First, we will have to implement regulations for international and interprovincial exchanges for artificial intelligence systems by establishing Canada-wide requirements for the design, development and use of AI systems. Next, we must prohibit certain uses of AI that may adversely affect individuals.

The legislation is very clear on many other aspects, including on the fact that there would be a requirement to name a person responsible for artificial intelligence within organizations that use this technology. The responsibilities are fairly extensive.

In addition to the artificial intelligence and data act, which is in part 3, Bill C‑27 also includes, in part 1, the consumer privacy protection act, as well as the amendments to the former legislation. Part 2 of the bill enacts the personal information and data protection tribunal act, while part 4 includes the coming into force provisions of the bill.

As my colleagues explained, the other sections of the bill contain a lot of useful elements, such as the creation of a tribunal and penalties. One of the acts enacted by Bill C‑27 establishes a tribunal to process complaints under litigation when it comes to the use of private data. In case of non-compliance, the legislation provides for heavy penalties of up to 3% of a multinational's gross global revenue. There are provisions that are more in favour of citizens when a company misuses digital data.

Yes, this bill does have its weaknesses. I believe those weaknesses can be addressed in committee, but they may require the introduction of new legislative measures. Public services, however, are not covered by this bill. Data in the public sector requires a greater degree of protection; this bill covers only the private sector. Take, for example, CERB fraud and the CRA. In 2020, hackers fraudulently claimed $2,000 monthly payments and altered the direct deposit information for nearly 13,000 accounts.

The government can do more to tackle fraud. Unfortunately, this bill offers no relief or recourse to those whose information has already been compromised. There are digital records of nearly every important detail about our lives—financial, medical and education information, for example—all of which are easy targets for those who want to take advantage. It has been this way for a while, and it is only going to get worse when quantum computers arrive in the very near future.

This means that we must find and develop better means of online identity verification. We must have more rigorous methods, whether we are changing our requirements for passwords, for biometrics or for voice recognition.

Recently, at the sectoral committee, we heard about how easy it is for fraudsters to call telecommunication centres and pass themselves off as someone else to access their information. We must improve identity verification methods, and we must find a way to help those who are already victims of fraud. We must do so by amending Bill C-27 or introducing an additional legislative measure.

Since this is a fairly complex bill, it will be referred to the Standing Committee on Industry and Technology, where we will have the opportunity to hear from experts in the field. At this step, I would like to recognize the leadership of the Minister of Innovation, Science and Industry and his team. We have been reassured by the answers we have received.

Since Quebec already has data protection legislation—Bill 64, which became law 25—we want to understand when the federal act will apply and whether the changes we requested to Bill C-11, introduced in the previous Parliament, were incorporated into this bill. I want to say that we are satisfied with the answers we have received so far.

We will do our due diligence because this bill includes a number of amendments. Obviously, the devil is in the details. During the technical briefings held by the department since Bill C-27 was published, we asked how much time businesses would have to adjust their ways of doing things and comply with the legislation.

We expect that there will be a significant transition period between the time when Bill C-27 is passed and when it comes into force. Since the bill provides for a lot more penalties, the government will likely hold consultations and hearings to get input from stakeholders.

In closing, I would like to say that I have just come back from Tokyo, where I accompanied the Minister of Innovation, Science and Industry to the Global Partnership on Artificial Intelligence Summit, where Quebec and France took the lead. The first summit was held in 2020. I would like to list some important values that were mentioned at this summit that deserve consideration and action: responsible development, ethics, the fight against misinformation and propaganda, trust, education, control, consent, transparency, portability, interoperability, strict enforcement and accountability. These are all values that must accompany open data and ecosystems.

Digital Charter Implementation Act, 2022Government Orders

November 28th, 2022 / 5:35 p.m.


See context

Bloc

Luc Desilets Bloc Rivière-des-Mille-Îles, QC

Mr. Speaker, I will be sharing my time with my hon. colleague from Abitibi—Témiscamingue, whom I commend for his hard work.

Today, I am pleased to speak to a bill that is as necessary as it is complex. As written, the bill has some grey areas, some things the Bloc Québécois has reservations about, but we do think it has a lot of potential.

Bill C‑27 enacts the consumer privacy protection act. Sponsored by the Minister of Innovation, Science and Industry, the member for Saint-Maurice—Champlain, the bill is at second reading. It would create three different acts: the consumer privacy protection act, the personal information and data protection tribunal act, and the artificial intelligence and data act. That last one is very interesting.

In essence, Bill C‑27 seeks to strengthen the protection of anonymity and privacy. Now that digital technology is omnipresent in our lives, it is harder than ever to make sure our privacy and personal information are protected.

Until now, organizations of every kind have taken advantage of the absence of a legal consumer protection framework. In Canada, personal information is a commodity without a legal owner.

Just look at the Cambridge Analytica scandal during the 2016 U.S. election. Bill C‑27 aims to change this sorry state of affairs, which is threatening our democracy, our privacy and social peace. The bill not only limits and restricts the excessive freedom enjoyed by organizations that collect and share our data, but it also gives them responsibilities. In short, it puts the individual and the idea of consent back at the centre of reflections on digital exchanges, and that is significant.

The Bloc Québécois supports this bill because it partially fills a legal void in Canada. I say “in Canada” because the Quebec National Assembly passed Law 25 on the protection of personal information way back in September 2021. It is a well-written law. Bill C‑27 is actually largely modelled after it, and we are very proud of that.

Given that the protection of personal information is a shared jurisdiction, it is vital to the Bloc Québécois that Bill C‑27 not take precedence over Quebec law. This does not seem to be the case at this time, but it will be up to the committee to verify this and ensure that it does not.

Speaking of the committee stage, many grey areas still need to be clarified. According to Daniel Therrien, a former privacy commissioner of Canada, Bill C-27 is too timid in its current form.

I myself have thought of something that could be studied at the committee stage, and that is image copyright. Since we are speaking about consent, the protection of anonymity, personal data and the need to adapt our legal framework to the digital era, I believe that it would be highly relevant to address this subject.

Just like the digital world, the world of photography has changed a great deal over the past 20 years. Thanks to smartphones, and the fact that just about everybody owns one, or even two, more and more photos are being taken. According to some estimates, more than three billion photographs are taken every day around the world. An image is a form of personal information. The use and sharing of images are intrinsically linked to the principle of consent. If no consent is obtained, that is a breach of privacy.

I believe that our current interpretation of image copyright is too strict, and this is detrimental to street photography and photojournalism. My father, Antoine Desilets, a photojournalist, was also a street photographer in his own way. Street photography is generally defined as photography done outdoors whose main subjects are people in spontaneous situations and in public places such as streets, parks, beaches and protests.

A good example of this kind of photography is the famous photograph The Kiss by the Hôtel de Ville, taken by the renowned French photographer Robert Doisneau. That shot has actually been the subject of multiple lawsuits, with every Dick and Jane claiming to be one of the two main figures in the picture.

Let me tell a little story from closer to home. In 1987, a Quebec photographer and friend by the name of Gilbert Duclos took a picture of a woman in the street. After the photograph was published in a magazine, the woman decided to sue Gilbert Duclos. She claimed that she was being mocked by her friends and felt that she had been wronged.

After a two-year legal saga that reached the Supreme Court, the woman won. For more than three decades, that decision, known as the Duclos decision, has been a precedent.

The debate was recently reignited by the case of a veiled woman and her husband who were photographed at a flea market in Sainte‑Foy. Since the photograph had been published without their consent, the photographer was forced to pay $3,500 to each of the two people in the photograph, even though the individuals were veiled. There is no doubt that the Duclos decision was used to bolster the plaintiffs' case.

Today, it is very easy to take a photographer to court and win. This means that many photojournalists and street photographers get sued, so unfortunately, they have to practise a form of self-censorship to protect themselves and the newspapers they work for. I believe this self-censorship has grave consequences for the arts, journalism and archive building. As it happens, on October 1, a group of 12 street photographers, led by the esteemed Jean Lauzon, published a book entitled Le droit à l'image as a commentary on this very issue.

The Bloc Québécois believes that the committee that will study Bill C-27 will have to take its time and question all the experts it needs to consult in order to come up with an ironclad law. I have a suggestion. Since we are discussing consent, privacy, the right to anonymity and personal data in the digital age, why not invite experts such as Jean Lauzon to help us understand how to modernize image copyright?

Also, when does an image of an individual taken in a public space become private? Once again, there is the need for oral or written consent on the one hand, and perhaps the definition of the concept of a subject on the other. There is a whole host of factors to consider.

For the rest, I am in favour of Bill C‑27 because it gives hope that we are going to begin to plug the gaping hole that our data is currently circulating in, allowing it to be sold and exploited.

It will be especially important to ensure that the Quebec legislation takes precedence over the Canadian legislation, as is customary in matters of shared jurisdiction.

Digital Charter Implementation Act, 2022Government Orders

November 28th, 2022 / 5:20 p.m.


See context

Green

Elizabeth May Green Saanich—Gulf Islands, BC

Madam Speaker, I acknowledge that I am standing today, as any day that I am on Parliament Hill, on the Algonquin land of the Anishinabe peoples. I say a large meegwetch to them.

I am speaking today, as we all are, to Bill C-27, which is really three bills in one. My other parliamentary colleagues have already canvassed the bare outline of this, in that we are looking at three bills: an act to create a consumer privacy protection act; a personal information and data protection tribunal act, which largely replaces some of what there was already in PIPEDA in the past; and a brand new artificial intelligence and data act.

I want to start with the artificial intelligence and data act because it is the part with which all of us are least familiar. Much of what we see in this bill was previously before Parliament in last session's Bill C-11. There is a lot to dig into and understand here.

As I was reading through the whole concept of what kinds of harms are done by artificial intelligence, I found myself thinking back to a novel that came out in 1949. The kind of technology described in George Orwell's book, famously called 1984, was unthinkable then. The dystopian visions of great writers like George Orwell or Margaret Atwood are hard to imagine. I will never forget the scene in the opening of The Handmaid's Tale, where a woman goes into a store and her debit card is taken from her. At that moment, we did not have debit cards. Margaret Atwood had to describe this futuristic concept of a piece of plastic that gave us access to our banks without using cash. No one had heard of it then.

There are words from George Orwell, written in 1949, about the ways in which artificial intelligence and new technologies could really cause harm in a dystopian sense. In 1984, he writes, “It was terribly dangerous to let your thoughts wander when you were in any public place or within range of a telescreen. The smallest thing could give you away.”

More recently, there is the song by The Police and written by Sting and others. I will never forget that once I went to a session on rights to privacy being under assault and a British jurist brought with him for his opening of the speech, “Every breath you take, And every move you make, Every bond you break, Every step you take, I'll be watching you.”

We live in a time when artificial intelligence can be enormously invasive of our privacy with things like visual recognition systems, as the hon. member for Selkirk—Interlake—Eastman was just speaking to. These are things that, for someone like me born in 1954, are all rather new, but they are new for people born in 1990 too. It is very new technology and bringing in legislation to control it is equally new and challenging for us as parliamentarians. The whole notion that we are going to be able to spot the ways in which artificial intelligence can affect our democracy is something that will take time.

We talk about harms from this kind of technology, from capturing algorithms, from invading our spaces. We do not have to look any farther than the way Cambridge Analytica was used by the Brexit forces in the U.K. to harness a public outrage against something based on a pile of disinformation, by targeting individuals and collecting their data. That kind of Cambridge Analytica concern also gets into part 1 and part 2 of this bill. We really do need to figure out how to control the digital tech giants harvesting our information.

As an example used earlier today in debate, there is the idea that big digital giants and large corporations can profit from data without the consent of Canadians who may have put a family photo on social media, never knowing that their privacy has been invaded and their personal information and photos have been used for profit without their permission. In this sense, I am going to flag that in the context of the artificial intelligence and data act, I hope we will be taking the time necessary to hear witnesses specifically on this.

We have developed a pattern in recent years, which is to say the last decade or so, of having three or four witnesses appear on panels. All of us in this place know that committees are trying to hear from a lot of people and receive a lot of evidence. It will do us a disservice in our dive into the artificial intelligence and data act if we combine panels of people who are experts on PIPEDA and people who are experts on other aspects of this bill, with panels on artificial intelligence and data.

The committees that study this bill will control their own process. Committees are the masters of their own process, but I would urge the government, the Liberal legislative managers of this piece of legislation, Bill C-27, to follow the lead of the Speaker's ruling earlier today. If we are going to vote on the artificial intelligence act as a separate piece when we come to vote, we could at least make an effort to ensure that the concentrated effort of committee members and hearing witness testimony is not diluted through several different pieces of legislation and panels with three or four witnesses.

Members' questions will inevitably and invariably go to one or two. In this format of panels and pushing witnesses through quickly, we lose a lot of content. Compared with when I worked in government back in the 1980s, which I know seems like the dark ages and no one in this room was on committees in those days, committees would hear from a witness who could speak for 15 minutes and then we would have the rest of an hour to ask that one witness questions. Now that we are into something as complicated as this area, I would urge the committee to give it that kind of attention or to ask the government to send part 3, the artificial intelligence and data act, to a different committee, so that the study can be thorough and we can educate ourselves as to the unintended consequences that will inevitably occur if we go too fast.

Turning to the parts of the bill that deal with privacy, I want to put on the record again a question that was raised just moments ago about whether privacy legislation should apply to political parties in Canada. At the moment, it does not. Political parties are exempted from the kinds of privacy protections that other organizations, NGOs and corporations must use to protect the privacy information of their customers, consumers and citizens.

The Green Party of Canada believes it is essential that political parties be added to the list of organizations that have an obligation to protect the privacy of Canadians.

I will say quickly that I tend to agree with the first analysis of one of the NGOs that are very concerned with privacy information. OpenMedia, in an article by Brian Stewart, says very clearly that this legislation could actually make things worse for some privacy protections. They give the efforts of Bill C-27's consumer privacy protection act and its personal information and data protection tribunal act a grade of D. In other words, it passes but just barely. There will be many witnesses.

I can certainly confirm that, as a Green Party member of Parliament in this place, I will be bringing amendments forward, assuming this bill gets through second reading, which I think we can assume, and ends up at committee.

In the time remaining, I want to emphasize that Canada is aware that privacy is a fundamental human right. It is part of the UN declaration on the rights of individuals. I echo some of the sentiments from the hon. member for Selkirk—Interlake—Eastman in asking why we are looking at consumer privacy. Maybe we should change that word to Canadians' rights and privacy.

I also agree with many members who have spoken today about the problems of subclause 18(3) and the number of exemptions along with the question of what is a “legitimate” reason that people's privacy can be invaded. That should be further clarified. I find “a reasonable person would expect the collection or use for such an activity” to be fine, but the exemptions seem overly broad.

If I dive into anything else I will go over my allotted time.

This is important legislation. We must protect the privacy of Canadians. I think we will call on all parties in this place to set aside partisanship and make an honest effort to review it. That is not to delay it but to make an honest effort to review the bill before it leaves this place.

Digital Charter Implementation Act, 2022Government Orders

November 28th, 2022 / 5:20 p.m.


See context

NDP

Leah Gazan NDP Winnipeg Centre, MB

Madam Speaker, Bill C-27 does not explicitly apply to political parties, and in the past we have seen the possibility of privacy breaches and misuse in the political arena.

Should the bill be amended to specifically include political parties?

Digital Charter Implementation Act, 2022Government Orders

November 28th, 2022 / 5:15 p.m.


See context

The Assistant Deputy Speaker (Mrs. Alexandra Mendès) Alexandra Mendes

For clarification, I would point out that Bill C-27 has not been divided and only the vote will be done separately.

The hon. parliamentary secretary.

Digital Charter Implementation Act, 2022Government Orders

November 28th, 2022 / 5:15 p.m.


See context

Bloc

Sébastien Lemire Bloc Abitibi—Témiscamingue, QC

Madam Speaker, I would like to thank my colleague from Halifax for his speech. I am sure he will work hard in committee to defend the integrity of this bill. He can count on the Bloc Québécois's support for the principle of the bill.

The Chair delivered a ruling earlier this afternoon about how Bill C-27 should be divided into two parts. I would like to hear his comments on that. What impact will that have on the bill? Does he think that will jeopardize certain aspects of Bill C-27? What will be the consequences?

Digital Charter Implementation Act, 2022Government Orders

November 28th, 2022 / 5:05 p.m.


See context

Halifax Nova Scotia

Liberal

Andy Fillmore LiberalParliamentary Secretary to the Minister of Innovation

Mr. Speaker, I will be splitting my time with the member for Saanich—Gulf Islands.

I am very pleased to be here to discuss Bill C-27, the digital charter implementation act of 2022. The bill would implement a new world-class regime for the protection of consumers and to ensure that Canadians have confidence that businesses are handling their personal data responsibly and are developing and deploying new technologies in a responsible and ethical way.

The bill also includes important changes that would support responsible innovation in an increasingly digital and data-driven marketplace. It would modernize Canada's regulatory framework for privacy protection in the private sector in a manner that supports innovation and is interoperable with the data protection laws of Canada's major trading partners.

The bill would also reinforce Canada's commitment to responsible artificial intelligence development, or AI development. As parliamentary secretary to the Minister of Innovation, Science and Industry, and indeed as the MP for Halifax, with its burgeoning tech sector, I can tell members from first-hand experience that Canada is a world leader in AI, with top talent and innovative companies.

In a world that is increasingly reliant on digital technologies, the bill would build on Canada's advantage by creating a foundation of trust and ensuring that companies meet the highest standards of responsibility when developing and deploying AI. We need to ensure that Canadians’ personal information is protected, but there is also a need to support Canadian businesses so that they can grow, prosper and innovate in this increasingly digital world.

We recognize that technology is growing rapidly and providing companies with large amounts of personal information. This information fuels business decisions. It informs the creation of new products and services for customers. This innovation is critical, but we absolutely have to ensure that this innovation happens in a responsible way.

Therefore, in my limited time today, I am going to focus my comments on the first and third parts of the act, with a focus on enabling and supporting responsible innovation.

I will begin with the first part.

The proposed new Consumer Privacy Protection Act, or CPPA, retains the principles-based approach of our current private sector privacy law in order to continue harnessing the success of a flexible and adaptable privacy law.

We know circumstances are changing all the time. To better reflect advances in digital technologies, the emergence of AI and other new technologies, the CPPA contains a number of provisions to support industry innovation without compromising the protections Canadians depend on.

First, the CPPA includes a new exception to consent, to cover specified business activities, and it introduces the concept of legitimate interests into Canada’s privacy framework, with updates that take into consideration what we have heard from stakeholders on the previous proposal that came before Parliament in 2020, back when I was parliamentary secretary to the then minister of heritage and we were considering this.

The objective is to help reduce the administrative burden on businesses and on individuals in situations in which seeking consent is not meaningful, for example, the use of personal information for the shipping of goods that have been requested by the individual.

In these situations, the customer clearly anticipates receiving a shipment, and the company should be able to undertake this shipment without the law adding an extra burden to provide this service. Importantly, this exception may not be used in situations in which the organization intends to influence the individual’s behaviour or decisions.

Moreover, given the need to consider interests and potential impacts on individuals, the organization will be required to assess the potential impacts on individuals, implement measures to eliminate or mitigate such impacts, and comply with any prescribed requirements. The Privacy Commissioner may review such assessments on request.

All in all, the inclusion of a targeted legitimate interest exception aligns the CPPA with international best practices, including those of the EU.

Second, the CPPA defines and clarifies how businesses should handle de-identified personal information, in other words, personal information that has been modified to reduce the risk that an individual could be recognized or identified.

This framework takes into account the feedback we heard from the previous proposal. The bill also defines anonymized information and confirms that information that has no risk of identifying an individual falls outside the scope of the act.

The bill before us today would incentivize organizations to de-identify personal information before using it for research, development and analysis purposes, further protecting Canadians’ privacy.

We know businesses need to invest in R and D to improve their products, which benefits customers by providing them with new and innovative products and services. This provision would allow businesses the flexibility to use de-identified data for R and D, adding value for both customers and firms. However, the CPPA confirms that this information would still stay within the protection of the act and under the oversight of the Privacy Commissioner of Canada, as one would expect.

Recent years have also shown the critical role data plays in developing evidence-based policies and responding to public crises. Whether it is to respond to public health needs or the now-present challenges from climate change, or even planning a city, data is needed to help us rise to these challenges, but it must be used responsibly and in keeping with our values.

That is why the CPPA introduces a framework that would allow for the use of data in ways that would benefit the public good. It would do this by allowing companies to disclose de-identified data to specified public entities, such as hospitals, universities and libraries. These disclosures would be permitted only where specific criteria are satisfied. That is, the personal information must not identify an individual, and there must be a socially beneficial purpose, like those related to health, public infrastructure or environmental protections. This would ensure that the privacy of individuals is protected, while making sure we would be using everything at our disposal to respond to increasingly challenging global issues.

Third, the CPPA introduces a new framework for codes of practice and certification systems that would enable businesses to proactively demonstrate their compliance with the law. For example, companies that are engaged in a particular business activity could collaborate on the development of a code of practice that outlines how they comply with the specific provisions of the law. With the approval of that code by the Privacy Commissioner, organizations would have greater certainty that they are meeting their obligations.

Similarly, the bill provides a scheme for recognizing certification systems that demonstrate compliance with the law. Organizations that choose to participate in approved certification schemes would benefit from a reduced risk of enforcement actions under the act. This would be especially helpful for small- and medium-sized entities that do not necessarily have extensive legal resources at their fingertips. These new frameworks for recognized codes and certifications would make it easier for businesses to demonstrate their compliance with the law to customers, to business partners and to the Privacy Commissioner of Canada.

I would like to move now to the third part of the legislation, the proposed artificial intelligence and data act, or AIDA, which would support responsible innovation by giving businesses a clear framework to guide the design, development and deployment of artificial intelligence systems, or AI systems. AI systems have many benefits and operate across national and provincial boundaries.

As I mentioned, Canada has become a global leader in artificial intelligence through the pan-Canadian AI strategy. However, as the technology has matured, risks associated with AI systems have also come to light, including with respect to health, safety and bias. In order for Canadian innovators to maintain this status, common standards are needed for international and interprovincial trade in AI systems.

The bill would guide innovation by building confidence in the technology and protecting Canadians against the harms such systems can cause. Specifically, AIDA would ensure that entities responsible for high-impact AI systems identify and mitigate potential harms, including bias. By aligning with internationally recognized standards, this would ensure market access for Canadian innovations.

Lastly, an artificial intelligence and data commissioner would be created, with the dual role of support the minister in administering the act and playing a supportive role in helping businesses understand their responsibilities and how to comply. We believe the government is paving the way for Canada to be a world leader in innovation by providing Canadians with clear rules on how it may be developed and used.

I believe it is imperative the House move to pass this bill. The digital charter implementation act would not only protect the personal information of Canadians and lay the ground rules for the responsible design, development, deployment and operation of AI systems in Canada, but also enable the responsible innovation that will promote a strong Canadian economy. With this bill, the government is sending a clear message that responsible innovation is critical for Canada’s future economic success and competitiveness.