Digital Charter Implementation Act, 2022

An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts

Sponsor

Status

In committee (House), as of April 24, 2023

Subscribe to a feed (what's a feed?) of speeches and votes in the House related to Bill C-27.

Summary

This is from the published bill. The Library of Parliament has also written a full legislative summary of the bill.

Part 1 enacts the Consumer Privacy Protection Act to govern the protection of personal information of individuals while taking into account the need of organizations to collect, use or disclose personal information in the course of commercial activities. In consequence, it repeals Part 1 of the Personal Information Protection and Electronic Documents Act and changes the short title of that Act to the Electronic Documents Act . It also makes consequential and related amendments to other Acts.
Part 2 enacts the Personal Information and Data Protection Tribunal Act , which establishes an administrative tribunal to hear appeals of certain decisions made by the Privacy Commissioner under the Consumer Privacy Protection Act and to impose penalties for the contravention of certain provisions of that Act. It also makes a related amendment to the Administrative Tribunals Support Service of Canada Act .
Part 3 enacts the Artificial Intelligence and Data Act to regulate international and interprovincial trade and commerce in artificial intelligence systems by requiring that certain persons adopt measures to mitigate risks of harm and biased output related to high-impact artificial intelligence systems. That Act provides for public reporting and authorizes the Minister to order the production of records related to artificial intelligence systems. That Act also establishes prohibitions related to the possession or use of illegally obtained personal information for the purpose of designing, developing, using or making available for use an artificial intelligence system and to the making available for use of an artificial intelligence system if its use causes serious harm to individuals.

Elsewhere

All sorts of information on this bill is available at LEGISinfo, an excellent resource from the Library of Parliament. You can also read the full text of the bill.

Votes

April 24, 2023 Passed 2nd reading of Bill C-27, An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts
April 24, 2023 Passed 2nd reading of Bill C-27, An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts

Digital Charter Implementation Act, 2022Government Orders

November 28th, 2022 / 5 p.m.


See context

Green

Elizabeth May Green Saanich—Gulf Islands, BC

Madam Speaker, I want to the thank the hon. member for Selkirk—Interlake—Eastman for a very thoughtful speech. As a member of Parliament grappling with Bill C-27, I have to say that I am grateful that his party assigned him to this area of work sometime in the past, because this is enormously complicated.

The bill is three acts in one, and I would ask the member what we should do at this point. The Speaker has now given a ruling that says we will be able to vote separately on the AI piece of the bill, but I do not think that is good enough. I do not know if the committee will be able to set aside witnesses and only look at the AI piece in a concentrated fashion.

I would support anything we could do as opposition members of Parliament to make sure the bill is not rushed and to make sure that the artificial intelligence pieces are treated as separately as possible so that we have a good amount of time for amendments and understanding while not rushing it through.

Digital Charter Implementation Act, 2022Government Orders

November 28th, 2022 / 4:55 p.m.


See context

Conservative

James Bezan Conservative Selkirk—Interlake—Eastman, MB

Madam Speaker, I do not believe that the bill lives up to the gold standard of European Union law. The European Parliament has been very good at having general data protection regulation. That is the gold standard. The bill does not provide the types of safeguards that protect the interests of Canadians.

We need an ongoing discussion on how the personal information of Canadians is protected. Bill C-27 does not provide all the guardrails required for the protection of individual Canadians. A task should be given to the industry committee or the ethics committee to dive deeper to make sure we have an opportunity to hear from more witnesses and to provide the amendments that are so desperately needed to the bill. I think it actually needs to go back to be redrafted.

Digital Charter Implementation Act, 2022Government Orders

November 28th, 2022 / 4:55 p.m.


See context

Bloc

Julie Vignola Bloc Beauport—Limoilou, QC

Madam Speaker, my colleague from Selkirk—Interlake—Eastman mentioned some things that are not covered by Bill C-27. The law they have in Europe right now requires businesses to have two ways to identify individuals, but the trend is moving toward having three.

Does my colleague think that Bill C-27 should also legislate on the number of methods of identification that businesses should be required to use? It does not do so right now, which is why we need to carefully study it in committee.

Digital Charter Implementation Act, 2022Government Orders

November 28th, 2022 / 4:35 p.m.


See context

Conservative

James Bezan Conservative Selkirk—Interlake—Eastman, MB

Madam Speaker, it is indeed a pleasure to rise to discuss Bill C-27, an act to enact the consumer privacy protection act, the personal information and data protection tribunal act and the artificial intelligence and data act. There is a lot happening in Bill C-27. I have a lot of concerns about this bill, and that is why I will be voting against Bill C-27. It would not do the things we need to do to protect the privacy of Canadians.

I would first flag, in looking at this legislation, that the first act it would create is the consumer privacy protection act. Why is it not the Canadians' privacy protection act? Why are we talking about consumers and giving more ability to corporations to collect the privacy data of Canadians? That, to me, is very disconcerting and one of the things I want to talk about during my presentation.

The Personal Information Protection and Electronic Documents Act, PIPEDA, was the very first piece of legislation we had back in 2000, so it has been 22 years since we have updated legislation related to the issue of the privacy protection of data that has been shared online. Of course, technology has evolved significantly over the last 20 years. If we look at PIPEDA, it all rolls back to 34 years ago when the Supreme Court of Canada said, “that privacy is...the heart of liberty in a modern state”.

It said “privacy is...the heart of liberty”, and that completely falls back on the Charter of Rights and Freedoms. Concerning fundamental freedoms, subsection 2(b) of the charter says, “freedom of thought, belief, opinion and expression, including freedom of the press and other media of communication” while subsection 2(d) refers to, “freedom of association.”

We know very well that people's privacy has to be protected on anything they do online, what they do through mobile apps, what they do in their email communications and the collection of that data by service providers because, ultimately, anything we do online goes through a service provider on the Internet, and we have to ensure that our charter freedoms are protected to ensure our liberty.

We already know that under freedom of association, a lot of people who gather in Facebook groups and other fora on the Internet have already been violated by the Emergencies Act. We know that during the “freedom convoy” in the city, the government was harvesting data and that data was then shared by some means. With GiveSendGo, the data was mined off of it, shared on Google Maps and distributed across the country. People's individual financial information, the ultimate piece of privacy that should be protected, went across this country and the government failed to intervene.

Bill C-27 falls short on what needs to happen to protect privacy, recognizing how people are using the Internet and modern technologies, especially with mobile apps and everything that is happening on our phones. However, the protection of individuals is worth it and the privacy rights are worthy of constitutional protection, which Bill C-27 fails to recognize. We do not have a definition of privacy rights or a guarantee of privacy rights in Bill C-27, and that is why it fails.

I am the shadow minister of national defence, but earlier this year I served for a number of months as the shadow minister of ethics and digital information. I can say that, during my time serving on the ethics committee, it dealt with a number of issues. One of them, of course, was the use of Clearview AI, the facial recognition software that the RCMP and other police agencies use across this country. The ethics committee dug in deep and provided a report.

The Liberals let the RCMP make use of this technology under their tenure and did not say anything until it became public. Clearview AI, an American company, was scraping images off of Facebook and other social media such as Instagram to populate its database.

That information was then used, using artificial intelligence, to profile and identify people using mass surveillance techniques. We found through testimony that, not only was this done illegally, and the Privacy Commissioner ruled that Clearview AI had broken the law and that the RCMP had used it illegally, but also it was racially discriminatory as well, and it was a huge problem that people of colour and women were unfairly treated by this AI.

Bill C-27 would not regulate the use of facial recognition technology such as Clearview AI. Right now, we know the RCMP disagrees with the ruling of the Privacy Commissioner, so the question is whether CSIS, the Department of National Defence or the Communications Security Establishment are making use of similar types of technology. I will get into some of the recommendations from that report if I have time later on, but we did call as a committee, and it was adopted by the majority of members on our committee, for a federal moratorium on the use of facial recognition technology. We called for new laws, guardrails and safeguards to be built into legislation through PIPEDA and through the Privacy Act.

Bill C-27 would not provide that protection to Canadians. It would not ban or install a moratorium on the use of FRT, so that is absent.

Also, we asked that all companies be prohibited from scraping the images of Canadians off the Internet, whether it be through Facebook, Instagram, TikTok or whatever the app might be. We know that this causes potential harm to Canadians, yet Bill C-27 fails again to recognize this harm. The Liberals failed to incorporate recommendations coming from a standing committee of the House into this legislation.

One of the other things we heard about was that Tim Hortons was caught mass tracking Canadians who were using their app. If anyone who had the Tim Hortons app went to a Tim Hortons location and bought a coffee and a donut, that app was then used to track the behaviours of consumers of Tim Hortons as they were travelling for the next 30 minutes.

Again, this shows how the sharing of personal information and the mass data violation with the tracking of individual Canadians violated their privacy rights. Although Tim Hortons assures us they are not doing it now, we are not sure what happened with that data. Was it shared or sold to other corporations? Again, Bill C-27 would give companies, under clause 55 of the bill, a litany of exceptions to consent to sharing that personal information they collected through the use of their app. That would violate our privacy rights.

Although the Liberals have built in here words about consent and the ability for individuals to write in with consent or get removed, when it comes to terms and conditions, most Canadians, when they download an app and check the box to say “yes”, they have not read those terms and conditions. They do not know that some of these apps, as Tim Hortons was doing, were actually undermining their own privacy rights as they apply to the use of mobility data information, and because those terms and conditions are long, legalistic and cumbersome, people refuse to actually take the time to read it. Just because someone checks the box to say “Yes, I consent to using this app”, does not give those companies the right to violate the privacy of those individuals' outside of the commercial transaction that takes place between them and, in this situation, Tim Hortons.

The exemptions that are allowed under the bill for corporations need to be changed in the bill. There is no we can support it as Conservatives because they would be huge violation of privacy and of mobility, which are all things that are provided under our charter rights.

Under the government, we also saw the Liberal Minister of Health stand up and defend the Public Health Agency of Canada, which was caught red-handed having companies such as TELUS track the movement of Canadians via their cellphones. It said that it de-identified all the data it collected, but it wanted to know how Canadians were moving around the country underneath the auspices of the COVID pandemic and how transmission was occurring. That was a violation of privacy.

At committee, we made a bunch of recommendations, which the government has failed to implement in Bill C-27. Bill C-27 gives companies, such as TELUS and other mobile service providers, the ability to track the movement of Canadians across this country. It may want to call it “meta data” or say it has been de-identified, but we also know from testimony at committee that it can re-identify the meta data that has been turned over to the government. We have to make sure that it is done in the public interest and under the auspices of national security, public health and national defence. If that type of data is being collected, then there has to be a way to dump that data and ensure it disappears forever.

One of the other studies we undertook was of the Pegasus software system, which is very insidious. It is being used for national security. A similar type of technology is being used right now by the RCMP, CSIS and others. It has the ability to turn people's cellphones into video cameras and listening devices. It is a very cryptic, insidious spyware, or malware, that people can get on their phones by accidentally clicking on a piece of information, like opening up an email, and it will download. Then they can listen to the individuals in that place.

They do not have to bug people's houses anymore. They do not have to use high-grade technology to listen to the interests of individuals because it gives them the ability to turn cameras on to watch what they are doing, and turn microphones on to hear what they are discussing without them ever knowing it.

We want to make sure charter rights are protected. There are times we have to use this in the collection of data. There was definitely the admission by members of the RCMP that they have used it over a dozen times. They have their own system, not Pegasus, but one similar to it. We know that to use that type of technology, to protect the rights of Canadians, there should be a warrant issued to ensure there is judicial oversight, even if it is being used by the Department of National Defence and CSE, we have to make sure it is not being used against Canadians and only deals with those national threats they refer to as threats that are foreign entities. That is something that Bill C-27 fails to recognize.

I should say this as well. We heard at committee that this type of technology is being used against politicians, that there is foreign interference out there. As we have come to learn on different occasions, there are countries out there and other agencies that are interested in what we are saying as politicians, not just here in the House, but the private conversations we have in caucus, among colleagues, when we get together at committees, at pre–committee meetings, and the discussions we have in our offices. Our phones have become listening devices, so we have to be aware of that.

One of the things we have always talked about is what the gold level standard is to protect individuals, the citizens of our country, and to ensure their privacy rights are paramount in all the discussions we have. At the same time, we know there are going to be advances in technology, and the need at times to have police agencies, the Department of National Defence and the military use technology that could violate the rights of some people, but always with that judicial oversight that is provided underneath the charter. That gold standard is the European Union’s General Data Protection Regulation. We see that the gold standard goes well above and beyond what Bill C-27 is trying to do.

Bill C-27 falls way short. We heard at committee that with the data collection taking place on apps, online surveillance measures have to provide the right for data to be forgotten, or the right to data disposal or erasure, another terminology that is used. It is about making sure that data collected, even if it is for the public good or even if it is metadata, is disposed of at the end of the day.

It should not be that I consent to have my data removed from a database by checking something off or having to write in an app being used to buy coffee at the neighbourhood store, for example. It should be that it is our right to be forgotten and that after a certain time frame, data is erased forever from the database where it is being held and is not used again for commercial purposes, nor used, sold or traded among commercial entities.

The gold standard that the European Union has is not included in Bill C-27. Again, that is why we have so many concerns.

When we look at clause 55, which has already been mentioned by a number of my colleagues, it has a boatload of exemptions built in for corporations to get around the removal of privacy data. These exemptions allow them to write in, make changes and share data. We have to make sure the onus is not on Canadians to get their privacy information back or to get their privacy information removed. The onus should be on corporations to prove why they need it. The onus also has to be on the government. This is about transparency and accountability. There needs to be a realization that Canadians deserve an explanation as to why some of their data may be used, even if it is de-identified, and why it would be used for the buildup of public policy or to deal with issues like a pandemic.

Just to move forward a bit, I note that given some of things we saw at committee when we were looking at facial recognition technology, the power of artificial intelligence and the growing power of AI, we made a number of recommendations. They included that whenever the government looks at using artificial intelligence or FRT for military, defence or public safety, it needs to be referred to the National Security and Intelligence Committee of Parliamentarians for study, review and recommendation, and it needs to be reported publicly. There also needs to be a public artificial intelligence registry for the algorithmic tools being used. However, we do not see that registry for artificial intelligence companies in Bill C-27.

I have already talked about the right to be forgotten and said there needs to be a set period of time. I have talked about the prohibition on the practice of capturing images of Canadians from public platforms such as Facebook, Instagram and Twitter. We also need to make sure there is a federal moratorium on using FRT until we have proven it is needed by police agencies, the justice system has proven that it works and we are sure it is not racializing Canadians in its use. Ultimately, the Privacy Commissioner and judicial authorization have to override that.

As Daniel Therrien, the Privacy Commissioner, said about the RCMP:

[It] did not take measures to verify the legality of Clearview’s collection of personal information, and lacked any system to ensure that new technologies were deployed lawfully. Ultimately, we determined the RCMP’s use of Clearview to be unlawful, since it relied on the illegal collection and use of facial images by its business partner.

Its business partner was Clearview AI.

There is an ongoing need to ensure that charter rights and international human rights are brought together in a collaborative way in how we all form our opinions on Bill C-27. I hope the bill is taken back and redrafted, and if not, I hope there is an opportunity to make massive amendments to it so that it actually takes into consideration the privacy rights of all Canadians.

Digital Charter Implementation Act, 2022Government Orders

November 28th, 2022 / 4:30 p.m.


See context

Bloc

Julie Vignola Bloc Beauport—Limoilou, QC

Madam Speaker, I would like to take 15 seconds to congratulate my colleague on delivering half his speech in French. He has improved by leaps and bounds in less than a year.

Now, the moment we have all been waiting for, my question. Quebec has a law that protects its citizens' privacy, law 25. We talked about it earlier. In the early 2000s, PIPEDA's paragraph 26(2)(b) stated that the Governor in Council would, by order, respect Quebec's legislation. Essentially, the federal act would not apply with respect to personal information about individuals' property or their civil rights. In other words, the act would leave matters under Quebec's jurisdiction alone. Even though Quebec's law 25 already complies with EU expectations, Bill C‑27 contains no clause guaranteeing that the federal government will respect the application of Quebec's law.

My question is simple. Will my colleague work to ensure that the federal government respects Quebec's law 25 and that there will be an order to that effect?

Digital Charter Implementation Act, 2022Government Orders

November 28th, 2022 / 4:15 p.m.


See context

Bloc

Andréanne Larouche Bloc Shefford, QC

Madam Speaker, I thank my hon. colleague for his speech.

I would like to come back to the topic of adopting this motion and particularly the importance of sending Bill C-27 to committee, to make sure all the details are in place. It is important that the committee do its work properly. This is very technical.

Quebec has Bill 25. How can we ensure that there is no interference between Bill 25 and Bill C-27? How can we combine the work of both levels of government? This is a shared jurisdiction. Could my colleague comment on that?

Digital Charter Implementation Act, 2022Government Orders

November 28th, 2022 / 4:05 p.m.


See context

Liberal

Majid Jowhari Liberal Richmond Hill, ON

Madam Speaker, I will be splitting my time with my colleague, the member for Vaughan—Woodbridge.

I am pleased to rise today in support of Bill C-27, the digital charter implementation act.

Privacy is a long-standing, fundamental right for Canadians, and we have never been more reliant on the digital economy. Even though we are living in this complex technological era, the current privacy law was last updated over 20 years ago, before smart phones or any social media platforms even existed. This brings us to the cardinal step our government is taking today.

We know Canadians need to have confidence not only that their is data safe and their privacy fully respected, but also that their government is striving to enhance the protection of their privacy through the implementation of timely safeguards in an era when the digital economy is driving transformative change. These objectives are exactly what the privacy protection framework of Bill C-27 would aim to accomplish.

We are introducing new legislation to ensure our country has critical protection in place to safeguard the security of Canadians. This legislation proposes not only to increase the confidence of Canadians in emerging technologies but also to strengthen privacy protection for consumers while supporting economic development that results from the responsible use of data and artificial intelligence. It would also pave the path for governing trade and commerce in the private sector, as it relates to regulating how private organizations handle personal information and develop AI systems.

Upon enactment into law, Bill C-27 would be one of the most substantial improvements to Canadian privacy laws in decades, but it would go further by establishing a legal framework to regulate high-impact AI systems to better protect consumers. In essence, this legislation proposes the following key enactments: the consumer privacy protection act; the personal information and data protection tribunal act; and finally, the artificial intelligence and data act, or AIDA. I will expand on each one of these major enactments in detail.

The enactment of the consumer privacy protection act proposes to achieve the following: first, to enhance Canadians' control over personal information by empowering them to request its deletion, and adding new transparency requirements for organizations when obtaining consent from individuals for their information; second, to create new data mobility rights that promote consumer choice and innovation; and third, to bolster our privacy enforcement and oversight by granting the Privacy Commissioner of Canada order-making powers to compel organizations to stop the use of personal information, through administrative monitoring penalties for serious breaches of law.

This aspect of the bill is of the utmost importance to nearly 200 of my constituents in the riding of Richmond Hill who have voiced their pertinent concerns regarding privacy protection and have spoken to me personally in relation to this legislation and what it seeks to achieve for Canadians. Through the mentioned key facets, my constituents, and in fact all Canadians, can rest assured that their government's sole intention is to ensure Canadians' first-class privacy and data protection.

By enacting the personal information and data protection tribunal act, our government seeks to strengthen protection for minors' personal information, introduce greater flexibility for the Privacy Commissioner and explicitly foster more privacy expertise among key decision-makers. This would be achieved through the establishment of a new administrative tribunal to hear appeals of certain decisions made by the Privacy Commissioner.

The third and most crucial aspect of this legislation, in my point of view, would establish a new law on artificial intelligence.

According to a recent study by Nanos Research on behalf of Innovation, Science and Economic Development Canada, key industry stakeholders have expressed a range of concerns regarding artificial intelligence. As technologies have matured, risks associated with AI systems have also come to light, including with respect to health, safety and bias. These concerns speak to the need to ensure the responsible development of AI. Moreover, as companies invest in increasingly complex AI systems, Canadians need to have confidence in AI systems they use every day.

It is therefore essential that the use and collection of data follow best practices to protect the rights and freedoms of Canadians. This brings me to the very reason why I personally identify this enactment as the most crucial aspect of this legislation.

It is in response to these legitimate concerns that our government proposes to introduce a new law to promote a unique approach to AI. It is an approach that would protect Canadians from discrimination, loss of autonomy and serious harm to their health, safety and economic well-being. The newly proposed AI law contains central provisions that would protect commercially sensitive information while ensuring that AI systems do not cause adverse effects on Canadians. Consequently, this approach would establish rules aimed at promoting good data-governance practices and respect for Canadian standards and values.

This new law would support responsible innovation by giving companies a clear framework for developing AI systems; compel organizations responsible for AI systems to mitigate potential harm to Canadians, including bias; establish an AI and data commissioner to support the Minister of Innovation, Science and Industry in the administration of the act to encourage innovation in the marketplace; and, finally, impose serious penalties for all use of illegally obtained personal information.

It is also notable to mention that it would serve as a build-up on our government's previous investments and commitment to expanding the pan-Canadian AI strategy first launched in 2017 to enhance growth in Canada's digital economy.

Each of these acts would work to provide Canadians with more autonomy over their privacy and increase accountability of personal information handled by organizations, while also giving Canadians the freedom to move their information from one organization to another in a secure manner.

In quick summary, by introducing this groundbreaking piece of legislation, our government is working to strengthen and modernize our privacy laws and to protect Canadian consumers by limiting private companies' abilities to access private information in the digital sector. Most importantly, we would be creating new rules for the responsible development of Al alongside the continuation of the advancement of its implementation across Canada.

The digital charter implementation act would ensure Canadians have strong privacy protections and clear rules of the road for businesses, as well as guardrails to govern the responsible use of artificial intelligence. As I stand here today in support of this important piece of legislation, I am confident that, given our country's highly skilled workforce, with this vital step, Canada would be well positioned not only to play an important global role in the field of AI, but also to create an environment where Canadian companies could be world leaders in responsible innovations.

Most importantly, through this cardinal legislation, Canadians would be reassured that we would never compromise on trust and safety for their privacy, and that their government is wholeheartedly committed to advancing Canadian privacy protection laws while unlocking innovation that promotes a strong economy that works for everyone.

I would like to close this intervention by encouraging all my colleagues in the House to support this valuable piece of legislation. We can work together to move beyond traditional privacy protection to ensure data control for all Canadians and modernize our laws to adapt to the realities of a complex digital economy. This is the only way to advance Canadian digital technology and Canadian values across the world.

Digital Charter Implementation Act, 2022Government Orders

November 28th, 2022 / 4 p.m.


See context

Conservative

Dan Albas Conservative Central Okanagan—Similkameen—Nicola, BC

Madam Speaker, I too share concerns with Bill C-27, particularly around the artificial intelligence and data act. Specifically, I agree with her. Having one minister solely delegated the responsibility for a wide variety of different regulations that might affect private as well as public data is too much. As Parliament, we should be looking into this and setting out the parameters.

The government has basically told the private sector that it can hold it accountable for serious harm, something it does not even define in the law, in Bill C-27, while at the same time giving itself the ultimate loophole. It says it can exempt itself. Not only that, but some of the organizations are trustworthy, as it says in the bill. The minister can say that any provincial or federal commission or body he or she wants can be exempted, allowed to use artificial intelligence and held to a different standard than the private sector is.

Does the member agree that this particular section, more than anything, needs to be looked at? I believe it is too much government overreach. It has essentially given itself the ultimate loophole.

Digital Charter Implementation Act, 2022Government Orders

November 28th, 2022 / 3:35 p.m.


See context

Conservative

Cathay Wagantall Conservative Yorkton—Melville, SK

Mr. Speaker, as many of my colleagues already indicated, this is a large and complex bill, and we believe that its individual components are too important for them to be considered as one part of an omnibus bill. I am pleased with the ruling of the Speaker.

There are three separate pieces of legislation to this bill. In part 1, the consumer privacy protection act would repeal and replace decades-old measures concerning personal information protection. In part 2, the personal information and data protection tribunal act would strike a tribunal to administer penalties for violations of the CPPA. In part 3, the artificial intelligence and data act is brand new to the bill and sets up a framework for design and use of AI in Canada, which is almost entirely unregulated.

Long before the widespread use of the Internet, our Supreme Court was clear that privacy is at the heart of liberty in a modern state. The government should be taking every opportunity possible to enshrine privacy in our laws as essential to the exercise of our rights and freedoms in Canada. As Daniel Therrien stated in the Toronto Star earlier this month, “democracies must adopt robust solutions anchored in values, not laws that pretend to protect citizens but preserve the conditions that created the digital Wild West.”

The value of privacy should anchor the bill. Instead, the bill fails right out of the gate. The preamble states:

the protection of the privacy interests of individuals with respect to their personal information is essential to individual autonomy and dignity and to the full enjoyment of fundamental rights and freedoms in Canada

Placing this value in the preamble of the bill where it has no teeth raises distrust rather than confidence that the government truly respects Canadians' privacy rights. The CPPA would require organizations, companies or government departments affected by the bill to develop their own codes of practice for the protection of personal information. While these codes must be approved and certified by the Privacy Commissioner, one can only imagine the variation of protection that would result. This requirement would add significant red tape and would be yet another onerous task borne on the backs of small and medium-sized businesses, which employ most Canadians. It would also create more work for the Privacy Commissioner in parsing through complicated codes created by larger, wealthier, powerful corporations, companies or government departments that have legal teams whose sole purpose is to find creative ways to perhaps game the system.

Although it would take more time and investment up front, the better option, in my mind, would be to create a standard code of practice that all entities have to follow. This could certainly be taken on as one of the first responsibilities of the expanded Office of the Privacy Commissioner in defining the universal code of practices, where confidence in the process would be greatest and where the greatest level of concern for individual privacy actually exists.

This bill states that personal information can be transferred without Canadians' consent for purposes ranging from research to analysis to business purposes, but it must be de-identified before this can take place. At first glance, this is a positive measure until it is compared with anonymization as an alternative. According to the bill, de-identify means “to modify personal information so that an individual cannot be directly identified from it, though a risk of the individual being identified remains.” That leaves much to be desired when compared to the anonymization of personal information. In the bill, anonymize means “to irreversibly and permanently modify personal information, in accordance with generally accepted best practices, to ensure that no individual can be identified from the information, whether directly or indirectly, by any means.”

Any attempt to identify individuals from de-identified information is prohibited, except in approved circumstances. While many of these approved circumstances relate to the ability of an entity to test the effectiveness of its de-identification system, the potential for abuse still exists. This bill would be improved by eliminating those chances for abuse. We should examine replacing de-identification with anonymization wherever possible.

In comparing Bill C-27 to the EU regulations, we see there are several ways in which the CPPA does not live up to what is widely considered to be the international gold standard of privacy protection, which is the European Union's 2016 General Data Protection Regulation, or GDPR. There is a glaring example of Bill C-27's inferior protections: The GDPR processes personal data in such a manner that it can no longer be attributed to a specific individual without the use of additional information kept separately, subject to technical and organizational measures. This is a security and privacy-by-design measure of the GDPR.

Regarding what Bill C-27 considers to be sensitive information, there is nothing to indicate what sensitive information actually entails. It is also limited in its application. Only the personal information of minors is considered to be sensitive. All information Canadians surrender to any entity should be considered sensitive. On the other hand, the GDPR possesses a particular regime for special categories of personal data, including racial or ethnic origin, political opinions, religious or philosophical beliefs, trade union membership, genetic data, biometric data and data concerning health, sex life and sexual orientation.

We are happy to see that consent is better defined in Bill C-27. However, exceptions for activities not requiring consent would remain in place. Some of them are so broad that an entity could interpret them as never requiring consent. These are loopholes that Canadians should not have to endure when they are required to check the box that they have read and accept terms before they are able to interact with a digital site.

For example, legitimate interests in a given situation may be used by companies to disregard consent. There is a danger that these interests will outweigh potential adverse effects on the individual. Attempting to define legitimate interests allows for too much interpretation, and interpretation is not something that lends itself to privacy laws. The use of personal information could also be exempt from consent if a reasonable person would expect the use of their information for business activities. There is no definition as to what a reasonable person is.

The bottom line is that there are far too many loopholes and vague terms. For the savvy, wealthy or well-lawyered, the potential for abuse exists. The GDPR, conversely, is unequivocal on consent. It must be freely given, specific, informed, unambiguous and in an intelligible and accessible form, and is only valid for specific purposes. Canada should have followed that example. Canadians cannot help but wonder why Bill C-27 does not.

Under the proposed CPPA, there is no minimum age for minor consent, nor is “minor” defined. In the EU, the GDPR sets out a minimum age for a minor's consent at 16 years of age. Member states also have the flexibility to allow for a lower age, provided the age is not below 13 years.

If a breach of personal information does take place, Bill C-27 would make Canada slower to respond than its international counterparts. This bill mandates that a notification be made to the Privacy Commissioner of any breach that creates a real risk of significant harm as soon as it is feasible. The individual affected would also need to be informed, but, again, as soon as feasible.

The GDPR sets out that a mandatory notification must be made to the supervisory authority without undue delay, or 72 hours after having become aware of the incident in certain circumstances. Prior to the introduction of this bill, Canada was lagging behind internationally, and it still is, even after. The GDPR is already six years old. That is six years of extra time during which the Liberals have failed to develop this legislation to meet the robust international standard.

In Bill C-27, the Privacy Commissioner would be empowered to investigate any certified organization for contravening the act. The commissioner has been rightly asking for increased powers and responsibilities for some time, and this goes beyond a mere recommendation to violators to stop their actions. The commissioner would be able to recommend greater penalties of no more than $20 million or 4% gross global revenue for a summary offence, and no more than $25 million or 5% gross global revenue for an indictable offence.

These penalties should add more bite to what the Privacy Commissioner can do and impact how Canadians’ personal information will ultimately be treated. The penalties would also apply to a greater number of provisions, such as actions that contravene the establishment and implementation of a privacy management program and failure to ensure equivalent protection for personal information transferred to a service provider.

However, these new powers for the Privacy Commissioner hit a dead end when taken in context with the second part of this bill, which establishes a tribunal. The personal information and data protection tribunal would consist of no more than six members, and only half of those members must have experience in information and privacy law. The Privacy Commissioner would have order-making authority and the ability to make recommendations to this tribunal regarding penalties. However, the tribunal would have the power to apply its own decision instead, which would be final and binding. Except for judicial review under the Federal Courts Act, the tribunal's decisions would not be subject to appeal or to review by any court. These are powers equivalent to a superior court of record.

The existence of this tribunal would dull the new teeth given to the Privacy Commissioner. While the commissioner could recommend that a penalty be levied for violations of the CPPA, it is the tribunal that would have the power to set the amount owed by these organizations.

The cost associated with striking this tribunal is also a concern. Despite the fact that its work would likely be limited to a handful of times per year to determine penalties, it would apparently require a full-time and permanent staff of 20. I am deeply concerned as the government also has a bad habit of striking advisory councils, or so-called arm's-length regulatory bodies, in advance of bills being debated and passed in the House, long before the ink on the legislation is dry.

My memory is drawn to when a bill was being debated in the House, and I inquired about the details of the proposed environmental council. I was told with great zeal that it had already been established, and the members had been appointed before the bill was even debated in the House.

Can the current Prime Minister tell us if this tribunal would be struck only after Parliament has dealt fully with this bill? Will the Liberals be transparent with Canadians on how the appointment process would be undertaken? Can they assure Canadians that a full-time and permanent staff of 20 has not already been determined? After seven years of Liberal power, the level of patronage in this place run deep.

Part 2, which is the personal information and data protection tribunal act, should be removed as it is a bureaucratic middleman with power that would conflict and create redundancy with the Privacy Commissioner's new powers. The new powers would mean little if they were not coupled with quick and effective consequences for violators. It would prolong decisions on fines and harm Canada's reputation of holding violators accountable.

It would also not align with our friends in the EU, U.K., New Zealand and Australia that do not use a tribunal system for issuing fines. It goes to show Canadians that when it comes to making big government needlessly bigger, the Liberals do it well.

The third and final part of this bill is the only entirely new component. The artificial intelligence and data act seeks to regulate an entity, artificial intelligence, that has not been regulated before in this country.

It would set standards for the creation and use of AI systems in Canada by both domestic and international entities. More specifically, international and interprovincial trade and commerce in artificial intelligence systems would be regulated through common requirements for the design and use of those systems.

It would prohibit certain conduct pertaining to AI systems that could lead to harmful results for individuals and their personal data. There is that mention of personal data. This is a massive undertaking, attempting to regulate something that, up to this point, has been almost entirely unregulated.

I also understand that consultations on this were only initiated in June. Logic would dictate that such a bill requires careful scrutiny and time to get it right.

Requiring record keeping and human oversight are positive developments. What we find difficulty with is getting a clear picture of what the final framework would look like, as the minister alone would be empowered to establish these regulations. The minister would be able to act independently of Parliament in making rulings and imposing fines. In an age of uncertainty and new horizons for our relationship with AI, this is unacceptable. Parliament, at the very least, and independent experts and watchdogs should be central to the creation and enforcement of these rules.

It appears that once again the government has chosen to simply tack on a crucial area of concern to Canadians to an already complicated bill, and it wishes to again entrust sweeping powers to a minister to act independently of parliamentary oversight.

My final thoughts today on Bill C-27 are as follows. The Conservatives are considering this bill through a reasoned approach, and appreciate that stakeholders who have been calling for this legislation for years are watching today's debate closely.

It is absolutely clear that modern-day protection for the personal information of Canadians is required. They must have the ability to access and control its collection, use, monitoring and disclosure, and the right to delete it or the right to vanish.

How can we ensure that data is protected through watertight regulations and strict fines for abuse while also realizing that not every business affected by this bill would have the resources of Walmart or Amazon? Small and medium-sized businesses should be shielded from onerous regulation that stifles their growth. This is not to say that business interests should weigh equally with personal privacy, but there is a balance to be had, and I believe the Liberals do not have it right here.

Furthermore, in a cynical attempt to move their legislative agenda forward, the Liberals have bundled changes to privacy laws with a first-of-its-kind framework for artificial intelligence that once again intends to govern through top-down regulation and not through legislation.

The Liberals should commit today to splitting this bill up to allow Canadians a clear view of its intended impact. With that commitment, the Conservatives will be looking to do the hard work at committee to improve the long-awaited but flawed elements of this legislation. Even in an age of convenience, the world in which we live grows even more complicated by the day. Canadians deserve privacy protection worthy of 2022 realities and beyond.

Division of Bill C‑27 for the Purpose of Voting—Speaker's RulingPoints of OrderRoutine Proceedings

November 28th, 2022 / 3:30 p.m.


See context

The Speaker Anthony Rota

I am now prepared to rule on the point of order raised on November 22, 2022, by the member for New Westminster—Burnaby concerning the application of Standing Order 69.1 to Bill C-27, an act to enact the consumer privacy protection act, the personal information and data protection tribunal act and the artificial intelligence and data act and to make consequential and related amendments to other acts.

The member for New Westminster—Burnaby stated that there is a clear link between the first two parts of Bill C‑27, which respectively enact the consumer privacy protection act and the personal information and data protection tribunal act. He further noted that these elements were both part of the previous Bill C-11, which was introduced in the House during the 43rd Parliament.

However, the member argued that part 3, which enacts the artificial intelligence and data act, should be considered separately, because it does not directly concern privacy protection or the analysis, circulation and exchange of personal information. Accordingly, he asked the Chair to divide Bill C‑27 for the purposes of voting, as Standing Order 69.1 permits.

The official opposition House leader concurred. He added that, outside of clause 39 of the bill, which mentions the new consumer privacy protection act in the definition of the term “personal information”, part 3 of Bill C-27 does not refer to parts 1 or 2. Furthermore, the member for South Shore—St. Margarets stated that parts 1 and 2 of Bill C-27 deal with privacy protection, which has nothing to do with the subject of part 3, the regulation of the new industry of artificial intelligence.

On November 23, the parliamentary secretary to the government House Leader pointed out that privacy protection is the common theme that links every part of Bill C-27. In his view, the bill’s three parts constitute a framework for protecting the privacy of Canadians from the risks posed by artificial intelligence systems. He argued that dividing the bill would prevent members from considering all the risks and impacts that new artificial intelligence technologies may create for the security of personal information. He also noted that privacy laws do not adequately protect the public from new artificial intelligence systems and that, as a result, Bill C-27 should be considered as a whole.

Standing Order 69.1 gives the Chair the authority to divide the questions, for the purposes of voting, on the motions for second or third reading of a bill. The objective here is not to divide the bill for consideration purposes, but to enable the House to decide questions that are not closely related separately.

The Chair has carefully reviewed the provisions of Bill C‑27 and taken into account members' statements on the issue of dividing it for voting purposes. The Chair agrees that the bill's three parts are connected by a broad theme, namely, the use and protection of personal information. While parts 1 and 2 of the bill are closely related, this is not true of part 3.

The Chair is of the view that, given the lack of cross-references between part 3 and the preceding parts of the bill, with the sole exception being one reference to the new consumer privacy protection act—which serves to propose a common definition of the term “personal information”—dividing the bill for voting at second reading is justified.

In his intervention, the parliamentary secretary to the government House leader emphasized the common theme that links the three acts enacted by Bill C-27. In a decision on a similar matter, delivered on March 1, 2018, which can be found at pages 17550 to 17552 of the Debates, Speaker Regan said the following, at page 17551:

…the question the Chair must ask itself is whether the purpose of the standing order was to deal only with matters that were obviously unrelated or whether it was to provide members with the opportunity to pronounce themselves on specific initiatives when a bill contains a variety of different measures.

In the absence of a clear link between the three parts of Bill C-27, other than the theme of privacy protection, the Chair is willing to divide the question. Accordingly, two votes will take place at the second reading stage for Bill C-27. The first will be on parts 1 and 2, including the schedule to clause 2. The second will deal with part 3 of the bill. The Chair will remind members of this division before the voting begins.

If any part of this bill is negatived, the Chair will order the bill reprinted for reconsideration at committee.

I thank the hon. members for their attention.

Digital Charter Implementation Act, 2022Government Orders

November 28th, 2022 / 1:55 p.m.


See context

Conservative

Cathay Wagantall Conservative Yorkton—Melville, SK

Madam Speaker, for the average citizen in the digital age, we have entered uncertain times. To almost everyone, at face value, the convenience of our time is remarkable. Access to any piece of information is available at our fingertips. Any item imaginable can seamlessly be ordered and delivered to our doors. Many government services can be processed online instead of in person. Canadians have taken these conveniences for granted for many years now.

The pandemic accelerated our ascent, or descent, depending on who you ask, into the digital age. The inability to leave our homes and the necessity to maintain some rhythm of everyday life played a significant part in that, but around the world, we saw governments taking advantage of the plight of their citizens. Public health was used as a catalyst for implementing methods of tracking and control, and social media platforms, which have been putting a friendly face on exploiting our likes, dislikes and movements for years, continue to develop and implement that technology with little input or say from their millions of users.

Canadians no longer can be sure that their personal information will not be outed, or doxed, to the public if doing so would achieve some certain political objective. We saw that unfold earlier this year with the users of the GiveSendGo platform.

The long-term ramifications of our relationship with the digital economy is something Canadians are beginning to understand. They are now alert to the fact that organizations, companies and government departments operating in Canada today do not face notable consequences for breaking our privacy laws. As lawmakers, it is our responsibility to ensure that Canadians’ privacy is protected and that this protection continues to evolve as threats to our information and anonymity as consumers unrelentingly expands both within and beyond our borders.

That brings me to the bill we are discussing today, Bill C-27. It is another attempt to introduce a digital charter after the previous iteration of the bill, Bill C-11, died on the Order Paper in the last Parliament. My colleagues and I believe that striking the right balance is at the core of the debate on this bill. On the one hand, it seeks to update privacy laws and regulations that have not been modernized since the year 2000 and implemented in 2005. It would be hard to describe the scale of expansion in the digital world over the last 22-year period in a mere 20-minute speech. It is therefore appropriate that a bill in any form, particularly one as long-awaited as Bill C-27, is considered by Parliament to fill the privacy gaps we see in Canada’s modern-day digital economy.

Parliament must also balance the need for modernization of privacy protection with the imperative that our small and medium-sized businesses remain competitive. Many of these businesses sustain themselves through the hard work of two or three employees, or perhaps even just a sole proprietor. We must be sensitive to their concerns, as Canada improves its image as a friendly destination for technology, data and innovation. This is especially true as our economic growth continues to recover from the damaging impact of pandemic lockdowns, crippling taxes that continue to rise and ever-increasing red tape.

That extra layer of red tape may very well be the catalyst for many small businesses to close their operations. No one in the House would like to see a further consolidation of Canadians’ purchasing power in big players such as Amazon and Walmart, which have the infrastructure already in place for these new privacy requirements.

In a digital age, Canadians expect businesses to operate online and invest a certain amount of trust in the receiving end of a transaction to protect their personal information. They expect that it will be used only in ways that are necessary for a transaction to be completed, and nothing more.

In exchange for convenience and expediency, consumers have been willing to compromise their anonymity to a degree, but they expect their government and businesses to match this free flow of information with appropriate safeguards. This is why Bill C-27, and every other bill similar to it, must be carefully scrutinized.

As many of my colleagues have already indicated, this is a large and complex bill, and we believe that its individual components are too important for them to be considered as one part of an omnibus bill.

There are three—

Digital Charter Implementation Act, 2022Government Orders

November 28th, 2022 / 1:25 p.m.


See context

Kingston and the Islands Ontario

Liberal

Mark Gerretsen LiberalParliamentary Secretary to the Leader of the Government in the House of Commons (Senate)

Madam Speaker, it is an honour today to rise to speak to Bill C-27, the digital charter implementation act.

I think it is important to reflect on how long it has been since we last had an update to legislation regarding the privacy laws that exist around data. The last time was over 20 years ago. Twenty years might not seem like a long time, but when we think about it, 20 years ago Facebook was probably just a program Mark Zuckerberg was working on in his dorm room.

If we think of iPhones, they were pretty much non-existent 20 years ago. Smart phones were out, but they certainly did not have anywhere near the capabilities they do today. So many other technologies we have come to rely on now have been getting smarter over the years. They are acting in different manners and are able to do the work they do because of the data being collected from individual users.

Another great example would be Google. Twenty years ago it was nothing more than literally a search engine. One had to type into the Google form what one was looking for. Sometimes one had to put weird characters or a plus symbol between words in the search terms. It literally was just a table of contents accessing information for people. However, now it is so much more than that. How many of us have, at some point, said to somebody that we would love to get a new air fryer, and then suddenly, the next day or later that day, we see in Google, on Facebook, or whatever it might be, advertisements for air fryers that keep popping up. I am sure that sometimes it is a coincidence, but I know in my experience it seems it happens way too often to be a coincidence.

These are the results of new technologies that are coming along, and in particular AI, that are able to work algorithms and build new ones based on the information being fed into the system. Of course the more information that gets fed in, the smarter the technologies get and the more they are looking to feed off new data that can give them even further precision with respect to advertising and targeting tools at people.

This is not just about selling advertising. AI can also lead to incredible advancements in technology that we otherwise would not have been able to get to, such as advancements in health and the automotive industry. If we think of our vehicles, the big thing now in new cars is the lane-assist feature, which uses technology such as lidar to read signals in the road.

There is technology that, when we enter our passwords to confirm we are human beings, sometimes requires us to pick different things from pictures. When we do that, we are feeding information back into helping those images be properly placed. We are not just confirming that we are human beings; there is an incredible amount of data being used to give better evaluations to various different formulas and equations based on the things we do.

When we think of things like intelligent and autonomous vehicles, which basically drive themselves, 20 years ago would we ever have thought a car could actually drive itself? We are pretty much halfway there. We are at a point where vehicles are able to see and identify roads and know where they need to be, what the hazards are, and what the possible threats are that exist with respect to that drive.

What is more important is that, when I get into my vehicle, drive it around and engage with other vehicles, it is analyzing all of this data and sending that information back to help develop that AI system for intelligent vehicles to make it even better and more predictive. It is not just the data that goes into the AI, but also the data that it can generate and then further feed to the algorithms to make it even better.

It is very obvious that things have changed quite a bit in 20 years. We are nowhere near where we were 20 years ago. We are so much further ahead, but we have to be conscious of what is happening to that data we are submitting. Sometimes, as I mentioned in a previous question, it can be data that is submitted anonymously for the purposes of being used to help algorithms around lidar and self-driving vehicles, for example. At other times it can be data that can be used for commercial, marketing and advertising purposes.

I think of my children. My six-year-old, who is in grade one, is developing his reading quite quickly. Two years ago, even at the age of four, when he would be playing a video game and would not be able to figure out how to get past a certain level, he would walk up to my wife's iPad and basically say, “Hey, Siri, how do I do this?”

Just saying that, I probably set off a bunch of phones to listen to what I am saying, but the point is that we have children who, already at such a young age, are using this technology. I did not grow up being able to say, “Hey, Siri, how do I do this or that?”

What we have to be really concerned about is the development of children and the development of minors, what they are doing and how that can impact them and their privacy. I am very relieved to see there is a big component of this that, in my opinion, aims to ensure the privacy of minors is maintained, even though I have heard the concern or the criticism from some members today that the definition of “minor” needs to be better reflected in the legislation.

I feel as though if it is not known what a minor is, in terms of how it relates to this legislation, then I believe this is something that can be worked out in committee. It is something to which the governing members would be more than welcome, in terms of listening to the discussion around that and why or why not further clarifying the definition is important.

I would like to just back up a second and talk more specifically about the three parts of this bill and what they would do. The summary reads as follows:

Part 1 enacts the Consumer Privacy Protection Act to govern the protection of personal information of individuals while taking into account the need of organizations to collect, use or disclose personal information in the course of commercial activities.

A consequence of this first part would be to repeal other older pieces of legislation. I think this is absolutely critical, because this goes back to what I have been talking about in terms of how things have changed over the last 20 years. We are now at a place where we really do not know what information we are giving or is being used from us. I realize, as some other colleagues have indicated, 99.9% of the time, we always click that “yes, I accept the terms” without reading the terms and conditions, not knowing exactly how our information is being used and what is actually being linked directly back to us.

Through the consumer privacy protection act, there would be protections in place for the personal information of individuals while, at the same time, really respecting the need to ensure companies can still innovate, because it is important to innovate. It is important to see these technologies do better.

Quite frankly, it is important for me personally, and this will be very selfish of me, that, when I am watching on Netflix a show that I really like, I get recommendations of other shows I might really like. As the member for South Shore—St. Margarets mentioned earlier, when it comes to Spotify, it is important to me also that, when I start listening to certain music, other music gets suggested to me based on what other people who share similar interests to mine have liked, and how these algorithms end up generating that content for me.

It is important to ensure that companies, if we want them to continue to innovate on these incredible technologies we have, can have access to data. However, it is even more important that they be responsible with respect to that innovation. There has to be the proper balance between privacy and innovation, how people are innovating and how that data is being used.

We have seen examples in recent years, whether in the United States or in Canada, where data that has been collected has been used in a manner not in keeping with how that data was supposed to be used. There has to be a comprehensive act in place that properly identifies how that data is going to be used, because, quite frankly, the last time this legislation was updated, 20 years ago, we had no idea how that data would be used today.

By encouraging responsible innovation and ensuring we have the proper terminology in the legislation, companies would know exactly what they should and should not be doing, how they should be engaging with that data, what they need to do with that data at various times, how to keep it secure and safe and, most importantly, how to maintain the privacy of individuals. It is to the benefit not just of individuals in 2022, or 2023 almost, to have data that is being properly secured. It is also very important and to the benefit of the businesses, so that they know what the rules are and what the playing field is like when it comes to accessing that data.

The second part of this bill, as has been mentioned:

...enacts the Personal Information and Data Protection Tribunal Act, which establishes an administrative tribunal to hear appeals of certain decisions made by the Privacy Commissioner under the Consumer Privacy Protection Act and to impose penalties for the contravention of certain provisions of that Act.

This is absolutely critical, because there has to be somewhere people can go to ensure that, if they have a concern from a consumer perspective over the way their data is used and they are not happy with the result from the commissioner, they have an avenue to appeal those decisions. If we do not do that, and we put too much power in the hands of a few individuals, or in this case the Privacy Commissioner under the consumer protection act, if we give all that power and do not have the ability for an appeal mechanism, then we will certainly run into problems down the road. This legislation would help ensure that the commissioner is kept in check, and it would also help consumers have the faith they need to have in terms of accountability when it comes to their data and whether it is being used and maintained in a safe way.

The third part of the bill is the more controversial in terms of whether or not it should be part of this particular legislation or in a separate vote. The summary reads:

Part 3 enacts the Artificial Intelligence and Data Act to regulate international and interprovincial trade and commerce in artificial intelligence systems by requiring that certain persons adopt measures to mitigate the risks of harm and biased output related to high-impact artificial intelligence systems.

That act would provide for public reporting and authorizes the minister to order the production of records related to artificial intelligence systems. The act also would establish prohibitions related to the possession or use of illegally obtained personal information for the purpose of designing, developing, using or making available for use an artificial intelligence system in an intentional or reckless way that causes material harm to individuals.

One of the consequences of artificial intelligence, quite frankly, is that if we allow all of this biased information to be fed into the artificial intelligence systems and be used to create and produce results for important algorithms, then we run the risk of those results being biased as well if the inputs are going to be that way. Therefore, ensuring that there are proper measures in place to ensure individuals are not going to be treated in a biased manner is going to require true accountability.

The reality is that artificial intelligence, even in its current form, is very hard to predict. It is very hard to understand exactly when a person is being impacted by something being generated from an artificially intelligent form. Quite often, a lot of the interactions we already have on a day-to-day basis are based on these artificial intelligence features that are using various different inputs in order to determine what we should be doing or how we should be engaging with something.

The reality is that if this is done in a biased manner or in a manner that is intentionally reckless, people might not be aware of that until it is well past the point, so it is important to ensure that we have all of the proper measures in place to protect individuals against those who would try to use artificial intelligence in a manner that would intentionally harm them.

As I come to the conclusion of my remarks, I will go back to what I talked about in the beginning, that artificial intelligence, quite frankly, has a lot of benefits to it. It is going to transform just about everything in our lives: how we interact with individuals, how we interact with technologies, how we are cared for, how we move around by transportation, how we make decisions, as we already know, on what to listen to or what to watch.

It is incredibly important that as this technology develops and artificial intelligence becomes more and more common, we ensure that we are in the driver's seat in terms of understanding what is going into that and making sure we are fully aware of anybody who might be breaking rules as they relate to the use of artificial intelligence. It will become more difficult, quite frankly, as the artificial intelligence forms take on new responsibilities and meanings to create new decisions and outputs, and we must ensure that we are in a position to always be in the driver's seat and have the proper oversight that is required.

I recognize that some concerns have been brought forward today by different members. At first glance, when the member for South Shore—St. Margarets and others brought forward the concern around the definition of a “minor”, which is not something I thought of when I originally looked at this bill, I can appreciate, especially after hearing his response to my question, why it is necessary to put a proper definition in there. I hope the bill gets to committee and the committee can study some of those important questions so we can keep moving this along.

I certainly do not feel as though we should just be abandoning this bill altogether because we might have concerns about one thing or another. The reality, and what we know for certain, is that things have changed quite a bit in the last 20 years since the legislation was last updated. We need to start working on this now. We need to get it to committee, and the proper studies need to occur at this point so we can properly ensure that individuals' privacy and protection are taken care of as they relate to the three particular parts I talked about today.

Digital Charter Implementation Act, 2022Government Orders

November 28th, 2022 / 12:55 p.m.


See context

Conservative

Rick Perkins Conservative South Shore—St. Margarets, NS

Madam Speaker, data is used for good and data is used for evil. Data is money, data is power and data is knowledge. Data can improve our lives. Data can also harm our lives. Data tells the story of our lives, and our personal data flows globally. The amount of data in the world has doubled since 2020 and is expected to triple by 2025 according to Statista, 2022.

To understand why we need modern privacy rights in the digital world, it is important to understand that businesses have evolved from providing a specific service, like a social network such as Facebook and Twitter or a search engines such as Google or Microsoft to find things, to using data to gather information on individuals and groups, to manage and deploy people's data and to sell their information to others and sell them goods and services.

We have evolved from businesses providing these services for interest to businesses using these services for surveillance on us and making enormous amounts of money on our personal information. As legislators, we must balance the uses of data collection with an individual's right to privacy. It is a delicate balance that Bill C-27 aims to address by modernizing our privacy laws.

At the heart of this long overdue revision to our privacy laws must be the rights of the individual. In my view, commercial usage of data under privacy law should be secondary to personal privacy, and should only be focused on how business interests enhance personal needs and how commercial entities protect individual privacy rights. My remarks today will focus on why this legislation falls far short of what individuals, groups and businesses need for a clear legislative framework of data collection and management of personal information in this digital age.

First, Bill C-27 is really three bills in one omnibus bill. The first bill would update privacy law. The second bill contains a new semi-judicial body and would potentially duplicate what the Privacy Commissioner could do while removing the right to go to the courts. The third is a rushed bolt-on bill on artificial intelligence that does not, in my mind, have much intelligence in it. The Liberal legislation manages to weaken privacy and put up barriers to innovation at the same time.

Bill C-27 fails Canadians right up front in its preamble. Despite demands from privacy advocates over the last few years, the government has failed to recognize privacy as a fundamental right in the preamble. The bill states that individuals' personal information should have the “full enjoyment of fundamental rights”. This is clever language that avoids giving personal privacy the recognition that it is a fundamental right or a fundamental human right.

The wording “full enjoyment of fundamental rights” in the preamble needs to be amended from “of fundamental rights” to “as a fundamental right”. Furthermore, leaving this strictly in the preamble reduces if not eliminates any real legal impact. If privacy is a fundamental right, for it to have true force in this bill it needs to be included as well in clause 5, which notes the purpose of the bill.

Why is privacy a fundamental right? Freedom of thought, freedom of speech and freedom to be left alone are derived from privacy. The legal protections of privacy limit government's intrusion into our lives. In free and democratic societies, we consider these freedoms as essential rights. The rights to think what I want, to say what I want and to be free to choose what I do, what I am interested in and whom I interact with and where I do that in our digital world are data points. To me they are personal information and therefore are part of a fundamental right to privacy.

What does this mean? It means privacy rights under law are prioritized over commercial rights. A rights-based approach serves as an effective check on technology's potential dangers while ensuring businesses can function and thrive.

Government officials have told me this cannot be recognized in the bill the way it needs to be to have true meaning under law and force because it would intrude on provincial jurisdiction. I do not agree, and neither does the Privacy Commissioner of Canada. Both levels of government can regulate privacy and do. The federal government's role is to regulate aspects under its control, including the fact that commerce does not follow provincial boundaries and therefore requires federal oversight.

I believe that most Canadians accept and expect their data to be used to enhance their experiences and needs in our modern society. I also believe that for organizations to obtain the data of Canadians, Canadians must first consent to it, and that if these same organizations find new uses of our data, they need to get express consent as well. Canadians want their data safely protected and not used for things they did not give permission for, and if they choose to end a relationship with a service provider, they want their personal data to be destroyed.

I do not believe Canadians want their personal data sold to other entities without their express consent, and how does Bill C-27 deal with these expectations of Canadians? I think poorly. The legislation, in the summary section, states that the dual purpose of the bill is to “govern the protection of personal information of individuals while taking into account the need of organizations to collect, use or disclose personal information in the course of commercial activities.” What it would not do is place personal privacy rights above commercial interests.

The bill would require express consent in clause 15, and that is true, but a great deal of the bill goes on to describe the many ways in which consent would not be required and how it would be left up to the discretion of the organization that has collected the data if it needs consent for its usage. The bill is also weak in terms of making sure individuals understand consent when given. For consent to be meaningful, the usages proposed must be understood. The lack of definition and the placement of burden of interpretation on businesses expose those same businesses to legal action and penalties if they get it wrong. This lack of clarity may stifle innovation in Canada as a result. The bill needs to ensure that individuals understand the nature, purpose and consequences of the collection, use and disclosure of the information to which they are consenting.

In addition, the bill would give organizations the right to use information in new ways and would require businesses to get an update to consent for this information. That is good and necessary, but the bill would also enable organizations to use the implied consent in subclause 15(5). When combined with paragraph 18(2)(d), this would give businesses carte blanche to use implied consent rather than express consent.

An organization can decide on its own that the original consent implies consent for a new purpose, and they do not need to seek the individual's views. This is a version of the old negative option marketing that was outlawed in the 1990s. Either someone gives consent, or they do not. There is no such thing as implied consent, in my view, and this needs to be removed from the bill.

Additionally, the bill uses the term “sensitive information”, which companies and organizations must determine to protect data, but it does not anywhere in the more than 100 pages define what “sensitive information” is. It needs to be defined in the bill to include information revealing racial and ethnic origin, gender identity, sexual orientation and religious and other affiliations. These are just a few examples.

However, that is not the worst of it. Bill C-27 would introduce a concept called “legitimate interest”. This is a new rule that would rank an individual's interests and fundamental rights below those of the organization that gathered the information, the exact opposite of what a personal privacy bill should do. To do this, subclause 18(3) would allow an organization or business to use information if it has a legitimate interest in doing so. However, here is where it really gets goofy: To try to reduce businesses using our data under the legitimate interest clause for their own needs over ours, the Liberals have decided to limit the power under paragraph 18(3)(b). This clause could prohibit the business or organization from using our information for the purpose of influencing behaviour.

For more than 20 years, since the invention of loyalty and rewards programs, retailers have used people's data to offer products they might enjoy based on their purchasing patterns. Have members ever bought wine online or in store because it said, “If you like this, you might enjoy this alternative”? Have members ever watched a show on Netflix because it was recommended? Have members ever listened to a song on Spotify because it was recommended based on what else they had listened to? Well, guess what. Paragraph 18(3)(b) could now make this service illegal.

The Liberals cannot get express consent right, and they are allowing companies to use people's data with implied consent or no consent at all. The Liberals are also putting the business use of people's personal data above their privacy rights. That is why it is really the no privacy bill. At the same time, the Liberals are making illegal the good parts of what businesses do in enhancing the customer experience by removing the ability to study purchasing patterns and offering products that we might enjoy because of paragraph 18(3)(b). This bill makes influencing people's decisions illegal.

The minister said to me and mentioned in the House in his opening speech on the bill, as have other members today, that he is proud to be protecting children from harm in this digital bill. This 100-page legislation has only one clause related to children. Subclause 2(2), under “Definitions”, states that “information of minors is considered to be sensitive”, but the bill does not define “sensitive” nor does it define what a minor is. Officials tell me that the definition of a minor is determined by provincial law, so each province would have different rules, and companies would have to comply with the different rules in every province.

If the protection of children were really a major purpose, this legislation would devote some space to defining both what a minor is and what sensitive information is. During COVID, minors used many online apps and programs to continue their formal education. There were then and still are no protections under law as to what is done with their data. This technology would be a new normal for our education system. The online surveillance of children resulting from the COVID experience is huge and protections are zero, even with this bill.

This bill needs to define in law, not regulation, age-appropriate consent for minors, and comprehensive rules to prevent the collection, manipulation and use of any minor's data. This bill leaves it up to businesses to decide what is sensitive and appropriate for minors. It is a colossal failure on the minister's main selling point for this no privacy bill.

The bill is silent on the selling of personal data. It needs provisions on the limits and obligations of data brokers. The bill is silent on the use of facial recognition technology. The bill also prohibits using data in a way that produces significant harm and defines it inadequately. For example, psychological harm caused by a data breach and embarrassment caused by privacy loss are not included. The damages role needs to be expanded to include moral damages, since most contraventions of privacy do not involve provable, quantifiable damages.

Creating more government bureaucracy and growth is the true legacy of the Liberals in government. This bill is no exception, with the creation of a body to appeal the Privacy Commissioner's rulings to. The appointed new body of non-lawyers is called the personal protection and data tribunal, and it is the second part of the bill. Frankly, these powers, if they really are important, should be given to the Privacy Commissioner to eliminate the middle man of bureaucracy. There is no need for this tribunal.

Finally, let us turn to the ill-conceived, poorly structured and ill-defined artificial intelligence part of Bill C-27. It really needs to be removed from this legislation and puts this bill's passage into question. AI is a valid area to legislate, but only with a bill that has a legislative goal. That is why I am hopeful that the Speaker will rule in favour of the NDP's point of order, reiterated by our Conservative House leader, which would ensure that part 3 of the bill is voted on separately from part 1 and part 2.

Essentially, this part of Bill C-27 would drive all work on AI out of Canada to countries with clearer government legislation. It tells me the government has not done its homework, does not really know what AI is or will become, and has no idea how it will impact people in our country.

The bill asks parliamentarians to pass a law that defines no goals or oversight and would give all future law-making power to the minister through regulation, not even to the Governor in Council but to the minister. The minister can make law, investigate violations, determine guilt and impose penalties without ever going to Parliament, cabinet or any third party.

It is a massive overreach and is anti-democratic in an area critical to Canada's innovation agenda. Promises of consultation in the process of crafting regulations is too little, too late. It puts too much power in the hands of unelected officials and the minister.

The definition in the bill of what AI is, and therefore what it wants total regulatory power over, is a system that autonomously processes date related to human activities using a genetic algorithm, a neural network, machine learning or other networks to make recommendations or predictions. If we think this is futuristic, it is not. It is already happening in warfare to determine and execute bombings.

Without parliamentary oversight, the bill introduces the concept of “high-impact systems”. It does not define what that is, but it will be defined in regulation and managed in regulation. No regulatory power should ever be given to the minister or the Governor in Council for anything that is not defined in law.

The only thing the bill defines is the unprecedented power to rule all over this industry and the fines to those who breach the unwritten regulations. The massive financial and jail penalties that extend down to the developers and the university researchers for undefined breaches of law as part of the statute are huge.

Unless this portion of the bill is separated when members vote, this AI section is reason alone that the bill should be defeated. AI is a significant need, but it needs a proper legislative framework, one that is actually developed with consultation.

I urge all members to read the bill carefully. Current privacy laws need amendment, but the current law is preferable to this ill-defined proposal. The AI bill would drive innovation and business out of Canada's economy, making us less competitive.

It is hard to believe anyone could get this legislation so wrong, especially since this is the second time the Liberals have proposed updating our privacy laws. Without splitting the bill, without having separate votes and without considerable amendments in committee in the first two parts, the bill should be defeated.

I urge all members to consider this seriously in their deliberations as we go on to the many speeches that we will hear. While this is a critical point of updating our personal privacy, the bill, in its current state, does not do it and it gives equal if not greater rights to businesses and organizations than it does to individuals.

Digital Charter Implementation Act, 2022Government Orders

November 28th, 2022 / 12:55 p.m.


See context

NDP

Laurel Collins NDP Victoria, BC

Madam Speaker, privacy rights are so critical. When they are violated, consumers deserve to be compensated. There have been numerous examples in the United States where consumers have been compensated in the realm of hundreds of millions of dollars. For the same breach here in Canada, consumers have not been compensated.

I am wondering if the member would support amendments that would ensure that, in Bill C-27, there is parity, and for the same breach, Canadians and Americans would be getting fair compensation.