Mr. Speaker, 34 years ago, the Supreme Court said that “privacy is at the heart of liberty in a modern state”. In the words of Justice Gérard La Forest of the Supreme Court of Canada in 1988, it is worthy of an individual and “it is worthy of constitutional protection”. All Canadians are worthy of having their privacy respected.
It is our duty as parliamentarians to do our best to protect Canadians' privacy rights, especially as we struggle so much for it today.
Bill C-27, formerly Bill C-11, is designed to update Canada’s federal private sector privacy law, the Personal Information Protection and Electronics Documents Act, or PIPEDA, to create a new tribunal and to propose new rules for artificial intelligence systems. It is a reworking of Bill C-11, and it has three components: the consumer privacy protection act; the personal information and data protection tribunal act, creating a new tribunal; and the artificial intelligence and data act.
The bill applies to Canadians' private rights. It does not apply to CSIS, RCMP or CSE. That and other government-held data is governed by the Privacy Act. Privacy laws for Canadians have not been updated in 22 years, and Europe updated the General Data Protection Regulation in 2016.
When we last updated this act, 22 years ago, the member for South Shore—St. Margarets was turning 21 years old, and society was going through big changes. The world had just gotten past the Y2K scare. We were looking at what was going to happen to computers when the clock changed from 1999 to 2000. In certain areas, we did not know if the power would go out or what would happen.
People listened to music on CD Walkmans. Apple was over a year away from launching a cutting-edge new technology called the iPod. Less than 30% of Canadians actually owned a cellphone. The most popular cellphones were the Motorola Razr, which was a flip phone, and the Nokia brick phone, with texting that used the number pad and almost no web browsing capabilities. The most sophisticated app was called Snake. A fledgling Canadian telecommunications company was just starting, and it was called BlackBerry.
That is how long it has been since we updated our laws. Today, 22 years later, data collection is getting more sophisticated, and surveillance is more of the norm than the exception.
Apple Watch announced a few weeks ago that it can track and tell when a woman is ovulating. What is concerning, and we are going to talk a lot about data for good and data for wrong, is that this technology can tell if a woman skips a cycle, and then can identify if there has been a miscarriage or an abortion. This is very concerning.
Our Fitbits, our web history and our Apple phones can tell us how many steps we did in a day. Sometimes when we are in Parliament it is about 10, and if we are door knocking it is about 25,000. That does not sound important, but that information is also letting those regulators know where we have been, where we are going and where we live.
Facial recognition technology can identify a face like a fingerprint. Sometimes that is good. We have heard from law enforcement that it can be used for human trafficking. Sometimes that is wrong, when people are identified in a street and when people are identified with their names, their data and where they have been. Let us think of Minority Report, where everywhere someone goes, they are identified. It did not matter where they where going or where they had been. That is something that could happen with facial recognition technology.
Google and Amazon listen and collect our data in our bathrooms, living rooms, kitchens and cars. How many times have we been in conversations and Siri asks, “What was that?” Siri is always listening. Amazon is always listening. Speaking of cars, they are cellphones on wheels. When we connect to a rental car, and a lot of us rent cars, we see five or six other phones in the history. That car has downloaded all the data from our phone into that car. A lot of times, if we see that in the rental car, that car holds our information. It is very concerning.
There are many examples where it has hurt Canadians in the last several years. Two summers ago, Tim Hortons had a data breach, where every time someone rolled up the rim, it told Tim Hortons where they went afterwards, if they went home or where they were staying. It collected all that data, and it was a big problem.
In the ethics committee, we studied facial recognition technology. There was a company called Clearview AI, which took two billion images off the Internet, including a lot of ours, and just gave them to the police. There was no consent. The information just went and ended up in the hands of law enforcement.
There is Telus's “data for good”. During the pandemic, Telus collected our data. It knew where we went and if we went to the grocery store or the pharmacy, or if we stayed home. It just gave that to the government. It was called “data for good”. They called it de-identification. I am going to talk about how that hurt everyone later.
Lastly is doxing or using personal information to try to out people. GiveSendGo is a big one. It gave a U.S. company the information of people who donated to different causes or events. At one point, Google identified all those donors on a website showing exactly where they lived. Everyone's information, when they donated to a company, was identified and outed. That was terrible.
Surveillance has not just resulted in a wholesale destruction of privacy but a mental health crisis in children and youth as well. I am glad to hear the minister speak about children and youth because data has certainly affected them and continues to.
Canada’s federal government has repeatedly failed to take privacy seriously and construct a legal framework that protects the rights of Canadians in the digital age. This bill normalizes surveillance and treats privacy not as a fundamental human right and not even as a right to consumer protection. To make this point very clear, nowhere in the document for Bill C-27 does it state that privacy is a fundamental human right. However, this should be the crux of new legislation to update privacy laws, if not the outward premise, with the statement hammered from the preface until the end of Bill C-27 and following through the entire document. However, it is not there. It is nowhere and, therefore, holds no value.
This bill does not use that statement from the onset. It should be the pillar by which the bill is designed and led. Only a strong bill will ensure that Canadians' privacy rights are protected. Because of its omission, the bill is very weak, making it easier for industry players to be irresponsible with people's personal data. This is ironic as Canada has signed on to the UN Declaration of Human Rights and the International Covenant on Civil and Political Rights. That is where the bill starts and ends, with its failure to properly address privacy for Canadians.
Conservatives believe that Canadians’ digital privacy and data need to be properly protected. This protection must be a balance that ensures Canadians’ digital data is safe and that their information is properly protected and used only with their consent, while not being too onerous to be detrimental to private sector business. It is a balance.
Let us be clear. We need new privacy laws. In fact, it is essential to Canadians in this new digital era and to a growing digital future, but Bill C-27 needs massive rewrites and amendments to properly protect privacy, which should be a fundamental right of Canadians. The bill needs to be a balance between the fundamental right to privacy and privacy protection and the ability of business to responsibly collect and use data.
It also needs more nuance, but parts of this bill are far too vague. The definition of tyranny is the deliberate removal of nuance, so to create more equality or fairness on those privacy rights and to ensure businesses and AI use data for good, we need more nuance with more detail and more explanation, not less. There was a saying I used to love that my grandfather would say: “If you're going to do something, make sure you do it right or don't do it at all.”
Besides the omission of privacy rights as a fundamental right, the bill needs a massive rewrite. First, the bill doubles down on a flawed approach to privacy using a notice and consent model as its legal framework. The legal framework of Bill C-27 remains designed around a requirement that consent be obtained for the collection, use and disclosure of personal information, unless one of the listed exceptions to consent applies. Those exceptions are called “legitimate interest”.
What is scary about legitimate interest is that the businesses themselves will determine what legitimate interest means and what will be exempt. A quote on this from Canada’s leading privacy and data-governing expert, Teresa Scassa, says that this provision alone in the bill “trivializes the human and social value of privacy.” The legitimate interest provision allows Facebook, for instance, to build shadow profiles of individuals from information gathered from their contacts, even those with no Facebook access or accounts, without asking for their permission.
Have colleagues ever seen the “people you may know” feature on Facebook? Sometimes people turn up there, although one might not know where they had ever met and even though neither party is actually on Facebook. That is because Facebook builds profiles and shadow profiles from other members' contacts. Facebook has a feature that will suggest that one share their contacts: It will be great. People will give all their friends' information to Facebook: their emails, addresses and sometimes their private phone numbers. The U.S. found that information was turning up in Facebook. Here are a couple of examples. An attorney had a man recommended as a friend he might know who was a defence counsel on one of his cases, when they had only communicated though a work email. Another time, a man who donated sperm to a couple, secretly, had Facebook recommend their child as a person he should know, despite not having the couple, whom he once knew, on Facebook.
Legitimate interests needs more nuance. It needs to be more defined, or it is useless. Legitimate interests allow for too much interpretation. In other words, it allows something to be something unless it is not. It is far too broad.
Additionally, consent is listed as having to be “in plain language that an individual to whom the organization’s activities are directed would reasonably be expected to understand.” Bill C-27 makes it hard to determine what legitimate interests are, and that goes back to privacy as a Human Rights Commission complaint.
If we compare this section to the European Union's privacy law, the GDPR, which is, as the minister stated, the gold standard, the legitimate interest exemption is available unless there is an adverse effect on the individual that is not outweighed by the organization's legitimate interest, as opposed to the interest or fundamental freedom to the individual under the GDPR. If adverse effects on the individual can be data breaches, which are shocking and distressing to those impacted, and some courts have found that the ordinary stress and inconvenience of a data breach is not a compensable harm since it has been a routine part of life, probably for the last two years at least, then the legitimate interest exemption will be far too broad.
However, Bill C-27 would take something that was meant to be quite exceptional for consent in the European Union's privacy laws and make it a potentially more mainstream basis for the use of data without acknowledging consent. Why would it do this? It is because Bill C-27 places privacy on par with commercial interests in using personal data, something that would not happen if privacy was noted in the bill as a fundamental right for Canadians.
Additionally, we need to be wary of consent. As a mandatory, consent should be made easier. Has anyone ever looked at their iPhone when agreeing to consent and scrolled down? Has anyone actually read all that? Has anyone read Google's 38 pages of consent every time they sign up or use Google?
Consent is not easy. It is not simple, and certainly this proposed law would not make it any simpler. We need to be wary of consent, and we need to ensure that consent is consensual, both in language and intent, and that we all know exactly what we are signing up to do, to give and to receive.
There is another term I want to explain as well called “de-identification”. The bill talks a lot about de-identification, and its definition is that it “means to modify personal information so that an individual cannot be directly identified from it,” and then goes on to say “a risk of the individual being identified remains.” Therefore, an individual would lose all their information, but a risk of identifying an individual would remain.
Members will remember my Telus data for good example. Telus gave this information to the government during COVID, even though a risk of the individual being identified remained. It should be scrapped, and instead we should be using the word “anonymize”, which is also in the bill. This is what the GDPR does. In the bill, it “means to irreversibly and permanently modify personal information, in accordance with generally accepted best practices, to ensure that no individual can be identified from the information, whether directly or indirectly, by any means.”
I would ask members which one they would prefer. Would they like to be re-identified, as there is a possibility, or would they like no identification by any means?
Another major flaw in Bill C-27 is the creation of a bureaucratic tribunal instead of giving the Privacy Commissioner more bite. The creation of a tribunal is a time-waster, and the Privacy Commissioner should be allowed to levy fines. The Privacy Commissioner should be given more power and more bite. This is unclear because the EU, the U.K., New Zealand and Australia do not have tribunals that mediate their fines for privacy violations. Furthermore, it would no doubt cause those who have had their privacy violated to have to wait for years for the right of action.
I will put this straight. First we would have the Office of the Privacy Commissioner, or OPC, make a ruling. Then the government said that it would have a tribunal, which could then reverse the ruling of the Privacy Commissioner, and then we would have the Supreme Court, which would be allowed to rule on the tribunal's ruling. We would have a decision, another decision and a third decision, and each one of them could be countered.
Let me guess how long it would take. What do members think it would take? Would it take 48 hours or six months? Right now, the average is one year for the Privacy Commissioner, and we could add another year for the tribunal plus another year for appeals.
I ask this: Is it fair to have the average Canadian who has had their data breached, with their limited resources, have to go up against Facebook and Amazon and then spend three years in court? Does this protect fundamental privacy rights? Is this not just adding another layer of government that we certainly do not need?
The absence of rights-based language in the bill might tip the scale away from people in Canada, and the OPC and the tribunal weigh the privacy interest of people against the commercial interests of companies. Again, what does this come back to? Privacy was not listed as a fundamental right of Canadians.
Lastly, the AI portion of this bill is a complete rewrite. It needs to be split into its own bill.
I want to commend the minister for bringing this forward. He wants to be the first one in the land to bring this part of the bill forward, but to be honest, consultations only started in June. We have met with many individuals who certainly have not had any input into this deal, and although AI is there, there are many parts missing.
First of all, its findings conclude that there will be no independent and expert regulator for automated decision systems, nor does it have a shell of a framework for responsive artificial intelligence regulation and oversight. Instead, it says that the regulations will be determined at some future date and decisions will come from the Minister of Innovation, Science and Economic Development or a designated official.
Again, part of this includes a new tribunal and puts decisions where they should not be, onto the government, with enforcement and decision-making by the minister or the minister's designated ISED official. This would be political decisions on privacy. Does everyone feel comfortable that we are now shifting from a tribunal to the government?
This part of the bill will shift all of that to the government, to the minister or his designate. It reminds me of the proclamation, “I'm from the government, and I'm here to help.”
There is no mention of facial recognition technology, also, in this part of the bill, despite reports that have come from the ethics committee, the examples I gave from before on FRT. Certainly, that is worth more study.
There are some parts of the bill that have good aspects and certainly ones we can get behind, including the protection of children's privacy. As a father, I know it is so very important. Our children now have access to all kinds of different applications on their phones, iPads and Amazon Fires.
Our children are being listened to and they are being surveilled. There is no question that businesses are taking advantage of those children and that is something that we definitely need to talk about.
The attempt to regulate AI, though, as I have stated, needs major revisions. Without a proper privacy statement, it does not have a balanced purpose statement establishing that the purpose of the CPPA is to establish rules for governing the protection of personal information in a manner that balances the right to privacy and the need for organizations to collect, use or disclose personal information.
We should be shooting beyond the European Union's privacy act, shooting to be the world leader in the balance of ensuring privacy protection and that businesses and industries use data for good. In doing so, they would attract investment and technology, all the while protecting Canadians' fundamental right to privacy.
Canada needs privacy protection that builds trust in the digital economy, where Canadians can use new technologies for good while protecting them from the bad, profiling, surveillance and discrimination. The minister said that he wants to seize the moment, that we need leadership in a constantly changing world. Most importantly, the minister said that trust has never been more important.
If we do not get this right, and if we do not make sure that privacy is a fundamental human right, and declare that in the document and build the document around that right, we are doing two things: We are not prioritizing Canadians' privacy, as we are certainly not putting privacy at the forefront of the bill, and we are certainly not showing leadership in an ever-changing world.
As I noted at the onset, the technologies of 22 years ago have changed so significantly. The technologies now are changing more significantly. In the next 22 years, we are going to have technologies that are more embedded, not less, in our lives. We will have AI that do good.
One of the stakeholders that we met with actually talked about AI for good. They talked about embedding AI into the government's system of passports. That might actually mean that we could get passports within 48 hours. Could we imagine that? Could we imagine imbedding technology for good into a system that would allow Canadians to get the things that they need more often?
We love technology. We want to embrace it. We just want to make sure that, number one, privacy is protected. We want to make sure that we do the hard work of building frameworks alongside Canadians' fundamental human right to privacy and being protected in equal balance with the economy, democracy and the rule of law. This bill does not do that, not yet.
Let us work to make sure we come back with a bill that does that.