Digital Charter Implementation Act, 2020

An Act to enact the Consumer Privacy Protection Act and the Personal Information and Data Protection Tribunal Act and to make consequential and related amendments to other Acts

This bill was last introduced in the 43rd Parliament, 2nd Session, which ended in August 2021.

Sponsor

Navdeep Bains  Liberal

Status

Second reading (House), as of April 19, 2021
(This bill did not become law.)

Summary

This is from the published bill. The Library of Parliament often publishes better independent summaries.

Part 1 enacts the Consumer Privacy Protection Act to protect the personal information of individuals while recognizing the need of organizations to collect, use or disclose personal information in the course of commercial activities. In consequence, it repeals Part 1 of the Personal Information Protection and Electronic Documents Act and changes the short title of that Act to the Electronic Documents Act. It also makes consequential and related amendments to other Acts.
Part 2 enacts the Personal Information and Data Protection Tribunal Act, which establishes an administrative tribunal to hear appeals of certain decisions made by the Privacy Commissioner under the Consumer Privacy Protection Act and to impose penalties for the contravention of certain provisions of that Act. It also makes a related amendment to the Administrative Tribunals Support Service of Canada Act.

Elsewhere

All sorts of information on this bill is available at LEGISinfo, an excellent resource from the Library of Parliament. You can also read the full text of the bill.

Ryan Williams Conservative Bay of Quinte, ON

Thank you very much, Mr. Chair.

Through you, and to echo the sentiments from the rest of our colleagues, it sounds like we have the right person in the role. Thank you very much for coming today.

I wanted to get a bit more into the old Bill C-11. Privacy is obviously a lot harder to protect these days, because it is digital. You mentioned looking at consent, proportionality and the GDPR. Is there anything else you've seen in your work as a law clerk on the assessment of the old Bill C-11, and how effective it is? Do you see that modelling the GDPR from Europe at this point?

Matthew Green NDP Hamilton Centre, ON

Do you have more to comment on Bill C-11? I'm glad you brought that up, because it's certainly one that we seem to have gotten bogged down on. I'm wondering if you would share any perspectives on Bill C-11, the former one.

June 13th, 2022 / 11:30 a.m.


See context

Nominee for the position of Privacy Commissionner, As an Individual

Philippe Dufresne

My main priorities are going to ensure that Canadians can have better understanding and better protection. The private sector law has expectations that may come first. Certainly it did with Bill C-11. It would be a priority to ensure that Canadians can participate in the digital economy. Canada's market—

June 2nd, 2022 / 5:05 p.m.


See context

Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada

Daniel Therrien

Our submission on Bill C-11 dealt with that in detail.

I would say that one case that comes to mind would be research, for instance. Health was mentioned earlier in this conversation. Research for health purposes could be, within certain parameters, an exception to consent. Bill C-11 had exceptions to consent that were way too broad, but as I said, consent is not a panacea. It is normal that there would be some exceptions to consent.

Ryan Williams Conservative Bay of Quinte, ON

Thank you so much.

Mr. Therrien, I want to ask about exceptions for consent. Bill C-11 mentioned some exceptions for consent, and you mentioned telecommunication carriers. What other examples have you seen for exemptions for those looking for consent from Bill C-11?

June 2nd, 2022 / 4:25 p.m.


See context

Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada

Daniel Therrien

I'll speak in some detail, but I would refer you to the key recommendations for a new private sector law that accompanied a letter I sent to this committee further to its study on data mobility. There are two or three pages of specific recommendations. I'll just point to the ones most relevant to your question.

When consent is appropriate—it's not always appropriate, but when consent is appropriate—it is very important that it be meaningful. Bill C-11 would have removed from the law the requirement in the current law that consumers need to have the knowledge and understanding necessary for consent to be meaningful. I think knowledge and understanding, which was not in Bill C‑11, needs to be reintroduced in the law.

Bill C-11 also allowed companies to define purposes for which they would collect information almost unfettered. Other laws provide parameters. Companies can only collect information for purposes that are “specified, explicit, and legitimate”. That allows the regulator to then determine whether the purposes defined by a company were indeed specific, explicit and legitimate.

Another important factor is accountability. We think that accountability in Bill C-11 was defined to broadly. It is important that corporate accountability be defined by an objective standard, i.e., adopting procedures to comply with a law. Bill C‑11 simply said that so long as companies adopt procedures, that's a demonstration of accountability. That is too subjective. The law needs to set out objective standards such as accountability means and procedures to comply with the law.

In broad terms, the law should not refer to subjective standards defined by companies or departments. The law should define objective standards that are knowable by citizens and companies. Companies would know and would have certainty through objective standards. These objective standards could be examined by the regulator to determine whether indeed the company was accountable in such a way as to comply with the law or whether there was sufficient consent based on knowledge and understanding by the consumer.

Ryan Williams Conservative Bay of Quinte, ON

Thank you, Mr. Chair.

Mr. Therrien, thank you. I'll join the rest of the committee in thanking you for your service, sir.

You made a great statement in the text of your remarkts that it is “neither realistic nor reasonable to ask individuals to consent to all possible uses of their data in today’s complex information economy”, and you specifically mentioned AI. You also said, “While disruptive technologies have undeniable benefits, they must not be permitted to disrupt the duty of a democratic government to maintain its capacity to protect the fundamental rights and values of its citizens.”

We're going to start with a case study just to kind of go through this. What I'd like to do is to try to relate this to changes that we need to make to Bill C-11, whenever it comes back to us. Yesterday you made a statement regarding mass surveillance of Canadians through the Tim Hortons app. Canadians who downloaded this popular app learned that their movements were being tracked every few minutes. You rightly pointed out that this kind of tracking can reveal to the company where people live, work and go to school, even where they may take medical appointments.

When it comes to Bill C-11, what changes do we need to see so that this doesn't happen further to Canadians?

June 2nd, 2022 / 4:20 p.m.


See context

Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada

Daniel Therrien

I'll be glad to do that.

As I recall, Bill C-11 was tabled in the fall of 2020. The government has announced that a successor will be tabled in 2022, perhaps before the summer.

I thought it was important that the OPC start thinking about how it would be organized to inherit new responsibilities that the earlier Bill C-11 would have given the OPC. We don't know what the new bill will say, but there's a chance, of course, that it will have many elements of Bill C-11. The idea is to get ahead of the curve and think about how we would exercise these responsibilities, so we're not caught off guard if the transition period after the adoption of the bill is shorter than we would hope.

Among the responsibilities that Bill C-11 would have given the OPC—and we think it's likely this will continue to be the case—is order-making. It would be subject to appeal before a tribunal, which we think is unnecessary...but still order-making. That would require, we think, the setting up of an adjudication branch of arbiters or adjudicators. Right now, we have investigators who make recommendations, but with new legislation that has order-making powers, we would likely need to have adjudicators somewhat distant from investigators to ensure the fairness of processes.

That is one area we looked at.

The bill also provided for a review function of the code of practice.

We have looked at all the new authorities Bill C-11 would have given the OPC, and we have given some thought to how we would exercise these responsibilities.

Daniel Therrien Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada

Thank you very much, Mr. Chair. That is very kind.

Good morning, Mr. Chair, and members of the committee.

Thank you for the opportunity to appear before you today to discuss some of the lessons of the last eight years and some high-level recommendations on how the law should be reformed.

We are living in the fourth industrial revolution, the digital technology revolution. These technologies are disruptive.

As the pandemic has shown, there can be several benefits to this, for instance in health and education, or even the environment. Digital technologies can indeed serve the public interest.

We have also learned over the years that the consent model means of protecting privacy has serious limitations. It is neither realistic nor reasonable to ask individuals to consent to all possible uses of their data in today's complex information economy, for instance in some circumstances where artificial intelligence is used. The balance of power is too unequal and the asymmetry in terms of who controls personal information is too great.

In fact, consent can be used to legitimize uses that, objectively, are completely unreasonable and contrary to our rights and values. And refusal to provide consent can sometimes be a disservice to the public interest.

During my term, however, we have also seen through investigations that these technologies can present not just potential risks to privacy, but also cause real harms.

For example, our Clearview AI investigation showed that the company used facial recognition technology in a way that amounted to mass surveillance. And our investigation into the RCMP's use of the Clearview technology demonstrated the growing risks posed by public-private partnerships and the absence of a legal framework governing the use of such sensitive biometric data.

The Cambridge Analytica scandal, studied by a committee composed of members of the Standing Committee on Access to Information, Privacy and Ethics and legislators from other countries, showed that privacy violations could lead to violations of democratic rights.

Finally, our investigation into Statistics Canada revealed that a government institution believed evidence-based policymaking could justify the collection of line-by-line financial records of citizens, another form of surveillance.

This leads to the following conclusion. While disruptive technologies have undeniable benefits, they must not be permitted to disrupt the duty of a democratic government to maintain its capacity to protect the fundamental rights and values of its citizens.

What we need, then, is real regulation of digital technologies, not self-regulation.

The previous Bill C‑11 would unfortunately have allowed more self-regulation by giving companies almost complete freedom to set the rules by which they interact with their customers, and by allowing them to set the terms of their accountability.

If we draw on the lessons of the last few years, we will adopt private sector privacy laws that will allow for innovation—sometimes without consent—for legitimate commercial purposes and socially beneficial ends, within a framework that protects our values and our fundamental rights.

In the public sector, we also need laws that limit the state's ability to gather information about its citizens beyond that which is necessary and proportional to achieving its objectives.

Overall, we need federal laws in the public and private sectors that are rights based, that have similar and, ideally, common principles for both sectors, which are based on necessity and proportionality, which are interoperable at both the national and international levels and which give the regulator the power to audit and enforce that it needs to ensure compliance.

Adopting adequate privacy legislation is not sufficient in itself. The regulator must also have adequate enforcement powers, be properly funded and be given regulatory discretion to manage its workload to ensure that it can protect the greatest number of individuals effectively within limited resources.

In July, the Privacy Act extension order will come into force, giving foreign nationals abroad the same right as Canadians to request access to personal information about themselves that is under the control of federal government institutions.

The government believes that this will result in a large increase in the number of requests for access, which will trickle down by way of complaints to our office. The OPC has communicated its funding needs to the government. To date, no new funding has been provided. This is a critical issue for the OPC as it requires additional funds to perform these newly mandated duties.

As for the broader financial impact of law reform, we believe, based on the experience of other data protection authorities, that our budget would need to double, approximately, if the promised new law for the private sector were similar to the former Bill C-11. We also anticipate the expansion of advisory functions and the obligation to review industry codes of practice.

We welcome these new responsibilities as they would promote compliance with the law when programs are at the design stage. Nonetheless, we are concerned that the non-discretionary nature of these activities and of our investigative work would deprive us of the ability to risk-manage our caseload and give greater priority to matters of higher risk. We therefore urge you, when a bill is eventually presented to Parliament, to give my office greater discretion to manage our caseload by selecting its advisory and investigative files to ensure that we can protect the greatest number of Canadians effectively within our limited resources. Not only would this allow us to operate more efficiently, but we have also estimated that it would result in a cost saving of nearly $12 million per year.

As for enforcement powers, I have consistently called for quick and effective remedies, including the power to issue orders and to impose significant monetary penalties proportional to the financial gains that businesses can make by disregarding privacy. Yet further evidence of the need for these powers was provided yesterday with the result of our investigation into Tim Hortons.

Like many other data protection authorities in Canada and abroad, the OPC should also be empowered to conduct proactive audits to verify compliance with the law. The need for this was demonstrated in spades in the recent story about the Public Health Agency's use of mobility data that was obtained in modified form from private sector organizations. In a world where innovation requires trust, an important factor of trust in the population would be the assurance that an independent expert has their back, will verify and ensure compliance with the law and will take appropriate action to stop or correct non-compliant behaviour. Again, these are powers or authorities that a number of our provincial colleagues have in Canada and that a number of our international partners have, including in common-law jurisdictions such as the United Kingdom.

I would like to leave you with a few final thoughts on the future of privacy laws federally and their interoperability with the laws of other jurisdictions, both domestically and internationally.

Domestically, we see that Canada's three most populous provinces have made recent proposals towards responsible innovation within a legal framework that recognizes privacy as a fundamental right. Quebec adopted such a law in 2021.

All of these provinces confer order-making powers on data protection authorities, and they propose to give them the authority to impose monetary penalties directly without going through an administrative appeal—but subject to judicial review. We ask for similar powers, in part so that all Canadians, regardless of their jurisdiction, have access to quick and effective remedies if their privacy rights are violated, and in part to ensure that the OPC remains an influential and often unifying voice in the development of privacy in Canada. If the powers of provincial and the federal authority are different, if the process federally is longer than that in the provinces, I'm concerned that citizens will address themselves to provincial authorities and that the influence of the federal authority will become less.

Globally, it is also essential that Canada's laws be interoperable and not too different from international standards. Some industry stakeholders say that a made-in-Canada approach has been good for the country and that a rights-based approach would hurt innovation.

The idea that rights-based law would impede innovation is a myth. It is simply without foundation. In fact, the opposite is true. There can be no innovation without trust, and there is no trust without the protection of rights.

In our view, a made-in-Canada approach that would be too different from what is becoming the international gold standard would not be in the interest of Canadian business. To the contrary, interoperable laws are in Canada's interest.

In closing, my message to this committee is this: continue the work that you and your predecessors have been doing on these important files. As legislators, you have the power to bring meaningful change to our privacy regime and your reports to date point in the right direction.

Remember also that our laws should protect the right to privacy in its true sense: freedom from unjustified surveillance. Thus, legislation should recognize and protect the freedom to live and develop independently, free from the watchful eye of the state or surveillance capitalism.

In other words, the law should protect our values and rights, hard won over centuries, and should not be set aside in order to benefit from digital technologies.

It has been an honour working with all of you. Thank you for the extra time this afternoon.

I am happy to answer any questions you might have.

Online Streaming ActGovernment Orders

March 29th, 2022 / 4:30 p.m.


See context

NDP

Charlie Angus NDP Timmins—James Bay, ON

Madam Speaker, I am very proud, as always, to rise in the House to speak for the incredible people of Timmins—James Bay.

We are here to talk about Bill C-11. We have to step back into the last Parliament where we had Bill C-10, which this is the update of, and what was then Bill C-11, which was supposed to be about addressing the long outstanding need to bring Canada's laws up to standard in dealing with the tech giants.

This Bill C-11 was the old Bill C-10, which should have been pretty straightforward. Who does not want Facebook to finally start paying tax? This is a company that made $117 billion in profit last year, up $31 billion in a single year, and it is not paying tax. That is what Bill C-10 was supposed to do, but then it was our modern Minister of Environment who was then the minister of heritage who turned it into a total political dumpster fire. It was so bad the Liberals had to call an election, just to get that thing off the table.

Now the Liberals have brought it back. At the time, then Bill C-11 was supposed to be the privacy bill, a pretty straightforward thing. However, that was another dumpster fire, because the Privacy Commissioner had to come out and say that the Liberal plan to update privacy rights would actually undermine basic Canadian privacy in the realm of digital technology. Particularly, the Privacy Commissioner found this American company, Clearview AI, broke Canadian law for their illegal use of images in facial recognition technology. In response, the Liberals were going to rewrite the rules so it would be easier for Clearview AI to break the law, rather than for the Privacy Commissioner to protect Canadians.

The Liberals had to call an election to erase all of that. Now the Liberals have been given, as they have so many times in the past, one more chance. The deus ex machina comes down and gives them a chance to do things all over again.

Now we are looking at this Bill C-11. I can say one thing about this Bill C-11 is that it fixed a lot of the problems with the previous dumpster fire, maybe by moving the minister, although God help the planet now that he is looking after the environment. That is just my own personal thoughts from having read his ridiculous environment plan today. What he was going to do for culture, he is now doing to our environment.

Having said that, I would say that there is a couple of key issues we need to be looking at. We need to be looking at the need for Canada's legislation to actually address the right of artists to get paid in the digital realm. For too long in Canada we sort of pat our artists on the head. We all talked about the favourite TV shows we had growing up. One of the Liberals was talking about the Polkaroo.

Arts policy should not be that we just pat our artists on the head. This is an industry. It is one of our greatest exports. We are not promoting arts as an export or promoting our artists to do the work they need to do. We saw from COVID the devastating impacts on Canada's arts industry, on theatre, on musicians and on the tech people, the highly skilled tech people who went over two years without working. We really need to address this. One of the areas where they have been so undermined is online.

Let us talk about Spotify. It is basically a criminal network in terms of robbing artists blind. The number of sales one needs to have on Spotify to pay a single bill is so ridiculous that no Canadian artist could meet it.

We have streaming services that are making record fortunes. Therefore, it is a reasonable proposition to say that they are making an enormous amount of profit and they have a market where they do not have any real competition, so some of that money, and this was always the Canadian compromise, needs to go back into the development of the arts so that we can continue to build the industry.

The one thing I have also come to realize is that what the digital realm gives us and what streaming services give us is the ability to compete with our arts internationally on a scale that we never had before, if we are actually investing. Let us not look at it in a parochial manner, like what was done with the old broadcasters, where it was one hour on prime time a week they had to have a Canadian show on. Let us actually invest so that we can do the foreign deals. Why is it I can watch an incredible detective show from Iceland on Netflix, yet people in Iceland are not seeing an incredible detective show from Canada?

This is what we need to be doing. This is a reasonable position to take. With the profits that Facebook and Google are making, they can pay into the system. That is simple. They have unprecedented market share.

I will go to the second point, which is dealing with the tech giants. It is something I worked on in 2018. Our all-party parliamentary committee came up with numerous recommendations. I have to speak as a recovering digital utopian because there was a time when I believed that when we let all these platforms come, if we stood back and did not put any regulations on them, they would create some kind of new market promised land, but what we saw was that those dudes from Silicon Valley who were making YouTube in their parents' garage morphed into an industrial power that is bigger than anything we have ever seen.

There is a term, “kill zone of innovation”, where these companies have become so rich, so powerful and have such unprecedented corporate strength that it dwarfs anything we have ever seen in the history of capitalism, companies like Facebook. When Facebook gets a $5-billion fine, it does not even blink. It does not bother it. When the Rohingya are launching 150-billion U.K. pound lawsuit for the mass murder caused because of the exploitation of Facebook's platform, we realize we are dealing with companies that are so much beyond that they do not believe that domestic law applies to them. There has to be some level of obligation. I have worked with international parliamentarians in London, and there were meetings in Washington, trying to see how we can address the unprecedented power.

There is one thing that changed fundamentally when we saw the growth of this power. There used to be a principle that the telecoms would always tell parliamentarians, which was that we should not be blamed for what is in the content because, as they say, the pipes are dumb. We just send out the content and people choose, but people do not choose the content on Facebook and YouTube because of the algorithms. It is the algorithms that make them culpable and responsible.

I refer everyone to Congresswoman Carolyn Maloney, who demanded Facebook explain how many of these stolen bot pages were driving misinformation during the convoy crisis here in Ottawa. Congresswoman Maloney wrote, “Facebook’s history of amplifying toxic content, extremism, and disinformation, including from Russia and other foreign actors” is well known. It is no wonder that some members on the Conservative backbench are so defensive about this bill. My God, this is their main source of news. What are they going to do if we start dealing with bot pages that they think is something that came down from the promised land?

As parliamentarians, we have an obligation to address bot accounts. We have an obligation to hold these companies to account. What does that mean? Number one, it is about algorithm accountability. I do not care what someone watches on Facebook or YouTube, that is their business, but if the algorithm is tweaked to show people what they would not otherwise see, Facebook is making decisions for them.

I would refer my colleagues to Tristan Harris, the great thinker on digital technology. He spoke to the committee in 2018 and said, “Technology is overwriting the limits of the human animal. We have a limited ability to hold a certain amount of information in our head at the same time. We have a limited ability to discern the truth. We rely on shortcuts” like thinking what that person says is true and what that person says is false. However, what he says about the algorithm is that the algorithm has seen two billion other people do the same thing, and it anticipates what they are going to do so it starts to show people content. What they have learned from the business model of Facebook and YouTube is that extremist content causes people to spend more time online. They are not watching cat videos. They are watching more and more extremist content. There is actually an effect on social interaction and on democracy. That is not part of this bill.

What the all-party committee recommended was that we needed to address the issue of algorithmic accountability and we needed to address the issue of the privacy rights of citizens to use online networks without being tracked by surveillance capitalism. With this bill, we need to ensure that these tech giants, which are making unprecedented amounts of money, actually put some money back into the system so that we can create an arts sector that can compete worldwide.

Carole Piovesan

The extra-jurisdictional enforcement of these types of decisions is very difficult. We've seen this raised by courts before. We draw inspiration from the General Data Protection Regulation out of the EU that is starting to impose very significant fines, not for actual activity in the European jurisdiction, but for the use of European data subjects—the use of data of European residents.

Opportunities to extend jurisdiction and enforcement are being very much explored. We've seen this in Quebec, absolutely, with the passing of new private sector reform of the privacy law. It is certainly a consideration that we saw in the old Bill C-11, which was to reform aspects of PIPEDA. We'll see what comes out of the new reform, when and if it comes.

February 28th, 2022 / 11:30 a.m.


See context

Professor of Law, University of Ottawa and Canada Research Chair in Internet and e-Commerce Law, As an Individual

Dr. Michael Geist

Yes, on the issue of the tribunal, there was some opposition to that. A tribunal was proposed in Bill C-11. I actually had less of a problem with it. I thought that as long as it was an expert tribunal—which unfortunately Bill C-11 did not have; it had a mandate that one of the tribunal members have privacy experience, and I would think that if it's going to be authoritative, it needs to be a true expert tribunal in this area—there might well be value.

I recognize that the Privacy Commissioner has voiced some opposition to that, but I think that at a minimum we need to get a piece of legislation on the table. We can talk about what that administration looks like through committee study, but we're not even getting out of the gate on this issue.

February 28th, 2022 / 11:25 a.m.


See context

Professor of Law, University of Ottawa and Canada Research Chair in Internet and e-Commerce Law, As an Individual

Dr. Michael Geist

I think this represents one of the really exceptional challenges. We started to see that considered in the former Bill C-11, which included references to potential consent or potential rules even around de-identified data, and so—

Michael Geist Professor of Law, University of Ottawa and Canada Research Chair in Internet and e-Commerce Law, As an Individual

Thank you very much, Chair.

Good morning. My name is Michael Geist. I'm a law professor at the University of Ottawa, where I hold the Canada research chair in internet and e-commerce law, and I'm a member of the Centre for Law, Technology and Society. I appear in a personal capacity, representing only my own views.

I'd like to thank the committee for the invitation to appear on this issue, which represents an exceptionally thorny privacy challenge. I recognize that some of your witnesses have brought differing perspectives on the legality and ethics of this collection and use of mobile data.

From my perspective, I'd like to start by noting three things. First, ensuring that the data was aggregated and de-identified was a textbook approach to how many organizations have addressed their privacy obligations—namely, by de-identifying data and placing it outside the scope of personally identifiable information that falls within the law. Second, the potential use of the data in the midst of a global pandemic may well be beneficial. Third, it does not appear that there's a violation of the law, because the data itself was aggregated and de-identified. The public notice may not have been seen by many, but that, too, is not uncommon.

I think this creates a genuine privacy quandary. The activities were arguably legal, and the notice met the low legal standard. Telus, I think, is widely viewed as seeking to go beyond even the strict statutory requirements, and the project itself had the potential for public health benefits.

Now, there could have been improvements. The Privacy Commissioner of Canada, I think, should have been more actively engaged in the process, the public notification should have been more prominent, and there should have been opportunities—and should still be opportunities—for opting out, but I'm not entirely convinced that these steps would have changed very much.

The OPC would surely have pushed for more prominent notification and some assurances on the de-identification of the data, but it seems likely that the project would still have continued. Similarly, better notices would have benefited the few Canadians who paid attention, but I think we can recognize that it's a fiction to suggest that there are millions actively monitoring privacy policies or similar web pages for possible amendments. Yet, despite all of these factors, something doesn't sit right with many Canadians.

I believe the foundational problem that the incident highlights is that our laws are no longer fit for purpose and are in dire need of reform. It's not that I think we need laws that would ban or prohibit this activity. Again, most recognize the potential benefits. Rather, we need laws that provide greater assurances that our information is protected and will not be misused, that policies are transparent and that consent is informed. That doesn't come from baking in broad exceptions under the law that permit the activity because the law doesn't apply. Instead, it means updating our laws so that they contemplate these kinds of activities and provide a legal and regulatory road map for how to implement them in a privacy-protected manner. The need for reform applies to both the Privacy Act and PIPEDA.

With respect to the Privacy Act, there have been multiple studies and successive privacy commissioners who have sounded the alarm on legislation that is viewed as outdated and inadequate. Canadians rightly expect that the privacy rules that govern the collection, use and disclosure of their personal information by the federal government will meet the highest standards. For decades, we've failed to meet that standard.

The failure to engage in meaningful Privacy Act reform may be attributable in part to the lack of public awareness of the law and its importance. The Privacy Commissioner has played an important role in educating the public about PIPEDA and broader privacy concerns. The Privacy Act needs to include a similar mandate for public education and research.

With respect to PIPEDA, I would need far more than five minutes to identify all of the potential reforms. Simply put, the issue has inexplicably been placed on the back burner. Despite claims that it was a priority, the former Bill C-11 was introduced in November 2020 and there was seemingly no effort to even bring it to committee. The bill attracted some criticism, but this isn't rocket science. If Canada is looking for a modernized privacy law and wishes to meet international standards, the starting point is the European Union's GDPR.

Notwithstanding some of the recent scare tactics from groups such as the Canadian Marketing Association, the reality is that GDPR is widely recognized as the standard. Global multinationals are familiar with its obligations. There are innovative rules that seek to address the emerging digital challenges, and there are tough enforcement powers and penalties. There's room to tweak the rules for Canada, but we should not let the perfect be the enemy of the good.

Modernized privacy rules are not some theoretical exercise. As this recent event demonstrates, failing to implement those rules leaves Canada in a difficult position, with potential conflicting rules at the provincial level, compliance strategies that may still undermine public trust, and policy implementation choices that fail to maximize the benefits that can come from better data—

February 17th, 2022 / 5:15 p.m.


See context

Vice-President, Chief Data and Trust Officer, Telus Communications Inc.

Pamela Snively

In the context of de-identified information, the focus should be on the actual core privacy protections. I think Bill C-11 started down this path that we can do more things with de-identified data, and perhaps the space it had for codes of practice would be a great place to put some of the standards we were just talking about. How can we get comfortable that we're all talking about the same thing around de-identification and raise that standard?

I think the most important thing, as I said earlier, is not to rely on consent, because we're talking about de-identified information, but to rely on absolutely substantial privacy controls that are in place regardless of the choices or selections. We know that choices and selections are challenging, so let's just get it right.