Evidence of meeting #49 for Access to Information, Privacy and Ethics in the 42nd Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was pipeda.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Michael Karanicolas  Senior Legal Officer, Centre for Law and Democracy
Teresa Scassa  Full Professor, University of Ottawa, Canada Research Chair in Information Law, As an Individual
Florian Martin-Bariteau  Assistant Professor, Common Law Section, Faculty of Law, and Director, Centre for Law, Technology and Society, University of Ottawa, As an Individual

3:30 p.m.

Conservative

The Chair Conservative Blaine Calkins

Good afternoon, colleagues.

I know that many of us are anxious, as this is the last week of four before we go home for a constituency break week, but we have with us some very distinguished panellists to help us in the deliberations on our current study, which is on the Personal Information Protection and Electronic Documents Act, more affectionately known by Canadians on a daily basis as PIPEDA.

From the Centre for Law and Democracy, we are joined once again by Mr. Michael Karanicolas, a senior legal officer, by video conference.

It's good to see you again, Mr. Karanicolas.

As an individual, we again have joining us Teresa Scassa, a full professor at the University of Ottawa.

Thanks, Teresa, for joining us again. It's always a pleasure to have you here.

For the first time ever appearing before the committee, in his debut game—I mean, debut “appearance”—we have Florian Martin-Bariteau, assistant professor with the common law section of the Faculty of Law and the director of the Centre for Law, Technology and Society at the University of Ottawa.

As we normally do in this committee, we'll have a 10-minute opening statement from each of you. We'll simply go in the order in which I introduced you. I think everybody here is familiar with how this happens.

We'll start with you, Mr. Karanicolas. You have up to 10 minutes, please.

3:30 p.m.

Michael Karanicolas Senior Legal Officer, Centre for Law and Democracy

Thanks to the committee for your invitation to appear again.

I'd like to start by offering my congratulations to the standing committee for their recommendations to reform the Privacy Act, which were published late last year and which I thought were excellent.

It is, I believe, fairly clear that the current consent-based model of privacy protection is broken. The core dynamic that underlies this model and that drives much of the digital economy is that users may choose to trade their personal information for services. There are undeniable benefits to this model, which has assisted in the rapid spread of the Internet by lowering costs of entry. However, this dynamic relies on meaningful consent, which in turn requires at least a nominal understanding by the contracting party of what they're signing on to. In fact, virtually nobody reads their terms of service agreements, a state of affairs that significantly undermines the legitimacy of the consent obtained.

The OPC report points in part to the length of these agreements and the frequency with which they're presented to users as a cause of this lack of understanding, but it's also worth noting that these agreements are often drafted in a highly convoluted, confusing, and even self-contradictory manner that even technically and legally trained people struggle to understand. There's a vicious cycle at work. The fact that very few users read these agreements or use their substance as a basis for accepting or declining a service gives companies licence, and indeed an incentive, to draft them incredibly broadly. This drafting style and the lack of accessibility further depresses engagement with the agreements by their signatories and so on.

It's also worth noting that the company that presents the agreement and offers a service may often be distinct from the ones that actually collect and process the information. Third party data brokers play an increasingly common role in the Internet's ecosystem. A 2014 study showed that of the 950,000 most popular websites, 88% of them automatically shared visitor information with third parties, an average of 9.5 different third parties per website. The vast majority of this tracking is carried out surreptitiously, with only 2% of third parties including a visible prompt alerting users to their presence.

There's a clear problem here. However, it's important to try to look for solutions that will not derail the current digital economy. Although there are pros and cons to a system where personal information is used as a major currency by which online services are procured, potential avenues forward should be crafted with an eye to maintaining the tremendous benefits that Internet access provides.

One solution, which we strongly support, is to boost the quality of consent by improving the information available to users. A better practice here may include publishing a summary or explanatory guide of the terms of service alongside the full legal version, ensuring that the agreement is easily available for review, and clearly notifying users when a substantial change to the change of service has been made.

The OPC has an important role to play here: to promote better practice in terms of clarity and accessibility of terms of service agreements, and to audit existing agreements for their clarity and accessibility, as well as their accuracy against how information is actually collected and processed. In addition to these steps, the proposal to shift to opt-in consent as a default to the required approach is one that we support.

The move to expand transparency is another important factor to boosting the quality of consent, allowing people to look under the hood of the services and platforms they use. This may include, for example, a right to request an explanation of how their personal information has been used to customize their online experience, or what factors went into a particular decision by the company that they were subject to. However, while there is substantial room by which the quality of user engagement and of consent may be improved, these improvements alone are not sufficient to safeguard the privacy rights of Canadians. The CLD supports the creation of clearly defined no-go zones, as well as proceed-with-caution zones, as mentioned in the OPC report. One important area to consider here is the need for greater clarity on how information can be transferred out to third parties or resold, and what rules should govern these external uses. Broader investigative powers by the OPC are also needed to promote good practice in terms of information management and security.

In terms of the de-identification or anonymization of information, while I think it should certainly be encouraged, it is not a panacea for the current privacy concerns. I would add to the commentary contained in the OPC's report by noting that as anonymization gets stronger, the commercial value of information can often decline, giving businesses an incentive to pursue incomplete solutions. Moreover, the fact that information has been, quote-unquote, anonymized may create a false sense of security, prompting companies to be less vigilant in safeguarding it and consumers to assume that threats to privacy have been nullified.

I also want to speak briefly about reputation and privacy and the right to be forgotten.

The Internet's transformative impact on our social functions has made a person's online footprint a vital aspect of his or her identity. However, the permanence and increased accessibility of online information has led to concerns from some about the Internet's impact on privacy and reputation.

There are benefits to making people's pasts more accessible. A Holocaust museum, for example, has a legitimate interest in knowing if a person it is considering for a job has a history of making racist comments. However, we are also a society that believes in giving people second chances. There can be problems with how the digital records present themselves, such as where a decision by a prosecutor to drop charges may not generate as much coverage as the initial arrest, or where an erroneous and sensational media report may attract more attention than a later retraction.

However, experiences in Europe with the right to be forgotten should be viewed as a cautionary tale about what not to do. Namely, any move to develop a right to be forgotten should be grounded in clear and limited definitions of how it applies, strong transparency, and robust due process. I will address each of these in turn.

First, the application of a right to be forgotten requires a careful balancing of freedom of expression, privacy, and the right to information. Any such balancing will have to be based on a clear test to determine where the public interest lies. People have never had a right to control or curate their reputations. Any move to create a right to be forgotten should be aimed only at the novel aspects of reputation that have come about as a result of the Internet and should be reserved for significant and demonstrably unfair circumstances, such as when a person has been wrongly arrested.

Second, transparency is a key ingredient, including making available detailed information about how decision-making processes work and how they have been applied. There should be as much information as can be provided, short of undermining the efficacy of the processes themselves.

Third, as with any restriction on freedom of expression, due process is critically important. Search engines are simply not equipped to engage in this careful balancing of rights, and unfortunately have an incentive under the current European system to err on the side of removing the information without providing the careful due process such a tricky issue should warrant. Any order to remove material or to reduce its accessibility should be left in the hands of a court or a quasi-judicial authority, including careful due process considerations.

I want to emphasize that none of the above should be interpreted as an endorsement of the right to be forgotten. Indeed, there is a strong argument to be made that the present reputational challenges will sort themselves out over time, as people will gradually become inured to the preponderance of embarrassing or unpleasant information out there and will learn to take such information with a pinch of salt. However, insofar as the right to be forgotten is being considered, it is important that we not repeat the widely criticized mistakes of the Court of Justice of the European Union in how it handled the matter.

I look forward to your questions in the discussion.

3:40 p.m.

Conservative

The Chair Conservative Blaine Calkins

Thank you very much, Mr. Karanicolas.

We now go to Ms. Scassa, please, for up to 10 minutes.

3:40 p.m.

Prof. Teresa Scassa Full Professor, University of Ottawa, Canada Research Chair in Information Law, As an Individual

Thank you for the invitation to meet with you today and to contribute to your discussion on PIPEDA. I'm a professor at the University of Ottawa in the Faculty of Law, where I hold the Canada research chair in information law. I'm appearing in my personal capacity.

We're facing what might be considered a crisis of legitimacy when it comes to personal data protection in Canada. Every day we hear new stories in the news about data hacks and breaches, and about the surreptitious collection of personal information by the devices in our homes and on our persons that are linked to the Internet of things. There are stories about how big data profiling impacts the ability of individuals to get health insurance, obtain credit, or find employment. There are also concerns about the extent to which state authorities access our personal information that is in the hands of private sector companies. PIPEDA, as it currently stands, is inadequate to meet these challenges.

My comments are organized around the theme of transparency. Transparency is fundamentally important to data protection and has always played an important role under PIPEDA. At a basic level, transparency means openness and accessibility. In the data protection context, it means requiring organizations to be transparent about the collection, use, and disclosure of personal information, and it means that the commissioner also must be transparent in his oversight functions under the act.

I'm going to also argue that it means that state actors, including law enforcement and national security organizations, must be more transparent about their access to and use of the vast stores of personal information in the hands of private sector organizations.

Under PIPEDA, transparency is at the heart of the consent-based data protection scheme. It's central to the requirement for companies to make their privacy policies available to consumers and to obtain consumer consent to the collection, use, or disclosure of personal information, yet this type of transparency has come under significant pressure and has been substantially undermined by technological change on the one hand, and by piecemeal legislative amendment on the other.

The volume of information that's collected through our digital, mobile, and online interactions is enormous, and its actual and potential uses are limitless. The Internet of things means that more and more of the devices that we have on our person and in our homes are collecting and transmitting information. They may even do so without our awareness, and they often do so on a continuous basis. The result is that there are fewer clear and well-defined points or moments at which data collection takes place, making it difficult to say that notice was provided and that consent was obtained in any meaningful way.

In addition, the number of daily interactions and activities that involve data collection have multiplied beyond the point at which we are capable of reading and assessing each individual privacy policy. Even if we did have the time, privacy policies, as was just mentioned, are often so long, complex, and vague that reading them does not provide much of an idea of what's being collected and shared, with or by whom, or for what purposes.

In this context, consent has become a bit of a joke, although unfortunately the joke is largely on consumers. The only parties capable of saying that our current consent-based model still works are those that benefit from consumer resignation in the face of this ubiquitous data harvesting.

The Privacy Commissioner's recent consultation process on consent identifies a number of possible strategies to address the failures of the current system. There is no quick or easy fix, no slight changing of wording that will address the problems around consent. This means that on the one hand there need to be major changes in how organizations achieve meaningful transparency about their data collection, use, and disclosure practices, and there must also be a new approach to compliance that gives considerably more oversight and enforcement powers to the commissioner. The two changes are inextricably linked.

The broader public protection mandate of the commissioner requires that he have necessary powers to take action in the public interest. The technological context in which we now find ourselves is so profoundly different from what it was when this legislation was enacted in 2001 that to talk of only minor adjustments to the legislation ignores the transformative impacts of big data and the Internet of things.

A major reworking of PIPEDA may be well overdue, in any event, and it might have important benefits that go beyond addressing the problems of consent. I note that if one were asked to draft a statute as a performance art piece that evokes the problem with incomprehensible, convoluted, and contorted privacy policies and their effective lack of transparency, then PIPEDA would be that statute. As unpopular as it might seem to suggest that it's time to redraft the legislation so that it no longer reads like the worst of all privacy policies, this is one thing this committee should consider.

I make this recommendation in a context in which all of those who collect, use, or disclose personal information in the course of commercial activity, including a vast number of small and medium-sized businesses with limited access to experienced legal counsel, are expected to comply with the legislation. In addition, the public ideally should have a fighting chance of reading the statute and understanding what it means in terms of the protection of their personal information and their rights of recourse. As it's currently drafted, PIPEDA is a convoluted mishmash in which the normative principles are not found in the law itself, but rather are tacked on in a schedule.

To make matters worse, the meaning of some of the words in the schedule, as well as the principles contained therein, are modified by the statute, so that it's not possible to fully understand rules and exceptions without engaging in a complex connect-the-dots exercise. After a series of piecemeal amendments, PIPEDA now consists in large part of a growing list of exceptions to the rules around collection, use, or disclosure with consent. While the OPC has worked hard to make the legal principles in PIPEDA accessible to businesses and to individuals, the law itself is not accessible.

In a recent PIPEDA application involving an unrepresented applicant—and most of them who appear before the Federal Court are unrepresented, which I think is another issue with PIPEDA—Justice Roy of the Federal Court expressed the opinion that for a party to “misunderstand the scope of the Act is hardly surprising”.

I've already mentioned the piecemeal amendments to PIPEDA over the years, as well as concerns about transparency. In this respect, it's important to note that the statute has been amended so as to increase the number of exceptions to consent that would otherwise be required for the collection, use, or disclosure of personal information.

For example, paragraphs 7(3)(d.1) and (d.2) were added in 2015. They permit organizations to share personal information between themselves for the purposes of investigating breaches of an agreement or actual or anticipated contraventions of the laws of Canada or a province, or to detect or suppress fraud. While these are important objectives, I note that no transparency requirements were created in relation to these rather significant powers to share personal information without knowledge or consent. In particular, there's no requirement to notify the commissioner of such sharing. The scope of these exceptions creates a significant transparency gap that undermines personal information protection. This should be fixed.

PIPEDA also contains exceptions that allow organizations to share personal information with government actors for law enforcement or national security purposes without the notice or consent of the individual. These exceptions also lack transparency safeguards. Given the huge volume of highly detailed personal information, including location information, which is now collected by private sector organizations, the lack of mandatory transparency requirements is a glaring privacy problem.

The Department of Innovation, Science and Economic Development has created a set of voluntary transparency guidelines for organizations that choose to disclose the number of requests they receive and how they deal with them. It's time for there to be mandatory transparency obligations around such disclosures, whether it be public reporting or reporting to the commissioner, or a combination of both. Also, that reporting should be by both private and public sector actors.

Another major change that is needed to enable PIPEDA to meet the contemporary data protection challenges relates to the powers of the commissioner. When PIPEDA was enacted in 2001, it represented a fundamental change in how companies were to go about collecting, using, and disclosing personal information. This major change was made with great delicacy. PIPEDA reflects an “ombuds” model that allows for a light touch with an emphasis on facilitating and cajoling compliance, rather than imposing and enforcing it. Sixteen years later, and with exabytes of personal data under the proverbial bridge, it's past time for the commissioner to be given a set of new tools to ensure an adequate level of protection for personal information in Canada.

First, the commissioner should have the authority to impose fines on organizations in circumstances where there has been substantial or systemic non-compliance with privacy obligations. Properly calibrated, such fines can have an important deterrent effect that is currently absent from PIPEDA. They also represent transparent moments of accountability that are important in maintaining public confidence in the data protection regime.

The tool box should also include the power for the commissioner to issue binding orders. I'm sure you're well aware that the commissioners in Quebec, Alberta, and British Columbia already have such powers. As it stands, the only route under PIPEDA to a binding order runs through the Federal Court, and then only after a complaint has passed through the commissioner's internal process. This is an overly long and complex route to an enforceable order, and it requires an investment of time and resources that places an unfair burden on individual complainants.

I note as well that PIPEDA currently does not provide any guidance as to damage awards. The Federal Court has been extremely conservative in damage awards for breaches of PIPEDA, and the amounts awarded are unlikely to have any deterrent effect other than to deter individuals who struggle to defend their personal privacy. Some attention should be paid to establishing parameters for non-pecuniary damages under PIPEDA. At the very least, these will assist unrepresented litigants in understanding the limits of any recourse that's available to them.

Thank you. I welcome any questions.

3:45 p.m.

Conservative

The Chair Conservative Blaine Calkins

Thank you very much, Ms. Scassa.

We now go to Mr. Martin-Bariteau, please, for up to 10 minutes.

3:45 p.m.

Prof. Florian Martin-Bariteau Assistant Professor, Common Law Section, Faculty of Law, and Director, Centre for Law, Technology and Society, University of Ottawa, As an Individual

Thank you, Mr. Chair.

I would like to thank you for this opportunity to contribute to your work on the review of the Personal Information Protection and Electronic Documents Act (PIPEDA) and thus offer me the chance to share my thoughts with you about an issue of importance to Canadians.

I am an Assistant Professor of Law and Technology at the Common Law Section, Faculty of Law of the University of Ottawa, where I teach Digital Economy Law, and am the Director of the Centre for Law, Technology and Society. Nonetheless, I appear before you today in my personal capacity.

My comments will be built upon the letter sent to you by the Commissioner last December 2. I will focus on the issues of enforcement powers and reputation. I will then move to the scope of the act, before concluding with some reflections as to its accessibility and readability.

Throughout my presentation, I will draw references to new European Union's General Data Protection Regulation, GDPR, particularly due to the adequacy issues raised by the Commissioner.

As to the enforcement powers, I believe it is essential to strengthen the Commissioner's powers in order to ensure the effectiveness of the act, in particular by granting the Commissioner order-making powers and the authority to impose administrative monetary penalties. The ability to impose fines appears to be the most effective way to ensure protection.

As with everything, the protection of personal information is subject to a cost-benefit analysis. It is now a matter of either investing in a protection by design or choosing the possibility of a slap on the wrist. With the risk of monetary penalties, the cost-benefit analysis will favour a protection by design approach. Obviously, the amount of the fine will be a critical parameter for its effectiveness—a prohibitive amount is required. For example, if a $500,000 fine may seem significant—and it will be for small and medium-sized businesses—it will be an insignificant amount for companies like Amazon, Facebook or Google. In that respect, it was by imposing a $22.5-million fine that the U.S. Federal Trade Commission succeeded in getting Google to modify its DoubleClick advertising program.

In order to prove effective against big players, we need the maximum fine to be specified based on a percentage of worldwide turnover—for example, 1%. To ensure that the fine is not ludicrous for small and medium-sized enterprises, a second limit should be provided—for example, $500,000; with the greater limit to be retained. Incidentally, the GDPR is based on such a mixed approach.

In my view, this does not threaten the collaborative relationship between operators and the Commissioner. On the contrary, I am of the opinion that strengthened powers will encourage a greater co-operation within actors, before any damage. Besides, such powers seem necessary to obtain an adequate decision of the GDPR.

In order to avoid the appearance of conflicts of interest, fines should be made payable to the Receiver General. So as to protect small businesses and not slow down innovation, we could provide a procedure for a preliminary conformity assessment. In the event of damages, sanctions would only be imposed after an issued recommendation has not been acted upon within a reasonable time.

Finally, I am of the view that none of the Commissioner's powers, including those of order and sanction, should be limited to the receipt of a formal complaint—the totality of these powers evidently remaining subject to possible judicial review.

As to the rights of individuals and online reputation, many favour the creation of a “right to be forgotten”. In the way it is imagined and requested by some, I find this proposition dangerous. The Internet is the archives and the libraries of tomorrow, the new collective memory. Archives have never previously been erased because they were disturbing—at least, not legally in a democracy. This is dangerous ground, and similarly, it is dangerous to want to delegate censorship powers to private actors or to give the power to decide what should be accessible or not to a select few. In the same vein, the right to de-index seems illogical to me, in that it would entail the removal of the index entry, but not the content itself.

Legislation protecting personal information should not be used as a reputation management tool to remove what is embarrassing, but only to remove anything that is unjustified or inaccurate. Otherwise, I am not sure that such a mechanism would satisfy the charter test.

The actual problem with Canadian law is that PIPEDA recommends, but does not require, the erasure of inaccurate or unnecessary data. Certainly, in its recent and already famous Globe24h decision, the Federal Court circumvented this deficiency through the illegitimate and unauthorized nature of the disclosure.

Nevertheless, the erasure of data should be compulsory—and not simply recommended—once it is no longer necessary or accurate through stricter controls of the retention of data over time. One could also provide for an actionable right of erasure of outdated and inaccurate information. I should point out that this need does not only relate to the Internet, but to all databases, computerized or not.

It seems to me that these amendments are necessary—but sufficient—to the GDPR adequacy.

As to the scope, Canadians should be ensured that any harmful collection, use or disclosure of data be subject to strict standards of protection.

The definition of the scopes of the two federal statutes does not meet the citizens' expectation of protection in a global and interconnected world, including protected data and in particular with respect to the subjected organizations.

A solution for organizations would be to redefine the scope of PIPEDA in such a way that would render it applicable to all organizations operating under federal jurisdiction and that are not covered by the public sector act or any other federal law. Evidently, and analogously to our partners, the law shall retain exemptions for personal or journalistic use.

As to the issue of access to law, if it is undeniable that the law requires modifications in view of new realities, the legislator must seize the opportunity of this reform by performing a complete overhaul of the law, instead of making simple amendments.

Indeed, PIPEDA belongs, undoubtedly, in the hall of fame for the worst drafted federal laws—and we know that there is, in that matter, some competition there. The cornerstone of PIPEDA lies in an appendix copy-and-pasted from a document drafted by a private standardization organization. The act only supplements this document and other appendices by making constant references to them.

This poses a problem in terms of the public's access to law. A rewrite of the law, clearly explaining the right and obligations of each, would therefore be welcome—especially to make mandatory all that is presently recommended.

In terms of drafting, the act should remain conceived according to the principle of technological independence and be principles-based. Such an approach is essential to enable the Canadian legal framework to adapt to future social and technological changes, including the development of robotics, of the Internet of objects and artificial intelligence.

In terms of readability, the limitation of the legislation to the protection of personal information would be welcomed. Functional equivalence rules for electronic documents are irrelevant and should be transferred elsewhere.

Conversely, it would be desirable for a single act to contain the entire framework for the protection of personal information, that is, for both the private and public sectors. The concomitant reconsideration of these two acts by this committee offers this opportunity. This would also allow for the creation of a coherent framework for both the protection of personal information and the role of the Commissioner—even if it means providing several sections if it was considered necessary to maintain a public sector exemption regime.

As a final thought, I would like to draw your attention to the need of providing statutory rights of actions and damages. Equally, I would like to underline that it is necessary to update our law in order to satisfy the GDPR's suitability test, but that we must nevertheless consider two important factors: first, that the test does not require a carbon copy of the GDPR and secondly that this applies to all protection frameworks, and not just PIPEDA.

I hope that these few thoughts and recommendations will be useful to the committee. Sadly, I wasn't able to finalize on time a short bilingual brief with examples and recommendations. However, I could send it to you afterwards.

Thank you. I'll be happy to answer any questions that you may have.

3:55 p.m.

Conservative

The Chair Conservative Blaine Calkins

Thank you very much.

Colleagues, we'll now proceed to the seven-minute round.

Our opening time allotment goes to Mr. Bratina, please.

February 23rd, 2017 / 3:55 p.m.

Liberal

Bob Bratina Liberal Hamilton East—Stoney Creek, ON

Thank you very much.

Thank you to all.

Ms. Scassa, I loved your phrasing that it was a “convoluted mishmash”, a “piecemeal” document. Is it because of the dynamic nature of law-making, where the technology has been evolving and they've been adding things on that makes the whole thing unworkable, in the end?

3:55 p.m.

Full Professor, University of Ottawa, Canada Research Chair in Information Law, As an Individual

Prof. Teresa Scassa

I think it's really due to the legislative history of the statute. It arose at a time when there was a need to put legislation in place quickly. Europe had just passed its first data protection directive, and there were concerns about cross-border flows of data. We're in a similar situation again.

It was clear that we needed legislation. There wasn't a lot of comfort with legislation. It was decided that if it were built around the CSA model code, there would be a greater acceptance of it, both here and south of the border, in terms of the obligations it imposed on businesses.

The normative core is the CSA model code, which is in the schedule. In the legislation itself, all the exceptions and modifications are found, as are the enforcement powers and so on. For an ordinary individual who is trying to work his or her way through the statute, it's not intuitive. It's not easy to find. As amendments get made, the interaction between the two documents becomes even more complicated.

I think it's in large part due to that history that we have the legislation we have. I think we're mature enough now in our evolution in terms of our data protection that we can walk away from that and fix the statute.

4 p.m.

Liberal

Bob Bratina Liberal Hamilton East—Stoney Creek, ON

We've heard in testimony about the European Union's general data protection regulation. Relatively, PIPEDA falls short on the right to be forgotten. I'm going to ask Mr. Karanicolas about this, because he made a comment, but I'll ask you first.

Did you review the European Union's general data protection regulation, and how do you feel about that compared to what we have?

Ms. Scassa, I'll ask you first.

4 p.m.

Full Professor, University of Ottawa, Canada Research Chair in Information Law, As an Individual

Prof. Teresa Scassa

On the right to be forgotten, I would draw a distinction between the right to be forgotten, which is talked about a great deal in the context of a particular court decision in the European Union, and the right to erasure, which I think is more what is present in the data protection directive. I think those are very different things.

The right to be forgotten, in a sense, goes so far as to talk about what search engines have to delist, so it affects how you search and how you find information on the Internet. That is very different from the right of people who no longer want a company they perhaps dealt with in the past and which collected their personal information to have their personal information, because they no longer wish to deal with that company. They're asking to have that personal information removed and no longer dealt with.

The right to be forgotten and the right to erasure are very different things. The right to erasure seems to me to fall within the scope of PIPEDA, whereas the right to be forgotten goes beyond it, and I think, as my colleague pointed out, it implicates freedom of expression rights.

4 p.m.

Liberal

Bob Bratina Liberal Hamilton East—Stoney Creek, ON

Mr. Karanicolas, is it possible to track how data moves around? I made this comment another day about the offshore havens of money. Can there be offshore havens of data, where things can be slipped over to other servers and hidden away and used at the pleasure of those people? With the current technology, can you follow that data as it travels around?

4 p.m.

Senior Legal Officer, Centre for Law and Democracy

Michael Karanicolas

If I understand you correctly, it's possible to impose data localization rules. Different governments have experimented in different ways with those kinds of requirements.

That's not necessarily something I would recommend for Canada, but it is possible to control how information is routed. Those kinds of controls tend to raise significant concerns about the functionality and operability of the Internet as a whole, which is designed to allow information to flow by the most efficient route.

I'm not sure if I'm answering your question, or if you're—

4 p.m.

Liberal

Bob Bratina Liberal Hamilton East—Stoney Creek, ON

To me, as technology evolves so rapidly, we're trying to set in stone wording for legislation that may miss the next feature of technology, which would allow it to side-swipe that, if you will.

4 p.m.

Senior Legal Officer, Centre for Law and Democracy

Michael Karanicolas

Okay.

Technological neutrality is an admirable goal to aim for. I think what the drafters of PIPEDA were originally aiming for was to try to keep it as neutral as possible, as far as I understand it. Whether they succeeded is different. I think there have been some fair points brought up by my colleagues, particularly about how the Internet of things has so dramatically changed the way information is being collected. It has opened up all these new avenues that are vastly beyond what was conceived at the time PIPEDA was drafted.

As a general rule, I think that technological neutrality in legislation is a good thing to aim for. In crafting a new law or in revising a law, one should aim to avoid falling into that kind of pitfall as far as possible.

4 p.m.

Liberal

Bob Bratina Liberal Hamilton East—Stoney Creek, ON

Can I ask you about the comments you made with regard to the European Union's right to be forgotten, etc.? You didn't seem to be too supportive of its approach. We've heard the opposite testimony.

4 p.m.

Senior Legal Officer, Centre for Law and Democracy

Michael Karanicolas

The right to be forgotten is not something I strongly oppose. I'm sort of undecided on that issue specifically. I see arguments either way as to whether some sort of right is potentially a good idea, because I do see a problem and I do see a change in the way information is recorded, which the Internet has wrought.

That being said, there are huge problems with the way it's been rolled out in Europe, partly because the decision, when the European Court of Justice first handed it down, didn't provide a huge amount of clarity on how it should be applied. It provided vastly broad categories for what could be susceptible to the right to be forgotten, which led to a huge amount of confusion. I think the last I saw, something like 150,000 or 170,000 websites had been taken down as a result of that. Huge numbers of applications have been made.

I see problems with the way it has been rolled out in terms of a lack of clarity. I also question the wisdom of bundling it with the search engines themselves. As private sector actors, they're not well equipped to engage in that kind of balancing. When you impose this kind of potential for liability on them, without their necessarily having the proper processes in place to respect the freedom of expression interests that are engaged, what you end up with is a tendency to remove information whenever there's a complaint. That's a problematic approach.

4:05 p.m.

Conservative

The Chair Conservative Blaine Calkins

Thank you very much.

We now move to Mr. Jeneroux, please, for seven minutes.

4:05 p.m.

Conservative

Matt Jeneroux Conservative Edmonton Riverbend, AB

Thanks to all three of you, two of you for coming back, and to you, Mr. Martin-Bariteau, in your first time here, welcome.

My first question was going to be about what all of you think of PIPEDA, but it's pretty clear. Nobody here in the room is too pleased with it, so I'll move on to my second question.

In terms of order-making powers, Ms. Scassa, you made your thoughts clear in the last answer, but could we get the other two gentlemen on the record in terms of their thoughts on providing the Privacy Commissioner order-making powers?

You're first, Mr. Martin-Bariteau.

4:05 p.m.

Assistant Professor, Common Law Section, Faculty of Law, and Director, Centre for Law, Technology and Society, University of Ottawa, As an Individual

Prof. Florian Martin-Bariteau

I'm not sure I understand....

Like my colleague Teresa Scassa, I am fully in favour of the idea of granting the Commissioner order-making powers.

4:05 p.m.

Conservative

Matt Jeneroux Conservative Edmonton Riverbend, AB

Mr. Karanicolas.

4:05 p.m.

Senior Legal Officer, Centre for Law and Democracy

Michael Karanicolas

I'm not entirely convinced of the need for order-making powers. It's not something that I necessarily oppose, but I do think it raises some issues in terms of the procedural fairness of investigations, which the OPC itself has mentioned.

To me, the bottom line is necessity. I think the reason I'm not completely sold on the order-making power is that we've previously heard from the Privacy Commissioner that most of their recommendations are ultimately complied with. If that's the case, and if you have a system in which the recommendations are already being complied with, I'm not sure why you need a strengthening of the powers.

It was mentioned that recommendations are often very slow in being implemented, which is a significant problem. Some people have suggested a hybrid model, whereby the companies would need to apply to the court for permission to not comply within a particular time period. I'm not sure why a specific order-making power would solve the problem more than a hybrid model, which is I think why I'm not necessarily opposed to it but not fully convinced of the need for an order-making power either.

4:05 p.m.

Conservative

Matt Jeneroux Conservative Edmonton Riverbend, AB

We're going to try to pin you down on some answers here, Mr. Karanicolas. I'll skip to the right to be forgotten, where it sounds like you're equally on the fence. It sounds like the Privacy Commissioner is also struggling with the same focus as to what type of law he should put in place, if any. I find it personally fascinating. I think that somebody's right to be forgotten is somebody else's argument that, no, they should be remembered.

You gave a bit of an on-the-fence argument. I'll start with you and go around the table to see if there's any guidance or support you can provide us. The Privacy Commissioner is coming out with a position paper on this, but unfortunately not until after our study is done. We're looking for some advice or support in terms of our recommendations to the Privacy Commissioner.

4:05 p.m.

Senior Legal Officer, Centre for Law and Democracy

Michael Karanicolas

At the moment, I wouldn't make a recommendation in favour of the right to be forgotten.

The reason I'm a bit couched in that is that I do see some potential problems that could be addressed, but if you want a recommendation on whether or not to legislate that, I would be against it. I think there's a huge amount of potential to do harm, and a huge amount of potential to craft it in a way that has the negative impact on freedom of expression.

I do see the problem there, but there are a lot of ways that the legislation could be done badly, which is why I would be concerned.

4:10 p.m.

Conservative

Matt Jeneroux Conservative Edmonton Riverbend, AB

That's a little better.

Ms. Scassa.