Evidence of meeting #47 for Access to Information, Privacy and Ethics in the 42nd Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was consent.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Daniel Therrien  Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada
Patricia Kosseim  Senior General Counsel and Director General, Legal Services, Policy, Research and Technology Analysis Branch, Office of the Privacy Commissioner of Canada
Valerie Steeves  Full Professor, Department of Criminology, University of Ottawa, As an Individual
Vincent Gogolek  Executive Director, B.C. Freedom of Information and Privacy Association

4:25 p.m.

Conservative

The Chair Conservative Blaine Calkins

Mr. Therrien, I have a quick question for you. I was talking to the analyst here and something came across my mind, and it came out of the last meeting as well.

When electronic health records were brought up by a previous witness, we found, through a bit of investigation, that if the electronic health records or the data or the doctor's records—a person's medical records—were in a doctor's private practice, those would fall under provincial or federal private sector privacy legislation. Yet if that same medical record were in a hospital, it would fall under provincial or federal government privacy legislation, depending on where that document actually was.

If I give my accountant my information for tax purposes, the relationship with my accountant, I am assuming, falls under private sector privacy legislation. Yet my accountant is going to file my taxes on my behalf to the government, which then makes that information under the public sector privacy information.

So, with all of this overlap and confusion between private sector and public sector and information exchanging hands in this way, does it make sense that we have two sets of laws, one for the private sector and one for the public sector?

4:25 p.m.

Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada

Daniel Therrien

The short answer is yes.

4:25 p.m.

Conservative

The Chair Conservative Blaine Calkins

That's all we have time for.

4:25 p.m.

Some hon. members

Oh, oh!

4:25 p.m.

Conservative

The Chair Conservative Blaine Calkins

If you care to elaborate on that, that would be very helpful.

Colleagues, I appreciate your humouring me through this.

We thank you very much, Mr. Therrien, for coming once again. I'm sure it would be helpful, actually, at some point in time during the end of our study, once we've heard from more witnesses on this, to have you return to clear up some of the questions and concerns we'll have, so don't be surprised if you get an invitation.

We'll suspend for a few minutes, colleagues, to get ready for our next witnesses.

We're resuming now. In order to keep to the agenda this time, I'm going to be much more strict on the seven-minute and five-minute rounds of questioning. That's the only way we can get through our one-hour time sessions. I am going to get straight to it.

We have, from the B.C. Freedom of Information and Privacy Association, via videoconference, someone who is no stranger to this committee, Mr. Vincent Gogolek.

We appreciate you joining us again today, sir.

We also have Ms. Valerie Steeves, who is appearing as an individual. She is a full professor in the department of criminology at the University of Ottawa.

Ms. Steeves, you have up to 10 minutes, so go ahead, please.

4:25 p.m.

Dr. Valerie Steeves Full Professor, Department of Criminology, University of Ottawa, As an Individual

Thank you very much.

First, I'd really like to thank the committee for undertaking this study. I think it's incredibly important and very timely, given the changes we've seen since PIPEDA was first passed.

When I think back over that period of time, I always find myself thinking about three things. PIPEDA, as you know, was enacted to create trust in the information marketplace. Second, when PIPEDA was being passed, it was quite clear that the intention was to create consent as a floor and not a ceiling. Last, data protection and the provisions that are included in PIPEDA were part of a larger strategy that was designed to protect privacy as a human right. At the time, PIPEDA was seen as a necessary part of this protection, but it was not sufficient in and of itself.

In the last 20 years or so, I've spent a lot of time doing research on children's attitudes and experiences with privacy and equality in network spaces. I think that research raises real concerns about the first of those points, the success of PIPEDA to create trust in the information marketplace.

You could argue that part of it is a lack of education. I was in the field talking to 13- to 16-year-olds in October and November. We asked them about fair information practices, and none of them was able to identify a single one of them. In fact, almost none of them could remember the point at which they consented to the collection of their information when they signed up or posted material on Snapchat or Instagram.

Certainly when you talk to young people about the regulatory regime, they talk about privacy policies, and they don't talk about them in a very flattering way. From their point of view, these have been purposely written to obfuscate and confuse them, so they won't know what's happening, and so they will feel powerless.

They repeatedly—and increasingly, actually, over the years—have told us that the commercial surveillance they experience on these platforms is creepy; and “creepy” is a really important word because typically it means that someone's privacy has been invaded. It's a marker. But at the same time, since their school lives, their home lives, their work lives, and their play lives are so interpolated with technology, they really feel they don't have any choice about it whatsoever.

I think a good starting point for your study is the recognition that even though so many Canadian young people and Canadian adults have flocked to these platforms, that doesn't mean they're comfortable with the current regulatory framework.

In 2015 we surveyed 5,500 kids between the ages of 10 and 17 across the country. We asked them, “Who should be able to see what you post online?” and 83% of them said that the corporations that own the platforms where they're posting the information should not have access to it. So if I put something up on Facebook, Facebook shouldn't be looking. And 95% said that marketers should not be able to see what they post. Whether they've posted in a public place or a private place, they felt it was private to them.

Typically when kids are talking about privacy, they're not talking about non-disclosure, they're talking about audience control, and marketers were not an audience they wanted or expected. Some 96% said that companies that sell them smart phones and other devices or apps that use GPS should not be able to use it to locate them in the real world; and 99% said that marketers should never be able to use GPS to figure out where they were in the real world.

I think this brief snapshot really strongly suggests that there is a disconnect between the regulatory model and the lived experiences of the people who play, shop, go to school, and hang out on these platforms.

I think that disconnect is really related to a bit of the fiction that's embedded in PIPEDA. PIPEDA assumes that, when someone posts a photo on Instagram or is keeping a streak going at midnight on Snapchat, they knowingly and consciously are undertaking a commercial transaction, that they are trading their personal information for access to the platform.

But from the point of view of the people who live on these platforms, it's not a commercial transaction. If I'm on Snapchat, I'm chatting with my friends, I'm doing my homework, I'm signing a petition, I'm exercising my free speech, or I'm exercising my freedom of association. I don't think that's an outrageous perspective. Certainly that's the same relationship we have with our land lines. Although I spend $70 a month so Bell can put a phone line in my house and I can talk to people, I certainly don't expect Bell to listen to my phone calls.

I had a painter in the other day. I don't expect Bell to interrupt my conversation with my painter and tell me, “Home Depot has a sale on paint right now”, and sell to me in that environment. And I certainly don't expect Bell to take all that information and run it through an algorithm to figure out if I'm a criminal or not.

If we go back and look at that time period, part of reconnecting to that earlier hope for PIPEDA, I think, calls upon us to place privacy or data protection in a much broader context.

Go back to the Finestone report of 1997, in which privacy was seen as a social value, a democratic value, and a human right. I think that broader perspective provides this committee with two advantages.

The first one is that it's exactly the kind of thinking that you're going to need to use if you intend to harmonize our privacy protection regime with the European general data protection regulation that comes into force and effect in 2018. I think it's arguable that Europe has done a much better job than North America has in navigating through the challenges we've seen in network spaces over the last 15 years or so, precisely because of a strong commitment to human rights and a strong jurisprudence working on that commitment.

I also think that this broader perspective, placing data protection as is necessary but insufficient on its own piece of protecting privacy as a human right, will help us navigate the consent debate more effectively. As I said, when PIPEDA was passed, it was very clearly articulated that consent was intended to be a floor and not a ceiling, and it sure felt like a leaky ceiling after about six months had gone by.

Particularly given the commissioner's comments on big data, certainly there's pressure to weaken consent provisions and there's pressure to make more information publicly available precisely so corporations can sidestep the provisions that we now have. There's more pressure to de-identify and to accept de-identified information as non-personalized information for the purposes of the legislation.

It's always for the promise of big data: if we can just keep all the information, we'll be able to learn new things, because artificial intelligence will identify patterns that are hidden to us, so that we can predict behaviour, we can be more efficient, and we can be more effective. I think privacy is the best way to crack that open and to begin to examine the ethical concerns that flow from this type of information use. Big data is not predictive. This comes back to my human rights concern. Big data is never predictive; it can look only to the past. It assumes that I will do in the future what I did in the past, but even worse than that, it assumes that I will do what people like me have done in the past.

There's a deep concern around these kinds of information infrastructures, which is that we will unintentionally and unconsciously recreate biases in our information systems. We'll either program them in through false proxies, or they'll be learned by the algorithms themselves. We can look at the example in England where they identified young criminals. The youngest potential criminal they identified was three years of age, and he was identified because he was racialized, he was impoverished, and he lived in a particular area. There are discriminatory outcomes that are hidden within this information management system.

Even if we take the position that the algorithm will be able to learn, I think all you have to do is look at what happened with Microsoft's Tay to realize that an open season on information will lead to unintended consequences that will harm the most marginalized in our society.

At a practical level, I have five suggestions.

I think we need to strengthen the reasonable purposes clause. I was lucky enough to participate in the commissioner's meeting on consent, and it was quite interesting. We had quite a debate, because the representatives of the businesses I was sitting with kept saying that businesses have a right to collect information, while I kept saying, “No, businesses don't have a right.” People have rights. Businesses have needs and desires. I found it quite interesting that they kept pointing to the purpose clause. I think there's an opportunity to enrich our commitment to human rights within PIPEDA by opening up and reaffirming the need to protect individual rights against business uses, rather than business “rights”.

Second, I imagine that you're seriously considering adding a right to delink information if there's no public value. It's the right to be forgotten clause. From young people's point of view, certainly, this is absolutely crucial. When you sit down and talk to young people about the risks they're worried about online, that's it. They say, “Oh, something I did when I was 16 is going to sink me, and I will never be able to get over it.” I think that's a particularly important area to examine.

Also, young people certainly ask for regulators to mandate more technical controls so they can more easily control their audiences and take down content. I'm personally quite concerned that community standards are being created by corporations and that our elected representatives are not active in that space of setting standards for the kinds of discourse that are appropriate in Canadian context.

Fourth, I'd strongly urge you to consider mandating some form of algorithmic transparency. So many of these practices are hidden, and it's only getting worse, and so I think corporations should be required to be fully transparent with their information practices, particularly because of this concern about discriminatory outcomes.

Last, I'd ask you to consider holding corporations to account for those discriminatory outcomes if they're going to get the benefit of access to this information. It's like pollution; somebody is going to pay for the dirty water. Since we're building this system right from the get-go, we should be considering who that burden should fall on, and I would argue that it should fall on the people who profit from it.

Thank you very much.

4:40 p.m.

Conservative

The Chair Conservative Blaine Calkins

Thank you very much, Ms. Steeves.

We'll now hear from Mr. Gogolek for up to 10 minutes.

Go ahead, please, sir.

4:40 p.m.

Vincent Gogolek Executive Director, B.C. Freedom of Information and Privacy Association

My apologies first of all, but I'm strictly limited to your 2:30 deadline because we're having a bit of a problem out here in British Columbia with a privacy breach, strangely enough, one that affects both the public and the private sectors. So, I will have to go at 2:30.

I will also try to keep my comments as brief as possible to allow the maximum time for questions. I will limit myself to the four points raised by the commissioner in his letter of December 2 to the chair, as well as two extra points.

We've also had two detailed submissions that we've put in to the commissioner's process, which I believe are available, and I'd be pleased to provide them to you.

Consent for the collection, use, or disclosure of our personal information is the underpinning of PIPEDA. Attempts to move away from this or to tamper with it should be viewed with considerable suspicion. At the same time, it's important to note that, in many cases, consent is really illusory. The conditions being agreed to are often in the form of over-broad, lengthy terms of service and other contractual services. The choice offered to consumers is often to accept all conditions or to not use the service. The result of this is that, in many cases, an organization feels free to do whatever it wants with the information it collects under the guise that the individual whose information it is has, in fact, consented to this.

For example, in our 2015 study on “The Connected Car”—which was generously supported by the contributions program of the Privacy Commissioner—we found that there were multiple agreements, policies, and contracts that come into play when somebody is attempting to purchase a vehicle. The purchaser is supposed to have read and understood all of these policies. At lot of times these are not available on the Canadian website of the manufacturer. They are available only on the U.S. website, and it's not entirely clear whether or not they apply. These policies and conditions tend to have very open-ended use and conditions that allow for “such other purposes as we see fit” or for research or for marketing. Some of these policies can, in fact, be somewhat contradictory. It's not entirely clear where these are coming from. As a result, we provide this general recommendation in our “The Connected Car” report:

Rather than relying on the fiction of choice and consent, what is needed in this industry are clear, specific and relevant limits on collection, retention, use and disclosure of personal customer data. We need industry-specific data protection regulations for the Connected Car industry.

We also had a number of specific recommendations for the automotive industry regarding consent. I'd like to refer you to four suggestions that Professor Michael Geist of the University of Ottawa put forward as a useful basis for approaching the issue of consent generally: the opt-in consent should be the default model; rules on transparency must be improved; consumers must be able to exercise a choice other than to take it or leave it; and stronger enforcement powers and penalties are required.

In terms of reputation and privacy, with the rise of the online world, considerations that were once primarily the concern of the well-heeled and the well-known—things like damage to reputation—have become much more widespread and are, in fact, concerns of pretty much everybody who is involved online. What might once have been simply neighbourhood gossip can now become part of a global campaign of vilification. Ordinary people who do not have large financial resources or access to legal resources are put in the position of trying to defend themselves and their reputation in this new world. FIPA made a submission to the Privacy Commissioner's consultation on this issue, and I would refer you to that piece of work for a more detailed discussion of some of the issues involved.

We didn't make specific recommendations, but we did outline various considerations that should be taken into account when approaching this issue.

In terms of enforcement, as we've said before, with regard to the Access to Information Act and the Information Commissioner or the Privacy Act and the Privacy Commissioner, we're also of the view in terms of PIPEDA that the Privacy Commissioner should be brought up to the same level as his provincial counterparts who have order-making power. This system has operated for more than a decade in British Columbia, and there hasn't been any systemic problem with the commissioner having order-making power. It would also ensure that, in terms of protection of people's rights, they would be able to get a more immediate remedy under the federal regime, which is not the case currently, rather than somebody, say in British Columba, having a choice of complaining about conduct either provincially or federally.

In terms of adequacy, the order-making power would have, I think, a positive effect with regard to ensuring that PIPEDA continued to be looked upon as providing adequate privacy protections.

The two additional points that I would raise are these.

One is something that came up, I believe, during our discussions on the Privacy Act, and that is the coverage of federal political parties. It's our view that the federal political parties, which are currently not covered under any legislation protecting people's privacy and personal-information rights, should be dealt with under PIPEDA. Here in British Columbia, our substantially similar provincial act, the Personal Information Protection Act, covers the political parties in this province. Arguably it could cover provincially incorporated branches of federal parties. The commissioner has, in fact, successfully done at least two investigations and reports on the two largest parties here in British Columbia, and we continue to have parliamentary democracy here, so we don't see any impediment to federal political parties being brought under the PIPEDA regime.

Finally, I'd just like to support what Professor Steeves said in terms of algorithmic transparency. This is a very key point, and it's something that we raised previously with regard to the Privacy Act.

I look forward to your questions.

Thank you very much.

4:50 p.m.

Conservative

The Chair Conservative Blaine Calkins

Thanks a lot, Mr. Gogolek.

As I said, colleagues, I'm going to hold the line on the seven minutes this time; otherwise, we're not going to get through the full two rounds.

Mr. Bratina, please go ahead for seven minutes only.

4:50 p.m.

Liberal

Bob Bratina Liberal Hamilton East—Stoney Creek, ON

Thank you.

Ms. Steeves, there seem to be two different behaviours to be addressed, on the consumer side and on the corporate side. On the consumer side, there's education, and on the corporate side, there's enforcement.

It's staggering, really, to hear you talk about young people's sense of what this is all about and the fact that they don't understand that they're really making a deal with the devil, if you will, by pushing that accept button. What serious measures could we take to address that?

4:50 p.m.

Full Professor, Department of Criminology, University of Ottawa, As an Individual

Dr. Valerie Steeves

In the last review of PIPEDA, PIAC suggested that there be different levels, by age, of what could be collected from young people and no-go zones in which information would not even be collected from those under 13. Certainly developmentally speaking, you see that younger kids tend to be very mature and not put much out there. It tends to be the 13- to 15-year-olds who are most at risk.

I'm not sure if education is necessarily.... Certainly we do a lot of it. I do a lot of it myself, but I'm not sure if that's a fair response, because kids will say, “We're forced to use this technology at school. My mom makes me go on Facebook to check out my cousins so I can tell her what's going on, and at the same time I'm yelled at and told I shouldn't put any information out there.” In the studies we've done with young people, it's quite clear that the platform is designed to create incentives to disclose.

I think we have to look at those incentives and really evaluate them, and this goes to the comment earlier about the need to really limit purposes. We create honey pots, especially with young people, and corporations collect everything because of these very broadly crafted clauses. If we were much more careful about the purposes of the collection, not just from a transparency point of view but by saying, “No, there are some things you just can't do”, particularly with young people, I think that would go a long way.

4:55 p.m.

Liberal

Bob Bratina Liberal Hamilton East—Stoney Creek, ON

You referred to the Finestone report, which was 20 years ago.

4:55 p.m.

Full Professor, Department of Criminology, University of Ottawa, As an Individual

Dr. Valerie Steeves

Yes, I've been in this game too long.

4:55 p.m.

Liberal

Bob Bratina Liberal Hamilton East—Stoney Creek, ON

Well, no....

I guess I'll have to dig it out and read it over and see how a 20-year-old report on this very subject resonates with today's reality.

4:55 p.m.

Full Professor, Department of Criminology, University of Ottawa, As an Individual

Dr. Valerie Steeves

What's interesting about it is that it provides that broader context.

One of the things that I found when PIPEDA was passed was that prior to PIPEDA, the federal government exercised a great deal of leadership and put a lot of money behind public access points for technology. It supported non-commercial spaces like SchoolNet, which was a phenomenal site, and it created places where people could communicate and participate in public discourse without this deal with the devil, as you said. Once PIPEDA was passed, within two years, all of that was gone.

The federal government kind of exited from that type of leadership. I think it would be an interesting moment to go back and say, “Wow, what we meant to do was to create one piece of the patchwork that would deal with data protection within this broader quilt that looked at privacy as a human right.”

The fact that we did not do it has actually put us behind the eight ball when it comes to a number of different issues, from national security to education. The stuff that's going on with educational software is terrifying.

4:55 p.m.

Liberal

Bob Bratina Liberal Hamilton East—Stoney Creek, ON

Mr. Gogolek, we've had lots of great interventions from you, and I have to ask you this, because my time will run out soon.

For God's sake, if we don't do anything else, what should we be seriously looking at in terms of your priorities as to what needs to be done?

4:55 p.m.

Executive Director, B.C. Freedom of Information and Privacy Association

Vincent Gogolek

It's the question of consent and ensuring that it is in fact meaningful consent, informed consent.

We're very concerned about attempts to expand implied consent where you ought to have known that we would be using this. Somebody is saying “I agree” in order to use a service or a piece of equipment, and suddenly it's showing up in strange new places and having possibly very serious negative effects on them.

First of all, it's the notion that consent be real consent, as opposed to the idea that you checked the box so you opened yourself up to pretty much anything.

4:55 p.m.

Liberal

Bob Bratina Liberal Hamilton East—Stoney Creek, ON

It's interesting. Sometimes I push the accept on a hand-held device that I can hardly see in my own hand, never mind find the button, but there's also a paragraph or two that goes along with that acceptance.

4:55 p.m.

Executive Director, B.C. Freedom of Information and Privacy Association

Vincent Gogolek

Yes, or sometimes there's more.

4:55 p.m.

Liberal

Bob Bratina Liberal Hamilton East—Stoney Creek, ON

Sometimes there's more.

Thanks, Mr. Chair.

4:55 p.m.

Conservative

The Chair Conservative Blaine Calkins

Thank you, Mr. Bratina.

Now we'll move on to Mr. Jeneroux, please.

4:55 p.m.

Conservative

Matt Jeneroux Conservative Edmonton Riverbend, AB

virtually. You're now in high definition, I think. It's a little clearer picture than we've seen of you before. You're looking good, sir.

4:55 p.m.

Executive Director, B.C. Freedom of Information and Privacy Association

Vincent Gogolek

Better than live.

February 16th, 2017 / 4:55 p.m.

Conservative

Matt Jeneroux Conservative Edmonton Riverbend, AB

Thank you, Mr. Chair.

Thank you both for being here.

Mr. Gogolek, it's good to have you back. I want to touch on the right to be forgotten. You didn't mention it too much in your speech, but I am curious as to whether you have an opinion on where we go.

I want to put the concept out there that Ms. Steeves mentioned about this being a real concern for young people, the millennials. They do something, X, at the age of 16, and that then impacts Y later on in their life.

There are certain times.... I guess I can understand the one side, but there's also the other side of that too. There are instances in which X would have a significant impact on Y, and we see this in politics. We saw it during the election campaign. I believe that a number of candidates in each party were impacted by something in their past or whatnot. When someone is running for public office, sometimes those things are important to know about.

I will open it up.

Mr. Gogolek, would you mind touching on the right to be forgotten? I'll ask Ms. Steeves for her response as well.

4:55 p.m.

Executive Director, B.C. Freedom of Information and Privacy Association

Vincent Gogolek

As an organization we don't have an official position on the right to be forgotten. We are not intervenors in either the Equustek-Google case or the Facebook case. In our submission to the Privacy Commissioner's consultation, we did set out some conditions that are important and some concerns that we have about how this is currently being done in Europe.

One concern is that the intermediaries, such as Google and others, are being handed either quasi-legislative or quasi-judicial powers to decide what is or is not being removed from what is almost a utility. Google is now used as a verb. If something is not there, it tends to be considered not to exist. People don't go to page 12 or page 112 to try to find some report on this. They play an important role, but they shouldn't be handed the authority to determine this. That's one consideration.

We do have others, but we want to make sure that if something is removed there's some sort of notation, some some sort of indication that what you're getting.... When you look something up, you're assuming you're getting what is there. If things have been removed—and I'm afraid I can't provide you with a detailed description of what that would look like—there should be an indication that what you're getting as a result of this search is not everything, if this is a road we are heading down.