Evidence of meeting #42 for Access to Information, Privacy and Ethics in the 41st Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was kids.

A video is available from Parliament.

On the agenda

MPs speaking

Also speaking

Teresa Scassa  Canada Research Chair Information Law, Faculty of Law, Common Law Section, University of Ottawa
Michael Geist  Canada Research Chair, Internet and E-commerce Law, University of Ottawa, As an Individual
Valerie Steeves  Associate Professor, Department of Criminology, University of Ottawa

11:45 a.m.

NDP

The Chair NDP Pierre-Luc Dusseault

You can have a small question.

11:45 a.m.

Conservative

Dean Del Mastro Conservative Peterborough, ON

I am interested in this notion between aggregate and specific statistics. The use of aggregate statistics, it would seem to me, is basically what everybody is doing, and not necessarily interfering in the privacy of an individual. For example, if Google said that people with searches for this gave this as their top 10 responses, I don't see that as a privacy invasion. I see that as useful information.

When we get down to specific kinds of tracking, I think that's where most Canadians would be concerned.

Can you speak a little about the difference between aggregate tracking and specific tracking?

11:45 a.m.

Associate Professor, Department of Criminology, University of Ottawa

11:45 a.m.

NDP

The Chair NDP Pierre-Luc Dusseault

I will allow one person to answer quite quickly.

11:45 a.m.

Associate Professor, Department of Criminology, University of Ottawa

Dr. Valerie Steeves

Okay. When you use this data, you're collecting all this personal information. You're tracking population trends. Then you divide everybody up into categories, and then you treat them differently because they belong to a category. Earlier a concern was raised that this type of technology is very important for democratic debate. You can use those categories, once people identify themselves, to change the environment around them.

I was doing research on MSN, and while I had not identified myself as any particular person, I was surrounded by the news of the day. As soon as I registered as a 16-year-old girl living in Vancouver—which I was not, as you might have guessed—the news of the day disappeared and it was replaced with celebrity news, dieting ads, and plastic surgery ads. It wasn't that they knew I was Val the 16-year-old girl living in Vancouver; they knew I was someone who fit that category.

Therefore, there are issues of discrimination that flow from that, as Professor Scassa mentioned, but they are even more insidious, because they change the environment around a person because of their assumptions about who they are and what category they fit into. So that would not fall within PIPEDA protections on the use of personal information, but it's highly problematic from a privacy point of view, because it fractures the public spaces that are necessary for democratic debate, and it opens up vulnerable populations to discrimination.

11:45 a.m.

NDP

The Chair NDP Pierre-Luc Dusseault

Thank you.

Your time is up, Mr. Del Mastro.

Ms. Murray, you have seven minutes.

11:45 a.m.

Liberal

Joyce Murray Liberal Vancouver Quadra, BC

Thank you very much for presenting to the committee your ideas about what should be done.

What struck me when I was listening to you was that in some ways Canada is falling behind. At the same time, given some of the budget cuts, other organizations are impeded from helping to slow down that falling behind.

With the incredible complexity of what you've just presented and the potential for different interest groups to have different ideas about how to move forward, I'd like you to comment on whether the tools we as government have in the form of laws and enabling those laws and regulations are up to the challenge when we have such a fast-paced and dynamic environment. Or, is it the case that what we're trying to bring to bear as Parliament and government just has to be totally rethought if we are to catch up and do something that is in real time with respect to the risks and the concerns? It's a pretty broad question.

11:50 a.m.

Canada Research Chair Information Law, Faculty of Law, Common Law Section, University of Ottawa

Dr. Teresa Scassa

Yes, it is a very challenging environment. One of the things I talked about—and I think Professor Geist mentioned this as well—is that the problems are now so multi-dimensional and complex that it may be the case that they simply can't be slotted into one particular box of data protection legislation under federal jurisdiction. It may be that there are other dimensions that implicate other regimes, whether it's competition law or human rights law, or that implicate the provinces as well. So it may be that there's a need for a more multidisciplinary, multi-faceted approach to some of these issues, and that it's not necessarily to our advantage to treat or deal with the issues in specific silos.

11:50 a.m.

Canada Research Chair, Internet and E-commerce Law, University of Ottawa, As an Individual

Dr. Michael Geist

I have a couple thoughts on that. The first is to say that I don't think it's the role of government to come charging in saying, “We're the new sheriff in town when it comes to social media, and we're going to fix everything that has to do with the choices these private companies and individuals are making”.

Frankly it's tough to keep pace with what's happening. As we've heard, we're not even sure, necessarily, what the business models are sometimes. We don't know if there is a business model in some of these instances. So I think taking the approach that government knows and is going to fix everything would be foolish. That said, there is unquestionably a role for government and regulators to set certain parameters about what is appropriate and to ensure that it reflects Canadian values about what's right from a privacy perspective and what's right in terms of an obligation from a security perspective, as well as about the range of different issues that arise.

In that context, I find I'm a bit more optimistic about the prospect that government can engage in that broad rule-setting. PIPEDA, in many respects, was designed, at least initially, with the best of intentions to try to do just that. As Professor Steeves noted, we've now had more than 10 years of experience, and that experience has shown that there is a need for adaptation of the law. So it's not that we're changing something every 10 weeks. But surely every 10 years is enough time to say that there are shortcomings within the legislation on the privacy side that we can fix to ensure that the sorts of broad parameters around some of this activity better reflect what Canadians expect when they venture online.

11:50 a.m.

Liberal

Joyce Murray Liberal Vancouver Quadra, BC

I have another, associated question. Perhaps, Professor Steeves, you could wind your remarks into both of them.

It was mentioned that they were trying to find a balance between privacy and access to data, and how critical this was for business and the competitive issues that come up. I'd like to have positive and negative comments about the impact on small businesses—not the big data businesses—of what's going on.

I'd also like to know whether there is a country that has a framework for addressing these issues that could be a suitable model for Canada, or whether it's about unique values and principles in Canada and that we must have a made-in-Canada approach.

11:50 a.m.

Associate Professor, Department of Criminology, University of Ottawa

Dr. Valerie Steeves

As was mentioned earlier, online privacy issues are really nested in broader concerns about marketing, citizenship, human rights, social interaction, democracy, democratic dialogue, and those types of things. If you go back to the history of data protection, it was always assumed that it would be the last step. That's the floor, not the ceiling, approach. It was assumed that there would be mechanisms whereby governments would interrogate uses of information and ask if the public interest were served by these practices. If it were, only then will we go ahead with that kind of thing. We'll use fair information practices once the horse is out of the barn, to provide some redress in case something happens.

I think the reliance on fair information practices perhaps reflects a naivety that it will be enough. It might be a necessary but insufficient condition.

I would suggest that the jurisdictions that have approached these issues from a broader perspective and come up with solutions that better capture these broader human rights interests are places in Europe, for example, which have a human rights approach to privacy and where there are strong human rights protections for privacy, for the inviolability of the personality. There are a number of situations in Iceland and Germany where courts have been able to come up with creative solutions, interrogate those purposes, and call those purposes to some form of public judgment through broader understandings.

I agree with what Professor Geist said about consent. Consent is never going to be your solution. I think it's an important piece of the puzzle, but it's a small piece. We need another mechanism to interrogate these broader purposes. That's why I pointed you to section 3 of PIPEDA.

It was argued before in your predecessor committee that we needed section 3 because that way, you could look at purposes and say that it's not something a reasonable person would consider appropriate under the circumstances. And if it's not, then you shouldn't be doing it. There's quite a power on your part because of that provision to think more carefully about restricting certain uses of information.

11:55 a.m.

Canada Research Chair, Internet and E-commerce Law, University of Ottawa, As an Individual

Dr. Michael Geist

Often the question is put: Who does it better, or who does it best, and can we emulate them?

When PIPEDA was first established I think there was a view among many that it was the best practice. It looked at a lot of what was taking place in Europe and at what had emerged in the United States. In many ways, it tried to bridge the two different approaches. There can be disagreement over whether there could have been some tinkering here or there, but it genuinely tried to do that.

A number of countries looked to Canada as a model for how, on the one hand, to respect some of the views on privacy that have come out of Europe and at the same time to reflect some of the business considerations and enforcement elements that we've seen in the United States.

I would say that over the last 10 years we've really fallen behind. We've seen Europe, in some ways, get more aggressive on some of these issues, and we haven't kept pace. We've seen the U.S., frankly, do a far better job on the enforcement side than we have. There are real penalties there. If you screw up from a privacy perspective in the United States, you're going to pay. They are also the ones that came up with mandatory security breach disclosure requirements, which we see in States everywhere. We're seeing it, as I mentioned, in moving toward “do not track”. We're seeing it with respect to the misuse of social media, which I referenced as well.

I think it's about picking and choosing some of the very best that we've seen, from an enforcement perspective in the United States and from a values perspective from what we see elsewhere, to create an environment where we're not saying that we're like them but that we want other countries saying that they're like Canada. Over the last decade, we've failed to identify what it means to ensure that we have a privacy legislation that keeps pace with this changing world.

11:55 a.m.

NDP

The Chair NDP Pierre-Luc Dusseault

Thank you.

Your time is up, Ms. Murray.

Mr. Butt has the floor for seven minutes.

11:55 a.m.

Conservative

Brad Butt Conservative Mississauga—Streetsville, ON

Thank you very much, Mr. Chair.

Thank you all for being here today. I found your three presentations to be just excellent.

My daughters are 12 and 8. My 12-year-old daughter has decided that she, unlike Mr. Angus, likes Twitter. She has decided to set up her own little Twitter account and she does text, mainly to her little school chums.

As a parent, I am concerned about whether there's private information that is going to be accessed in some way, shape, or form.

Are you of the view that we can, or should, be looking at privacy measures for minors in a different way than we would for adults? Should we make the assumption that adults should know better? Adults are adults, and they should be smarter and should know better.

Should we look at strengthening privacy provisions to protect minors who are users of social media, or should we, in your view, treat everybody the same, regardless of their age?

11:55 a.m.

Canada Research Chair Information Law, Faculty of Law, Common Law Section, University of Ottawa

Dr. Teresa Scassa

Maybe Val could start.

11:55 a.m.

Associate Professor, Department of Criminology, University of Ottawa

Dr. Valerie Steeves

Sure, I'll take that one.

There were recommendations with the first PIPEDA review to have a tiered consent mechanism that recognized differences in ages. The suggestion was that under a certain age, companies shouldn't be able to collect any information at all. Then as kids become older, they can opt into programs where they can say the companies can have that information and can flash them a few ads. But it put real restrictions on what they would be able to do. Probably most importantly, there was a suggestion that once somebody turned 18, there should be a big delete button so that the information was forgotten.

If you look at how kids use technology, they use it to meet their developmental needs. When you talk to 11-year-olds, younger kids, they're actually the ones who make me the most comfortable. They sound the most mature. They say that they don't do any of the social networking stuff, certainly not in the broad world, because that's for older kids. They're very aware of the risks, and they manage them quite well.

When they hit 13 and 14, they're at a different developmental stage. They're exploring their identities through performance. They tend to do outrageous things, writ large, for a couple of years.

Then when they hit 15 to 17, right up to the early 20s, they explore their identities through social networks. If you think of it from their point of view, these technologies are fabulous, because they give them an opportunity to meet their needs as they become individuals and grow to be adults.

I certainly would not want to have to look at anything I wrote when I was 14 in any kind of public environment. Certainly for kids, yes, I think you need a forget button. There is definitely something different when you're a minor.

One of the interesting things that's come out of the research is that there was this belief that these digital natives would be different from us. Ironically, when they hit about 29, they start acting just like you and me, and they use technology the same way we do. They grow up, in other words.

So yes, they are different. I share the same concerns about using consent as a mechanism to provide that protection, because you have to identify an age for that system to work.

I was launching some research yesterday with a youth panel, and an 11-year-old told CBC all about how all of his 11-year-old friends in grade 6 have Facebook accounts. They know that they're supposed to be 13, but they just click the right button. I think we do a disservice to kids if we say that we have to put them under surveillance to make sure that they're old enough. That won't help. Certainly, having broader restrictions that say that kids are kids, so don't collect their information, and when they get older, don't use it in particular ways....

There was the Nexopia complaint, for example. Nexopia was the most popular social networking site for kids. One of the commissioner's recommendations was that they not retain information over a certain period of time. Nexopia just said, “Sorry, we're keeping it. There's a lot of money in this stuff”. You're talking about 12-, 13-, and 14-year-old kids.

The other thing is the use the information is put to. I don't have time to go into any details, but I can point to some research we're doing with young girls. The site is embedded with marketing material that uses very stereotypical images, particularly for gender. I've just done some really fascinating qualitative research with young women. They talk about how this restricts what they can do, and they're constantly trying to force it back. It's actually narrowing the kinds of people they can be rather than broadening the world for them.

Yes, we do have to think about kids differently. I think the way to do that is to look at the uses of the information and just say that it's not reasonable to collect information from eight-year-olds and then use it to try to sell them anything.

Noon

Conservative

Brad Butt Conservative Mississauga—Streetsville, ON

Go on.

Noon

Canada Research Chair, Internet and E-commerce Law, University of Ottawa, As an Individual

Dr. Michael Geist

Professor Steeves is the expert in this area, so I hate to take a different position. But I have to say that we've seen an attempt to try to target kids, from a privacy perspective, in the United States, with COPPA, the Children's Online Privacy Protection Act, which sought to have specific protections, and, essentially, parental oversight and consent for kids under 13. This legislation is a joke.

My kids are actually similar in age to yours, although I have one more. They're in this world as well. The notion that a company would say, “Hold on a second, we're not going to collect any of that information until we get your parents' consent. We're not going to collect anything at all...”.

The truth is, there are peer pressures. There's a desire to be there. Frankly, there's an awful lot of good that comes from this environment as well.

The idea that we can set specific rules that say that they're simply not going to collect or that they're going to get stronger consent we've seen for almost 10 years. There was a legislative attempt in the United States. I think it fails miserably, because the kids are smart enough to know that they can get around it if they want, and the companies will just look the other way as they know that it's happening.

From my perspective on these issues, we need tough standards that are enforceable. We need real order-making power from the Privacy Commissioner's perspective, with the potential for penalities when people overstep. And it would apply to all.

Noon

Conservative

Brad Butt Conservative Mississauga—Streetsville, ON

Thank you.

I'm sure my time must be up. That must be five minutes.

Noon

NDP

The Chair NDP Pierre-Luc Dusseault

You have one minute.

Noon

Conservative

Brad Butt Conservative Mississauga—Streetsville, ON

That was the main question I wanted to ask. If someone else wants to take the extra minute, I'm leaving the committee anyway.

Noon

Conservative

Blaine Calkins Conservative Wetaskiwin, AB

Sure, I will go.

Mr. Geist, I was listening to your questions a little while ago. One of your comments was about a reasonable expectation that users might have of how their personal information might be treated. That sounded like a legal definition. Is that defined anywhere in the current legislation? Does it need to be defined, or is the definition outdated? Is it outdated in PIPEDA? Is it a case law definition?

It sounded to me that this was some kind of standard verbiage that is used in the industry, and I'd like some more clarification on that.

12:05 p.m.

Canada Research Chair, Internet and E-commerce Law, University of Ottawa, As an Individual

Dr. Michael Geist

It is common language that they use, and I think it's highly problematic language. It's true that I used it, but there is a problem with relying on a reasonable expectation of privacy—which you actually see crop up very regularly in labour cases and other sorts of cases where they talk about what someone can reasonably expect. If there are privacy policies saying you shouldn't expect any sort of privacy, and if you have received clear notifications that they're going to collect all the information they can about you and will do absolutely everything they can to try to monetize it—they typically don't put it in that straightforward language, though that is essentially what they are often saying—then when you ask about what your reasonable expectation of privacy should be, the response is akin to the infamous Sun Microsystems' response, “You have no privacy. Get over it”. In that case, you have no reasonable expectation of privacy because you were told that you didn't have any, so get over it.

So in setting appropriate boundaries and standards and ensuring that we have effective tools to enforce those, we get away from the paradigm of saying, “You only get what you expect, and you shouldn't expect everything”, to saying “No, there are some minimum standards about what's appropriate and we have the tools to ensure that they're there and that they're going to be enforced”.

12:05 p.m.

NDP

The Chair NDP Pierre-Luc Dusseault

Thank you.

Ms. Borg has the floor for a five-minute question and answer period.

12:05 p.m.

NDP

Charmaine Borg NDP Terrebonne—Blainville, QC

Thank you very much.

I would also like to thank the witnesses for coming here today. The testimony we have heard is interesting; we are opening a Pandora's box of issues and questions.

My first question goes to Ms. Steeves.

You said that when you registered as a 16-year-old girl, you got advertisements specifically targeted to 16-year-old girls. Can you tell what effect advertising on social networks has on the behaviour of those young users?