Evidence of meeting #46 for Access to Information, Privacy and Ethics in the 41st Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was pipeda.

A video is available from Parliament.

On the agenda

MPs speaking

Also speaking

Sara Grimes  Assistant Professor, Faculty of Information, University of Toronto
Tamir Israel  Staff Lawyer, Canadian Internet Policy and Public Interest Clinic
Adam Kardash  Managing Director and Head of AccessPrivacy, Heenan Blaikie

11:40 a.m.

Conservative

Blaine Calkins Conservative Wetaskiwin, AB

Thank you, Chair.

Thank you very much to our witnesses. We've heard some fascinating information here.

I don't even know where to begin, but I'm going to just start, Ms. Grimes, with you.

You said that unbeknownst to most Canadians—I think this is fairly common knowledge—online activities are surveilled. We have data-mining going on out there. We have spiders. We have bots. We have all kinds of things that are downloaded onto people's computers unwittingly. We have spyware, malware, adware, and whatever you want to call it tracking people's activities, whether they're on a laptop or a mobile device. In these user agreements, we agree that our information will be allowed. It's in our settings in our devices whether or not we want to allow cookies, for example, on our computers. It's in our settings on our iPods and our iPads. We get push notifications. We can turn these kinds of things on or off. An educated user will have to make a little bit of an effort to do that. We can get third-party software that will help us protect, for example, our computers at home that our children are on when they're trying to do their homework, so that I as a parent can get notification on what kinds of activities my children may or may not be doing online.

And that's going to be a question I have for you: Do you think my child has the right to be able to do that on a computer, without me knowing what my child is actually doing? I'll save that question for the end.

In all of these agreements, I have one choice: I either accept the terms of the agreement in its entirety or I don't. That's the choice I have. I don't have the option to parse parts out.

My question, broadly, for all three of you is do you think there should be a legislative or a regulatory requirement to have these kinds of agreements parsed out in such a way that an end-user can actually have the ability to select which parts they're going to agree to, or which parts they're not going to agree to? Most of these things set defaults on how my information is going to be shared with a company like Acxiom, which frankly has me terrified.

I know how these things work, because I used to be a database administrator. I understand how these data points are collected, and many of these things are collected without my knowledge. I'm sure my name's in Acxiom, because I'm an avid computer user, or if it's not in Acxiom it's somewhere else. Somebody has information about me and my browsing habits and my user habits, and so on. So this is a very frustrating thing.

Why can I as a user not have the ability to choose which parts of the agreement I want to agree with and which parts I don't? Is that a reasonable thing, from a regulatory environment point of view, for a government to be involved in?

11:45 a.m.

Assistant Professor, Faculty of Information, University of Toronto

Dr. Sara Grimes

I'll leave the broader parts of the question to my co-presenters here.

In terms of kids and user agreements, there definitely need to be some changes. I've read a lot of end-user licence agreements for service directed to kids. They include all the same types of clauses that you find in any of these documents. There are all kinds of complications when kids are involved. Not only are there words that most kids can't understand, but most adults have trouble understanding them. Service contracts actually describe relationships in terms that younger kids just can't understand yet. They still have some developing to do before they can understand that level of complex things like property exchange or different economic processes that are being described in terms of service and use.

Changes to the kids' area, which is the area I'm an expert in, are defintiely needed with regard to things like the terms used in contracts so that they are understandable to kids and parents. I think as a government, as a country, we need to start thinking about how we're going to deal with kids entering into contracts, because minors' contracts are very tricky legally. They're voidable and there are all kinds of strange precedents to wade into.

I'm not a legal expert, and it hurts my head even to think about how complicated this all becomes when you start thinking about it in those terms. But we need to start dealing with that. We need to start thinking about it seriously and think about what we are expecting kids to be held to when they agree to terms of service that are 15 pages long, are full of all kinds of jargon, and include processes that are so far beyond what they're capable of understanding that we couldn't possibly expect these contracts to actually be upheld.

So, yes, I would love to see a more à la carte type of design for terms of service for use and end-user licence agreements, including some terms that have been delineated as terms that are appropriate for younger kids, and a framework for figuring out how we're going to deal with who signs on and who agrees to it, and how involved will parents be, because they clearly will have to be.

11:45 a.m.

Conservative

Blaine Calkins Conservative Wetaskiwin, AB

Thank you.

11:45 a.m.

Managing Director and Head of AccessPrivacy, Heenan Blaikie

Adam Kardash

The question's excellent, because it illustrates how you can't address privacy in a meaningful fashion with just an upfront consent process, especially for platforms that get more complicated.

There are two approaches to dealing with all sorts of different contexts, not just in the pure technology sector, but even more broadly. One is that in addition to a meaningful drafted notice upfront about what the user should be engaged in, really the most important thing has become twofold. One part is making sure users have appropriate control, and know where they can exercise that control. The other is—and this is an absolutely critical point and the emerging theme over the practice during the last ten years—that we've seen a move from concepts of consent and notice being important parts of privacy protection to the concept of privacy governance and a much more holistic approach to how you address these issues.

I think two or three weeks ago the Office of the Privacy Commissioner of Canada and the Alberta and B.C. privacy regulatory authorities issued a joint 26-page guidance document on their expectations for effective privacy management programs. Those expectations set out obligations for organizations to look at privacy five steps back from the whole range of the life cycle of data but from a risk perspective, in a manner through which they continually improve their practices and address things like controls and transparency. But most importantly, they addressed it in a much more detailed format.

I encourage the committee to refer to that document. There are at least 110 expectations set out that really go to the heart of your question. If companies are relying solely on those long-winded consent forms.... I used to draft those things. I know what they're about. They're not effective for privacy compliance. It's privacy governance that's exactly at the heart of what you're raising.

11:50 a.m.

Staff Lawyer, Canadian Internet Policy and Public Interest Clinic

Tamir Israel

It is an excellent question. I fully agree it is important to do a little bit more to simplify privacy policies. There's been talk of trying to standardize certain terms that have similar meanings for different companies but that are described in different ways in order to make it easier for consumers to compare privacy policies, but I agree with my colleague that doing that can't be the end of the process.

It's very important to have accountability and to have organizations put in place processes that take into account privacy concerns at all stages of the development of their services. I think our federal Privacy Commissioner and some of our provincial privacy commissioners have done a really good job at instilling that.

In addition, though, it's very important to make sure the substance of what is being imbedded into these development processes is also reflective of user expectations and privacy. Historically there's been a divide internationally among what the European Union does, what Canada does, and what the U.S. does. The U.S. had this sort of open framework where there was not too much regulation in place, but they're moving very far away from that and towards where we are now and also adopting these types of last-minute, just-in-time notifications where you're providing more notification and more control in line with the decisions you're actually making. That helps adjust elements of the privacy policy to let users have greater control over which parts of it they're okay with and which parts they're not.

11:50 a.m.

NDP

The Chair NDP Pierre-Luc Dusseault

Thank you, Mr. Calkins.

11:50 a.m.

Conservative

Blaine Calkins Conservative Wetaskiwin, AB

I have a couple of minutes left, Mr. Chair.

11:50 a.m.

NDP

The Chair NDP Pierre-Luc Dusseault

Unfortunately, Mr. Calkins, your time is up.

Although it is very interesting, I must now recognize Mr. Andrews, for seven minutes.

11:50 a.m.

Liberal

Scott Andrews Liberal Avalon, NL

Thank you, Mr. Chair.

Mr. Calkins, I'm going to try to get back to your questions around parents and the role of children in this.

Sara, my question's going to be directed at you. There are kids being born today who will know nothing but Facebook. They will grow up. We're at a very critical stage of this right now as these kids are now realizing what Facebook is.

As parents we sometimes like to show our kids off, and we put them on Facebook even before they're born. Only yesterday I saw a picture of an ultrasound on Facebook.

What role do parents have in this whole debate? We start it often before they're even born. Then when they're born we put their lives out there. I guess we bear some of the responsibility here.

Is it an education thing we need to be doing here? Is there any way to stop it, or is this genie already out of the bottle and there's no moving back on it?

11:50 a.m.

Assistant Professor, Faculty of Information, University of Toronto

Dr. Sara Grimes

It depends what the genie includes, I guess, in terms of being public and living online. It's hard to know if that's even something we should be trying to prevent and prohibit. We're trying to figure out the best way to do it, the best way to welcome kids into this world they're being born into. There's not really an alternative. They're getting school lessons and homework that gets them on social media. A lot of social interaction happens there. There are opportunities to be political, to find out about important events. There's a ton of benefits. It's how you balance the benefits and the risks, I think, instead of just focusing on one or the other.

In terms of parents' involvement, young kids and parents often come as a unit. A lot of these things are family processes that families are going through together. How families negotiate those is really important, but it can't be left completely to the families to make these types of decisions. As you say, not all parents know everything there is to know. This is new and fast-moving. It's hard for people to keep up. It's a huge burden to expect parents to be able to monitor and regulate every single thing their kids do. If they offload that responsibility onto something like a cyber-nanny program, a number of those have actually been investigated for doing a huge amount of data-mining on the kids they're protecting from particular sites and what not.

Approaching this as a family issue is definitely a useful way to think about it, but families also need support. Families need guidelines. Families need experts and politicians and lawyers on their side as well to think about how best to manage these things and support the best practices that do emerge and not to put all the burden on individual families to come up with solutions to very complex problems.

11:55 a.m.

Liberal

Scott Andrews Liberal Avalon, NL

Often people put the expectation on governments, and that has been done throughout the years. That's why we have an age for drinking. That's why we have an age for smoking. So it begs the question: Do we need an age restriction for this, and if we do, is it impossible to police, and have we already gone past that point of doing that now?

11:55 a.m.

Assistant Professor, Faculty of Information, University of Toronto

Dr. Sara Grimes

Comparing it to something like alcohol is a bit tricky, because there are proven health risks. There are adverse effects on development and physical development and all kinds of terrible things that can happen to young children if they're exposed to alcohol too young. I think about online space very much as being like public space. We don't have age restrictions on public spaces, on parks and the streets. Kids are allowed to be in public. If we think about being online as just an extension of being in public, then putting an age restriction on that I think is really problematic.

That doesn't mean that we can't have rules and legislation that would help guide what the practices should be and how kids should be addressed and treated. There are parts of this online public that are definitely inappropriate. I'm thinking about extreme examples here, just as in our physical world we also wouldn't let kids wander into certain stores and certain places. So I think having the same types of more nuanced, more considered rules and approaches would be applicable here, as opposed to having age restrictions. They rarely work in an online world, because age restrictions do put a lot of the onus on kids and families.

That's why I think COPPA has been deemed.... I don't think the Childrens Online Privacy Protection Act in the States is a complete failure. I think it did result in a lot of good things. Other people have deemed it a failure because kids are able to bypass the age restrictions. That's maybe taking too much of a blanket approach, and it also puts the onus on kids and parents to do all the monitoring. The age restriction is there, and if they're not abiding by it, then they're kind of punished.

11:55 a.m.

Liberal

Scott Andrews Liberal Avalon, NL

We're dealing with minors, and there are laws about minors. How do parents like Mr. Calkins and me integrate ourselves into this? Do we have any rights to contact these companies and say "My child is on here and I need to do something, I need to monitor this"? Is that a responsibility? Is that something we as parents should be concerned about?

11:55 a.m.

Assistant Professor, Faculty of Information, University of Toronto

Dr. Sara Grimes

One of the actually very positive aspects of COPPA was that one section of it said parents had the right to contact companies and find out all the data that a company had collected on their child and to put in a request for the data to be destroyed. According to people who have followed up and studied how the act has functioned and not functioned over the past 10 or 12 years, apparently that was not something enacted very often. But that was one way. Presumably kids could also demand the same thing and find out what data had been collected on them and have some sort of recourse, an established solution, so that if they wanted the data to be destroyed, it could be.

11:55 a.m.

Liberal

Scott Andrews Liberal Avalon, NL

That was COPPA, did you say?

11:55 a.m.

Assistant Professor, Faculty of Information, University of Toronto

Dr. Sara Grimes

Yes: the Children's Online Privacy Protection Act in the U.S.

11:55 a.m.

Liberal

Scott Andrews Liberal Avalon, NL

Okay.

Maybe to Mr. Kardash, how does that dovetail into our PIPEDA law? Are we both there on the same page?

11:55 a.m.

Managing Director and Head of AccessPrivacy, Heenan Blaikie

Adam Kardash

Canada doesn't have specific legislation dealing with children's privacy, but by implication.... In PIPEDA there are, among other things, consent rules. Children, and certainly children under the age of 13, would not have the legal capacity to provide their consent. By virtue of certain activities they would engage in, the consent of a parent or a guardian would be required in order for them to legally engage there. So the parent, in that vein, steps in the shoes of the child.

Probably more robust than COPPA, we have in addition to the consent rules the full range of other requirements that an organization would still have to comply with—the security, the notice to the parent, the retention practices and the destruction of personal information, and the rights of access that would be afforded to the individual parent.

So you have a full complement of legal requirements and obligations that would still suffice, but you have fundamentally that consent provision that someone under the age of 13 wouldn't have legal capacity to provide.

11:55 a.m.

Liberal

Scott Andrews Liberal Avalon, NL

How difficult is it for—

Noon

NDP

The Chair NDP Pierre-Luc Dusseault

Thank you. Unfortunately, your time is up, Mr. Andrews.

In order to be fair, everyone must be able to speak for the same amount of time.

Ms. Davidson, you have seven minutes.

Noon

Conservative

Patricia Davidson Conservative Sarnia—Lambton, ON

Thank you very much, Mr. Chair.

Thank you very much to each of our presenters today. This is an extremely interesting topic that we're studying here. Your input today I think raises a lot more questions and concerns that we need to address.

In particular, Ms. Grimes, you talked about some of the sites for kids and about some of the things that are available for kids. I think you said that their privacy rights can be infringed upon for commercial gain by some of these companies. You talked about other countries having safeguards. Some areas have a ban, for those who are younger than 13 years of age, with respect to whether or not they can be on these sites.

I agree, I think the consents that are required are totally inappropriate for kids. I think they're totally inappropriate for adults in most cases as well. I just fail to understand how anybody can be assured that just because there is a ban on children under the age of 13 that this ban can be enforced. I mean, anybody can say they're 13 years of age or over. If a kid is determined that they're going to go on this site because their peers are or for whatever reason, then they're going to indicate that they're over 13. I think it's just a ludicrous thing to even think that it could give anybody any type of comfort.

I'm wondering if you could talk a little bit about the other countries. You talked slightly about the U.S. and some of the safeguards they have in place. Are there other things in other countries we could look at?

As well, do you have any examples of social networks where children are specifically targeted and perhaps being used for commercial gain? And do you think kids themselves are concerned about their privacy rights?

Noon

Assistant Professor, Faculty of Information, University of Toronto

Dr. Sara Grimes

There have been some developments and some recommendations in the EU. I'm not up to date enough on that to know where they are in that process. They did a huge study in the EU, which ended recently. Academics and a number of government agencies studied these types of issues with kids of various ages online in something called the EU Kids Online project. After the reports came out, I know that discussions started about industry guidelines and implementing new guidelines and implementing potential regulations. Where they are in that process, I'm not entirely sure, but that would be one place to look. I know they have been considering it, and they've also grounded a lot of what they've been doing in research, which is great.

In terms of examples of children being specifically targeted and used for commercial gain, one of the big problems of studying this area and these processes is that there's a lack of transparency. Data is collected and you can read the terms of service and you kind of see the data coming out in different places, but it's not always clear what the links are and how data is being transferred and how it's being used. The examples I've looked at to see how this process can work tend to be sites that actually sell the data to other companies and that are quite open about selling the data to other companies. They function as a social media space, but they also do data-mining and data-brokering in-house.

An example from a few years ago was that of Neopets, which is an online community for kids. They sold the market research they had done to various different companies and had an annual report in AdAge, which is a big advertising industry trade publication in the U.S. They would include surveys and pretty easy-to-identify market research strategies within the site.

A more recent example is Habbo Hotel, which is based out of Finland but is popular all around the world. Most of the people who use it are between the ages, I think, of 13 and 18, but they do have a significant number of users who are 11 to 13, as well. They offer a similar type of service, called Habbel, through which they package data and sell it to other companies. Through that service, companies can also hire them in advance to sort of spy on conversations that kids might be having about a particular product, and tell them not just what the kids are saying about the product but the larger context within which that conversation emerges—what kinds of likes those kids have, what areas of the site they are gravitating towards, what other things they talk about, what time of day they are there, where they plan to go after if they plan to meet up in real life, because a lot of kids who meet and communicate in social media actually do know each other in real life and go to the same schools and that kind of thing. It can be very detailed information.

The only reason we know how detailed they are and we know about these kinds of processes is that they're openly selling the data. But in many cases they're not selling the data. They're keeping it or they're selling it through more covert means, so it's not as obvious what's happening to it.

Are kids concerned about privacy? Definitely. There's been a lot of talk about the different concepts of privacy that kids have. I think this comes back to Mr. Andrews' comment earlier about kids being born in this age of Facebook, and not knowing any different type of environment and having pictures of themselves online before they're even old enough to go online themselves.

They may have slightly different concepts of privacy, but a lot of them are very similar to traditional concepts of privacy. In study after study, what comes out the most is that they're most concerned with privacy infringements that impact them on an immediate level: friends infringing on their privacy or parents infringing on their privacy or perceiving that their parent is infringing on their privacy. These abstract forms are at a length. They doesn't seem to impact them on that day-to-day basis. They are dealing with these privacy issues in ways that we have yet to fully appreciate. They might not seem as concerned about these things, but oftentimes they just don't really understand how they're going to impact them and where. Frankly, because so many of us also don't understand how those types of privacy infringements are impacting us and where, we're worried about what might happen, but we're not completely seeing the consequences yet. It's more difficult to find out how they feel about that.

There is a new study of Canadian children and youth that has come out just recently and has explored these issues. Increasingly, kids are even able to articulate these concerns about abstract privacy infringement, which I think is a really important development. They're learning about it more, they're experiencing it more, and they're able to communicate more about how it makes them feel and whether they feel their rights are infringed.

The sad thing is that I'm not sure if they feel there's an escape, a solution, or an alternative. There certainly isn't one being presented to them right now.

12:05 p.m.

NDP

The Chair NDP Pierre-Luc Dusseault

Your time is up, because it includes the question and the answer.

Ms. Borg, you have five minutes.

12:05 p.m.

NDP

Charmaine Borg NDP Terrebonne—Blainville, QC

My thanks to the witnesses for being here today.

My first question goes to Mr. Israel. According to what I have read, when the commissioner makes recommendations about a privacy protection policy, some companies completely change their platforms so that the recommendations become redundant.

Could you comment on that? Could that justify the argument that the commissioner needs more powers to impose financial penalties?

12:05 p.m.

Staff Lawyer, Canadian Internet Policy and Public Interest Clinic

Tamir Israel

Thank you. It's a very good question.

I think that in many contexts we do get a good level of compliance from industry, but the problem is that sometimes in the social networking context and the Internet context, some of the mechanisms that it takes to comply with the Privacy Commissioner's recommendation take a while to implement—to develop and to put in place. We've seen this in the United States with the Federal Trade Commission in a number of the privacy complaints they've looked at. We've seen it in Canada a little bit.

The problem is that the mechanism we have in place under PIPEDA is not very well suited for the Privacy Commissioner to have ongoing control of that issue. Forty-five days after they implement their recommendation, they're faced with a decision on whether to take the issue to the Federal Court—to start from scratch and to do it in the context of a trial, which is not a very flexible context to be in when you're trying to do privacy governance—or to enter into really undefined arrangements.

In one case we had, it was basically almost a contractual arrangement that was entered into with the party. In the United States you're seeing similar things, where it's a settlement agreement between the Federal Trade Commission and companies to do certain things over certain years. But there are not necessarily a very clear enforcement mechanism and a process in place to deal with those types of compliance processes.