Evidence of meeting #46 for Access to Information, Privacy and Ethics in the 41st Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was pipeda.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

  • Sara Grimes  Assistant Professor, Faculty of Information, University of Toronto
  • Tamir Israel  Staff Lawyer, Canadian Internet Policy and Public Interest Clinic
  • Adam Kardash  Managing Director and Head of AccessPrivacy, Heenan Blaikie

11 a.m.

NDP

The Chair Pierre-Luc Dusseault

It is 11:00 a.m., so let's start the meeting right away.

My thanks to Mr. Kardash, from Heenan Blaikie, and Ms. Grimes, from the University of Toronto, for being with us today. We are waiting for two other witnesses who were supposed to be here at 11:00 a.m. Our information is that they are on their way.

We will start with the presentations, for which you each have 10 minutes. Then there will be a question and answer period.

So let's start right away with Ms. Grimes.

11 a.m.

Dr. Sara Grimes Assistant Professor, Faculty of Information, University of Toronto

Thank you for this opportunity to speak with you today.

Over the past several years my research has focused on some of the social media sites most popular among children, from online communities like Neopets to virtual worlds like Club Penguin. These types of sites don't look very much like Facebook, but they nonetheless do allow for many of the same types of social interactions and activities we identify as characteristic of social media.

Privacy issues are of enormous relevance within these environments. The research shows that since the very early days of the World Wide Web, kids' privacy rights have been infringed upon for commercial purposes within certain online social forums. This happens with much greater frequency than most of the other risks associated with kids online. It's also something that in other countries has led directly to the establishment of child-specific privacy legislation. The key example here is the U.S. Children's Online Privacy Protection Act, or COPPA, which was initially created in response to the then growing practice of soliciting names and addresses from children in order to direct-market to them.

Today the type of data collected from kids and the purposes for which it's used have both expanded significantly. The article that was circulated to you in advance of my appearance here today describes this shift in detail, explaining industry trends toward data mining, in which children's conversations, behaviours, and ideas can become fodder for market research and product development.

In my work in this area, I have observed that within social media forums, when children are considered at all, concern for their rights often plays second fiddle to narrowly defined notions of risk. Children are still very much more often seen as potential victims or conversely as potential criminals in the online environment. As such, the emphasis is placed on protecting them from strangers, from each other, and from themselves, rather than supporting and empowering them as citizens.

This tendency has greatly impacted the way in which social media companies address child users. The first and most common response has been to simply ban children under the age of 13 from participating in social media sites. This was the strategy found until very recently on Facebook, and it remains common throughout other popular social media as well. Although some children may, and often do, bypass these bans—by lying about their age, for instance—a formalized age restriction still has deep impacts on how and where children use social media. It also serves as a way of deflecting some of the public and regulatory scrutiny that can be associated with sites that do openly allow children or invite children to participate.

While in some cases age restrictions may very well be appropriate—there are many sites where they would be—in others, the no-children-allowed approach has more to do with wanting to avoid the risks and complications that kids might bring than it does with the actual content or activities that unfold there, which means that younger children are frequently banned from participating fully and inclusively in online culture and from reaping many of the benefits and opportunities that social media presents, simply because it's been deemed too much work or too expensive or simply too risky to accommodate them.

Another increasingly common response is the creation of tightly controlled child-specific social media, found in social networking sites, virtual worlds, and online communities designed and targeted specifically to children, usually under the age of 13. In my research I've found that in many of these cases the emphasis on risk has put privacy front and centre. Privacy concerns integrated at the level of design are quite apparent. They surface in legal documents such as privacy policies in terms of use, and they appear in the marketing of the sites themselves.

However, a number of areas are in dire need of improvement. As mentioned, there is continued evidence that children's online interactions are being surveilled and data-mined, most often without the full knowledge or consent of the kids involved, or that of their parents and guardians. While kids are regularly asked to agree to these kinds of activities through the privacy policies and terms of use they are required to agree to in order to participate, even on sites designed and targeted to younger children, these documents are long and extremely complex. They describe a wide variety of data collection activities and include a number of terms that are inappropriate and even inapplicable to ask children to agree to.

This raises important questions about informed consent, an issue that's particularly pressing when the users consist of young children with widely varying literacy levels and emerging capacities for understanding complex legal relationships. Best practices would include providing a child-friendly version of both of these documents to ensure that children and their parents know exactly what they're agreeing to. While there are definitely some really great examples of this practice out there, overall very few sites for kids bother to do it. When they do, the child-friendly versions are rarely comprehensive: most don't explain the full reasons for user data collection or only describe items that present the social media company in a positive light.

The misrepresentation of children's privacy as a matter of online safety is also becoming an increasingly prevalent trend. Now, don't get me wrong here. A broader consideration of how rules and design features aimed at protecting children's privacy rights might also offer protection from online predators and bullies has some very real benefits for children's safety and for their enjoyment of social media. But so far, in many cases this dual function has been realized in ways that work primarily to obscure the underlying commercial practices that privacy policies are actually meant to address. By reframing children's privacy as predominantly a matter of online safety--which is, in these cases, defined as safe from other users--the more mundane and less obviously risky threats to children's privacy, such as corporate surveillance and invasive market research, are sidelined.

A related emerging trend is to commercialize the safety features themselves, as I discovered in a recent study of kids' virtual worlds. Some kids' virtual worlds come with a “safe chat” version, where chat between users is limited to selecting preconstructed sentences from a drop-down menu. In one case, the “safe chat” version limited kids' options to a mere 323 different phrases, 45 of which were cross-promotional and 30 of which promoted third-party ads. As you might have guessed, none of these phrases were in the least bit negative. Kids could chat about how much they loved the brand but were prohibited, by design, from saying anything critical about it.

Among the many potentially negative impacts this can have on children is the impact it has on children's rights. These examples reveal that an unfortunate trade-off is taking place, as limited approaches to children's privacy and safety can place undue restrictions on children's other rights, such as the right to freedom of expression or the right to participate freely in cultural life.

Now, it's important to note that what I've described here are general trends, mostly found in commercial social media sites that are considered to be popular among children. Not all social media companies follow these practices. And there are, in fact, a number of Canadian companies that have come up with some pretty brilliant alternative strategies for balancing kids' privacy, safety, self-expression, and cultural participation. There is potential for real leadership here, but there's currently a lack of the kind of regulatory and government support that would be necessary for these types of individual, small-scale, ethical, rights-based approaches to develop into widespread industry practice.

In the time I have left, I'd like to outline four key take-aways or recommendations.

First, there is a clear and growing need for child-specific regulation on the collection, management, and use of children's data. In so doing, however, we'll need to avoid making the same mistakes that have plagued certain previous attempts, such as COPPA, in the U.S., which resulted in kids losing access to certain very important social spaces and/or widespread lying about their ages. We'll also need to expand this regulation in ways that better reflect current and emerging online data collection practices.

Second, we need a much clearer articulation of the ethics of informed consent where children of various ages are involved.

Third, we need to strive for a better balance between children's privacy rights and other rights, such as freedom of expression and the right to participate in cultural life, both within our discussions of these issues and within regulations, either amended or new.

Last, we need to establish clearer leadership and stronger enforcement of these child-specific rules, which would include acknowledging and supporting the innovative, ethical, rights-based examples that certain independent and small Canadian social media companies are already working to build.

I look forward to discussing these issues further with you during the question period.

Thank you.

11:10 a.m.

NDP

The Chair Pierre-Luc Dusseault

Thank you very much.

I now give the floor to Mr. Israel.

You have 10 minutes.

11:10 a.m.

Tamir Israel Staff Lawyer, Canadian Internet Policy and Public Interest Clinic

Good morning.

My name is Tamir Israel, and I am a staff lawyer with the Samuelson-Glushco Canadian Internet Policy and Public Interest Clinic, CIPPIC for short. CIPPIC is grateful for this opportunity to present its views on the privacy implications of social media to the committee.

CIPPIC is a legal clinic based at the University of Ottawa Centre for Law, Technology and Society. We advocate in the public interest on issues arising at the intersection of law and technology.

Since its inception, CIPPIC has taken an active part in legal and policy debates about online privacy, both domestically and internationally. Our clinic filed a complaint that led to the first comprehensive investigation of international social networks' privacy practices.

11:10 a.m.

NDP

The Chair Pierre-Luc Dusseault

Just a moment, Mr. Israel. Could you read a little more slowly, please, so that the interpreters can do their job properly?

11:10 a.m.

Staff Lawyer, Canadian Internet Policy and Public Interest Clinic

Tamir Israel

I'll talk a little slower.

The growing importance and benefits of social media to Canadians cannot be understated. These are far-reaching and permeate every aspect of our individual, social, and political lives. The innovative and commercial growth of such networks should not be unduly restricted. At the same time, Canadians should not be forced to choose between their privacy rights and their right to participate in this new interactive world.

PIPEDA, which forms the backbone of privacy regulation in Canada, provides a flexible set of principles that cater to the legitimate needs of businesses while providing safeguards for user privacy. While PIPEDA has largely withstood the test of time, the privacy landscape has changed substantially since its enactment, and a decade of experience has exposed a number of shortcomings that should be addressed if the statute is to continue to meet its objectives.

I will quickly say a few words about the shifting privacy landscape and proceed to elaborate on four areas that I think need immediate attention.

In recent testimony before this committee, Professor Valerie Steeves pointed to research indicating growing lack of trust in online companies. A survey conducted for Natural Resources Canada in late 2009 similarly found that respondents' level of trust in different types of organizations to keep their personal information secure is moderate to low. The least trusted were small private sector businesses and social networking sites.

The study similarly found that the ability to control the context in which information is shared increased levels of trust. In another study conducted by researchers at Annenberg and Berkeley, 67% of Americans agreed or strongly agreed that users have lost all control over how personal information is collected and used by companies.

Feeding this sense of lost control is an increasingly complex ecosystem where the scope and nature of data collected increases daily, even as the sophistication of information collection and analysis mechanisms keeps pace. While Google and Facebook have been at the forefront of debates on these issues, numerous other companies are involved. Acxiom, a data broker based in Arkansas, has reportedly collected an average of 1,500 data points on each of its 500 million active user profiles.

Few of these users have heard of Acxiom, let alone had any direct interaction with the company. Yet the profiles, which data brokers such as Acxiom sell, are populated with their browsing habits; the Facebook discussions they have with their friends and family; their sensitive medical and financial information; their ethnic, religious, and political alignments; and even real-world locations visited. All this data is collected, analyzed, and refined into a sophisticated socio-economic categorization scheme, which Acxiom's customers use as the basis of decision-making.

The sheer complexity of the ecosystem that fuels databases such as Acxiom's defies any attempt to articulate within the confines of a privacy policy. A number of jurisdictions are looking at ways of addressing the need for greater transparency and choice. I will briefly focus on four here that I think are relevant specifically to PIPEDA. I'll point out as well that the nature of the data being collected in this ecosystem is also increasing in sensitivity. Newly emerging capacities are aiming to incorporate real-time location and even emotional state into the categories of information that are available for targeting. I'll touch on four changes I think we should focus on. The first is transparency.

Greater transparency is needed. To this end, the United States Federal Trade Commission has recently stated it will push data brokers to provide centralized online mechanisms that will help users discover which data brokers have collected their data. This can serve as the basis for the exercise of other user rights.

Informing users can be achieved in a number of contexts through greater integration of notification into the service itself. This not only allows for greater flexibility and nuance in notification, but also increases privacy salience by reminding users in context of the privacy decisions they are making. In addition, elements of privacy policies can be standardized, but care must be taken not to oversimplify data practices that are in reality complex. The dangers of oversimplification are that organizations will begin to rely on blanket and categorical consent, which are simple but do not provide customers or advocacy groups the details they need to properly assess their practices.

Another area I'd like to touch on is privacy by default or privacy by effort, which is an analog to that.

Transparency alone is not enough to protect privacy in this interconnected age we are in. In a recent consultation process on online privacy, it was noted that many online services are public by default and privacy by effort. New users will rarely know how to configure the complex web of the often conflicting privacy control services that are offered when first signing on. Settings constantly shift and change, as new ones are introduced and old ones replaced, or when new features are added to existing services. Simply maintaining a constant level of privacy is a never-ending effort.

Compounding such efforts is a tendency for social networking sites to make occasional tectonic shifts in the constitution and nature of their services. These are often imposed on ingrained users as “take it or leave it” propositions. At other times, pre-selected defaults are used to nudge users in directions that are very different from the service they have grown accustomed to.

As you've heard from other experts, the devil is indeed in the defaults. Stronger protections are needed to ensure new services and settings are introduced with privacy-friendly defaults that reflect the expectations of users and the sensitivity of the data in question, not whatever configuration is best fitted to the service provider's business model.

Under PIPEDA, the form of consent should already be tailored to user expectations and the sensitivity of the data that might be affected. However, in order to firmly ingrain this concept in service design, privacy by default should be explicitly adopted as a principle under PIPEDA.

Another area I want to touch on briefly is enforcement and process.

The committee has heard from a number of parties about the importance of ensuring that the Office of the Privacy Commissioner can enforce its powers. Adding bite to PIPEDA is critical for a number of reasons. First, it is necessary in order to provide incentives for compliance. Currently there are very few penalties for non-compliance. In most cases the most an organization can expect is the threat of being publicly shamed for non-compliance. Second, having these powers in place will assist the Office of the Privacy Commissioner in its interactions with large multinational organizations so it can carry out its mandate in protecting the privacy of Canadians.

In addition to adding penalties, procedural changes to the OPC's investigative and compliance framework should be explored. Compliance with OPC recommendations in a social networking context may be a long and complicated road, requiring changes to system design. However, under PIPEDA the OPC's legal mandate to exercise its powers over a particular complaint ends 45 days following the issuance of an official finding. The mechanism lacks the flexibility necessary to ensure Privacy Commissioner recommendations are carried out adequately.

Finally, I'll touch briefly on breach notification requirements.

Canada is in dire need of a breach notification obligation. Such an obligation will improve incentives to build stronger technical safeguards and provide users with opportunities to redress harm, such as identity theft and the potential humiliation that may result from a breach of their data.

Bill C-12, which is currently in first reading, provides a workable framework for breach notification, but it requires fixes and a commitment to introduce penalties for non-compliance if it is to be effective.

I would be happy to elaborate further on any of these points. CIPPIC plans to file a more detailed brief with the committee at a later point.

Thank you very much for your time and attention.

11:20 a.m.

NDP

The Chair Pierre-Luc Dusseault

Thank you for providing us with your presentation.

We now move to Mr. Kardash, also for 10 minutes.

11:20 a.m.

Adam Kardash Managing Director and Head of AccessPrivacy, Heenan Blaikie

Good morning, Mr. Chair and honourable members. Thank you for the opportunity to speak with you today.

My name is Adam Kardash. I am a partner at the national law firm of Heenan Blaikie, and chair of the firm's national privacy and information management practice. I am also managing director and head of AccessPrivacy, a Heenan Blaikie consulting and information service focusing on privacy and information-related matters.

I appear before this committee in a personal capacity, representing only my own views. However, my views are based upon my experience at Heenan Blaikie and AccessPrivacy.

Over the past ten years I have focused almost exclusively on advising private sector organizations on privacy and information management matters. I regularly consider the privacy law implications of new technologies and platforms.

In my opening remarks I will offer a number of comments that centre on a single theme; namely, that our federal private sector privacy law, the Personal Information Protection and Electronic Documents Act, or PIPEDA, works very well. Since coming into force in 2001, and despite all sorts of criticism from a range of stakeholders across the Canadian privacy arena when first introduced, the statute has stood the test of time. In my view, PIPEDA has worked and continues to work particularly well in addressing privacy challenges raised by new technologies.

The act sets out a comprehensive set of requirements that regulates an organization's collection, use, disclosure, storage, and management of personal information. One of the reasons the statute remains effective today is because it was drafted in a technologically neutral fashion. PIPEDA's core rules are mainly set out in plain language as broad principles, and therefore can be applied to any new technology, new application, or new system that involves the processing of personal information, including social media platforms.

It is precisely because PIPEDA does not focus on any particular type of technology that it is so well suited to addressing seemingly novel privacy issues that may be raised by new technological developments. In this regard, it is important that PIPEDA remains drafted in a technologically neutral manner. Given the increasingly rapid pace of technological innovation, any statute that is drafted focusing on a certain technology or platform, whether social media or otherwise, will be obsolete, out of date, by the time it comes into force.

In my experience, technology-based issues, privacy or otherwise, are most effectively addressed through self-regulatory frameworks that work in concert with the statutory regime. Compared to statutes or regulations, self-regulatory frameworks are far easier to develop, implement, supplement, or revise in order to remain current with changing technological developments.

Notably, under PIPEDA, a self-regulatory framework developed by way of a meaningful consultation process would have legal value under the statute. Self-regulatory frameworks establish industry standards, and well-developed industry standards inform the meaning of PIPEDA's overarching reasonable person test. This is in subsection 5(3) of the act, which provides that organizations may only collect, use, or disclose personal information for a purpose that a reasonable person would consider appropriate in the circumstances.

When advising clients, as a matter of practice I do not refer to PIPEDA as merely a set of legal rules. Rather, the statute sets out a useful framework for organizations to proactively address privacy concerns in a manner that balances individual privacy with the collection, use, and disclosure of personal information in the course of legitimate business activities. PIPEDA's rules are dynamic, in that they apply to the entire life cycle of data, from the collection or creation to the ultimate destruction of personal information held by an organization.

All of these rules fall under the principal feature of PIPEDA: the accountability principle. The accountability principle is a simply worded but very powerful requirement. It provides that organizations are responsible for personal information in their possession or control.

Notably, PIPEDA's accountability model is now being referred to around the world, by foreign data protection authorities, foreign governmental bodies, and global privacy think tanks, as the enlightened statutory model for the protection of personal information. PIPEDA's framework, in large part due to its accountability model, is specifically cited in these international fora as being well positioned to appropriately address the privacy concerns that may arise in the online sector, and otherwise in the technological context.

There are a number of published letters of findings from the Office of the Privacy Commissioner of Canada that clearly demonstrate the OPC's effectiveness, under PIPEDA's existing framework, in considering and appropriately resolving emerging privacy issues raised by new technologies. They include several letters of findings issued in the social media context.

One of the central and in my view critical features of PIPEDA is the ombudsman model incorporated into the act. The Privacy Commissioner is vested with the role of ombudsman in carrying out her duty to oversee the personal information practices of organizations subject to PIPEDA, with recourse to the Federal Court where issues remain unresolved.

The ombudsman model is hardly new. It is typically employed by governments to regulate public administration. But PIPEDA applies the ombudsman model, in a novel fashion, as a means of regulating private sector activity. In my experience dealing and interacting with the OPC when advising clients across all sectors, the OPC's ombudsman model has proven over time to be very effective and generally well received by private sector organizations.

An ombudsman model is particularly well suited to facilitating effective privacy compliance, since meaningful privacy protection is not just about an organization satisfying legal rules. Rather, privacy interests are addressed meaningfully when a privacy mindset is fostered within an organization in a manner that's tailored to the reality of an organization's business context. Experienced chief privacy officers understand that privacy is about enhancing trust. And building trust requires engaged discussion with stakeholders within an organization, within industry sectors, and across the privacy arena. The OPC plays an important part in this discussion, and the ombudsman model facilitates flexible and collaborative interaction with private sector organizations.

Commissioner Jennifer Stoddart eloquently described the nature of her role as ombudsman in a 2005 speech in which she considered the merits of the ombudsman model. She stated:

It must be underscored that the Ombuds-role is not simply remedial, but transformative in nature. The aim is the resolution of individual complaints, but it is also the development of a lasting culture of privacy sensitivity among parties through their willing and active involvement in the process itself. In order to achieve these twin goals, the process must necessarily be flexible, participative and individuated in its approach.

Recently there have been calls from various stakeholders in the Canadian privacy arena, including from Commissioner Stoddart, for PIPEDA to be amended to provide the OPC with greater enforcement powers. Based on my experience in the privacy arena over the last ten years, it is not clear that any such amendments are necessary.

To their credit, Commissioner Stoddart and the more recently appointed assistant commissioner, Chantal Bernier, have been remarkably successful in carrying out their mandate in the ombudsman model context. They have done so with an arsenal of several powers under PIPEDA. In particular, they have the power to publicly name organizations that are in breach of PIPEDA, the power to self-initiate investigations or audits of an organization's personal information practices, and, as I noted, the power to refer complaints to the Federal Court.

The OPC has been highly respected in the international privacy arena for years, but it enhanced its reputation considerably among foreign data protection authorities as a result of its highly publicized investigation of Facebook's personal information practices. As a direct result of the OPC's enforcement activities, Canada is now regarded as one of the leading jurisdictions globally, exploring privacy issues associated with new technologies, including in the social media context. The OPC's achievements in this regard have been accomplished without order-making power or other enforcement mechanisms, such as the ability to levy fines. Notably, Commissioner Stoddart has made public statements to the effect that the mere public threat by the OPC of potential Federal Court action against a given organization has almost always resulted in the organization satisfying the OPC's concerns.

Innovative new technologies, such as social media platforms, offer Canadians tremendous value. As we continue to engage with and take advantage of new technologies, and we all provide our personal information in the course of doing so, privacy will continue to play an increasingly integral part of private sector organizations' trust relationship with individuals.

As we consider emerging privacy issues, it is of course important to reflect upon whether the existing privacy regulatory framework serves to ensure that individual privacy is appropriately addressed. With PIPEDA, we're fortunate: we have a technologically neutral, principle-based statutory framework that has served us exceedingly well in ensuring the protection of privacy in a balanced fashion.

As the committee continues its study, I respectfully offer the following concluding suggestions when it considers whether and the extent to which PIPEDA needs to be amended to address challenges posed by new technologies, in particular, amendments that will provide enhanced enforcement powers.

First, as individuals we all have a responsibility to be careful with how we use our personal information in public contexts. Public outreach and regular training and awareness by privacy regulatory authorities and relevant private sector organizations are critical in this regard. No amendments to PIPEDA would be required to enhance our collective efforts in this fashion.

Second, I respectfully submit that the committee carefully consider the costs of moving to an enforcement model under PIPEDA. To accommodate new enforcement powers such as order-making power, structural changes to the OPC will be required, and key benefits afforded by the ombudsman model will be lost.

Third, as part of a national strategy to ensure growth of our domestic technology sector, we need to ensure that any legislative change or initiative be carefully considered in a manner that ensures we don't impose unnecessary impediments to legitimate business activity. In short, in my view, the economic costs of privacy regulatory change need to be carefully considered. We need a regulatory framework that fosters innovation. In the privacy arena, PIPEDA provides us now with an appropriate model that has served us well in this regard.

Finally, the constitutional impact of any legislative change to PIPEDA, in particular with respect to new enforcement powers, needs to be carefully reflected upon. The recent Supreme Court of Canada decision in the securities reference, a case that considered the constitutionality of a national securities administrator, serves as an important reminder that constitutional considerations need to be a part of any study of privacy legislative reform.

Thank you again for the opportunity to speak with you this morning. I would be pleased to respond to any questions from the committee.

11:30 a.m.

NDP

The Chair Pierre-Luc Dusseault

Thank you.

Now it's time for questions and comments.

Mr. Angus, you have seven minutes.

11:30 a.m.

NDP

Charlie Angus Timmins—James Bay, ON

This is another fascinating day of discussion.

I would like to start with you, Madam Grimes, because I think what you're suggesting seems to be somewhat counterintuitive to the message we've been given, which is that we have to limit young people on social media, that we need to have these limits for 13-year-olds. I don't know any kid under 13 who isn't on Facebook or on social media. They're not allowed to do it at school. They're not allowed to go to YouTube at school. So they're supposed to be staying off social media, but we'll create these little walled gardens for them to protect them.

These walled gardens are run by corporate interests that you're telling us mine their data and sell their data and basically are using this as a bit of a commercial predatory space. Would it be better that we establish some clear rules to limit companies' ability to go after this kind of right, the privacy of information of children? Should we actually move young people onto general social media with some better rules? Would that be a better solution than these walled gardens that are being set up now?

11:30 a.m.

Assistant Professor, Faculty of Information, University of Toronto

Dr. Sara Grimes

Yes, I definitely think so. Yes, in pointing out the two trends, both models have resulted in some pretty clear infringements of kids' rights and some huge problems. Banning kids from Facebook hasn't kept kids off Facebook, as you've said, and creating the walled gardens has created this false sense of security and safety for parents and for children who are seeking those types of alternatives, where a lot of other processes are going on unchecked.

Again, it's not that every social media site that's designed specifically for kids is doing this to the same extent, but it is this trend that has been spreading and kind of deepening as time goes on. So yes, I in no way meant to suggest that banning kids or creating walled gardens was the ideal, but these are the things that have happened over the past ten years.

This is the state of affairs: neither model works. So now I definitely think that we need to start looking at alternatives, at the possibility of creating a better framework that would give different social media companies a guideline and baseline to work from that's not based purely on reacting to public outcries about risk and parental concerns about risk, but on something broader, on a more democratic sort of sensibility about rights in general. It would weigh all the different benefits that kids can get from participating in social media, along with the risks. I think a lot of companies would really benefit a lot from having those kinds of guidelines and frameworks in place.

11:35 a.m.

NDP

Charlie Angus Timmins—James Bay, ON

Thank you.

Mr. Kardash, I'm hearing two very different views, one from Mr. Israel and one from you. He says we need breach notification, we need compliance orders, administrative monetary penalties. You tell us that the market works best when it's left to do what it wants to do, and Commissioner Stoddart is perfectly happy with the state of affairs.

You don't believe there should be better compliance rules?

11:35 a.m.

Managing Director and Head of AccessPrivacy, Heenan Blaikie

Adam Kardash

As I mentioned in my remarks, PIPEDA currently establishes, in my view, a very effective model for the protection of privacy in a manner that balances the interests of both individuals and businesses in the course of their collection and use of data in the course of legitimate business activities.

There currently is a series of powers that the commissioner does have, which, by the evidence of over the last several years, have been remarkably successful—to their credit—in addressing these very issues.

11:35 a.m.

NDP

Charlie Angus Timmins—James Bay, ON

What powers? She said her powers aren't sufficient.

11:35 a.m.

Managing Director and Head of AccessPrivacy, Heenan Blaikie

Adam Kardash

In my view, the current model and powers are entirely sufficient.