Evidence of meeting #100 for Access to Information, Privacy and Ethics in the 42nd Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was platform.

A video is available from Parliament.

On the agenda

MPs speaking

Also speaking

Kevin Chan  Global Directeur and Head of Public Policy, Facebook Canada, Facebook Inc.
Robert Sherman  Deputy Chief Privacy Officer, Facebook Inc.

8:50 a.m.

Conservative

The Chair Conservative Bob Zimmer

Thank you, everybody, for coming today to meeting number 100 of the Standing Committee on Access to Information, Privacy and Ethics. Ppursuant to Standing Order 108(3)(h)(vii), we are studying the breach of personal information involving Cambridge Analytica and Facebook.

With us today are Kevin Chan, global director and head of public policy, Facebook Canada, and, via teleconference from California, Robert Sherman, deputy chief privacy officer.

I would like to start with Mr. Chan. We, and I as chair, were disappointed that Mr. Zuckerberg declined our request. We don't take that lightly, but we appreciate your being here today.

Mr. Chan.

8:50 a.m.

Kevin Chan Global Directeur and Head of Public Policy, Facebook Canada, Facebook Inc.

Thank you very much.

Again, yes, our CEO does apologize that he could not be here today in person with the committee. I am here with my colleague Rob Sherman on his behalf. Thank you for that note, sir.

Mr. Chair and members of the Standing Committee on Access to Information, Privacy and Ethics, thank you for the invitation to appear before you today. My name is Kevin Chan, and I am the head of public policy for Facebook Canada. I am joined via video conference by my colleague Rob Sherman, Facebook's deputy chief privacy officer.

Before I start, I want to acknowledge our offer earlier this week to pre-brief committee members on the Cambridge Analytica situation. Over the past few weeks, we have made a large volume of announcements for which we have done pre-briefs to U.S. lawmakers prior to last week's congressional hearings. We want to extend that same offer as a courtesy to members of this committee. I regret that our intentions may have been unclear.

I want to begin by sharing that while we do not yet have all the facts surrounding the situation with Cambridge Analytica, what has alleged to have occurred is a huge breach of trust to our users, and for that we are very sorry.

Given the scale of our service, with more than 23 million Canadians using Facebook every month—and more than 2 billion people globally—we recognize the role we play in people's lives and the need to take greater responsibility for that.

It goes without saying that the events of recent weeks involving the protection of personal data is of concern to us all. With hindsight, it is clear that Facebook had not invested enough in the security of our platform, and, for that, we are responsible. We have a duty of extreme vigilance and we are going to do everything we can to make the required corrections in order to regain the trust of those who use the platform.

The events of the last few weeks have taught us some important lessons. Trust in our service is at the core of what we do at Facebook. As our CEO Mark Zuckerberg recently said, “We have a responsibility to protect your data, and if we can't then we don't deserve to serve you.”

As Facebook has grown, people have gotten powerful tools to stay connected to those they care about, make their voices heard, and build communities and businesses, but it's clear now that we didn't do enough to prevent these tools from being used for harm as well. We didn't take a broad enough view of our responsibility, and that was a mistake.

In Canada and around the world, we know we have a lot of work to do, and this is just the beginning. We are of course also fully co-operating with the Office of the Privacy Commissioner of Canada as it conducts its investigation into the matter.

I would like to turn now to my colleague Rob Sherman, who can take you through some of the facts as we know them today and the actions we are taking to prevent abuse from happening on our platform going forward.

8:50 a.m.

Robert Sherman Deputy Chief Privacy Officer, Facebook Inc.

Thank you, Kevin.

Thank you to the committee for having me here today.

As Kevin mentioned, I'm Facebook's deputy chief privacy officer. I want to apologize for not being able to join today's committee hearing in person. I'm hosting a summit today in California with many leading privacy experts, a summit that had been scheduled for some time. I appreciate the committee's attention to this important matter, and we appreciate the opportunity to provide information to support your study.

I'd like to spend just a few minutes on the specifics of this situation and what we're planning to do going forward.

In 2015 we learned from a report in The Guardian that a Cambridge University researcher named Aleksandr Kogan had shared data from a quiz app that he operated on the Facebook platform, This Is Your Digital Life, with Cambridge Analytica. It is against our policies for developers to share data without people's consent, so we immediately banned Dr. Kogan's app from our platform and demanded that Dr. Kogan and certain other entities he had relationships with, including Cambridge Analytica, delete any information they had received.

Several weeks ago we saw press reports alleging that some of this information may not have been deleted as Dr. Kogan, Cambridge Analytica, and others had certified. Based on our own data, we estimated a total of 305,000 people around the world had installed the app This Is Your Digital Life and that an additional 86.3 million were friends of people who had installed that app and were therefore potentially affected by data sharing.

While the vast majority of these people were in the United States, we estimate that 272 people in Canada installed the app, potentially affecting 621,889 additional Canadians. This represents 0.7% of the people affected across the world.

We take each case with the utmost seriousness, and that is why we're informing people if there is even a possibility that they may have been affected.

We have a responsibility to make sure that what happened with Cambridge Analytica does not happen again, so we've undertaken a series of steps to increase the protections we're providing for people's information. Here are some of the steps.

First, we need to make sure that developers like Dr. Kogan who got access to a lot of information in the past cannot get access to as much information anymore. We already made changes to the Facebook platform in 2014 to dramatically restrict the amount of data that app developers can receive and to proactively review apps before they can use our platform. Because of these 2014 changes, a developer today would not have access to the same amount of data that Dr. Kogan was able to obtain.

However, there is more that we intend to do to limit the information developers can access and to put more safeguards in place to prevent abuse. For instance, we're removing developers' access to your data if you haven't used their app in three months. We're reducing the data you give an app, when you use the new version of Facebook login, to only your name, your profile photo, and your email address. That's a lot less than is available to developers on any other major app platform. If a developer wants to use Facebook login to obtain more information than this—for example, access people's posts or other private data—we'll require them to sign a separate contract with us that imposes strict requirements.

Second, we're in the process of investigating every app that had access to a large amount of data before we locked down our platform in 2014. If we detect suspicious activity, we'll do a full forensic audit. If we find that someone is improperly using data, we'll ban them and we'll tell everyone affected.

Finally, we're making it easier to understand to which apps you've allowed access to your data. This past week we started showing everyone a list of the apps they've used and then an easy way to revoke permissions they've granted to those apps in the past. This is something you can already do in your privacy settings, but we're putting it at the top of the news feed to be sure everyone sees it.

We've also announced proposed updates to our data policy and terms of service to provide more information about our data practices and the choices people have. We hope this will better enable people to make informed decisions about their privacy and to better understand how we use data across Facebook, Instagram, Messenger, and our other services.

I'd now like to turn it back to my colleague Kevin to talk a bit about what we are doing with respect to election integrity in Canada.

8:55 a.m.

Global Directeur and Head of Public Policy, Facebook Canada, Facebook Inc.

Kevin Chan

Thanks, Rob.

We recognize that the situation involving Cambridge Analytica raises more general questions on the use of Facebook and the integrity of elections. I would like to conclude with some comments on the subject, because we are working hard to do our part to protect the integrity of the federal elections in 2019. We know that your leaders and your political parties continue to use Facebook as a key platform for citizen involvement. So it is important that the matter be taken seriously.

As you may know, the Communications Security Establishment published last year a report outlining various cyber-threats to the next federal election and identified two areas Facebook sees a role in addressing: cybersecurity, the hacking into the online accounts of candidates and political parties; and the spreading of misinformation online. In response, we launched, last fall, our Canadian election integrity initiative, which consists of five elements.

First, to address cybersecurity, we launched the Facebook “Cyber Hygiene Guide”, created specifically for Canadian politicians and political parties. It provides key information on how everyone who is administering a political figure or party's Facebook presence can help keep their accounts and pages secure. I have brought copies of the guide with me, Mr. Chair, and with your permission, later I will circulate them to members.

Second, we are offering cyber-hygiene training to all the federal political parties.

Third, we launched our cyber-threats email line for federal politicians and political parties. This email line is a direct pipe into our security team at Facebook and will help fast-track responses for compromised pages or accounts.

To address misinformation online, we've partnered with MediaSmarts, Canada's Centre for Digital and Media Literacy, on a two-year project to develop thinking, resources, and public service announcements on how to spot misinformation online. This new initiative, which we are calling “Reality Check”, will include lesson plans, interactive online missions, videos, and guides that will provide the idea that verifying information is an essential life and citizenship skill.

We also launched our ads transparency test, called “View Ads”, here in Canada last November. This test, which is ongoing, allows anyone in Canada to view any and all Facebook ads, including ads for which you were not the intended audience. All advertisers on Facebook are subject to “View Ads”, but we recognize that it is an important part of our civic engagement efforts. Candidates running for office and organizations engaged in political advertising should be held accountable for what they say to citizens, and this feature gives people the chance to see all the things a candidate or organization is saying to everyone. This is a higher level of ad transparency than currently exists for any type of advertising, online or offline.

As we answer your questions, Rob and I hope that we can tell you more about our efforts to protect personal information and the integrity of elections. We recognize that, in the past, we have been too idealistic about the use of our technologies and we have not concentrated sufficiently on preventing abuse on our platform. We are in the process of making major changes in the operation of our company in order to improve our approach in that regard.

Thank you again for the opportunity to appear before you today, and we would now be pleased to answer your questions.

8:55 a.m.

Conservative

The Chair Conservative Bob Zimmer

Thank you, Mr. Chan and Mr. Sherman.

First up, for seven minutes, is Mr. Erskine-Smith.

April 19th, 2018 / 9 a.m.

Liberal

Nathaniel Erskine-Smith Liberal Beaches—East York, ON

Thank you very much.

First, have you notified Canadian users, and if so, exactly how have you notified them?

9 a.m.

Deputy Chief Privacy Officer, Facebook Inc.

Robert Sherman

Thank you very much for the question.

We're in the process of notifying people in Canada and globally about the situation. The way we will let them know is through information at the top of their news feeds that will explain that they have access to information about which apps have received their information. If they are affected by Cambridge Analytica, they will be notified about that as well.

9 a.m.

Liberal

Nathaniel Erskine-Smith Liberal Beaches—East York, ON

If people have deleted their Facebook accounts, they wouldn't have any ability to be notified, in all likelihood.

If a company cared more about users than its share price, and it learned about a breach in 2016, wouldn't it have notified its users in 2016?

9 a.m.

Deputy Chief Privacy Officer, Facebook Inc.

Robert Sherman

I think it's important to note that the trust of people who use Facebook is paramount, and it's critical not only for our ethical obligations but for our business obligations as well, because we realize that if people don't feel that their information is protected on Facebook, they won't feel comfortable using our services. So while certainly information about—

9 a.m.

Liberal

Nathaniel Erskine-Smith Liberal Beaches—East York, ON

So why didn't you notify users in 2016?

9 a.m.

Deputy Chief Privacy Officer, Facebook Inc.

Robert Sherman

I think what our CEO Mark Zuckerberg has said is that in retrospect is that we should have done that. Going forward, if a situation like this occurs, then we will certainly do that.

9 a.m.

Liberal

Nathaniel Erskine-Smith Liberal Beaches—East York, ON

In the international context, 350,000 or so people consented to using an application and allowed the ap developer to access 87 million user profiles. In Canada, if I have it right, 272 people accessed it, giving access to 620,000-plus Canadian user profiles. How is that in compliance with the existing law?

9 a.m.

Deputy Chief Privacy Officer, Facebook Inc.

Robert Sherman

I think those numbers are generally correct, but it's important to note that we have taken a conservative approach here. We don't have perfect information about exactly which information was transferred at which time. What we have aimed to do is err on the side of caution and notify more people rather than fewer people.

9 a.m.

Liberal

Nathaniel Erskine-Smith Liberal Beaches—East York, ON

I appreciate that in terms of the numbers, but in terms of our legislation here in Canada, PIPEDA, which requires consent—usually explicit consent, and in some cases implied consent—where was the consent of 620,000 users?

9 a.m.

Deputy Chief Privacy Officer, Facebook Inc.

Robert Sherman

The approach we took at the time...and as I mentioned in my opening statement, we've made significant changes to the platform since this information was available to restrict the information that app developers can receive.

For the 272 people who specifically authorized the app, there was a screen that popped up that would have notified them of what information the developer wanted to receive, and they would have clicked it to accept—

9 a.m.

Liberal

Nathaniel Erskine-Smith Liberal Beaches—East York, ON

They can't consent on behalf of other people, right?

9 a.m.

Deputy Chief Privacy Officer, Facebook Inc.

Robert Sherman

With regard to the—

9 a.m.

Liberal

Nathaniel Erskine-Smith Liberal Beaches—East York, ON

Sorry. You've made changes, and perhaps you're in compliance with the law now, but it seems pretty clear that you weren't in compliance with the law previously. Is that fair?

9 a.m.

Deputy Chief Privacy Officer, Facebook Inc.

Robert Sherman

With regard to the people who are friends of those who were using the app, our data policy and our disclosures at the time were very clear that this was how the platform worked. It's important to note that as our changes in 2014 reflect, we don't think that's the right way for a platform to operate, and it's not the way the platform operates today.

This is something that at the time and since, we've been in discussions with the Privacy Commissioner of Canada about. So while we think it's not the appropriate way for a platform to operate, we also want to make sure we're in compliance with all applicable laws.

9 a.m.

Liberal

Nathaniel Erskine-Smith Liberal Beaches—East York, ON

Not only is it not an appropriate way, but the way you previously designed the system is also contrary to our law.

Mr. Zuckerberg has noted that you're open to regulation. You've taken some additional steps. What regulations specifically do you think would fix the problems that you've experienced?

9 a.m.

Deputy Chief Privacy Officer, Facebook Inc.

Robert Sherman

There are a number of different steps that need to be taken, and the first one, as you pointed out, is that Facebook needs to take responsibility. We hope that we have, and we need to continue to do work to make sure that people's information is safe on our platform. That's something we've invested in and that we have a responsibility to do, over and above the law. As Kevin mentioned in his opening comments, we have a responsibility to take a broader view of what we should do.

From my conversations about privacy regulation in Canada and around the world, I think taking a principles-based approach that provides strong privacy protections to Canadians and to people everywhere is important. That's something that exists in PIPEDA today.

I know this committee is undertaking a study and has published a report with recommendations regarding PIPEDA, and there's a lot in that study that's worth considering, but I think PIPEDA's fundamental principles-based approach and giving the Privacy Commissioner broad authority and discretion in how to apply that to new technologies and new situations is an appropriate model.

9:05 a.m.

Liberal

Nathaniel Erskine-Smith Liberal Beaches—East York, ON

Though, interestingly, we had a principles-based approach previously, when Facebook disrespected those principles and failed to abide by our existing legislation.

In 2014 you made changes, but all of those app developers who have previously collected information still have that information. Can you give a sense to Canadians of exactly what detailed information that entails?

My understanding is that app developers would have had access to the education, work affiliation, personal relationships, friend lists, likes, location. What else?

9:05 a.m.

Deputy Chief Privacy Officer, Facebook Inc.

Robert Sherman

Obviously, the specific information that's affected depends on the specific app.

9:05 a.m.

Liberal

Nathaniel Erskine-Smith Liberal Beaches—East York, ON

What's the worst situation, the most personal information that would have been shared with app developers?

9:05 a.m.

Deputy Chief Privacy Officer, Facebook Inc.

Robert Sherman

App developers would have been able to receive information that people have shared on their profiles—things such as their likes, their city, where they live, and that kind of information.

We've made changes since then, and those were pieces of information that were shared under the privacy settings of the person affected. You would have had the ability to choose whether to share the information in the first place. You would have had the ability to choose who to share it with, so you might have shared it with some friends but not others. And you would have had the ability to choose whether those friends could bring that information to apps.

As I mentioned, since then we've significantly restricted the amount of information that's available to apps.