Evidence of meeting #153 for Access to Information, Privacy and Ethics in the 42nd Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was facebook.

A video is available from Parliament.

On the agenda

MPs speaking

Also speaking

Ian Lucas  Member, Digital, Culture, Media and Sport Committee, United Kingdom House of Commons
Kevin Chan  Global Policy Director, Facebook Inc.
Neil Potts  Global Policy Director, Facebook Inc.
Derek Slater  Global Director, Information Policy, Google LLC
Carlos Monje  Director, Public Policy, Twitter Inc.
Damian Collins  Chair, Digital, Culture, Media and Sport Committee, United Kingdom House of Commons
Colin McKay  Head, Government Affairs and Public Policy, Google Canada
Edwin Tong  Senior Minister of State, Ministry of Law and Ministry of Health, Parliament of Singapore
Hildegarde Naughton  Chair, Joint Committee on Communications, Climate Action and Environment, Houses of the Oireachtas
Jens Zimmermann  Social Democratic Party, Parliament of the Federal Republic of Germany
Keit Pentus-Rosimannus  Vice-Chairwoman, Reform Party, Parliament of the Republic of Estonia (Riigikogu)
Mohammed Ouzzine  Deputy Speaker, Committee of Education and Culture and Communication, House of Representatives of the Kingdom of Morocco
Elizabeth Cabezas  President, National Assembly of the Republic of Ecuador
Andy Daniel  Speaker, House of Assembly of Saint Lucia
Jo Stevens  Member, Digital, Culture, Media and Sport Committee, United Kingdom House of Commons
James Lawless  Member, Joint Committee on Communications, Climate Action and Environment, Houses of the Oireachtas
Sun Xueling  Senior Parliamentary Secretary, Ministry of Home Affairs and Ministry of National Development, Parliament of Singapore
Michele Austin  Head, Government and Public Policy, Twitter Canada, Twitter Inc.

Noon

Director, Public Policy, Twitter Inc.

Carlos Monje

We are working to adjust our policies. We have things that are against our rules and we have things that aren't against our rules but that people don't like. We call it, internally, the gap. What we've been doing and what our teams have been doing is trying to break those issues down, portion by portion, and understand where the expectations of our users don't match the experience they're getting.

Our approach is, again, very consistent. We want people to feel comfortable, to feel safe to come online. We also don't want to squelch public expression, and these are issues that we care about deeply and take very personally.

Noon

Conservative

The Chair Conservative Bob Zimmer

Thank you.

Next up, we'll go to the Kingdom of Morocco for five minutes.

Noon

Mohammed Ouzzine Deputy Speaker, Committee of Education and Culture and Communication, House of Representatives of the Kingdom of Morocco

Thank you, Mr. Chair.

I would also like to thank the kind team doing us the honour of being here today: Kevin Chan, Derek Slater, Neil Potts, Carlos Moje and Michele Austin. We would have liked Mark Zuckerberg to be with us, but he let us down. We hope he will return some other time.

I have been very attentive to two proposals from Mr. Chan. I would like to make a linguistic clarification for interpreters: when I use the word “proposition”, in English, it refers to the term “proposition”, and not “proposal”.

In presenting the issues raised by his company, Mr. Chan said that it was not just Facebook's responsibility to resolve them. We fully agree on this point.

And then, again on these issues, he added that society must be protected from the consequences. Of course, these platforms have social advantages. However, today we are talking about the social unrest they cause; this is what challenges us more than ever.

Facebook, Twitter and YouTube were initially intended to be a digital evolution, but it has turned into a digital revolution. Indeed, it has led to a revolution in systems, a revolution against systems, a revolution in behaviour, and even a revolution in our perception of the world.

It is true that today, artificial intelligence depends on the massive accumulation of personal data. However, this accumulation puts other fundamental rights at risk, as it is based on data that can be distorted.

Beyond the commercial and profit aspect, wouldn't it be opportune for you today to try a moral leap, or even a moral revolution? After allowing this dazzling success, why not now focus much more on people than on the algorithm, provided that you impose strict restrictions beforehand, in order to promote accountability and transparency?

We sometimes wonders if you are as interested when misinformation or hate speech occurs in countries other than China or in places other than Europe or North America, among others.

It isn't always easy to explain why young people, or even children, can upload staged videos that contain obscene scenes, insulting comments or swear words. We find this unacceptable. Sometimes, this is found to deviate from the purpose of these tools, the common rule and the accepted social norm.

We aren't here to judge you or to conduct your trial, but much more to implore you to take our remarks into consideration.

Thank you.

12:05 p.m.

Global Policy Director, Facebook Inc.

Kevin Chan

Thank you very much, Mr. Ouzzine.

Again, please allow me to answer in English. It isn't because I can't answer your question in French, but I think I'll be clearer in English.

I'm happy to take the first question with respect to what you were talking about—humans versus machines or humans versus algorithms. I think the honest truth on that is that we need both, because we have a huge amount of scale, obviously. There are over two billion people on the platform, so in order to get at some of the concerns that members here have raised, we do need to have automated systems that can proactively find some of these things.

I think to go back to Mr. Collins's first question, it is also equally important that we have humans that are part of this, because context is ultimately going to help inform whether or not this is malicious, so context is super important.

If I may say so, sir, on the human questions, I do think you are hitting on something very important, and I had mentioned it a bit earlier. There is this need, I think, for companies such as Facebook not to make all of these kinds of decisions. We understand that. I think people want more transparency and they want to have a degree of understanding as to why decisions were arrived at in the way they were in terms of what stays up and what goes down.

I can tell you that in the last few months, including in Canada, we have embarked on global consultation with experts around the world to get input on how to create an external appeals board at Facebook, which would be independent of Facebook and would make decisions on these very difficult content questions. We think there is—at least as our current thinking in terms of what we put out there—this question of whether they should be publicly binding on Facebook. That is sort of the way we have imagined it and we are receiving input and we will continue to consult with experts. Our commitment is to get this done by 2019.

Certainly, on our platform, we understand that this is challenging. We want a combination of humans and algorithms, if you will, but we also understand that people will have better confidence in the decisions if there is a final board of appeal, and we're going to build that by 2019.

Of course, we're all here today to discuss the broader question of regulatory frameworks that should apply to all services online. There, once again obviously, the human piece of it will be incredibly important. So thank you, sir, for raising that, because that's the nub, I think, of what we're trying to get at—the right balance and the right framework per platform but also across all services online.

12:05 p.m.

Conservative

The Chair Conservative Bob Zimmer

Thank you, Mr. Chan.

Next up, we will go to Ecuador for five minutes.

12:05 p.m.

Elizabeth Cabezas President, National Assembly of the Republic of Ecuador

[Delegate spoke in Spanish, interpreted as follows:]

Thank you very much.

I want to talk about some of the concerns that have already been mentioned at this meeting, and also express great concern regarding tweets and Twitter, on which there is a proliferation in the creation of false accounts that are not detected. They definitely remain active for a very long time on social networks and generate, in most cases, messages and trends that are negative and against different groups, both political and those that are linked to businesses or unions in many different areas.

I don't know what mechanisms you have decided to choose to verify the creation of these, because these are accounts that have to do with troll centres or troll farms, which in the case of Ecuador have really cropped up very frequently and which continue to. They have been spreading messages on a massive scale, malicious messages that counteract real information and true information and really twist the points of view.

More than continuing to mention the issues that have already been mentioned, I would urge you to think about fact-checking mechanisms that can detect these accounts in a timely manner, because definitely you do not do it quickly enough or as quickly as is necessary. This allows damaging messages to proliferate and generate different thoughts, and they distort the truth about a lot of subjects.

I don't know what the options are, in practice, or what you're going to be doing in practice to avoid this or prevent this, and to prevent the existence of these troll centres and the creation of accounts that are false, of which there are many.

12:10 p.m.

Director, Public Policy, Twitter Inc.

Carlos Monje

Thank you. That is exactly the right question to ask, and one that we work on every day.

I'll just note that our ability to identify, disrupt and stop malicious automation improves every day. We are now catching—I misspoke earlier—425 million accounts, which we challenged in 2018.

Number one is stopping the coordinated bad activity that we see on the platform. Number two is working to raise credible voices—journalists, politicians, experts and civil society. Across Latin America we work with civil society, especially in the context of elections, to understand when major events are happening, to be able to focus our enforcement efforts on those events, and to be able to give people more context about people they don't understand.

I'll give you one example because I know time is short. If you go onto Twitter now, you can see the source of the tweet, meaning, whether it is coming from an iPhone, an Android device, or from TweetDeck or Hootsuite, or the other ways that people coordinate their Twitter activities.

The last piece of information or the way to think about this is transparency. We believe our approach is to quietly do our work to keep the health of the platform strong. When we find particularly state-sponsored information operations, we capture that information and put it out into the public domain. We have an extremely transparent public API that anybody can reach. We learn and get better because of the work that researchers have undertaken and that governments have undertaken to delve into that dataset.

It is an incredibly challenging issue, I think. One of the things you mentioned is that it's easy for us to identify instantaneous retweets and things that are automated like that. It is harder to understand when people are paid to tweet, or what we saw in the Venezuelan context with troll prompts, those kinds of things.

We will continue to invest in research and invest in our trolling to get better.

12:10 p.m.

Conservative

The Chair Conservative Bob Zimmer

We'll move on to the last on our list and then we'll start the sequence all over again.

To Saint Lucia, please go ahead for five minutes.

12:10 p.m.

Andy Daniel Speaker, House of Assembly of Saint Lucia

Thank you, Mr. Co-chair.

My questions are to Neil Potts, global policy director. I have two questions.

The first one is that I would like to understand and to know from him and from Facebook, generally, whether or not they understand the principle of “equal arms of government”. It would appear, based on what he said earlier in his opening remarks, that he is prepared and he is willing to speak to us here, and Mr. Zuckerberg will speak to the governments. It shows a.... I do not understand...not realizing the very significant role that we play as parliamentarians in this situation.

My next question is with reference to Speaker Nancy Pelosi's video, as well as to statements made by him with reference to Sri Lanka. He said that the videos would only be taken down if there were physical violence.

Let me just make a statement here. The Prime Minister of Saint Lucia's Facebook accounts have been, whether you want to say “hacked” or “replicated”, and he is now struggling out there to try to inform persons that this is a fake video or a fake account. Why should this be? If it is highlighted as fake, it is fake and it should not be....

Let me read something out of the fake...and here is what it is saying, referring to a grant. I quote:

It's a United Nation grant money for those who need assistance with paying for bills, starting a new project, building homes, school sponsorship, starting a new business and completing an existing ones.

the United nation democratic funds and human service are helping the youth, old, retired and also the disable in the society....

When you put a statement out there like this, this is violence against a certain vulnerable section of our society. It must be taken down. You can't wait until there is physical violence. It's not only physical violence that's violence. If that is the case, then there is no need for abuse when it is gender relations, or otherwise. Violence is violence, whether it is mental or physical.

That is my question to you, sir. Shouldn't these videos, these pages, be taken down right away once it is flagged as fake?

12:15 p.m.

Global Policy Director, Facebook Inc.

Neil Potts

If it is the case that someone is misrepresenting a member of government, we would remove that if it is flagged. I will follow up with you after this hearing and make sure that we have that information and get it back to the team so that we can act swiftly.

Maybe perhaps to address a few of the other conversations here, there's been this kind of running theme that Mr. Zuckerberg and Ms. Sandberg are not here because they are eschewing their duty in some way. They have mandated and authorized Mr. Chan and me to appear before this committee to work with you all. We want to do that in a very co-operative way. They understand their responsibility. They understand the idea of coequal branches of government, whether that's the legislative branch, the executive branch or the judicial branch. They understand those concepts and they are willing to work. We happen to be here now to work on—

12:15 p.m.

Conservative

The Chair Conservative Bob Zimmer

With respect, Mr. Potts, I'm going to step in here.

With respect, it is not your decision to select whether you're going to come or not. The committee has asked Mr. Zuckerberg and Ms. Sandberg to come, plain and simple, to appear before our international grand committee. We represent 400 million people, so when we ask those two individuals to come, that's exactly what we expect. It shows a little bit of distain from Mark Zuckerberg and Ms. Sandberg to simply choose not to come. It just shows there's a lack of an understanding about what we do, as legislators, as the member from Saint Lucia mentioned. The term “blowing us off”, I think, can be brought up again, but it needs to be stated that they were invited to appear and they were expected to appear and they're choosing not to. To use you two individuals in their stead is simply not acceptable.

I'll go back to Mr. Daniel from Saint Lucia.

12:15 p.m.

Global Policy Director, Facebook Inc.

Neil Potts

Thank you, Mr. Zimmer. I want to be clear. I'm not familiar with the procedures of Canadian Parliament and what requires appearance. I respect that, but I do want to get on record that they are committed to working with government, as well as being responsible toward these issues.

Additionally—

12:15 p.m.

Conservative

The Chair Conservative Bob Zimmer

I would argue, Mr. Potts, if that were the case, they would be seated in those two chairs right there.

Continue on.

12:15 p.m.

Global Policy Director, Facebook Inc.

Neil Potts

Additionally, just to address another question that I think is permeating, about how we go about removing content and identifying it, we do remove content for a number of various abuse types. It's not just violence. In that specific case, where we're talking about misinformation, the appearance of some of these tropes that appeared in Sri Lanka and other countries, we removed that on the cause that it would lead to violence. But we have policies that cover things like hate speech, where violence may not be imminent. We have things like personal identifiable information, bullying, which we take very seriously, that may not lead directly to violence but we do enforce those policies directly and we try to enforce them as swiftly as possible.

We now have 30,000 people globally working on these issues. There was a comment earlier about having people with the right amount of context to really weigh in. For all the countries that are represented here, I just want to say that, within that 30,000 people, we have 15,000 content moderators who speak more than 50 languages. They work 24 hours a day, seven days a week. Some of them are located in countries that are here before us today. We take that very seriously.

Additionally, we do have a commitment to working with our partners—government, civil society and academics—so that we are arriving at the answers that we think are correct on these issues. I think we all recognize that these are very complex issues to get right. Everyone here, I think, shares the idea of ensuring the safety of our community, all of whom are your constituents. I think we share those same goals. It's just making sure that we are transparent in our discussion and that we come to a place where we can agree on the best steps forward. Thank you.

12:15 p.m.

Conservative

The Chair Conservative Bob Zimmer

Thank you.

It was just brought to my attention, too, the inconsistency in your testimony, Mr. Potts.

On one hand, Mr. Collins had asked you about the Pelosi video, which you're not going to pull down. Then within 30 minutes or within an hour you just answered the member from Saint Lucia that it would come down immediately. I just would like you to be completely aware that it's expected that you completely tell the truth to this committee at this time and not to be inconsistent in your testimony.

12:20 p.m.

Global Policy Director, Facebook Inc.

Neil Potts

Mr. Zimmer, if I was inconsistent, I apologize, but I don't believe that I answered the question differently. If I had a transcript, obviously I would flag where my discrepancy was and correct it immediately.

Again, on misinformation that is not leading to immediate harm, we take an approach to reduce that information, inform users that it is perhaps false, as well as remove inauthentic accounts. If someone is being inauthentic, representing that they are someone else, we would remove that. Authenticity is core to our principles, authentic individuals on our platform. That's why we require real names.

The question that I believe Mr. Collins was asking was about the video, the video itself. It's not that the user was inauthentic in his sharing of the video. The user is a real person or there's a real person behind the page. It's not a troll account or a bot or something like that. We would remove that.

If I misspoke, I apologize, but I want to be clear that I don't think my—

12:20 p.m.

Conservative

The Chair Conservative Bob Zimmer

I don't think it's any clearer to any of us in the room, but I'll move on to the next person.

We'll go to Mr. Baylis, for five minutes.

12:20 p.m.

Liberal

Frank Baylis Liberal Pierrefonds—Dollard, QC

Thank you. I'll start off with Mr. Slater.

A couple of weeks ago, we had another one of your gentlemen in, telling us that Google would not comply with Bill C-76, our new election campaign law. I asked, “Why not?” He said, “Well, we can't get the programming done in six months' time.”

I pointed out that Facebook can, and he said, “Well, our systems are more difficult and it's more complicated.”

He said, “We can't do it in six months”, so I asked him, “Okay, how much time do you need? When can you get it done?” He said he didn't know.

Can you explain that?

12:20 p.m.

Global Director, Information Policy, Google LLC

Derek Slater

They've given you extensive information on this front, but just to add a bit, yes, we are a different service, and while we knew regrettably that we would not be in a position to offer election advertising in this time, we would look to it in the future.

12:20 p.m.

Liberal

Frank Baylis Liberal Pierrefonds—Dollard, QC

If you can say you can't do it in six months' time but you don't know how long it will take, how do you know you can't do it in six months, when Facebook can do it in six months?

12:20 p.m.

Global Director, Information Policy, Google LLC

Derek Slater

We are very different services with different features of various types.

12:20 p.m.

Liberal

Frank Baylis Liberal Pierrefonds—Dollard, QC

How do you not know how long it will take?

12:20 p.m.

Global Director, Information Policy, Google LLC

Derek Slater

In part, it's because things continue to change over time. It's a rapidly evolving space, both legally and in terms of our services. Therefore, in terms of exactly when it will be ready, we wouldn't want to put—

12:20 p.m.

Liberal

Frank Baylis Liberal Pierrefonds—Dollard, QC

You know you can't do it in six months.

12:20 p.m.

Global Director, Information Policy, Google LLC