Evidence of meeting #135 for Access to Information, Privacy and Ethics in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was content.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Jeanette Patell  Director, Government Affairs and Public Policy, Canada, Google Canada
Rachel Curran  Head of Public Policy, Canada, Meta Platforms Inc.
Lindsay Hundley  Global Threat Intelligence Lead, Meta Platforms Inc.
Steve de Eyre  Director, Public Policy and Government Affairs, TikTok Canada
Wifredo Fernández  Head of Government Affairs, United States of America and Canada, X Corporation
Justin Erlich  Global Head of Policy Development, TikTok

5:45 p.m.

Director, Public Policy and Government Affairs, TikTok Canada

5:45 p.m.

Conservative

Frank Caputo Conservative Kamloops—Thompson—Cariboo, BC

What's the threshold for “significant harm”?

5:45 p.m.

Director, Public Policy and Government Affairs, TikTok Canada

Steve de Eyre

My colleague Justin could describe that for you.

5:45 p.m.

Global Head of Policy Development, TikTok

Justin Erlich

We consider a wide variety of harms that we remove, including election misinformation, undermining civic integrity, medical misinformation that may lead to significant physical harm or death, or things that may cause public panic or large-scale property damages. Those are a few examples.

5:50 p.m.

Conservative

Frank Caputo Conservative Kamloops—Thompson—Cariboo, BC

We're all parliamentarians here. We all use social media, but many of us really don't enjoy the process, because there's so much misinformation and, frankly, a lot of really mean-spirited comments. However, during an election campaign, which often is about 35-36 days in Canada, Mr. Erlich, is there a heightened awareness on TikTok's behalf with respect to these significant harms, or is it just kind of business as usual because there's just so much content to deal with?

5:50 p.m.

Global Head of Policy Development, TikTok

Justin Erlich

In the midst of elections, we take our responsibility to protect the integrity of the platform incredibly seriously and we have task forces that are spun up to prepare and enforce all of our policies here.

Certainly we do scenario planning and assess various different types of things that may happen. We have dedicated teams looking to moderate and enforce content around election misinformation, hate or harassment. We also partner closely with fact-checkers whom we leverage to assess the veracity of any of the claims that may come in.

5:50 p.m.

Conservative

Frank Caputo Conservative Kamloops—Thompson—Cariboo, BC

Thank you.

5:50 p.m.

Conservative

The Chair Conservative John Brassard

Thank you, Mr. Caputo.

Ms. Khalid, before I go to you, I just want to circle back to your earlier intervention about asking for documents.

I've looked back into the book. It's clear that committees.... We've had several requests throughout the course of this meeting. I know that Mr. Housefather has made a request. There have been others that the clerk has noted. We'll follow up with whomever that request has been asked of, but it does say that we usually obtain papers simply by requesting them from their authors or owners. If the request is denied after the ask has been made, however, and the standing committee believes there are specific papers that are essential to its work, it can use the power to order the production of papers by passing a motion to that effect. Typically, the method is to ask. If we're not satisfied after that, we can move a motion.

I just wanted to make that very clear before you started.

Iqra Khalid Liberal Mississauga—Erin Mills, ON

Chair, to clarify, I wasn't making a motion. I was just generally asking for those documents. I hope that is taken as a serious request.

5:50 p.m.

Conservative

The Chair Conservative John Brassard

Okay. We will certainly follow up with Mr. Fernández on that. The clerk has noted the request. She will check the blues to make sure the request is accurate.

Ms. Khalid, you have five minutes. Go ahead, please.

Iqra Khalid Liberal Mississauga—Erin Mills, ON

Thank you very much, Chair.

I'll direct my next couple of questions to Meta, if that's okay.

Ms. Curran, on December 20, 2023, there was a Human Rights Watch report entitled “Meta: Systemic Censorship of Palestine Content”. Hundreds of Palestinian users have reported being shadow-banned or having their accounts suspended without any explanation. Does Meta acknowledge this as a form of censorship that deprives people in support of Palestinians' plight of their fundamental rights to express their opinions online?

5:50 p.m.

Head of Public Policy, Canada, Meta Platforms Inc.

Rachel Curran

I'm not familiar with the report. I'm sorry about that.

Look, last year we implemented a number of additional policy measures to address a spike in harmful content on our platforms. Those are still in place today. We've taken extensive steps over the past 12 months to keep people safe—

Iqra Khalid Liberal Mississauga—Erin Mills, ON

I understand that. I would appreciate it if you could perhaps share those documents to us in written format. I would like to reclaim my time, if that's okay.

I would like to have a little bit of understanding as to how the repressing of content happens on Meta, Facebook and Instagram.

5:50 p.m.

Head of Public Policy, Canada, Meta Platforms Inc.

Rachel Curran

We have content policies called community standards that are published publicly. They're available at our transparency centre.

Iqra Khalid Liberal Mississauga—Erin Mills, ON

My question really is this: How is it that if, for example, there are two sides to an issue, one side gets more repressed than another side?

5:50 p.m.

Head of Public Policy, Canada, Meta Platforms Inc.

Rachel Curran

Our content policies are enforced fairly across partisan divides, across political divides—

Iqra Khalid Liberal Mississauga—Erin Mills, ON

How do you ensure that there is fairness in the marketplace of ideas that you provide to people, not just in Canada but also across the world?

5:50 p.m.

Head of Public Policy, Canada, Meta Platforms Inc.

Rachel Curran

That's a really good question.

Our content review teams are always looking at content decisions and making sure our policies are fair and are enforced fairly.

We also recently set up something called the Oversight Board, a really interesting model that examines the decisions we make around content. If we make decisions to remove content or to leave content up that our users disagree with, they can appeal those decisions to the independent Oversight Board. The Oversight Board will take a second look at our decisions.

Iqra Khalid Liberal Mississauga—Erin Mills, ON

Where does the Oversight Board operate from?

5:55 p.m.

Head of Public Policy, Canada, Meta Platforms Inc.

Rachel Curran

I'd have to get back to you on that. It's an independent board. It's independent of Meta. I'll get back to you on that.

Iqra Khalid Liberal Mississauga—Erin Mills, ON

If you could do that, please, I'd appreciate it.

In your comments to a colleague here who asked a question with respect to the Meta employees, I think you mentioned that there were 60,000 globally. How many are physically working from within Canada?

October 24th, 2024 / 5:55 p.m.

Head of Public Policy, Canada, Meta Platforms Inc.

Rachel Curran

I don't know the answer to that. We have 40,000 people working in safety and security. I will get back to you on how many of those people are located in Canada.

Iqra Khalid Liberal Mississauga—Erin Mills, ON

Thank you.

You recently cut 21,000 jobs, including in trust and safety and customer service, over multiple rounds of layoffs:

...[your] company dissolved a fact-checking tool that would have let news services like The Associated Press and Reuters, as well as credible experts, add comments at the top of questionable articles as a way to verify their trustworthiness. Reuters is still listed as a fact-checking partner, but an AP spokesperson said the news agency's “fact-checking agreement with Meta ended back in January.”

How do you justify these cuts, especially when the year of 2024 is dubbed the election year, and people do rely on you to get a lot of the information they seek?

5:55 p.m.

Head of Public Policy, Canada, Meta Platforms Inc.

Rachel Curran

We have the largest global fact-checking network of any of the online platforms. I don't know about Reuters specifically, but in Canada, we use Agence France-Presse, and we're looking at bringing on another organization for the election specifically. We work with a range of organizations—over 90 now—to do independent fact-checking.

Iqra Khalid Liberal Mississauga—Erin Mills, ON

Thank you.

If there is any additional data that you can provide that would confirm what you've said today, I'd really appreciate that.

Mr. Fernández, I'll go back to you quickly on the blue check marks and on the apparent ability now on X to be able to buy legitimacy for a number of dollars and to amplify your voice, regardless of how accurate or how truthful—or not—that voice is. It could be misinformation, disinformation, hate speech, etc., but you could purchase the blue check mark that then amplifies your voice.

Has there been any study done within X as to whether that blue check mark and the accounts associated with it have any correlation with misinformation or disinformation campaigns or with fact-checking expeditions on your platform?