Evidence of meeting #123 for Public Safety and National Security in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was russian.

A video is available from Parliament.

On the agenda

MPs speaking

Also speaking

David Agranovich  Director of Threat Disruption, Meta Platforms Inc.
Steve de Eyre  Director, Public Policy and Government Affairs, Canada, TikTok
Lindsay Doyle  Head of Government Affairs and Public Policy for Canada, YouTube
John Hultquist  Chief Analyst, Mandiant Intelligence, Google, YouTube
Rachel Curran  Head of Public Policy, Canada, Meta Platforms Inc.
Justin Erlich  Global Head, Policy Development, TikTok
Anthony Seaboyer  Assistant Professor, Royal Military College of Canada, As an Individual
Adam Zivojinovic  Journalist, As an Individual
Clerk of the Committee  Mr. Simon Larouche

4:20 p.m.

Conservative

Glen Motz Conservative Medicine Hat—Cardston—Warner, AB

Thank you very much.

The witnesses we've had so far in this study have indicated that Russian interference is used across the political spectrum. There is no right or left. In fact, contrary to what my colleagues across the way have alluded to in the past, it's not a far-right phenomenon. Media reports in fact have indicated that, of the 90 key accounts that promote pro-Moscow sentiments, 33% are controlled by people on the far left.

YouTube officials, what does this mean from your perspective with respect to Russian tactics and how you're seeing this deployed, maybe in other countries?

4:20 p.m.

Head of Government Affairs and Public Policy for Canada, YouTube

Lindsay Doyle

Thank you so much for the question and for the opportunity to weigh in.

I will ask my colleague, John, to share some of his insights here.

4:20 p.m.

Chief Analyst, Mandiant Intelligence, Google, YouTube

John Hultquist

It's actually fairly common to see these actors work both sides of the political spectrum. It's important not to over-index any specific operation, because many of these operations have a counterpart that's focused on the opposing audience.

I'll give you an example. The Internet Research Agency is the group that was so active in 2016 in the elections in the United States. They're still around, or remnants of that organization are still around, and actually, there is a very recent operation that was sort of right-facing. However, prior to that, they were doing a left-facing operation or a left wing-facing operation, so it's not uncommon to see them work from both angles.

The other thing I think is really important when it comes to this activity is that there is no favoured version of the truth. The point is often to flood the zone with narratives that are often completely contradictory to each other.

Glen Motz Conservative Medicine Hat—Cardston—Warner, AB

Thank you very much for that.

I want to ask this across the platforms that are here.

There seems to be a tone that, rather than taking responsibility for its role in dealing with foreign interference, it would be easy for government to point the fingers at the social media platforms, as if they're solely responsible for making sure that no foreign interference, disinformation or misinformation is ever available to the Canadian public.

Having said that, what do you think governments should do—it was alluded to just recently—about misinformation or disinformation campaigns so they don't reach Canadians, without interfering with their right to expression? Are there things that government can do to work with social media platforms to ensure that the right to expression is still upheld, but we do some work to prevent disinformation and misinformation?

I'll start with Meta.

4:20 p.m.

Head of Public Policy, Canada, Meta Platforms Inc.

Rachel Curran

That's a really interesting question. I'm going to turn it over to my colleague David Agranovich, who can talk about the work we have done with governments elsewhere. I'm happy to chime in with the work we've done with the Canadian government as well.

4:20 p.m.

Director of Threat Disruption, Meta Platforms Inc.

David Agranovich

Thank you for the question.

Maybe I'd break my answer into three key points.

The first is that the more we aggressively enforce networks on our platforms, the more we see them nebulize across the Internet and increasingly rely on tactics that look more like Cold War-style traditional espionage tradecraft. The people who have the best visibility into that activity are often governments or law enforcement organizations. We welcome information sharing from security organizations that might have better insight into nation-state intelligence services or their proxies so we can use that information to key our own investigations on our own platforms.

Second, we're very careful in our own public disclosures both to avoid speculating about the potential for influence activity and to make sure that we're reporting critically about the effectiveness of those operations. For example, a Russian network known as DoppelGänger has been the focus of quite a bit of public reporting recently. It is less focused on reaching real people and more focused on making itself look like it's really good at reaching real people. Oftentimes, these organizations are selling a story to their bosses or their funders as much as they might be trying to sell a story to their targets. Governments, in partnership with us, civil society organizations and media, can be really careful about how we talk about these efforts so that we do not, for example, make Russia sound as all powerful as they'd like us to think they are.

Third, there are concrete tools that governments have that could make these operations meaningfully more difficult. As one of those tools, governments can levy geopolitical power, whether that's diplomatic sanctions, financial sanctions or information sharing, as we saw with the Tenet Media indictment. That can enable other actors to take action. Second, we've noted that influence networks increasingly rely on off-platform web domains. Those are websites that we can't take down even if we try to block them, but the content persists across the Internet. We published a report last year with some concrete recommendations to governments thinking about—

The Chair Liberal Ron McKinnon

I'm sorry, sir. I have to cut you off. Thank you.

We'll go now to Mrs. Zahid.

Go ahead, please, for five minutes.

Salma Zahid Liberal Scarborough Centre, ON

Thank you, Chair. Thanks to all the witnesses for appearing before the committee.

For my first question, I would like to have answers from all three witnesses. I would appreciate a verbal answer.

Did any witness here today discuss their testimony with any members of this committee or staff before this meeting?

I would like to start with Ms. Curran.

4:25 p.m.

Head of Public Policy, Canada, Meta Platforms Inc.

Rachel Curran

No, we did not.

4:25 p.m.

Head of Government Affairs and Public Policy for Canada, YouTube

Lindsay Doyle

We reached out to all members of the committee to offer any questions or answers in advance of our testimony today.

Salma Zahid Liberal Scarborough Centre, ON

You shared your testimony with all members?

4:25 p.m.

Head of Government Affairs and Public Policy for Canada, YouTube

Lindsay Doyle

We did not share our testimony, but we did reach out, including to you.

4:25 p.m.

Director, Public Policy and Government Affairs, Canada, TikTok

Steve de Eyre

We did not.

Salma Zahid Liberal Scarborough Centre, ON

Thank you.

My next question is for Mr. de Eyre from TikTok.

In 2023, ByteDance reported approximately $120 billion U.S. in revenue. How much of this revenue was reinvested into your teams working on clearing your sites of misinformation and disinformation?

4:25 p.m.

Director, Public Policy and Government Affairs, Canada, TikTok

Steve de Eyre

I don't have an exact figure.

Perhaps my colleague Justin can speak a bit about how we address misinformation and disinformation.

4:25 p.m.

Global Head, Policy Development, TikTok

Justin Erlich

Thanks, Steve, and thanks for the question.

First, I want to underscore how invested we are in keeping the platform safe. This year alone, we're investing over $2 billion on trust and safety. In particular, we have several teams working on misinformation and covert influence to make that a top priority for us. We've been working in this space across 150 elections around the world over the last four years and continue to invest more resources for each election we have.

As I said earlier, we basically take a three-part approach to protecting our community: removing content, empowering our community through user literacy campaigns and partnering with experts, and working closely with independent fact checkers, who help assess the veracity of the content on the platform.

Salma Zahid Liberal Scarborough Centre, ON

Thank you.

My next question is also for you. We know that inflammatory material generates more engagement, and more engagement leads to more ad revenue. Is it fair to say that your platform benefits financially when and if you allow the dissemination of myths and disinformation on your platform?

4:25 p.m.

Director, Public Policy and Government Affairs, Canada, TikTok

Steve de Eyre

I disagree with that statement. As Justin mentioned earlier, our goal at TikTok is to create a place for creativity and joy. People come to TikTok for authentic and engaging content, but positively engaging content.

When you spend time on TikTok, if you engage positively with a video, whether you watch the whole video, comment on it, like it or share it, that gives us a signal of the type of content you may like. When I talk to people, that's what they say they come to TikTok for.

I'm sure that you heard earlier this year about Keith Lee going to a number of restaurants in Scarborough. He's an American food TikTok creator. There were lineups around the block to go to these restaurants. There was a shawarma restaurant and a jerk restaurant. That was the biggest thing over the summer on TikTok in Toronto. That's the type of experience we try to cultivate.

Salma Zahid Liberal Scarborough Centre, ON

Mr. Chair, I will share the rest of my time with Mr. Gaheer.

The Chair Liberal Ron McKinnon

Go ahead, Mr. Gaheer.

Iqwinder Gaheer Liberal Mississauga—Malton, ON

Mr. Chair, how much time do I have left?

The Chair Liberal Ron McKinnon

You have a minute and a half.

Iqwinder Gaheer Liberal Mississauga—Malton, ON

Mr. Chair, I wish I were in the room so I could have a longer conversation with Ms. Curran afterward. I can't believe some of the testimony I've heard today. As a generation [Technical difficulty—Editor], I've seen the effects of Facebook and other social media on my generation.

Ms. Curran, did you just say there are 90 fact checkers for your platform, a platform with hundreds of millions of users? It's probably in the billions. The platform is available in every single language and you have 90 fact checkers for 60 languages. What you're saying, then, is that there is a fact checker and a half on average for every language you're fact-checking.

Do you have a statistic on how many posts are being produced on your platform per minute, and do you think 90 fact checkers is an adequate number for checking that many posts?

4:30 p.m.

Head of Public Policy, Canada, Meta Platforms Inc.

Rachel Curran

I will get back to the committee with an answer on posts per minute. I don't have that answer now. We have more than 90 independent fact checkers checking content on our platforms in 60 different languages. Yes, those are the numbers currently.

Iqwinder Gaheer Liberal Mississauga—Malton, ON

I'm very scared by that statistic. You have 90 fact checkers for I don't know how many hundreds of millions of users in every single country in the world. How many posts are being produced per minute? That's a very scary thought given the state of misinformation and disinformation around the world.