Evidence of meeting #123 for Public Safety and National Security in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was russian.

A video is available from Parliament.

On the agenda

MPs speaking

Also speaking

David Agranovich  Director of Threat Disruption, Meta Platforms Inc.
Steve de Eyre  Director, Public Policy and Government Affairs, Canada, TikTok
Lindsay Doyle  Head of Government Affairs and Public Policy for Canada, YouTube
John Hultquist  Chief Analyst, Mandiant Intelligence, Google, YouTube
Rachel Curran  Head of Public Policy, Canada, Meta Platforms Inc.
Justin Erlich  Global Head, Policy Development, TikTok
Anthony Seaboyer  Assistant Professor, Royal Military College of Canada, As an Individual
Adam Zivojinovic  Journalist, As an Individual
Clerk of the Committee  Mr. Simon Larouche

The Chair Liberal Ron McKinnon

I call this meeting to order.

Welcome to meeting number 123 of the House of Commons Standing Committee on Public Safety and National Security. Today's meeting is taking place in a hybrid format.

I would like to remind participants of the following points. Please wait until I recognize you by name before speaking. All comments should be addressed through the chair. Members, please raise your hand if you wish to speak, whether participating in person or via Zoom. The clerk and I will manage the speaking order as best we can.

Pursuant to Standing Order 108(2) and the motion adopted on September 19, 2024, the committee resumes its study of Russian interference and disinformation campaigns in Canada.

I would like to welcome our witnesses for the first hour.

From Meta Platforms Inc., we have David Agranovich, director of threat disruption, by video conference; and in the room we have Ms. Rachel Curran, head of public policy, Canada.

From TikTok, we have Steve de Eyre, director of public policy and government affairs, Canada; and by video conference, we have Justin Erlich, global head of policy development.

From YouTube, we have Lindsay Doyle, head of government affairs and public policy for Canada; and John Hultquist, chief analyst, Mandiant Intelligence for Google, appearing by video conference.

I thank all the witnesses for being here today and for helping us with our study.

I would now invite Mr. Agranovich to make an opening statement of up to five minutes.

Please go ahead, sir.

David Agranovich Director of Threat Disruption, Meta Platforms Inc.

Thank you so much, and thank you for the opportunity to appear before you today.

My name is David Agranovich. I am the director of threat disruption at Meta.

My work is focused on coordinating our cross-company efforts to identify, disrupt and deter adversarial threats on our platforms. I've worked to counter these threats at Meta for the past six years. Previously, I worked in the U.S. government on Russian interference issues, culminating as the director for intelligence and director for Russia at the National Security Council.

I'm joined today by Rachel Curran, who is our head of public policy for Canada.

At Meta, we work hard to identify and counter adversarial threats. These include hacking, spyware and cyber espionage operations, as well as influence operations or what we call “coordinated inauthentic behaviour”, or CIB, which we define as any coordinated effort to manipulate public debate for a strategic goal in which fake accounts are central to the operation.

At Meta, our community standards prohibit inauthentic behaviour, including by users who seek to misrepresent themselves, use fake accounts or artificially boost the popularity of content. This policy is intended to protect the security of users and our services and create a space where people can trust the people and the communities that they interact with on our platforms.

We also know that threat actors are working to interfere with and manipulate public debate, exploit societal divisions, promote fraud, influence elections and target authentic social engagement. Stopping these bad actors is one of our highest priorities. This is why we have invested significantly in people and technologies to combat inauthentic behaviour. The security teams at Meta have developed policies, automated detection tools and enforcement frameworks to tackle deceptive actors, both foreign and domestic. These investments in technology have enabled us to stop millions of attempts to create fake accounts every day and to detect and remove millions more, often within minutes of their creation.

Just this year, Meta disabled more than two billion fake accounts, the vast majority of which, over 99%, were identified proactively before receiving any report from a user.

Our strategy to counter these adversarial threats has three main components. The first is expert-led investigations to uncover the most sophisticated operations. The second is public disclosure and information sharing to enable cross-societal defence. The third is product and engineering efforts to build the insights derived from our investigations into more effective scaled and automated detection and enforcement.

A key component of this strategy is our public quarterly threat reports. Since we began this work, we've taken down and disclosed more than 200 covert influence operations. These operated from 68 different countries and operated in at least 42 different languages, from Amharic and Urdu to Russian and Chinese.

Sharing this information has enabled our teams, investigative journalists, government officials and industry peers to better understand and expose Internet-wide security risks, including those ahead of critical elections. We also share detailed technical indicators linked to these networks in a public-facing repository hosted on GitHub, which contains more than 7,000 indicators of influence operations activity across the Internet.

I want to very briefly share the key trends we've observed in the course of our investigations into influence operations around the world.

First, Russia continues to be the most prolific source of CIB. We've disrupted more than 40 operations from Russia that targeted audiences all over the world. Second, Iran remains the second most active source of CIB globally. Third, while historically China-origin clandestine activity was limited on our platforms, we've seen a shift by Chinese operations in the past two years to target broader, more global audiences in languages other than Chinese.

Across the different geographic operations, we've seen an increasing reliance on private firms selling influence as a service; the use of generative AI tools—though, I would note, with little impact on our investigative capabilities; and finally, amplification through uncritical media coverage of these networks.

I'd be happy to discuss these operations in more detail throughout our discussion today.

Countering foreign influence operations is a whole-of-society effort, which is why we work with our industry peers—including some of the folks represented here today—as well as independent researchers, investigative journalists, government and law enforcement.

Thank you for your focus on this work. I look forward to answering your questions.

The Chair Liberal Ron McKinnon

Thank you, Mr. Agranovich.

We go now to Mr. de Eyre and Mr. Erlich to make an opening statement of up to five minutes.

Steve de Eyre Director, Public Policy and Government Affairs, Canada, TikTok

Good afternoon, Mr. Chair and committee members.

My name is Steve de Eyre, and I'm the director of public policy and government affairs for TikTok Canada. I'm joined today by my colleague, Justin Erlich, the global head of policy development for TikTok's trust and safety team, who's joining us virtually from California.

Thank you for the invitation to meet today to speak about the important issue of protecting Canadians from disinformation. The topic of today's hearing is important to us, to the foundation of our community and to our platform. TikTok is a global platform where an incredibly diverse range of Canadian creators and artists have found unprecedented success with global audiences, where indigenous creators are telling their own stories in their own voices and where small business owners like Caitlin Campbell, who spreads a message of positivity while caffeinating Canadians with Street Brew Coffee, are finding new customers not just across Canada but around the world.

Canadians love TikTok because of the authenticity and the positivity of the content, so it's important and in our interest to maintain the security and integrity of our platform. To do this, we invest billions of dollars into our work on trust and safety. This includes advanced automated moderation, security technologies and thousands of safety and security experts around the world, including content moderators located in Canada. We also employ local policy experts who help ensure the application of our policies and consider the nuances of local laws and culture.

When it comes to outside manipulation and foreign interference, TikTok takes an objective and robust approach. To start, our community guidelines prohibit misinformation that may cause significant harm to individuals or society, regardless of intent. To help counter misinformation and disinformation, we work with 19 independent fact-checking organizations to enforce our policies against this content.

In addition, we invest in elevating reliable sources of information during elections and unfolding events and on topics of health and well-being. We relentlessly pursue and remove accounts that break our deceptive behaviour rules, including covert influence operations. We run highly technical investigations to identify and disrupt these operations on an ongoing basis. We've removed thousands of accounts belonging to dozens of networks operating from locations around the world, and we regularly report these removals in our publicly available transparency centre.

Addressing disinformation is an industry-wide challenge that requires a collaborative approach and collective action, including both platforms and government. As an example, TikTok has joined forces with other companies to combat the deceptive use of AI in elections. We became the first video-sharing platform to implement technology from the Coalition for Content Provenance and Authenticity that automatically labels AI-generated content. We endorse the International Foundation for Electoral Systems' voluntary guidelines for election integrity for technology companies, which provide a shared set of expectations and practices for companies and election authorities to promote election integrity.

Such collaboration is also critical as we approach the next federal election. In 2021, TikTok worked with Elections Canada to build an in-app hub that provided authenticated information on when, where and how to vote. That year, TikTok was also the only new platform to sign on to the PCO's Canada declaration on electoral integrity online. As we approach the next election, we will be building upon these efforts and leveraging learnings and best practices from other elections taking place around the world, including in the U.S.

Before I conclude, I want to provide the committee with information regarding TikTok's actions related to the revelations made by the U.S. Department of Justice on Tenet Media. Following evidence presented by the U.S. DOJ and our own investigation, we've removed accounts belonging to Tenet Media, its founder Lauren Chen, and a fake news outlet for violating our policies on promoting deceptive behaviour and paid political promotion.

I also want to note that TikTok removed accounts associated with Rossiya Segodnya and TV-Novosti for engaging in covert influence operations on TikTok, which violates our community guidelines. We label other state-affiliated media accounts on our platform to provide the community with important context about the source of the information.

Thank you again for the invitation to speak with the committee, and we look forward to sharing more with you about how we are addressing these important issues.

The Chair Liberal Ron McKinnon

Thank you, sir, for your remarks.

I now invite Ms. Doyle and Mr. Hultquist to make an opening statement of up to five minutes.

Please go ahead.

Lindsay Doyle Head of Government Affairs and Public Policy for Canada, YouTube

Mr. Chair and members of the committee, my name is Lindsay Doyle, and I am head of government affairs and public policy for YouTube in Canada.

I'm pleased to be joined remotely by my colleague John Hultquist, chief analyst at Mandiant Intelligence.

Responsibility is our first priority at YouTube. More than 500 hours of video are uploaded to YouTube every minute. The scale and our global reach demand that we take seriously the importance of protecting free expression while also ensuring we are doing the right thing for our users, creators and advertisers.

A critical aspect of our responsibility efforts is doing our part to protect the integrity of democratic processes around the world. That's why we have long invested in capabilities and tools to address threats to electoral integrity. We recognize the importance of enabling the people who use our services, in Canada and abroad, to speak freely about the political issues most important to them. At the same time, we continue to take steps to prevent the misuse of our tools and platforms, particularly attempts by foreign state actors to undermine democratic elections and political discourse.

As it relates to Russia, since the invasion of Ukraine in 2022 YouTube has blocked thousands of channels and millions of videos from Russian state-sponsored organizations, including channels directly tied to RT and Sputnik. So far in 2024, we have terminated more than 11,000 YouTube channels linked to coordinated influence operations with ties to Russia. We also continue to terminate channels belonging to Russian entities and individuals subject to sanctions.

Following a U.S. Department of Justice indictment, issued on September 4, regarding covert Russian support for a U.S.-based media company, we terminated Tenet Media's channels, channels owned or operated by its owners, and material that was cross-posted to other channels. We also removed copies and re-uploads of Tenet Media content from additional channels. Our investigation is ongoing, as are our efforts to combat coordinated influence operations.

In recent weeks, Canada, the United States and the United Kingdom sanctioned RT for engaging in both direct disinformation and covert influence operations. These recent developments highlight the importance of receiving information from law enforcement, government and trusted flaggers, which add to the signals we can observe about activity on our platforms. We continue to ensure compliance with applicable sanctions while upholding our terms of service.

Finally, over the last two years, the Russian government has periodically throttled access to YouTube. In the last two months, we saw frequent efforts to throttle and even block YouTube in Russia. YouTube has long been one of the last remaining sources of independent media inside Russia, and has refused to comply with a number of Russian government demands to remove political speech and similar content.

To help advance our work against foreign interference and state-sponsored activity, Google created the threat intelligence group. I will ask my colleague to briefly introduce his work.

The Chair Liberal Ron McKinnon

I'm sorry, but can we just pause for a moment?

Rhéal Fortin Bloc Rivière-du-Nord, QC

We have an unmuted microphone, and that causes interference.

The Chair Liberal Ron McKinnon

Mr. Hultquist, if you can mute yourself for the moment, we'll see whether that solves the problem...unless you were about to speak.

John Hultquist Chief Analyst, Mandiant Intelligence, Google, YouTube

I am about to speak, that's why. I'm sorry.

Rhéal Fortin Bloc Rivière-du-Nord, QC

Perhaps we should balance the sound. When I listen to Mrs. Doyle, I have to turn the volume all the way up. Then it's the reverse when Mr. Hultquist starts speaking. I think we have a minor volume balance problem, but I don't know how to solve it because I know nothing about these things.

The Chair Liberal Ron McKinnon

That's fine; we'll continue with Mr. Hultquist.

By the way, for those of you online who are not familiar with our Zoom system, if you look at the bottom of your window, there's an interpretation button. You can choose whether to listen to English, French or the original.

Jennifer O'Connell Liberal Pickering—Uxbridge, ON

I hope these witnesses are tech-savvy.

The Chair Liberal Ron McKinnon

I'm an IT guy, but I am a little behind on the technology these days.

Mr. Hultquist, please go ahead.

3:50 p.m.

Chief Analyst, Mandiant Intelligence, Google, YouTube

John Hultquist

Thank you for the opportunity to address this important issue and discuss our work.

Within our mandate at Mandiant Intelligence, we identify, monitor and tackle threats, including coordinated influence operations and cyber-espionage campaigns. Our teams disrupt activity on a regular basis and publish our findings. We also provide expert analysis on threats originating from countries like Russia, China, Iran, North Korea and criminal organizations.

Russia has a vast covert apparatus that includes their intelligence services, as well as contractors from their private sector. These organizations have differing capabilities, which range from complex intrusion operations to coordinated inauthentic behaviour on social media platforms. Though these threats are serious, we have been successful in disrupting this type of activity on our platforms quickly and effectively.

Russian information operations activity has been used in a number of contexts to support Russia's strategic and tactical concerns, but it is most consistently focused on undermining democratic society by highlighting polarizing political and social issues. Since the launch of Russia's full-scale invasion of Ukraine, this activity has prioritized narratives designed to erode western support for Ukraine.

Our team is constantly on the lookout, because this activity is always adapting. The actors develop new techniques to blend in with real users or scale their operations or, in the case of intrusion actors, new techniques that might help them gain illicit access to systems. We continue to monitor and adapt to the use of new techniques to proactively tackle new threats.

October 10th, 2024 / 3:50 p.m.

Head of Government Affairs and Public Policy for Canada, YouTube

Lindsay Doyle

We, our users, industry, law enforcement and civil society all play important roles in safeguarding democracy and combatting disinformation. At YouTube, we are committed to doing our part to keep the digital ecosystem safe, reliable and open to free expression.

We appreciate the committee convening this important hearing, and we look forward to hearing your questions.

The Chair Liberal Ron McKinnon

Thank you for your remarks.

We'll start our questions now with Ms. Dancho.

Please go ahead for six minutes.

3:50 p.m.

Conservative

Raquel Dancho Conservative Kildonan—St. Paul, MB

Thank you, Mr. Chair.

Thank you to the witnesses for being here. I appreciated the testimony of Facebook and Meta, YouTube and Google, and TikTok as well. I appreciate that you've brought your foreign interference intelligence experts and that you each have your own designated branch to tackle this growing issue.

I also appreciate that you each mentioned how you've been dismantling any reach of Tenet Media. I think that's priority number one, and I appreciate that you've all taken action on that.

Certainly, Conservatives are of the position that any actor taking money from a foreign government to undermine the Canadian interest should be held fully accountable and, of course, your platforms have a very strong role in ensuring that is done. Given your technology, I would imagine you would know even sooner than government, in many circumstances, when that's being done. It sounds like you're being quite proactive.

I would like to understand better what government has done tangibly to assist your platforms. We've heard a lot proclaimed by the current Liberal government that they are taking foreign interference seriously.

We could start with Meta. What tangible efforts have been made to assist Facebook, for example, in these efforts to combat foreign interference?

Rachel Curran Head of Public Policy, Canada, Meta Platforms Inc.

We have not had any specific outreach from the Government of Canada on this issue. We do engage with government departments when we think there is information that's relevant and necessary to their mandates, and we do brief them on our work, including work related to foreign interference.

We have done those briefings for government agencies in the last year or two, but we have not had specific outreach from government departments or government agencies on this issue in the last, I would say, 12 to 24 months.

3:50 p.m.

Conservative

Raquel Dancho Conservative Kildonan—St. Paul, MB

Thank you.

Just to confirm, you have proactively, of your own accord, reached out to brief government, but government, in the last two years, has not reached out or provided any tangible resources regarding foreign interference. I'm just confirming that.

3:50 p.m.

Head of Public Policy, Canada, Meta Platforms Inc.

Rachel Curran

That's correct, yes.

3:50 p.m.

Conservative

Raquel Dancho Conservative Kildonan—St. Paul, MB

It's surprising, given Facebook's reach and, of course, the fact that China, Iran, Russia and others are trying to utilize your platform, that no action really has been taken, but thank you for taking the initiative.

YouTube, do you have anything different or similar to add?

3:50 p.m.

Head of Government Affairs and Public Policy for Canada, YouTube

Lindsay Doyle

We do, again, regularly brief government, as well as parliamentarians, on our efforts, especially as it relates to foreign interference.

With respect to some of our internal teams, our threat analysis group, which is a team of experts and security analysts who regularly both detect and disrupt foreign campaigns, would also be the ones to likely coordinate and discuss these matters directly with law enforcement, but these have been proactive briefings on our account.

3:55 p.m.

Conservative

Raquel Dancho Conservative Kildonan—St. Paul, MB

As Ms. Curran has said, you have not received proactive efforts to you from government. It's been you initiating those efforts. Is that correct?

3:55 p.m.

Head of Government Affairs and Public Policy for Canada, YouTube

Lindsay Doyle

We do provide those briefings proactively, yes.