Thank you.
Distinguished members of the committee, it's an honour to address you today.
My name is Nina Jankowicz, and I lead a U.S. non-profit, the American Sunlight Project, which is dedicated to increasing the cost of lies that undermine democracies.
I'm also the author of How to Lose the Information War, a book that examines European responses to Russian disinformation.
I've spent a decade studying this topic. I teach a graduate-level course on it at Syracuse University's Maxwell School of Citizenship and Public Affairs and I have advised governments, including Ukraine's, on their responses to the Kremlin's influence campaigns.
My message to you today is not optimistic. Despite increased awareness of foreign-backed online influence campaigns, democracies like Canada and the United States are more vulnerable to them today than they were eight years ago.
The Kremlin continues to actively exploit deepening fissures in our societies in order to amplify democratic discord. Social media companies have rolled back their efforts to address disinformation on their platforms and have restricted access to their data, making it difficult to hold them to account. Researchers studying this phenomenon, including me, have been baselessly attacked as censors, enduring harassment and violent threats for our public interest investigations. However, my own organization has seen evidence of Russia's continued attempts to manipulate democratic societies.
American Sunlight recently identified what we call the “sleeper agent network” on X. It consists of over 1,100 likely automated accounts that post hundreds of times per day and that repeatedly retweet overt Russian propaganda within 60 seconds of its being posted.
Despite Elon Musk's promise to rid his platform of bots, some accounts in this network have been active for over a decade, springing into action at key moments. In that time, they have generated over 100 million posts on divisive issues, from the war in Ukraine to disinformation on the recent hurricanes.
They've also become involved in Canada's information space. In the past six months alone, they have amplified false narratives about the “freedom convoy” and about Deputy Prime Minister Chrystia Freeland hundreds of times.
More evidence of Russia's continued online influence campaigns includes the recent indictment from the U.S. Department of Justice. The DOJ identified a scheme in which two Canadian nationals allegedly set up Tenet Media, a shell company that ferried $10 million U.S. from Russian propaganda network RT to conservative YouTube influencers with millions of collective subscribers.
The influencers posted about divisive issues, from alleged racism against white people to censorship to trans rights. Canada is mentioned over 300 times in the videos, while Prime Minister Justin Trudeau is mentioned 60 times. The genius of this scheme is that while RT was paying influencers to create the divisive content that they were already creating for a built-in audience, Russia was simply adding fuel to the fire.
These two case studies show that Russia is still active in undermining our democracies and that the current paradigm of playing “whack-a-troll”—focusing on stopping Russian disinformation and influence efforts at the source—is not the best use of our resources. Russia increasingly attempts to dupe users into trusting local, authentic, seemingly independent sources of information. Conveniently, these are sources that social media platforms are much less likely to moderate.
What, then, can Canada do to respond to Russian and other foreign disinformation campaigns while preserving freedom of expression?
One effective reform is to simplify the declassification process so that Canada's intelligence agencies can quickly release information related to exigent national security threats, election security or foreign state-backed disinformation campaigns.
The U.S. and the U.K. governments found success with this tactic when declassifying information about Russian troop movements prior to the full-scale invasion of Ukraine. This helped to shore up public support for Kyiv.
A public notification process like this also undermined the effects of the so-called “Macron leaks” during France's 2017 election, essentially prebunking the claims made by Russia-enabled disinformers.
Second, Canada should strengthen and clarify its laws governing influencers and online political content. Neither the Canada Elections Act nor the Competition Act stipulates that influencers paid to create political content must disclose the source of their funding, unless that source is a political entity. This is a loophole that bad actors like Russia can exploit.
Finally, Canada should continue to invest in robust information literacy programs. It's important for these programs to be targeted to local communities and delivered by trusted local messengers, educating not only school-age children but voting-age adults as well.
In particular, Parliament should consider earmarking funding for programs that marry existing local efforts, such as tech literacy courses, with information literacy education. This programming should not label content as good or bad, trustworthy or not trustworthy, but give citizens the objective tools they need to navigate today's polluted information environment. They would then be a better equipped to approach content from Russia or elsewhere with healthy skepticism, protecting democracy from the front lines.
Thank you.