Thanks for inviting me.
My name is Samantha Bradshaw. I'm an assistant professor in new technology and security at American University, where I also direct the Center for Security, Innovation, and New Technology.
For the past eight years, my research has examined questions around how state actors co-opt and weaponize social media for achieving political goals. Some of this research has focused on Russian interference, so I'll spend most of my time discussing this work here today.
There's no doubt that emerging and digital technologies have expanded the scope, scale, reach and precision of disinformation campaigns. State actors like Russia have learned to use these technologies to reach across their borders and influence individuals in ways that can undermine democracy and the expression of human rights.
Since 2017, platforms have been taking down multiple campaigns. It's in the hundreds now. We've seen state-backed disinformation campaigns removed by Facebook, Twitter and YouTube. These activities have also been documented across other kinds of platforms, such as chat applications like WhatsApp or alternative platforms like Parler. Disinformation and propaganda on these platforms, of course, are used to influence online audiences in ways that advance Russia's geopolitical ambitions.
Sometimes they rely on more covert tactics, such as the use of fake social media accounts, bots and online troll forums to spread false information or other harmful narratives discreetly. Other times, they rely on more overt propaganda strategies that come from state-sponsored media outlets like RT and Sputnik, which openly disseminate pro-Kremlin narratives.
Many of the strategies we see today reflect the longer history of Cold War strategies, wherein Soviet leadership undertook many efforts to alter audience attitudes, opinions and perspectives on events and issues around the world. Back in the day, in addition to promoting overt and attributable content on social media, Soviet entities employed news agencies and sympathetic newspapers abroad, and courted journalists as sources to spread unattributable messages. Today we're seeing a lot of these strategies play out in the development of fake websites and fake journalist personas, the development of front media organizations, and the co-opting of social media influencers.
Some of my more recent work looks at Russian state-backed media coverage of the Black Lives Matter protests in the U.S. over the summer of 2020. We investigated elements of this Russian-affiliated media landscape and its digital presence. We found that a lot of these front media organizations often developed and tailored content to different segments of English-speaking users. A lot of this content was about playing both sides and emphasizing the racial divides in American politics, with some outlets expressing support for the Black Lives Matter protesters and others emphasizing support for the police and the Blue Lives Matter movement.
By tracking a lot of the ownership of these media companies, and the movement of staff and journalists affiliated with known Russian news agencies, we found lots of connections in the incorporation, funding and personnel working for media outlets that claim to be independent from the Russian government. While things like editorial independence can of course be subjective, funding and ownership relations are key criteria in any evaluation process.
A lot of strategies around state media, influencers and front organizations have appeared in information operations in other countries around the globe. This includes countries as far away as those in Africa and across the Sahel states, where I worked on platform data investigating Russian activities there. In those examples, we saw the co-opting of local influencers, who are often paid by Russian actors to generate this veneer of legitimacy around the content being produced and amplified on social media. While the specific goals of any influence operation will vary, many are designed with the intent of disrupting the social fabric of society. In the context of the Sahel, Russian disinformation campaigns often highlighted anti-Western and anti-colonial narratives that fed into localized and generational memory to amplify divides within and across society.
This brings me to my final point that I want to make in my opening remarks about contemporary Russian information operations, and it's that many don't really rely on what we consider, traditionally, to be disinformation. A lot of the more effective campaigns that we're seeing don't rely on false information, things that can be easily fact-checked, but on identity-based disinformation and tropes around racism, sexism, xenophobia or even who we are as political citizens. These tropes, really, are then used to polarize, suppress and undermine our institutions of democracy.