If we look at the U.S. and junk news being spread in swing states, this is just based on Twitter. It wasn't a Facebook analysis, but just Twitter and what people were sharing as news and information.
We analyzed a couple of million tweets in the 11 days leading up to the vote. If you looked on average at the URLs that users were sharing in swing states, they tended to point to higher rates of junk news and information, compared with uncontested states. Therefore, part of this is the somewhat organic drive of spreading misinformation. It's not necessarily coming through the advertisements but it's being organically spread through the platforms by users, or maybe by bots, who did play somewhat of a role in amplifying a lot of those stories.
The way we measured where the accounts were coming from was by using geo-tagged data. If a user had reported to be in Michigan, for example, which was one of the swing states, that's how we determined where the information was and where the junk news was concentrated.
There's the organic side of it, but there's also the targeted advertisement side of things. We have a lot of information on Russia, thanks to Facebook's disclosures around Russian operatives buying political advertisements and targeting them to voters based on their identities or values. They homed in on groups such as gun-right activists and the Black Lives Matter movement.
They tended to also play both sides of the political spectrum. It wasn't only about supporting Trump. They also supported candidates such as Jill Stein and Bernie Sanders. They never supported Clinton, though. They would always launch ad attacks on her.
The stuff that comes from the political parties themselves is really hard to trace. That relates back to the question you asked before on laws and what we can do to improve some of this targeting stuff.
We talked to and interviewed a lot of the bot developers who worked on campaigns for various parties. They were the ones who created the political bots to amplify certain messages. It's hard to trace their work back to a political party because of the campaign finance laws that only require reporting up to two levels. Generally how these contracts go out is that there will be a big contract to a big strategic communications firm, which will then outsource to maybe a specialized Facebook firm, which will then outsource work to a bunch of independent contractors. As you go down the list, you eventually get to the bot developer, who we interviewed.
We don't have any specific data on exactly what parties these groups worked for, at least none that I can share because of our ethics agreements with these developers. The big problem here is that we're unable to actually track because of campaign finance laws.