Yes. When I was talking about the principles supporting the algorithms, I was talking more about social media algorithms and the way those things tailor content or deliver content to users. They deliver content based on virality. If things get a lot of clicks, that means that a lot of people find this really interesting and, therefore, it might trend. That's sort of what I was talking about there.
Instead of principles around virality, maybe we want principles on factual information coming from professional news outlets as opposed to sources that constantly produce misleading or fake news. The professional news should maybe get—I can't think of the technical word right now—but it should be prioritized in the algorithms. That's what I was referring to there when I was talking about designing algorithms for democracy. It's changing those sorts of principles.
When it comes to the actual bot developers.... This wasn't my core research project but I could point you to one of the researchers who did a lot more of these interviews than I did. His name is Sam Woolley. What he found was that the bot developers are just like any other tech developer. They're creating a piece of software that is designed to mimic human behaviour. It might amplify a certain story. It might converse with actual users online.
Bots do a whole bunch of different things. It really depends on what the goals of the developer are. The developers might actually have ideals or principles that they feed into the bots. A lot of them see these bots as being good for democracy because they're helping to amplify a message that might not get heard or go trending without the help of the bot.