Of course. Automation can cover a number of separate activities. Often we might discuss algorithms, which certainly are part of automation. However, in our experience, the biggest challenge that small platforms have isn't in the basics but in the workflow. Content moderation automation is a simple mechanism in principle. It's identifying content that may fall afoul of the law or terms and conditions. It's then assessing whether this content does pass those thresholds. It's taking action, recording that action and reporting on it. It's also providing an opportunity for a user to appeal that decision. For the workflows, the complex ones, with smaller platforms in particular, most of our activity in supporting platforms is with that basic infrastructure.
You could argue that this is all about automation. It's about trying to ensure that small platforms are able to accurately identify and moderate content in a scalable way. Unlike big platforms, smaller platforms have very small teams. They often have no or limited revenue or profitability, and they tend not to have particularly sophisticated technical infrastructure. That explains partly why terrorists and violent extremists will often use smaller platforms, because they know it's so much harder for those smaller platforms to remove the material.
When we're working with smaller platforms, we provide a number of recommendations about how they can best use technology and automation to make the content moderation process more accurate and more successful as a result. Automation can include various other mechanisms such as hashing or hash-sharing. Potentially it can ultimately include searches of keywords and terminology, and it could involve more sophisticated mechanisms to understand whether a symbol is in an image or a video.
However, most small platforms rarely have the capacity or capability to build complex automation. The automation that we typically support with is fairly simple and it's about helping them make the right decisions and record the decisions that they're making. An important principle in all content moderation, at least in our view, is transparency. Therefore, we recommend that platforms of all sizes invest in transparency reporting and, for that, automation is required to understand what has been removed and what's been left up.