Good morning, and many thanks for the invitation to speak at the committee hearing today.
I'm Adam Hadley, executive director at Tech Against Terrorism. Over the next few minutes, I'd like to explain more about who we are at Tech Against Terrorism and what we do, and provide some clarity about our position on some of the discussion points.
Tech Against Terrorism is a not-for-profit based in the U.K. Ours is a public-private partnership. We were established with UN CTED, the counterterrorism executive directorate, in April 2017. Our mission is to work with the global tech sector, in particular smaller tech platforms, to help them tackle the terrorist use of their services while respecting human rights. Our work is recognized in a number of UN Security Council resolutions, including resolution 2354 and resolution 2395. As a public-private partnership, we work with the major democracies—governments such as the Government of Canada, the U.S., the U.K., Australia and New Zealand—alongside the tech sector, which includes big tech and smaller tech platforms.
The reason we focus on smaller technology platforms is that many of these platforms have limited capacity and capability to deal with terrorist use of their services. Our mission is to support these smaller platforms, free of charge, to improve their response to terrorist activity and terrorist content. In particular, over the past two or three years, we've seen a significant increase in migration from the use of very large platforms to smaller ones. This represents a strategic vulnerability in response to the terrorists' use of the Internet.
Tech Against Terrorism monitors over 100 tech platforms on an hourly basis. We also monitor around 200 terrorist-operated websites. Overall, we work with 150 platforms, providing a number of services to help improve their response. We also work alongside other organizations focused on online counterterrorism, such as the Global Internet Forum to Counter Terrorism.
In detail, our work at Tech Against Terrorism focuses on understanding the nature of the threat. This is based on open-source intelligence, in order to understand the detail of how terrorists use particular platforms. We use this intelligence and insight to establish relationships with these platforms, reach out to them and evaluate the extent to which we can provide support.
This results in a mentorship service that we offer free to platforms. The mentorship service is designed to build capacity. We do this alongside the GIFCT. Of note, we've developed some software, called the terrorist content analytics platform, which helps alert small platforms of the existence of terrorist content. The TCAP, the terrorist content analytics platform, has so far been funded by the Government of Canada. This has resulted in 30,000 URLs—individual items of terrorist content—being referred to platforms, with more than 90% of this content on smaller platforms removed. We've also built a knowledge-sharing platform, which is designed to share best practice information and guidance to smaller platforms. We actively work to have terrorist websites removed from the Internet.
I should stress that we focus on violent, Islamist extremist organizations and, of course, the extreme far right. The basis of our work is typically focused on designation. In upholding the rule of law, we believe that designation is a critical mechanism to ensure that platforms remove content in a timely fashion. Therefore, we applaud the Government of Canada for its pioneering work in designating organizations from across the terrorism and violent extremism spectrums.
In summary, we call for governments to focus on the rule of law and how they regulate, with a focus on providing definitional clarity to tech companies so that they can improve their action. We believe that designation is a crucial tool that can be used to help provide that clarity, so that small tech platforms get better at dealing with terrorist activity.
Finally, we would stress that proportionate measures are important. Often, regulation in this area is primarily focused on big tech. We understand the concern here. However, the current threat picture is such that there is a significant amount of terrorist activity from across the spectrum on smaller platforms. Often, regulation fails to take this into account and fails to take into account the nature of adversarial shift—in other words, when terrorist activity changes or adapts according to the measures that are being used to avoid terrorists' use of services.
In summary, many thanks for the invitation to speak today. I look forward to participating in the session.
Thank you.