I think you're also focusing specifically on terrorist and violent extremist content, which is a part of GIFCT as well. I believe we achieve a standard of two hours to try to take that content down.
As I stated in my remarks, with proprietary technology, 91% of that content doesn't make it to platform. We now have a better understanding of where it's being posted and who is posting it. Between the time you hit post and the time it comes through our servers, we can tag it.
It's a very interesting and important question that you ask because we're very proud of the work that we've done with regard to terrorism and violent extremist groups, but when we go to conferences like the Oslo Freedom Forum or RightsCon—I don't know if you know that conference; it happened in Toronto two years ago—we get feedback from groups like Amnesty International, which is here in Canada, and Witness, which is not, that are a little worried that we're too good at it. They want to see insignias on videos. They want to see the conversation that is happening in order to be able to follow it and eventually prosecute it.
We're trying to find this balance between the requests of governments to stop this kind of hate and these terrorist actions from happening, which, again, we've been very successful at, and the requirements of civil society to track them and prosecute.