With all due respect, I think what I just went through in my opening statement is more than just a plan. We are actually implementing things on the platform, including “view ads”. When it was launched in November as a product whereby anybody can see all the ads that are running on Facebook, it didn't exist anywhere else in the world.
The Canadian election integrity initiative is not a plan. It, in fact, is implemented. We have done the same thing in Ontario— but let me return to that in a moment.
To answer your specific question, now that I have a copy of this in front of me, my understanding is that this content is no longer on Facebook. With respect to general content on Facebook, we are governed by a set of community standards that are universal in nature, and you can read up on that at facebook.com/communitystandards. The standards actually do, in fact, prohibit hate speech and bullying. They also prohibit things like the glorification or the promotion of violence and terrorism and things like that.
What I would say to you, sir, is that when people are actually confronted with content that may be in violation of the community standards...in fact it's designed so that anybody can report this stuff to Facebook. I would actually respectfully disagree that it's not who you know; it's actually just being able to report these things. That's the whole point of having a global platform. If they violate the community standards, then they violate the standards, and the content will be taken down. That's actually how it works.
I would say, more broadly speaking—and I think you alluded to it, sir—that the challenge with a distribution platform, obviously, is that we want to be very careful about giving people the opportunity to express themselves, to have a platform that is for all voices and yet be mindful of the frameworks of our community standards that will indicate or set aside certain things that are not permitted on the platform.
I understand that is, certainly in our experience, challenging. I think in terms of the people's ability to express themselves, it is very rarely black and white. I think there are a lot of grey zones. I think you're absolutely right that in terms of the enforcement of our community standards, that is a challenging enterprise. We have committed to hiring. We'll have 20,000 people, by the end of this year, on the team working on security issues like the ones you mentioned.
I would also say that we have deployed, already, in actuality, artificial intelligence technology to be able to better detect prohibited content and remove it at scale without human review. Obviously there is ongoing progress that needs to be made. I would never say that we are perfect, but we do take this very seriously. I just want to make sure that you and other members of the committee understand that we do take this very seriously and we've already invested significantly in these efforts.