I don't think the day will ever come when we say that we're done with safety, that we are a safe platform now, so everybody just enjoy it.
I mentioned before that when we first launched you couldn't upload images. Therefore we were not seeing the violations of privacy that we would see in this day in age.
I think Twitter as a company was maybe surprised by the reach of its platform. The platform grew in a way that the company didn't grow in. Perhaps we were not “safety by design” from the get-go. I actually find myself being the party-pooper in the meetings with engineering at times, where, if they say we're going to enable the sharing of images in direct messages, I am the one who has to put up a hand to say, how about child sexual exploitation?
I think there's been a really big shift within the company where we now think about safety first. This is for any feature that I have seen discussed in the last year, and I'm not just talking about safety, I'm talking about anything that is rolled out on the site. When we rolled out Twitter Moments, we asked, how can Twitter Moments be abused? How do we make sure that they don't get abused? How do we build a reporting mechanism within Twitter Moments? That shift has taken place, and I think it's a normal shift.
I mentioned having worked at Google and Facebook before. When I joined Facebook I was the second person in Europe. There were no rules. There were no reporting mechanisms. I do think they're one of the safer platforms out there right now. I think this is the regular progression of a platform. I completely agree that we have a responsibility not just to our users but to people who encounter our content. My grandmother is not on Twitter, but is following hashtags left, right, and centre, and pinging me about hate speech that she sees on those hashtags.
We need to empower the users with better controls, but we need to take more severe actions when violations have taken place.
I started talking about the rules explaining that while we empower people to speak truth to power, that means little if they are scared. You can expect to see more changes in the next six months. It's a severe overhaul of how we have processed abuse reports before. I mentioned working on transparency of reporting. We want to make sure that the users know what happens when they click “report”, what action we're taking so they can appeal decisions. It's only going to get better.
I know that we have asked the world to be too patient. It's been too long. It's not acceptable. We did not want our platform to become a platform of abuse. I can assure you that every time I'm back in San Francisco, and I sit down with the abuse team, and I escalate content to them, it breaks their hearts. These people are working 24/7 to make sure that the abuse is not online, it's not live on the platform, and their own families don't have to see it.
I apologize for any re-victimization and anybody who has been abused through the platform. I think working constructively with civil society, and government is what's going to lead to a safer platform, and most importantly, to a safer society because unfortunately some of these prejudices do exist offline. They are very hard to eradicate.