Thank you, Lauren.
Thank you, Madam Chair, for the invitation to appear today.
I want to commend your committee for tackling this issue and for the work it is doing with respect to this study, especially your interest in the digital world and how it relates to human rights and gender-based violence.
It is important for me to begin with a personal story that really informs so much of my work here at Google.
From 2012 to 2014, I was cyberstalked by a former colleague. This individual aggressively stalked me online, created false websites against me, and sent shaming emails to former colleagues at the Department of Justice, at the White House, and to my funders. He invented false identities through which he further harassed me.
After many rejections by law enforcement to request for help in my situation, I finally found a detective who did, but here's the thing. At one point during my conversation with that detective, an intern of mine overheard me. The intern approached me afterward and disclosed to me that she, a first-year law student, had been revenge-porned. Those revenge-porn images were essentially her only digital footprint. As a result, no firm would hire her for the summer.
I realized that while the cyberviolence done to me had real emotional consequences, I already had a digital footprint that balanced all of the wreckage done to me, but this young woman did not. As with all forms of gender-based violence, there are emotional as well as economic consequences of the violence against us as women and girls.
Google was founded on the principle that the free flow of information is crucial and must be preserved and protected culturally, socially, and economically. The free flow of information is essential to creativity and innovation, and leads to economic growth for countries and companies alike. However, there are legitimate limits we must look at, even where laws strongly protect free expression and we have clear processes for removals if content violates local laws.
Beyond what is legally required, we want our products to enable positive community interaction, so we have policies about what content we do and do not allow on our platforms. Assessing controversial content can require hard judgements, and there isn't always one clear answer, but we do our best to balance free expression with safety.
I know algorithms have been of particular interest to the committee. For a typical search query, there are thousands if not millions of web pages with helpful information. Algorithms are computer processes and formulas that take your questions and turn them into answers. Google search algorithms rely on more than 200 unique signals or clues that make it possible to guess what you might really be looking for.
Our philosophy is that a search should reflect the whole web, so while we comply with laws and remove content from search results in response to valid legal requests, we only go beyond that for a few narrow categories. For example, if a user searches for child sexual abuse imagery, or what we call child porn, we block the content. We also remove nude or sexually explicit images of individuals shared publicly without their consent—for example, revenge porn—by reviewing requests from people to remove images shared without their consent from search, and by demoting websites dedicated to revenge porn.
We prohibit revenge porn on all Google-hosted platforms, including YouTube, Blogger, G+, and Play.
But remember, removing controversial content from Google Search does not necessarily remove this content from the Internet. Even if Google deletes particular URLs from search results pages, the web page hosting the content in question still exists.
We provide resources so that users understand that webmasters control individual sites and the content on them. We help users contact webmasters in order to seek removal of content from the source. This is the only way to actually get the content removed from the web. We think of Google Search like your public library. Taking the index card out of the card catalogue doesn't remove the book from the library. Removing the search won't eliminate the source material.
We also rely on our community to send us signals when content violates our guidelines, much like an online Neighbourhood Watch program. On YouTube, for example, people can use the flagging feature located beneath every video and comment to help report content they believe violates our community guidelines. In 2015 alone, we removed 92 million videos for violation of our policies through a mix of user flagging and our spam detection technology.
We are always looking to new technologies to help counter hate speech online. Jigsaw, Google's think tank, is working on a set of tools called Conversation AI, which is designed to use machine learning to automatically spot the language of abuse and the language of harassment far more accurately than other keyword filters and far faster than any team of human moderators.
Creating a positive, safe online experience for kids and families is an absolute priority for us, and we do this in a number of ways.
First, we want to ensure parents and children have the tools and knowledge they need to make smart and responsible choices online. We are committed to building an informed and responsible generation of digital citizens. We have several programs that train kids and teachers in the basics of privacy, security, and conscientious behaviour online.
We deeply believe that companies like Google have a responsibility to ensure all the products and services we provide to families offer safe and secure experiences for them online. We build features into our products that put families in the driver's seat, such as safety settings on Search and YouTube, so that users have a way to filter out more explicit content. We've also built products with families in mind, such as YouTube Kids, our stand-alone app that makes it safer and easier for children to find videos on topics they want to explore, but in a safe and age-appropriate way.
There has been a huge amount of progress made, and the technologies developed and shared by our industry are making a real difference in keeping women and girls safe online, but there is still so much more work that needs to be done.
We look forward to continuing to work with industry, non-profits, and governments to protect all people from harm while keeping the Internet free and open for everyone.
Thank you for the opportunity to provide these comments to the committee. I look forward to answering questions.