Thank you to the committee chair and to all the members of the committee for this opportunity to speak with you today.
I've devoted my career both as an academic and now as Google's director of privacy to one primary goal, which is to make it intuitive, simple, and useful for Internet users to take control of their privacy and security.
This is really the central challenge of privacy engineering. Products and services, particularly on the Internet, constantly evolve. Valuable new services, from social networking to online video to mobile computing, are constantly changing the way in which we interact with each other and use information.
These services, which are built in part from the information that providers learn from their users, offer tremendous value. Our goal is to offer our users innovative products that help them understand the world in new and exciting ways.
In order to do what we do, in order to provide great user experiences, we rely on our users' trust. It is our greatest asset. The information our users entrust to us enables us to better match searchers to the information they seek, to fight off those who would scam our users or undermine the usefulness of our search results, and to create new services, such as translation, speech-to-text, and many others.
We focus on building transparency, user control, and security into our products. We constantly review, innovate, and iterate to make sure we are honouring our users' privacy expectations and security needs. Because our users' trust is so critical to us, it's very important to us to note that we do not sell our users' personal information.
The Google Dashboard is a cornerstone of our efforts. If you haven't seen this tool, I invite you to take a look at www.google.com/dashboard. We developed the dashboard to provide users with a one-stop, easy-to-use control panel for the personal information associated with their Google accounts, from Gmail to Picasa to Search, and to more than 20 other Google products.
With the dashboard, a user can see, edit, and delete the data stored with her individual Google account. She can change her privacy settings, see what she is sharing and keeping private, and click into the settings for any individual product.
I was adamant when we created the dashboard that we not make it seem strictly a privacy tool. Above all, I wanted it to be a useful tool that our users would come back to and interact with even when they weren't consciously thinking about privacy.
We took a similar approach with our advertising network. Our ads preferences manager, which is linked from every ad in our advertising network, allows users to opt out of ad targeting and learn about our privacy practices. Equally important, it allows users to look at the categories of ads they will see, select new interest categories, and remove ones that don't match their interests.
By offering this useful service, we hope to get more people to understand and confirm their privacy settings. Interestingly, we have seen that for every one user who visits this page and opts out, four choose to edit their preferences, while ten view the page and choose to do nothing.
These are great examples of transparency and control designed into products in a way that is prompting individual users to learn more about how to control their information, and we're proud of this track record.
However, despite our best efforts, on occasion we have made mistakes. As this committee is well aware, in May, Google disclosed that we had mistakenly included code in the software on our Street View cars that collected samples of Wi-Fi payload data—information that was sent over open, unencrypted Wi-Fi networks. To be clear, Google never used this mistakenly collected data in any product or service, and there was no breach or disclosure of personal information to any third party. As soon as we learned about this incident, we disclosed what had happened and acknowledged our mistake.
Google is working hard to fully and completely address this incident. We recognize that we need to do better.
My colleague Jacob Glick spoke to you in November about some of our plans to strengthen our internal privacy and security practices. These plans include additional responsibilities for me, which I would appreciate telling you a bit about today.
I'm excited by the opportunity bring greater robustness to our privacy and security practices in my new role. With my expanded responsibilities, I will have the chance to oversee and work with both the engineering and the product teams to help ensure that privacy and security considerations are built into all of our products.
While the duties that go with this role are big, I am confident that I will be supported with the resources and internal support needed to help Google do better. Further, I believe that Google's commitment to redouble its efforts around staff training will go a long way.
Mr. Glick mentioned this when he appeared before this committee on November 4, and I'm happy to elaborate on this further for you. We want to deputize every Googler in this effort. We want to make certain that each product we roll out meets the high privacy and security standards that our users expect of us.
We are an innovative company, creating new products each year that are helping to transform how we organize information and relate to each other as people. Our users' trust is the foundation that Google's business is built upon. We are committed to not taking that trust for granted.
I look forward to answering your questions.
Thank you.