We do. We take the safety of underage users, minor users, of Facebook very seriously. We actually have a dedicated manager on my team who focuses exclusively on those issues. The reason is that it's an important issue, not just to us who work on privacy, but to everyone at the company.
One example of a way that we try to create a safe environment for teenagers who use Facebook is the default settings. We talked earlier about that. The default settings in general are more limited for teenagers. The thinking is that adults should make their own decisions about who they want to share with, but we want to put minors in a place that's a bit more limited, speaking in a smaller community.
We don't monitor the content of our users in general, but we do have reporting functionality. We try to use a tool called “social reporting”, for example, which allows people who are concerned with Facebook content to engage in a conversation. For example, if you see content that you're concerned about as a user, you can report it to the user who posted it, to a trusted third party—for example, an adult you know. You can also report the content directly to Facebook. We have a team of professionals who review reported content and make judgments about what steps we should take. There are also some technological measures that we use, independent of teenagers' communications, just to look at the ways that adults communicate in order to help keep teenagers safe when they use Facebook.