Great. I'd be happy to start on those few questions. I appreciate it and the opportunity to be here.
First, I'm not familiar with the figure you cited. What I can tell you is that we issued our most recent community guidelines enforcement report. It's like a transparency report that we issue quarterly. We issued the report for the second quarter of 2023. It just came out last week. In that report, we identified that we removed over 18 million accounts globally of users who were suspected of being under 13.
I think my colleague David identified to Mr. Kurek some of the tools and some of the challenges but also some of the ways in which we work to identify users who may not be old enough to use the platform. When we do find those accounts, they are removed.
Your second question, about an age-appropriate experience, is an excellent question. We work with non-profits around the world. In Canada those are groups like MediaSmarts, Digital Moment and Kids Help Phone, who are doing leading research on the experience of youth online and how we should approach that. We take that feedback and build policies. In the past few years, we have introduced such things as age-appropriate content labelling. There are types of videos that will be labelled and that will not be recommended to a user who is under 18. Take cosmetic surgery, for example. If somebody posts a video about cosmetic survey, that's ineligible to be recommended to users under the age of 18.
Does that answer your question?