Yes. I'm so glad you brought this up.
There are a number of issues to be concerned about, so I'm going to try and figure out how to formulate my response.
One way to look at this, if you think about protecting children.... Marc Andreessen, who is the founder of Netscape, has this insight that says software is eating the world. That means every single industry, domain, whether that's the way that children consume media or the way we get around in Ubers versus taxis, technology, if you throw it into that domain, will do the thing more efficiently. So software will continue eating the world. However, we don't regulate software, so what that really means is “deregulation is eating the world”.
I don't know how it works in Canada, but in the United States I think we still have protections about Saturday morning cartoons. We recognize there is a particular audience, which is to say, children, and we want to protect them. We don't want to let advertisers do whatever they want during the Saturday morning cartoon period.
As soon as you basically offload that regulated channel of television and formal Saturday morning programming, and say let's just let YouTube Kids handle it, then you get algorithms, just machines, where the engineers at YouTube have no idea what they're putting in front of all of those 2.2 billion channels, of which several hundred million are for children.
That's how to see the problem. We have a five-second delay on television for a reason. There are 100 million people or 50 million people on one side of the screen and a couple of people who are monitoring the five-second delay, or the editorial. If some gaffe happens, or there is profanity or something like that and you want to protect...you have some kind of filtering process.
Now we have 2.2 billion channels. This is the same, whether on the other side of that channel is a child or a vulnerable person in Myanmar who just got the Internet and is basically exposed to vulnerable things. The unified way of seeing this problem is that there is a vulnerability in the audience, whether that audience is a child, someone in Myanmar, or someone in an election. If we don't acknowledge that vulnerability, then we're going to have a huge problem.
The last thing I'll say, just to your point about children, is that when the engineers at Snapchat or Instagram—which, by the way, make the most popular applications for children—go to work every day, these are 20- to 30-year-olds, mostly male, mostly engineers, computer science or design-trained individuals, and they don't go to work every day asking how they protect the identity development of children. They don't do that. That's not what they do. The only thing they do is go to work and ask, “How can we keep them hooked? Let's introduce this thing called a “follow button”, and now these kids can go around following each other. We've wired them all up on puppet strings, and they're busy following each other all day long because we want them just to be engaged.”