We addressed it in a number of different ways. We certainly have to balance the rights of individuals to freedom of expression. We spent some time already talking about the importance of free speech.
There's freedom of expression on one side and illegal content on the other. What's the grey matter in between which is about misinformation, fake news, whether it's collectively targeted or targeted towards individuals?
It certainly wasn't explicit in our terms of reference to deal with that. Many of these issues transcend domestic boundaries because they're platform providers that operate globally. We really felt that it was important for the government, in effect, to undertake a separate initiative to look at what the right legislative and regulatory model is to address the social harm issues. These are the issues associated with misinformation, targeted bullying, sexist comments, all of that content that doesn't actually cross the line into illegal content where the Criminal Code applies, but is something that requires real effort to understand how that works. In a world of big data and artificial intelligence, in particular, what is the responsibility of those platform providers for the content that they allow to be shared or disseminated online?
That question of what's called “intermediary liability” is one that is evolving internationally and where we think the government needs to take direct actions through a separate process.