Thank you for the question.
To start, we've just done a report on health and science disinformation, which we have not published yet but will publish, and we will have a webinar on June 11 on that question.
I don't think there's anything unique to these forms of disinformation. It doesn't matter whether you go through a public health door, a harm to democracy door, or a public safety door. You end up with the same issues, which come down to, I guess, the responsibility that platform companies should have or should not have for their content.
There are two arguments, two models. One is that they're a telephone company and nobody interferes with telephone calls or tries to regulate free speech. The other is that they are a publisher. I would subscribe to the second view, for sure. They do things that are similar to what an editor does, except that the algorithms do it rather than humans. They decide that you will see something different from what I will see, for instance. That's an intervention in the process that doesn't occur on the telephone.
When I was editor-in-chief at The Globe and Mail, the publisher, the proprietor and I were responsible legally for every image and every word that was in there—for defamation purposes, for hate speech, for obscenity and for whatever laws might be applicable. It seems not unreasonable to me that platform companies should have those same responsibilities for being legally responsible for what appears on their sites.
As well, of course, as I mentioned, we're seeing competition law trying to force them to pay for content they use. We'll see how successful that is. That's being attempted in Australia and France in various ways.
In “The Shattered Mirror”, our report that you're obviously familiar with, we suggested that, as in cable and television, as money moved from the producer to the distributor, there was a policy that came in to try to rebalance that. A 5% levy on revenues was placed on cable and satellite companies. That went into a fund to help pay for production. That seems to me to be a not unreasonable model to be exploring again.
I think there's a series of remedies, which we continue to explore, and which I would urge parliamentarians and governments to do as well, because, as you say, a society that.... Jim Balsillie said to me a couple of years ago that he felt the disinformation problem was even worse than the climate change problem, because you can't even have a debate on climate change if you don't have good information to start with. I think that's the base and the foundation of public debate and political discourse, and we need to clean up those pollutants.