I think that's the soft bigotry of low expectations that we've been attuned to expect from these companies.
In a sort of twisted irony, my organization has shown that Meta has allowed state-controlled media in authoritarian countries to pay for advertisements containing disinformation. In 2022, we published a report on Chinese state media buying Facebook ads to push disinformation about the conflict in Ukraine. While Meta is punishing Canadian news publishers by removing their ability to operate on the platform, it has profited from disinformation content paid for by state-controlled media elsewhere in the world.
The truth is that we've come to expect the platforms, and Meta in particular, to behave in the worst way possible. Not only are these regulations.... I think that Bill C-18 has its value, but one thing we've urged to Canadian ministers when I've met them, and we will urge today, is that there needs to be a more comprehensive framework that surrounds these platforms.
Ultimately, if they can find a way to squeeze out of taking responsibility and retaliate against anything that you do try, they will do so. That's why a more comprehensive framework—based on what we call the STAR framework at CCDH—includes safety by design, transparency of the algorithms, economics and a content-enforcement policy. It includes real, meaningful accountability to democratic bodies like your own and also shared responsibility for the harms they create. The negative externalities that these companies are imposing are an unbearable cost on our societies, whether that's destroying the news media industry or harming young girls and young children with self-harm and eating disorder content.
In all of those respects, I think the lesson you need to be taking from this is that these companies will wiggle out of everything they can. They will act in the worst possible way, and that's why more comprehensive legislation—a framework, as your previous speaker said—is necessary.