This is a wonderful question and one that is extremely pressing, because, as you said, we have unelected officials—owners of these platforms—making content moderation decisions not only for the American town square but also for the global town square.
This, again, is where I think oversight and transparency mechanisms can play an important role in shining a light on what makes it into our feeds and in educating people. I think a lot of people who use Twitter or other platforms like it don't understand that Elon Musk, who claims to be a free speech crusader, is actually suppressing content for his authoritarian buddies in places like India and Turkey and, frankly, pumping up Donald Trump's content as well. Oversight and transparency are key to that.
Unfortunately, though, it's not necessarily a quick fix. This is more of a generational thing. Until we have a viable alternative in a more democratically minded social media platform—I mean small-d “democratically”, not a partisan platform—there's not much we can do. There are some regulatory regimes in other countries. Australia is where there are transparency powers that hold Musk and others to account for the business decisions they make in surfacing some content while suppressing other content. I like those systems. I don't know what they would look like in the Canadian context, necessarily.
Relying on this, rather than putting the burden of liability on the platforms—which might over-moderate and remove legitimate speech—is probably the best solution.