Thank you very much. It's an honour to be here.
Hello from Edmonton and Treaty 6 territory.
This is a subject that I feel extremely passionate about. The spread of misinformation is one of the greatest challenges of our time. Research shows that this is something that not only experts believe but also something that people around the world believe.
It's not hyperbole to say that misinformation is killing people. Misinformation is having a tremendous impact on democracies around the world. This is certainly something that we all need to address.
The battle against misinformation itself is very controversial, even when you look at ideas about what the definition of misinformation is.
I want to emphasize to the committee that even if we focus on things that are demonstrably false—about elections, vaccines, climate change or immigrants—we can, as a society, make a real difference.
This is a topic that I've been studying for a very long time, and as you heard from our last expert, I've never seen anything like what we're seeing right now. I just want to highlight a couple of challenges that build on the points she made—a couple of challenges that have made today and what's happening right now particularly challenging.
Number one, there is social media, absolutely, but in addition to that is AI. AI is going to make the spread of misinformation more challenging. It's going to make real, rapidly produced content that is very difficult to discern from reality. Studies have shown that many people believe that they can spot AI and deepfakes, but research consistently tells us they cannot, even when they're warned that AI might be coming.
The second thing that I find incredibly challenging right now is the politicization of misinformation and the connection of misinformation with political identity and polarization. This is a trend that is increasing and is doing incredible harm. It's not only horrible for democracy, but we also know that once misinformation becomes part of a person's political identity, it becomes more difficult to change their mind.
The third challenge is the degree to which state actors are pushing misinformation. The goal of many state actors and, by the way, of many misinformation-mongers, is to create distrust. The distrust that we see in institutions today is largely—not entirely, but largely—created by the spread of misinformation. Those spreading misinformation are trying to create distrust and information chaos. Alas, they are succeeding.
How do we respond? What can we do?
This is a generational problem. You've probably heard these recommendations over and over again, but we must come at this with a multipronged approach.
What does that mean? It's teaching critical thinking skills and media literacy and doing this across.... I wrote an article in which I suggested we start in kindergarten. We have to teach these skills throughout the life cycle, as they do in many countries.
We have to pre-bunk. We have to debunk. We have to figure out the best way to set labels and warnings on things like AI. Yes, we have to work with the social media platforms and other tech companies. Yes, there are regulatory tools that can be adopted.
The other thing I want to emphasize, which I think is so relevant to this committee, is the spread of misinformation about the fight against misinformation. As I've already said, much of the distrust that we see in society has been created by fake news and by the spread of misinformation. By the way, research consistently shows that.
We also have to recognize that fighting misinformation is not just about curtailing people's voices. On the contrary, most of the tools that we can use in a liberal democracy to fight the spread of misinformation can be used within the marketplace of ideas. Pre-bunking, debunking and education are things that work within the spirit of liberal democracies.
Yes, regulating can be a challenge. It's something that I welcome questions about.
I think that this is an essential topic that we must all band together to fight.
Thank you very much. I look forward to your questions and comments.