Evidence of meeting #129 for Access to Information, Privacy and Ethics in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was research.

A video is available from Parliament.

On the agenda

MPs speaking

Also speaking

Mireille Lalancette  Professor, Political Communication, Université du Québec à Trois-Rivières, As an Individual
Timothy Caulfield  Professor, Faculty of Law and School of Public Health, University of Alberta, As an Individual
Marcus Kolga  Director, DisinfoWatch
Yoshua Bengio  Founder and Scientific Director, Mila - Quebec Artificial Intelligence Institute

3:35 p.m.

Conservative

The Chair Conservative John Brassard

Good afternoon, everyone.

I call this meeting to order.

Welcome to meeting number 129 of the House of Commons Standing Committee on Access to Information, Privacy and Ethics.

Pursuant to Standing Order 108(3)(h) and the motion adopted by the committee on Tuesday, February 13, 2024, the committee is resuming its study on the impact of misinformation and disinformation on the work of parliamentarians.

I'd like to welcome the witnesses with us today for the first hour of the meeting.

As an individual, we have Mireille Lalancette, professor, political communication at Université du Québec à Trois‑Rivières

Also as an individual, we have Timothy Caulfield, professor, Faculty of Law and School of Public Health at the University of Alberta.

Ms. Lalancette, you have up to five minutes for your opening remarks. You may begin.

Mireille Lalancette Professor, Political Communication, Université du Québec à Trois-Rivières, As an Individual

Good morning, everyone.

Thank you for your time.

I would like to take a fairly broad look at disinformation and elections.

In my opinion, it's a bit like a perfect storm. Why is it a perfect storm? This can be attributed to five factors: the declining role of the media, the increase in—

3:35 p.m.

Conservative

The Chair Conservative John Brassard

Just a moment, Ms. Lalancette. We will suspend for a few seconds because the sound in the room is not loud enough. We'll turn it up a little so that people in the room can hear you properly.

It's a little better now.

Ms. Lalancette, I'll reset the clock and you can start again.

3:35 p.m.

Professor, Political Communication, Université du Québec à Trois-Rivières, As an Individual

Mireille Lalancette

Perfect.

Thank you for inviting me.

I think disinformation and elections go hand in hand right now, because of what I call a perfect storm related to the role of the media, the rise of digital social media, the decline in partisan affiliations, the rise of populism and the increased incidence of election campaigns.

More and more media outlets are in financial trouble. More and more people are getting their news on Facebook, TikTok and Instagram. Trust in traditional media is also waning. News consumption is happening less and less on traditional media; people are heading to digital social media platforms to get their news.

This perfect storm is also related to the media funding crisis. More and more media outlets are shutting down, and this is creating a media void that's being filled by digital social media. However, a whole bunch of questions could be raised about the reliability of sources and the variety of content found on digital social media. There's also no code of ethics or journalistic standards governing the social media or influencer content. It's still challenging to regulate social media platforms.

All of this is also the result of what we call the decline in partisan affiliations. Fewer and fewer people carry a party membership card and identify with one political party. That has led to electoral volatility, coupled with what we call the rise of populism. During the trucker convoy, many groups showed their dissatisfaction, Canada was split into two blocks: east and west, and regionalism made a comeback. Populism is often protest-based or identity-based. People denounce the elites or focus on certain identities. The founding people will be pitted against immigrants, for example.

Other factors in the storm are fixed-date elections, which actually don't have fixed dates, and something we call the permanent campaign in my field of research. Candidates no longer campaign only when an election is called; they campaign all the time. So disinformation can be be concentrated during elections, but it can also happen anytime.

Where does that disinformation end up? Most of it goes to digital social media, because people get their news on the Internet and because its easy to use these platforms to create content. In some cases, it's impossible to determine where the content on these platforms comes from. We're seeing more and more deepfakes and fake new. This is being spread not only by people engaging in foreign interference, but sometimes also by political parties themselves. Right now, we're seeing politicians themselves talking about fake news and criticizing the media, copying current practices in the United States, including those of the Republicans.

How can we fight disinformation in this context?

From my perspective as a researcher, I believe it's important to acquire good media literacy and show Canadians how to distinguish false information from information that might be true.

It's also important to ensure that platforms are moderated. We saw an example of this last week, when the mayor of Montreal decided to disable user comments about her posts on X, formerly Twitter.

In addition, it's important for states to draw inspiration from what's being done elsewhere, particularly in Europe, to regulate practices and digital social media platforms. Ethical issues and problems related to information and disinformation need to be raised, particularly when it comes to electoral politics.

I'm ready to answer your questions.

Thank you.

3:40 p.m.

Conservative

The Chair Conservative John Brassard

Thank you for your presentation, Ms. Lalancette.

Mr. Caulfield, I want to welcome you to the committee.

You have up to five minutes to make your opening statement. Go ahead, sir.

Timothy Caulfield Professor, Faculty of Law and School of Public Health, University of Alberta, As an Individual

Thank you very much. It's an honour to be here.

Hello from Edmonton and Treaty 6 territory.

This is a subject that I feel extremely passionate about. The spread of misinformation is one of the greatest challenges of our time. Research shows that this is something that not only experts believe but also something that people around the world believe.

It's not hyperbole to say that misinformation is killing people. Misinformation is having a tremendous impact on democracies around the world. This is certainly something that we all need to address.

The battle against misinformation itself is very controversial, even when you look at ideas about what the definition of misinformation is.

I want to emphasize to the committee that even if we focus on things that are demonstrably false—about elections, vaccines, climate change or immigrants—we can, as a society, make a real difference.

This is a topic that I've been studying for a very long time, and as you heard from our last expert, I've never seen anything like what we're seeing right now. I just want to highlight a couple of challenges that build on the points she made—a couple of challenges that have made today and what's happening right now particularly challenging.

Number one, there is social media, absolutely, but in addition to that is AI. AI is going to make the spread of misinformation more challenging. It's going to make real, rapidly produced content that is very difficult to discern from reality. Studies have shown that many people believe that they can spot AI and deepfakes, but research consistently tells us they cannot, even when they're warned that AI might be coming.

The second thing that I find incredibly challenging right now is the politicization of misinformation and the connection of misinformation with political identity and polarization. This is a trend that is increasing and is doing incredible harm. It's not only horrible for democracy, but we also know that once misinformation becomes part of a person's political identity, it becomes more difficult to change their mind.

The third challenge is the degree to which state actors are pushing misinformation. The goal of many state actors and, by the way, of many misinformation-mongers, is to create distrust. The distrust that we see in institutions today is largely—not entirely, but largely—created by the spread of misinformation. Those spreading misinformation are trying to create distrust and information chaos. Alas, they are succeeding.

How do we respond? What can we do?

This is a generational problem. You've probably heard these recommendations over and over again, but we must come at this with a multipronged approach.

What does that mean? It's teaching critical thinking skills and media literacy and doing this across.... I wrote an article in which I suggested we start in kindergarten. We have to teach these skills throughout the life cycle, as they do in many countries.

We have to pre-bunk. We have to debunk. We have to figure out the best way to set labels and warnings on things like AI. Yes, we have to work with the social media platforms and other tech companies. Yes, there are regulatory tools that can be adopted.

The other thing I want to emphasize, which I think is so relevant to this committee, is the spread of misinformation about the fight against misinformation. As I've already said, much of the distrust that we see in society has been created by fake news and by the spread of misinformation. By the way, research consistently shows that.

We also have to recognize that fighting misinformation is not just about curtailing people's voices. On the contrary, most of the tools that we can use in a liberal democracy to fight the spread of misinformation can be used within the marketplace of ideas. Pre-bunking, debunking and education are things that work within the spirit of liberal democracies.

Yes, regulating can be a challenge. It's something that I welcome questions about.

I think that this is an essential topic that we must all band together to fight.

Thank you very much. I look forward to your questions and comments.

3:45 p.m.

Conservative

The Chair Conservative John Brassard

Thank you, Mr. Caulfield.

We're going to go to our six-minute rounds. Mr. Barrett will go first.

Go ahead, please.

3:45 p.m.

Conservative

Michael Barrett Conservative Leeds—Grenville—Thousand Islands and Rideau Lakes, ON

I have a quick question for both the witnesses, Madam Lalancette and Mr. Caulfield.

Are you familiar with the news articles from August talking about a bot campaign backing the leader of the Conservative Party?

We'll start with you, Madam Lalancette.

Can you just indicate yes or no?

3:45 p.m.

Professor, Political Communication, Université du Québec à Trois-Rivières, As an Individual

3:45 p.m.

Conservative

Michael Barrett Conservative Leeds—Grenville—Thousand Islands and Rideau Lakes, ON

Mr. Caulfield, are you familiar with it?

3:45 p.m.

Professor, Faculty of Law and School of Public Health, University of Alberta, As an Individual

Timothy Caulfield

Yes, and I'm familiar with other bot campaigns.

3:45 p.m.

Conservative

Michael Barrett Conservative Leeds—Grenville—Thousand Islands and Rideau Lakes, ON

There was a ton of media. I want to read one of the headlines. This is from CTV News. It says, “Conservatives reject online bot allegation after Poilievre rally”. This article is set up and it has Conservatives denying a charge that has been made against them.

Now, I want to fast-forward you both to August 16—no, we'll say August 28, when a Canadian Press story was published across different outlets. I'm looking at it in the National Post, where it was titled, “No evidence Conservatives were behind social media bot campaign that praised Poilievre: study”. This study was done by the Canadian Digital Media Research Network, and their findings were that there was “no evidence that indicates a political party or foreign entity employed this bot network for political purposes.”

With that precursor—before I ask my question—Mr. Caulfield, are you familiar with the reporting I'm citing from the National Post?

3:45 p.m.

Professor, Faculty of Law and School of Public Health, University of Alberta, As an Individual

3:45 p.m.

Conservative

Michael Barrett Conservative Leeds—Grenville—Thousand Islands and Rideau Lakes, ON

Okay, that's perfect.

3:45 p.m.

Professor, Faculty of Law and School of Public Health, University of Alberta, As an Individual

Timothy Caulfield

I don't know the depth of the issue.

3:45 p.m.

Conservative

Michael Barrett Conservative Leeds—Grenville—Thousand Islands and Rideau Lakes, ON

Madame Lalancette, are you familiar with it?

3:45 p.m.

Professor, Political Communication, Université du Québec à Trois-Rivières, As an Individual

Mireille Lalancette

I know the work of this research centre, but I haven't read the report.

3:45 p.m.

Conservative

Michael Barrett Conservative Leeds—Grenville—Thousand Islands and Rideau Lakes, ON

We're talking about the spread of misinformation. We had two political parties, the Liberals and the NDP, that went out...and the media printed as sure fact that without having verified the claims, this was something that was paid for. There were allegations that it was orchestrated through foreign state actors.

Now, look, we're at a time in our country when we have an inquiry happening into foreign interference in our democracy. We have real state actors who are spreading disinformation. It's being propagated, of course, in the political discourse, but also in media.

Here we have an example of, for once, an independent group disproving the claim that was printed without it having been proven. Then we have political parties, the Liberals and the NDP, that didn't withdraw the allegation or didn't say, “Oh, I stand to be corrected,” or “We believe this because it was again leveraged for partisan purposes.”

Let me read you a quote from the author of the report, who said, “The finger-pointing without evidence is actually quite destructive and leans into this hyper-partisan, hyper-polarized information ecosystem that we find ourselves in today in Canada.”

The study says that the initial bot campaign that was used received very little attention because it was from sock puppet accounts, and they don't have a real following, but it was amplified with millions of impressions by Canadian political actors who didn't have altruistic intentions.

I'll start with you, Madame Lalancette. Is this type of misinformation that's being propagated, in this case by the Liberals and the NDP, part of the problem? Certainly that's what the report's author suggested.

3:50 p.m.

Conservative

The Chair Conservative John Brassard

We have a minute.

3:50 p.m.

Professor, Political Communication, Université du Québec à Trois-Rivières, As an Individual

Mireille Lalancette

This is what I spoke about in my presentation.

Disinformation comes not only from bots or foreign countries but also from the way political actors are communicating and want you to believe something. They are communicating that they feel this is foreign interference and that the party bought the bots in order to get attention.

Yes, this is part of the problem. It's coming from everywhere right now in the chamber. It doesn't have a party association. You can see it going in every way, I think.

3:50 p.m.

Conservative

The Chair Conservative John Brassard

Thank you, Mr. Barrett.

Mr. Caulfield, you'll have time to weigh in. There will be some other questions, I'm sure.

Ms. Khalid, go ahead for six minutes, please.

Iqra Khalid Liberal Mississauga—Erin Mills, ON

Thank you very much, Mr. Chair.

Thank you to the witnesses for being here today and for your very important testimony.

Building on some of the comments my honourable colleague made, I don't think it's about which political party has or has not done anything maliciously. When we're talking about foreign interference, it's about who is vulnerable to it. The bots example that Mr. Barrett raised is a prime example of how a political party in Canada is vulnerable to being used by Russian bots in order to disrupt democracy in Canada.

I want to question you, Professor Caulfield.

You talked specifically about what tools we can use to prevent that kind of interference within our democratic systems. You talked about teaching, but I also want to talk about accountability.

When we're talking about how to prevent this from happening, other than through teaching and raising awareness, how do we build accountability within political parties to ensure that, for example, the Leader of the Opposition is not susceptible to bots taking over his political campaign? How do we build on that to ensure the Canadian democratic system is protected?

3:50 p.m.

Professor, Faculty of Law and School of Public Health, University of Alberta, As an Individual

Timothy Caulfield

This is a great question.

I'd like to start by highlighting a very recent study. I believe it came out yesterday or the day before. It's a study done in the United States asking people what kind of misinformation the public is most concerned about. The number one response was misinformation from politicians. I think, by the way, that this data would be replicated in Canada. The public does not want to hear misinformation emanating from politicians, even though they know it's there. They want steps taken to stop it.

The other thing that is very important to emphasize is that this is where there is some degree of agreement across the political divide—stopping the use of things like AI and bots in the context of elections and political discourse. There was a survey done by EKOS Research that found very high support, for example, for the use of some type of regulatory response to stop the use of AI in the context of a political campaign.

I think this tells us that the Canadian public really values honesty in the political domain, even though they're realistic about it. They're not naive. They welcome the potential use of regulatory measures in this space. They're less comfortable with regulatory responses—or there's more of a divide—when we talk about regulating misinformation, because that feels like infringing on freedom of expression. There are genuine legal challenges there. However, when you're talking about protecting the integrity of democracy and our elections, I think there's room for a regulatory response.

Iqra Khalid Liberal Mississauga—Erin Mills, ON

Thank you for that, Professor.

We saw, with respect to regulations, how much misinformation our online harms bill received. What is the boundary between online harms and the quelling of freedom of expression?

You mentioned, in your comment just now, the Canadian public wanting honesty. I recently came across an X account—I keep wanting to say “Twitter account”. It's called @PierreIsLying and it highlights, on a daily basis, how many times the Leader of the Opposition lies in public during question period. They outline it and put down stuff like how many lies there are per minute. It's that kind of thing.

When we're talking about accountability—about that honesty and that teaching moment for Canadians—how important is it to fact-check the data or information that politicians and political discourse are providing to Canadians?

3:55 p.m.

Professor, Faculty of Law and School of Public Health, University of Alberta, As an Individual

Timothy Caulfield

I think it's very important. Again, it's something that you see the public say they want. The problem, of course, is that because this has become so politicized, and because the fight against misinformation has become so politicized, people trust fact-checkers less. They'll say it's partisan.

Some really interesting research has come out about the degree to which the response to fact-checking is different when an issue has become politicized. As I said earlier in my opening statements, several studies have shown that once this becomes about political identity, whether it's misinformation about vaccines or immigrants, etc., they become more resistant to fact-checking. We need to do more research on this exact topic. In fact, this is something we're researching right now to explore what kind of tools we can use when misinformation has become part of political identity.

The other problem that's happening, of course, is that once a bit of misinformation becomes part of a political platform, it becomes an ideological flag. Once that happens—we've seen that happen with, for example, vaccine misinformation, something that we study—it becomes very resistant to change.

There is some suggestion of tools that can be used, such as pointing to what the scientific consensus says, what the body of evidence actually says, and making it clear what that body of evidence is, but there's no doubt that because this has become so political, it has become more challenging.