Evidence of meeting #134 for Access to Information, Privacy and Ethics in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was political.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Jacob Suelzle  Correctional Officer, Federal, As an Individual
Michael Wagner  Professor and William T. Evjue Distinguished Chair for the Wisconsin Idea, University of Wisconsin-Madison, As an Individual
Samantha Bradshaw  Assistant Professor, New Technology and Security , As an Individual
Karim Bardeesy  Executive Director, The Dais at Toronto Metropolitan University

René Villemure Bloc Trois-Rivières, QC

Just as an aside, Mr. Bardeesy, in Trois-Rivières we have a lot of media outlets. However, since Facebook blocked news access, newsrooms have been struggling. We still have a lot of media outlets, but few journalists. As a result, the news may not be as reliable and people don't trust it as much. This goes a bit beyond the subject before us today, but it is still a problem that we should look into.

Mr. Chair, I understand that my time is up. I will end on that note.

Thank you.

5:40 p.m.

Conservative

The Chair Conservative John Brassard

Thank you, Mr. Villemure.

Mr. Green, for two and a half minutes, go ahead, please.

Matthew Green NDP Hamilton Centre, ON

Thank you very much.

Ms. Bradshaw, you probably listened to my exchange there. I know you're Canadian, so you have context for this.

My friend Mr. Villemure talked about cognitive warfare. Steve Bannon, chief far-right extremist strategist with connections to Canada's far-right extremist movement, talked about cognitive warfare, in essence, flooding the zone with a word that I can't say because it's unparliamentary.

Could you comment on the ecosystem of third party political actors and fake news, quite literally these fake online platforms that pop up? You can probably list them, and there are probably a dozen of them that I can think of off the top.

The online platforms look like news. You talked about Buffalo News. There's also the Western Standard. There's a whole bunch of these far-right extreme outlets. How do average people sift through all of that stuff in an electoral cycle in order to make informed decisions based on the facts?

5:40 p.m.

Assistant Professor, New Technology and Security , As an Individual

Dr. Samantha Bradshaw

First, maybe I'll talk a little bit about the disinformation for hire kind of groups and how there is an industry backing a lot of disinformation campaigns, which reminds us that we should not only focus on political and cognitive warfare but also recognize that a lot of these actors are incentivized. Not just platforms, but the creators of disinformation are incentivized to do so because they can generate advertising revenue or business deals with governments.

Thinking about policy responses could also think about raising the costs of engaging in these kinds of activities by making them less profitable, taking the typical kind of scam and fraud approaches and applying them to disinformation and these groups that try to generate advertising revenue by creating fake news stories and getting people to visit websites that show them ads. This is all part of the broader ecosystem of challenges.

When it comes to what citizens should do to navigate this complicated environment, I do think that we don't always give citizens enough credit for the diversity of media they already consume. We don't just get our news from social media. It does play an increasingly important role, but so does what we read in newspapers, what we watch on TV, who our social circles are and who's immediately around us in our community. All of these things play really important roles in shaping our political knowledge and then, therefore, our behaviours.

When we're thinking about solutions, we can focus and hone in on the social media angle, but we can also think about building more robust social institutions to empower people through other kinds of media to have that diverse knowledge and to be able to generate their own political knowledge and opinions.

5:45 p.m.

Conservative

The Chair Conservative John Brassard

Thank you, Mr. Green.

Thank you, Ms. Bradshaw.

Before I ask you a question, Mr. Bardeesy, I want to apologize for asking it in English, even though your French is very good.

I want to congratulate you for that.

I have a question. None of this has been touched on today, but we heard from prior witnesses about the impact that artificial intelligence is going to have on the propagation of disinformation and misinformation, so I'm going to give you both an opportunity, in a minute or less, to share your thoughts with the committee on that.

I'll start with you, Mr. Bardeesy, if you don't mind.

5:45 p.m.

Executive Director, The Dais at Toronto Metropolitan University

Karim Bardeesy

Sure.

Kevin Kelly, who was this Internet guru back in the day, said that the Internet is fundamentally a giant copying machine, and AI has the ability to create copies of things at incredible speed, at incredibly low cost and in incredible volume.

I'm specifically concerned about AI-generated audio content versus visual content. There's some evidence that audio content is harder to discern as being a deepfake. One thing to be aware of as we prepare people around this issue is having a specific line of inquiry about audio-related deepfake content.

I also commend to this group Bruce Schneier, a Harvard Kennedy School researcher who has these great little articles on 16 ways that AI can be useful or interesting for democracy. It's not all bad. There may be some specific uses around the edges of AI that could help us and could help you do your job, but it is definitely an area of concern.

5:45 p.m.

Conservative

The Chair Conservative John Brassard

Thank you.

Ms. Bradshaw, go ahead.

5:45 p.m.

Assistant Professor, New Technology and Security , As an Individual

Dr. Samantha Bradshaw

I will plus one everything Professor Bardeesy has said.

For me, the greatest challenge with AI is the way we talk about it in the public and in the media. Coming back to the idea of trust, when we're constantly telling people that they can't trust anything they see, read or hear, we're not creating resilient citizens who are able to then effectively participate in society. Going forward, it will be really important to be able to create digital literacy programs that don't build too much overt skepticism and that need to not trust anything that we read, see or hear anymore.

We do need some level of trust, so I'm all in there.

5:45 p.m.

Conservative

The Chair Conservative John Brassard

That's interesting, because one of the things we have heard is exactly what you've talked about: critical thinking and digital literacy. Finland has been used as the model for education in grade school. I suspect, though I can't preclude any conclusions or recommendations of this committee, that may be a big part of what we provide by way of a recommendation going forward.

I want to thank you both for being here today. I invite you, if you have any afterthoughts, to submit them to the clerk, because oftentimes, as I said earlier, you walk away and you're sitting there in bed at night and you think, “Ah, I should have said this,” so I'm giving you that opportunity to provide that to the committee. If you could do so by five o'clock on Friday, that would be helpful. We like to have deadlines at this committee.

For the sake of committee members, before I conclude, I want to let you know that Thursday we have TikTok, Google, Meta and X coming for this study, so prepare your questions.

That's it for today. Thank you, everyone: our technicians, the clerk, the analysts and our witnesses.

The meeting is adjourned.