Evidence of meeting #114 for Access to Information, Privacy and Ethics in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was news.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Jakub Kalenský  Deputy Director, COI Hybrid Influence, European Centre of Excellence for Countering Hybrid Threats
Aengus Bridgman  Assistant Professor, Media Ecosystem Observatory
Kenny Chiu  Former Member of Parliament, As an Individual
Patrick White  Associate Professor of Journalism, Media School, UQAM, As an Individual
Kathryn Hill  Executive Director, MediaSmarts
Matthew Johnson  Director of Education, MediaSmarts

11:55 a.m.

Conservative

The Chair Conservative John Brassard

Thank you, Mr. Chiu and Mr. Fisher.

Go ahead, Ms. Gaudreau. You have two and a half minutes.

11:55 a.m.

Liberal

Darren Fisher Liberal Dartmouth—Cole Harbour, NS

Thank you very much, Mr. Chair.

As I have only two and a half minutes of speaking time, I think I have more requests than questions.

Mr. Bridgman, you mentioned a number of studies conducted in the United States, and we at the committee would like to have them, along with any additional information such as changes in level of commitment to democracy. That's what concerns us. The bottom line is that we want to find ways to address things that might happen.

There was discussion about reducing the amount of disinformation. Can you suggest to the committee any approaches that might help us come up with better legislation?

Mr. Kalenský, I understand the best practices for defence. At the end, you talked about ways of fixing weaknesses. If we run out of time, I would also ask you to send us additional information, given that you didn't have enough time to fully explain everything in your opening remarks.

I'll give you the next minute to tell me as much as possible.

11:55 a.m.

Deputy Director, COI Hybrid Influence, European Centre of Excellence for Countering Hybrid Threats

Jakub Kalenský

In case you would be interested in more detail, I will be more than happy to share with you one report on these four lines of defence. It's about 20 pages.

In this repairing of the systemic weaknesses, I think we have tools like media literacy. We see in countries where they have a higher level of media literacy—Finland, Sweden, Denmark—that there is a smaller problem with disinformation. It's not a zero problem, but it's a smaller one.

Definitely, for strategic communication campaigns, the effort to try to increase the level of trust of the audience in their institutions has to be a depoliticized stratagem. It cannot be a promotion of the current political leadership. We see in countries where there is a functioning strategic communication system that, again, the trust of the audience is higher, but it's also trying to work on decreasing the polarization, decreasing the differences between the capital and the countryside, the people with higher incomes and those with lower incomes. Again, we see in countries where there is a lower level of polarization that the problem with disinformation is smaller.

These would be the parts about repairing the weaknesses, but in case you are interested in more detail, I would be more than happy to share the text.

Noon

Liberal

Darren Fisher Liberal Dartmouth—Cole Harbour, NS

Thank you very much.

Noon

Conservative

The Chair Conservative John Brassard

Thank you very much, Ms. Gaudreau.

Mr. Green, you have two and a half minutes.

Go ahead, please.

Noon

NDP

Matthew Green NDP Hamilton Centre, ON

Thank you.

I want to allow more space for this so that we get solid recommendations coming out of the study, Mr. Kalenský, so I will ask you, with specificity, what legislative or regulatory measures in Europe or elsewhere have been successful in addressing disinformation campaigns, especially when that affects parliamentarians, whether during election periods or throughout the year.

Noon

Deputy Director, COI Hybrid Influence, European Centre of Excellence for Countering Hybrid Threats

Jakub Kalenský

I'm not really sure that you would find legislative measures targeted only at disinformation campaigns that targeted elections. I think it would be more broad. I think it would be regardless of the election cycle.

We saw the most aggressive measures, like outright bans. Most of them have been in Ukraine, but also, in the EU, there was a ban on Russia Today and Sputnik. Ukraine has gone further. They also banned channels not owned by the state, but channels still spreading the same disinformation that was being spread by the channels owned by Russia—channels owned by a Ukrainian oligarch, Victor Medvedchuk.

Most of Europe has not done that so far, but these outright bans would probably be the most aggressive solution.

Noon

NDP

Matthew Green NDP Hamilton Centre, ON

On that point, though, let's be clear. Regardless of who owns it, particularly the private sector—you look at Meta, you look at X—if that information is for sale anyway, is it your assertion that we ban all platforms?

I know that in the United States, the Republicans, and even some Democrats, I think, are pushing for the banning of TikTok, yet you look at Cambridge Analytica and the lead-up to January 6, and that insurrection certainly wasn't based on TikTok.

I wonder if you could comment about whether or not the outright bans of these platforms are more theatre than an actual application of a sound policy that wouldn't just see them migrate to other commercial interests like Meta or X.

Noon

Deputy Director, COI Hybrid Influence, European Centre of Excellence for Countering Hybrid Threats

Jakub Kalenský

We definitely see the information aggressors adapting to these measures and migrating to different platforms, but there has been some research, although unfortunately just anecdotal, that they always lose at least some of the audience, not all of it, but at least some of it, and this is sometimes—

Noon

NDP

Matthew Green NDP Hamilton Centre, ON

Just quickly, before we end, I want to go back to the parliamentarian thing.

Would you care to comment? Do you think there should be an opportunity for us to look at the way political parties use these? If we're talking about bans, do you think we might want to look at legislation so that partisan political parties could not use these types of tools when it comes to profiling and targeting of people based on algorithms and misinformation?

Noon

Deputy Director, COI Hybrid Influence, European Centre of Excellence for Countering Hybrid Threats

Jakub Kalenský

I'm afraid that with five seconds to think, I can't really give you a proper answer.

Noon

NDP

Matthew Green NDP Hamilton Centre, ON

Could you submit something for us, for the benefit of the committee?

Noon

Deputy Director, COI Hybrid Influence, European Centre of Excellence for Countering Hybrid Threats

Jakub Kalenský

Yes. I will be happy to think about it.

Noon

NDP

Matthew Green NDP Hamilton Centre, ON

Thank you so much.

Noon

Conservative

The Chair Conservative John Brassard

Thank you.

We like to work on timelines here, Mr. Kalenský. If you could get it to us by Friday, I would appreciate it on behalf of the committee. We have very limited time for our study, so we have to make sure all the information comes in.

That concludes our first panel for today.

Mr. Kalenský, Mr. Bridgman and Mr. Chiu, thank you for taking the time to be here today and share your information with the committee. It was very helpful.

We're going to suspend for a few minutes while we change over the panel.

The meeting is suspended.

12:05 p.m.

Conservative

The Chair Conservative John Brassard

I'm now calling the meeting back to order.

I'd like to welcome the witnesses who will be appearing during the second hour of the meeting. We have Mr. Patrick White, associate professor of journalism at the Université du Québec à Montréal Media School, appearing as an individual.

From MediaSmarts, we have Matthew Johnson, who is the director of education, and Kathryn Hill, who is the executive director.

Mr. White, we're going to start with you.

You have five minutes for your opening address.

12:05 p.m.

Patrick White Associate Professor of Journalism, Media School, UQAM, As an Individual

Good afternoon, everyone.

I'd like to thank the committee members for the invitation.

I've been a journalist since 1990 and a professor of journalism at Université du Québec à Montréal for five years.

I believe that 2024 represents a crossroads for disinformation and misinformation. Content automation has proliferated with the launch of the ChatGPT 3.5 AI chatbot in 2022. Not only that, but a Massachusetts Institute of Technology study published in 2018 shows that false news has been circulating six times faster on Twitter than fact-checked news. That's cause for concern.

Things have gotten worse on X, formerly called Twitter, over the past 18 months, since it was taken over by businessman Elon Musk, as a result of several announcements, including the possibility of acquiring a blue checkmark, meaning verified status, simply by paying a few dollars a month, along with the reinstatement of accounts like the one held by former U.S. President Trump, who is himself a major vector of disinformation.

These social network algorithms clearly promote content that generates the most traffic, meaning comments, “likes” and sharing, which amplifies the spread of extreme ideas that we've been seeing in recent years.

One current concern is Meta's blocking of news on Facebook and Instagram in Canada since the summer of 2023, which further fuels the growth of disinformation and misinformation by suppressing news from Canadian media, except for sports and cultural news.

A recently published study that was quoted by Reuters says:

comments and shares of what it categorised as “unreliable” sources climbed to 6.9% in Canada in the 90 days after the ban, compared to 2.2% in the 90 days before.

On the political side of things, I believe efforts should be made to get the news back on Facebook and Instagram by the end of 2024, before Canada's federal elections. The repercussions of this disinformation are political. For example, on Instagram, you now have to click on a tab to see political publications. They've been purposely blocked or restricted by Meta for several months now. The experience is unpleasant for Canadians on Facebook, because more and more content of interest to them from major Canadian media outlets is being replaced by junk news. This reduces the scope of what people are seeing, is harmful to democracy, and also leads to less traffic on news sites. According to a recently published study from McGill University, to which our colleague who testified earlier contributed, news is being replaced by memes on Facebook. It reports the disappearance of five million to eight million views per day of informational content in Canada.

The Canadian government will also have to take rapid action on the issue of artificial intelligence by prohibiting the dissemination of AI-generated content, like deep fake images and audio. Bill C-63 is a partial response to prejudicial content, but it doesn't go far enough. More transparency is needed with respect to AI-generated content.

Oversight is also urgently needed for intellectual property. The Montreal newspaper Le Devoir ran an article about that this morning. What are the boundaries? I encourage you to quickly develop legislation to address this issue, rather than wait 30 years, as was the case for Bill C-11.

Canadian parliamentarians also need to declare war on content farms that produce false news on request about our country and other countries. Foreign governments like China's and Russia's often use that strategy. We mustn't forget that 140 million people were exposed to false news in the United States during the 2020 election. That's clearly very troubling in view of the coming U.S. election this fall. I am also amazed that Canada has been allowing the Chinese Communist Party to continue spreading propaganda press releases on the Canadian Cision newswire for years.

To conclude, I'll be happy to answer your questions. Canada needs to be on a war footing against disinformation, whether generated by artificial intelligence or manually. Stricter rules are required for generative artificial intelligence and for the protection of intellectual property owned by Canadian media and artists, who should be benefiting from these technological advances over the coming years.

Thank you.

12:10 p.m.

Conservative

The Chair Conservative John Brassard

Thank you for your address, Mr. White, and for having kept to your speaking time.

Ms. Hill, you have five minutes to address the committee.

Go ahead, please.

12:10 p.m.

Kathryn Hill Executive Director, MediaSmarts

Good afternoon, members of the committee. My name is Kathryn Hill. I am proud to serve as the executive director of MediaSmarts. Our office is located on unceded Algonquin Anishinabe territory. We are grateful for the invitation to appear today as part of this study.

I'm joined today by MediaSmarts' director of education, Matthew Johnson.

MediaSmarts—if you haven't heard of us—is Canada's centre for digital media literacy. We are a not-for-profit charitable organization, and our vision is that all people in Canada be empowered to engage with all forms of media confidently and critically.

To achieve this goal, we advance digital media literacy through world-class research, education, public engagement and outreach. Through our programs, people in Canada learn to become active, engaged and informed digital citizens.

Digital media literacy is essential to an informed and engaged populace and electorate. Canada is especially in need of a coordinated approach that moves beyond only access and skills-based understandings of digital media literacy.

The recent increase in visual disinformation, manipulated images, bots and artificial intelligence, or what we talk about as deepfakes, requires that we seriously engage in countering disinformation.

A recent report from StatsCan confirms that about 43% of people in Canada are feeling overwhelmed by these massive shifts in technology and information. For example, photographs and videos used to serve as proof that something occurred or happened in a particular way are no longer reliable. Research shows that people of all ages and beliefs are vulnerable to misinformation and disinformation. People in all sectors, including parliamentarians like you, need to know how to verify information and how to tell the difference between reliable and unreliable sources.

We need to promote information verification as a social norm and habit in Canada. Knowing and practising verification skills empowers citizens to mitigate the potential impact of disinformation and other online harms they encounter.

Digital media literacy education has been shown to be an effective approach to addressing misinformation. Around the world, there have been successful interventions with audiences ranging from elementary students to seniors. Our own Break the Fake program and materials have been found to be effective in both our own evaluations and those done by independent evaluators.

The last five years have also shown that not all approaches are equal. Most importantly, it is essential to focus on discernment over just debunking. Many interventions aimed solely at teaching people to recognize misinformation have a side effect of reducing trust in reliable sources, essentially teaching people to be cynical instead of skeptical.

As well, evaluations have identified three essential elements of a successful digital media literacy intervention. First is a focus on critical thinking and intellectual humility. Second is practical instruction in information triage. Finally, successful interventions recognize that in the networked world that we are all a part of, we are not just consumers of information but also broadcasters of information. Digital media literacy is essential to combat this misinformation and disinformation.

For parliamentarians, as elected public figures, the stakes of authenticating and verifying information online are even higher, given that you have a wide public reach and are considered trusted sources of information. When a trusted source or leader makes a misstep and spreads misinformation, the effects can reach a large and broad audience of Canadians and can erode people's trust in institutions, specifically the government.

Parliamentarians and their staff need support to build their digital media literacy skills when it comes to verifying information online.

Given all of this, I would like to conclude by providing two recommendations.

First, we recommend that Parliament, in both the House and the Senate, require mandatory training for all parliamentarians and their staff on how to verify information and combat misinformation and disinformation.

Second, as we have recommended consistently for 15 to 20 years, we recommend that the Government of Canada develop a digital media literacy strategy that would include supporting all people in Canada in developing the skills to navigate the online information ecosystem confidently and critically.

Thank you for your attention.

12:15 p.m.

Conservative

The Chair Conservative John Brassard

Thank you, Ms. Hill. Thank you for being on time as well.

We're going to start our first six-minute round with Mr. Brock.

Go ahead.

12:15 p.m.

Conservative

Larry Brock Conservative Brantford—Brant, ON

Thank you, Mr. Chair, and thank you to the witnesses for their attendance.

I'm going to start with you, Mr. White.

I'm reading from an article entitled “AI-powered disinformation is spreading—is Canada ready for the political impact?” It starts by talking about a story regarding Slovakia's national election last fall:

Just days before [the] election last fall, a mysterious voice recording began spreading a lie online.

The manipulated file made it sound like Michal Simecka, leader of the Progressive Slovakia party, was discussing buying votes with a local journalist. But the conversation never happened; the file was later debunked as a “deepfake” hoax.

On election day, Simecka lost to the pro-Kremlin populist candidate Robert Fico in a tight race.

While it's nearly impossible to determine whether the deepfake file contributed to the final results, the incident points to growing fears about the effect products of artificial intelligence are having on democracy around the world—and in Canada.

According to Caroline Xavier, head of the Communications Security Establishment, “This is what we fear...that there could be a foreign interference so grave that then the electoral roll results are brought into question.” She continued, “We know that misinformation and disinformation is already a threat to democratic processes. [AI] will potentially add to that amplification. That is quite concerning.”

What is Canada currently doing, in your opinion, to address this threat, or what should it be doing?

April 30th, 2024 / 12:20 p.m.

Associate Professor of Journalism, Media School, UQAM, As an Individual

Patrick White

Canada is already working hard with what it did with Bill C-18 and Bill C-11 for Canadian content, and with Bill C-63 it's going to fight misinformation and contenu préjudiciable as well. Are we doing enough? Probably not, but AI is an opportunity as well as a threat.

As far as deepfakes are concerned, I would strongly urge the government to legislate on that matter within the next 12 to 18 months, especially on deepfake videos and deepfake audio, as well, which you mentioned.

We have a lot to work on in the next 12 months on that issue, taking into context the upcoming federal election in Canada.

12:20 p.m.

Conservative

Larry Brock Conservative Brantford—Brant, ON

That's correct.

I'll turn now to Ms. Hill and Mr. Johnson.

Thank you for your attendance. I enjoyed our discussion in my office a few weeks ago.

I listened to your opening statement very carefully, Ms. Hill. You talked about some suggestions for parliamentarians moving forward: mandatory training and a digital media strategy for the government as a whole. Can you add a little more meat to that particular discussion, please?

12:20 p.m.

Executive Director, MediaSmarts

Kathryn Hill

Certainly. Would you like it around the digital media literacy strategy?

12:20 p.m.

Conservative

Larry Brock Conservative Brantford—Brant, ON

I'd like it around both.