Evidence of meeting #130 for Access to Information, Privacy and Ethics in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was disinformation.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Jon Bateman  Senior Fellow and Co-Director, Technology and International Affairs Program, Carnegie Endowment for International Peace
Benjamin Fung  Professor and Canada Research Chair, McGill University, As an Individual
Clerk of the Committee  Ms. Nancy Vohl

4:15 p.m.

Professor and Canada Research Chair, McGill University, As an Individual

Benjamin Fung

Do you mean the non-state actors like companies?

Matthew Green NDP Hamilton Centre, ON

Correct, and not just companies but ideologically motivated people, whether they're attached formally to think tanks or whether they're lone actors, hackers or people I envision in a dark basement who are really trying to manufacture consent around something they care about.

How widespread is this technology, and how usable is it in its current forms? Maybe you could talk about the ways that it's accessible.

4:15 p.m.

Professor and Canada Research Chair, McGill University, As an Individual

Benjamin Fung

In the current way, some non-state actors work together to boost disinformation. For example, they will like some posts together. This is one of the ways we can detect this type of activity.

Let's say that there are thousands of posts. Let's talk about television, for example. If both of us like to talk about television, it is very unlikely that we will co-like or co-comment on the same set of posts. We can use this type of information to detect this type of technology.

Matthew Green NDP Hamilton Centre, ON

Mr. Bateman, again, I think about Elon Musk, his takeover of Twitter and the way in which he's shaping the discourse of this digital public forum. Can you talk about non-state actors and the potential threat of undermining our democracy?

4:15 p.m.

Senior Fellow and Co-Director, Technology and International Affairs Program, Carnegie Endowment for International Peace

Jon Bateman

Yes. I would that say that actors other than foreign states are the main sources of mis- and disinformation. If you think about the perspective of an individual voter going through an election cycle, what's all the political information that he or she encounters? Almost none of it would be from any foreign actor. It would be from friends, families, community leaders, national politicians, local politicians and the news media. That is really the information environment in sum and substance. If any of those actors are spreading mis- and disinformation, as is frequent, that would be the primary problem facing democracies.

Matthew Green NDP Hamilton Centre, ON

Mr. Bateman, in your work you've identified 10 policy interventions, but you've also stated that there's no silver bullet.

With some specificity, given the contemplation that we have in this committee for recommendations, what might you suggest as a series of policy interventions that might be helpful, notwithstanding the 10 that you've already provided? Maybe you want to highlight a couple from the 10.

4:20 p.m.

Senior Fellow and Co-Director, Technology and International Affairs Program, Carnegie Endowment for International Peace

Jon Bateman

I would highlight at least two on there and one that's not on there.

The two that I would highlight are supporting local journalism and supporting media literacy programs. I mention those not because they're better or worse than the others, but because they have a higher ceiling. They could accomplish more over time than many of the other more small-bore measures that we're already highly invested in, which are more tactical in nature.

The third recommendation that I would make is a meta-recommendation. It's that we need to get better at informing ourselves about these informational dynamics and threats. That would start with, for example, helping researchers get better access to information from tech platforms and creating grants and other pathways to ensure this research actually occurs.

Matthew Green NDP Hamilton Centre, ON

Would that include algorithmic transparency?

4:20 p.m.

Senior Fellow and Co-Director, Technology and International Affairs Program, Carnegie Endowment for International Peace

Jon Bateman

It could include algorithmic transparency. It could also include transparency about major accounts, interactions and platforms.

There's a whole host of data. A colleague of mine at Carnegie has compiled some of that. I'd be happy to pass that on.

Matthew Green NDP Hamilton Centre, ON

On that theme, it was noted that it's not being studied by independent researchers in meaningful ways—I'm talking again about algorithms—and market viability of such changes is uncertain since the core business model for all major platforms is based on optimizing engagement.

Is it not the case that our major platforms have an incentive for what they call “clickbait” or “rage clicking”, which is often fed by misinformation and disinformation?

4:20 p.m.

Senior Fellow and Co-Director, Technology and International Affairs Program, Carnegie Endowment for International Peace

Jon Bateman

Unfortunately, yes.

Every platform is based on the business model of maximizing people's time and interest on the platform. That means the content that does well algorithmically is content that intrigues, outrages, upsets or amuses. False content is often more inclined to be sensational, outrageous and clickbaity.

We do have a conflict of interest here with the platforms. They are designed, in many ways, to spread disinformation.

Matthew Green NDP Hamilton Centre, ON

Thank you.

4:20 p.m.

Conservative

The Chair Conservative John Brassard

Thank you, Mr. Bateman, Mr. Fung and Mr. Green. That concludes our first round of questioning.

We're going to go to two five-minute rounds, followed by two-and-a-half-minute rounds for Mr. Villemure and Mr. Green.

Mr. Barrett, you have five minutes. Go ahead, sir.

4:20 p.m.

Conservative

Michael Barrett Conservative Leeds—Grenville—Thousand Islands and Rideau Lakes, ON

Mr. Fung, given what we know about the disinformation campaigns targeting our elections, particularly the campaign perpetrated by the communist dictatorship in Beijing against former Conservative member of Parliament Kenny Chiu, the current Liberal government has failed to adequately address this. This is borne out by the fact that we have to have a commission to look into foreign interference and that the legislation that's been put forward has not come in the first, second, third, fourth, fifth, sixth or seventh year of their mandate.

What should government be doing today to prevent hostile foreign state actors, like the communist dictatorship in Beijing, from putting their thumb on the scale of Canadian democracy?

4:20 p.m.

Professor and Canada Research Chair, McGill University, As an Individual

Benjamin Fung

There are two different ways. One way is to set up government agencies like the U.S. model, like the Global Engagement Center, and give them the authority to monitor the social media posts that are related to the democratic process. They have the technological capability and the legal authority to stop some of this disinformation.

Another approach is to create a not-for-profit organization and give that organization the authority to do a similar process, but it is more independent of the government.

Michael Barrett Conservative Leeds—Grenville—Thousand Islands and Rideau Lakes, ON

For anyone who's not familiar, and for the purpose of our report, can you provide some examples of the types of actions the CCP, the communist dictatorship in Beijing, has used to try to influence our elections?

If you have another example of any countermeasure that could be applied to that, please add it.

4:25 p.m.

Professor and Canada Research Chair, McGill University, As an Individual

Benjamin Fung

One example is that it is not just focusing on social media disinformation. When we talk about disinformation, we often talk about social media, but the CCP is not just working on social media. It also works with traditional media, which is the Chinese media running in Canada. There are newspapers. There are radio stations in Vancouver and Toronto. They are collaborating with the CCP and different Chinese organizations running in Canada.

One of the questions raised previously was about the difference between Russian disinformation and this Chinese disinformation here. It is the economic power, because China can use advertisements to directly control what radio stations and newspapers put out in their content and how they invite different commentators to the radio stations. It can use its local economic power to control that, which is not the same in the Russian case.

To fight against this type of collaboration, I think Bill C-70 will play a part of the role by trying to identify the foreign agents in this case.

4:25 p.m.

Conservative

Michael Barrett Conservative Leeds—Grenville—Thousand Islands and Rideau Lakes, ON

Thank you.

Mr. Bateman, you're someone who has experience assessing a foreign state's senior leadership and cybersecurity. Are you able to identify any systems or countermeasures that the current government could have put in place up to this point, or could quickly put in place now, to protect us from malicious state actors and protect our national security, our economy and our democracy?

4:25 p.m.

Senior Fellow and Co-Director, Technology and International Affairs Program, Carnegie Endowment for International Peace

Jon Bateman

I'm not familiar with gaps within the Canadian system specifically, but I can offer some best practices from other systems.

There are a variety of tools available to governments to fight foreign interference. There are naming and shaming sanctions and indictments. There are targeted, technical actions, such as cyber-operations that could be carried out to disrupt the foreign activity, especially during a sensitive, temporary period, such as before or after an election. There are others, as well, like Professor Fung mentioned, that are simply public disclosure and public information.

One path would be to build capacity in each of those areas, but another path would be to build connectivity across these areas and make sure that they're working together, which is something the U.S. government has done.

I will say that, in the end, it's not clear how effective any of these policies are. We've been naming and shaming, indicting, sanctioning and disrupting these adversaries for some time. It probably has some operational impact on them, but it doesn't stop the activity.

4:25 p.m.

Conservative

Michael Barrett Conservative Leeds—Grenville—Thousand Islands and Rideau Lakes, ON

Thanks very much for your response.

4:25 p.m.

Conservative

The Chair Conservative John Brassard

Thank you, Mr. Barrett.

Thank you, Mr. Bateman.

They seem to be even more emboldened now.

Mr. Fisher, we're going to go to you for five minutes. Go ahead.

Darren Fisher Liberal Dartmouth—Cole Harbour, NS

Thank you very much, Mr. Chair.

Thank you very much to our witnesses for being here today.

Mr. Bateman, I want to tag on to some of the things Mr. Green brought up. I notice you have your publication there, proudly to your right. I think it's great that you are proud of that document. We're hoping to have some of the social media platforms here for this study, which would be very interesting.

As you know, social media platforms do everything they can to hold and maximize your attention for as long as possible. I think you said to Mr. Green that they're “maximizing people's time”. Of course, they use algorithms to place in the feed the content you are most likely to be interested in, focus on or interact with.

In your publication, you talk about the importance of—or maybe it's a recommendation—finding a way to change the way algorithms are used. I'm interested in your thoughts on how that's possible. Make us understand what that would look like, how it might help solve the problem and why it hasn't been studied by independent researchers yet.

Is it just beginning, or do you expect that to happen in the near term?

4:25 p.m.

Senior Fellow and Co-Director, Technology and International Affairs Program, Carnegie Endowment for International Peace

Jon Bateman

I'd be happy to.

As we've discussed, the major social media platforms today have recommendation algorithms and other design choices, such as the way their buttons and apps look, feel and interact, that are designed to maximize engagement, but you could maximize something else. You could maximize, for example, the civility of discourse so that, if there were a long series of posts going back and forth on a controversial issue, you could actually bring to the top the one that seems the most clear and helpful and is achieving some amount of support or balance from both sides.

Other people have explored using algorithms to deter or dissuade people from posting toxic content by trying to nudge them in a more positive direction.

There are many options here. In essence, we just have to maximize for something other than engagement or for a combination of engagement with something else.

Why hasn't this been done? It's because engagement is how you attract eyeballs, and eyeballs are how you attract advertisers or subscribers and, thereby, make money.

There are academics who are experimenting with what are sometimes called civically oriented platforms. It's a worthwhile effort, but it's unlikely that these would ever be commercially viable alternatives because people actually want the high-engagement platforms.

Darren Fisher Liberal Dartmouth—Cole Harbour, NS

I'm going to stretch this a little bit. If they were to look for the positive and try to encourage the positive.... We used to use the line “if it bleeds, it leads” for news stories. It's easier to enrage. It's easier to get people to complain or to post something or to pay attention to something a little more toxic.

You touched on this because you said that might reduce the advertising funds these social media platforms would get if they decided to change the algorithms to go in a more positive direction. I'm interested in your thoughts on what that might look like monetarily. Is that something you've even looked at?

Is it potentially going to cut their profits in half? Is it that significant?

4:30 p.m.

Senior Fellow and Co-Director, Technology and International Affairs Program, Carnegie Endowment for International Peace

Jon Bateman

I don't have an estimate, but I think it would be extremely significant. I think the most important factor from a competitive point of view is whether this is something that a platform would be endeavouring to do by itself, thus falling behind in the competitive marketplace, or is this something that would happen collectively?

I'll give an example that has been playing out in real life. I believe the European Union now has a regulation that requires platforms to at least offer an option for a chronological feed instead of an algorithmically curated feed. It's not that big a deal to just offer that as an option—most users do choose the default—but that form of regulation then creates a level playing field so that all platforms would have that as an option.