Evidence of meeting #115 for Access to Information, Privacy and Ethics in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was finkelstein.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Ben Nimmo  Threat Investigator, OpenAI, As an Individual
Joel Finkelstein  Founder and Chief Science Officer, Network Contagion Research Institute
Sanjay Khanna  Strategic Advisor and Foresight Expert, As an Individual

1 p.m.

Conservative

The Chair Conservative John Brassard

Thank you, Mr. Finkelstein.

I'm finding it difficult to cut you off. I find the information that you're providing fascinating.

1 p.m.

Founder and Chief Science Officer, Network Contagion Research Institute

Joel Finkelstein

John, would you mind talking to my wife about that?

1 p.m.

Voices

Oh, oh!

1 p.m.

Conservative

The Chair Conservative John Brassard

If you talk to mine, yes.

Before I turn the floor over to Mr. Villemure, I want to let you know that he will be asking his questions in French. I mention it for the witnesses whose preferred language is English.

I'm just going to give a second for Mr. Khanna to set up his earbud.

The information being shared with the committee today is very important, so I want all the witnesses to understand what we are saying.

Go ahead, Mr. Villemure. You have six minutes.

1 p.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

Thank you, Mr. Chair.

Thank you to the witnesses for being with us today.

Mr. Finkelstein, I'll start with you.

You said that “we're being fed a conflict.”

I'd like to look at the situation at a broader level. Oftentimes, when I talk to people in Trois‑Rivières about the Ukraine-Russia war, for instance, they tell me that everything they're hearing indicates that Ukraine is good and Russia is bad. The same can be said of many other conflicts.

To some extent, we are all targets of a narrative being pushed on us by social media or, I fear, sometimes even news agencies or media organizations with a wider reach.

How are ordinary people supposed to navigate that to get a clear sense of the issue? In the example I just gave, the message is that Ukraine is in the right and Russia is in the wrong. That may well be true, but how are people who aren't experts on the issue supposed to make up their minds?

1 p.m.

Conservative

The Chair Conservative John Brassard

Which witness is that for?

1 p.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

The question is for Mr. Finkelstein.

1 p.m.

Founder and Chief Science Officer, Network Contagion Research Institute

Joel Finkelstein

It's a great question.

My father once had great advice. He said that it's not good to put people on pedestals because eventually everyone needs to pee.

It's sound advice in this case. It's important that we understand our own faults and it's important that we understand the faults of our allies. That's part of what it means to be able to have honest conversation. Looking at uncomfortable facts and being able to deliberate them in ways that don't leave us in fear of uncertainty of.... What if people can't handle the truth?

When we can stare at the uncomfortable facts, warts and all, then we're always in a better position to manage threats more strategically. That means we need a vote of confidence to be able to talk about those things, and amplify and elevate honest and hard conversation.

It's really ironic. People are so worried about what happens if we know the truth about what's happening and maybe our allies aren't as good as we are—

1 p.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

Thank you, Mr. Finkelstein. That answers my question. Sorry to cut you off, but I have more questions to ask.

It's said that the atomic bomb is what ended the Second World War. Today the claim seems to be that artificial intelligence will be the tool of choice in the next war, with its capacity to spread disinformation.

Are we at war?

1 p.m.

Founder and Chief Science Officer, Network Contagion Research Institute

Joel Finkelstein

That's such a good question.

Going back to what we were saying about whether or not we trust each other, the real question is whether we trust ourselves.

The danger that AI poses, especially generative AI, is it can trigger what's called an authenticity crisis. You won't know if you're talking to a real human. You won't know if the response you got was from somebody who's deliberate or if it's the most elegant, dexterous and successful autopilot that's ever been created. You won't know the difference. You won't be able to tell who you're talking to online. You won't be able to tell whether or not there's a real person who's agreeing with you or disagreeing with you.

I think that's—

1:05 p.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

Thank you. That is definitely scary.

My next question is for Mr. Khanna.

What does the future look like as far as misinformation and disinformation go? As it is, weak signals are being amplified. Information is spreading more and more quickly, and the idea of the truth tends to get lost. People are practically willing to replace truth with likelihood, the almost truth. At the very least, people are more likely to believe what they're told than to try to figure out what's true.

What's the outlook, then, when it comes to weak and strong signals?

1:05 p.m.

Strategic Advisor and Foresight Expert, As an Individual

Sanjay Khanna

If we don't really think about this deliberatively, cohesively and with strategies both for Parliament and for citizens, we're going to move more deeply and more quickly into the world that Mr. Finkelstein has mentioned.

I've done work looking at the emergence of these technologies now for 20 years. We are seeing things get to the point where they are starting to cause confusion in the public. They're starting to cause confusion among youth.

We have to think about children, I think, fundamentally. Are we doing the things today that will serve children in being able to understand and trust the environments they're living in and the context they're living in?

That's the futures orientation that I think we need to have. However, we do require a structured approach to think about what these emerging scenarios are and what we can do today to protect and mitigate against those risks so that we are resilient enough to not be manipulated at individual scales as citizens or more broadly in various groups that we participate in, and to make decisions and use the resources we have to take action to both protect ourselves and find opportunities in a changing world.

1:05 p.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

Thank you.

Mr. Finkelstein, I'm coming back to you with a quick question. Is TikTok a tool for disinformation?

1:05 p.m.

Founder and Chief Science Officer, Network Contagion Research Institute

Joel Finkelstein

I think TikTok has profoundly different uses from other platforms. It is primarily a platform where children and young adults can put up zany videos for 15 seconds of themselves doing obscure and nutty things, but in addition to that, it obviously serves an incredibly unusual purpose in distorting reality for 1.5 billion people at a massive scale.

This appears to be deliberate, and that makes it unparalleled, because, where you can have mob mentality and you can have prevailing political interests that are somewhat recognizable on the platforms that most people inhabit, here you have something that's quite distinct. It's a different animal. The things it produces are of a higher scale, and we're seeing far different outcomes psychologically for its users. Those concerns were crucial in us speaking to Congress and also to the Senate as they passed legislation in the United States that arrived at this conclusion: that there was something anomalous about the behaviour of the platform that merited significant concern.

Now, the process for managing that is imperfect, because we still don't know how to gather and make deliberative decisions about these large-scale platforms and how they're influencing us, but the actions of parliamentarians have to be informed by sober knowledge about the threats that are growing on these platforms. That is especially the case when those threats are coming from near enemies. My sense is that—

1:05 p.m.

Conservative

The Chair Conservative John Brassard

Mr. Finkelstein, I'm sorry.

Mr. Villemure's time is up.

I didn't want that to go unanswered, because I thought it was an important part of the discussion today.

Mr. Green, you have six minutes and a bit.

Go ahead, sir.

1:05 p.m.

NDP

Matthew Green NDP Hamilton Centre, ON

Thank you.

I want to note and go on the record to say that my round and that of the previous speaker were limited given that we weren't able to meet the time requirements of Mr. Nimmo. I'd request that my questions be put to him in writing for response.

What I would like to do, sir, for the good and welfare of the committee, is to split the time between both witnesses present, beginning with Mr. Khanna, on high-level recommendations that he could provide within three minutes to this committee for the good and welfare of our report.

May 2nd, 2024 / 1:05 p.m.

Strategic Advisor and Foresight Expert, As an Individual

Sanjay Khanna

I would agree with Mr. Finkelstein on the question of having an investigatory capacity. I think that's important.

More broadly, I think it's really about looking at the offices of Parliament and thinking about how to protect your ability as parliamentarians to ensure what you're working with as your ground truth is based on fact: how to do that, how to train your staff and how to build their capacity to be resilient so that everyone who then interacts with you, whether it's your constituents or others, knows that you're at least a trusted source. I'm not talking about your policy positions. I'm talking about the ground truth that you're using to make decisions. I think parliamentary staff are going to be more targeted by these technologies, such as deepfake videos, manipulated voices and those sorts of things.

The other piece, then, is how you are going to protect the body of Canadian society that is your constituents and how you are going to protect the next generations. This is why I was suggesting a Canadian charter of digital rights and freedoms that outlines both the responsibilities and protections of Canadian citizens. I know that there's been a lot on the online harms act, but I don't think it clarifies to citizens what their responsibilities are and what protections may be available to them.

I'll stop there because I know we have limited time, but thank you very much for that question.

1:10 p.m.

NDP

Matthew Green NDP Hamilton Centre, ON

If, with reflection and more time, you do have more, I encourage you to submit it. We can only draft our study report based on recommendations from the testimony of expert witnesses such as you.

It looks like I'm at two minutes and roughly 30 seconds, Mr. Finkelstein. You have three minutes to provide a synopsis, if you could, on high-level recommendations that you would put to this committee for our consideration at our report writing stage.

1:10 p.m.

Founder and Chief Science Officer, Network Contagion Research Institute

Joel Finkelstein

Thank you so much for that.

We are not a policy organization and we have refrained in our work...we have focused specifically on having a neutral attitude toward policy so that we can have a more sober capacity for risk assessment. We usually leave the policies in the hands of the experts—that's you all.

I have said this before, but I think it bears repeating. I spoke to a four-star marine general here in the United States, who commanded NATO's forces. His name is General John Allen. I asked him, “General, have you ever won a battle without a map of the battlefield?” He said it has never happened. That's what's happening with the current attempts to control social media. We have no idea what to control. We can't determine signal from noise.

Parliamentarians like you are being deliberately misled by the platforms you're supposed to be managing. As the threats emerge in the platforms, the incentives for the platforms to manage those threats are limited. They're limited because managing threats isn't their business model. They're limited by their shareholders, so I don't blame them. It's not totally their fault. Okay, I blame them a bit, but I will say that, really, the conversations we need to have need to be informed by data, and Parliament has the right to demand that data. It has the right to be able to see how the things.... Its job is to manage. It needs to be able to see those things.

I think the most important part of how we manage the threats of the future relies on a capacity for rapid research. That is the function I would most recommend that Parliament adopt—

1:10 p.m.

NDP

Matthew Green NDP Hamilton Centre, ON

I feel like I need to be putting the question to you in the form of an AI prompt to have you imagine yourself as a legislator. What might be some of the things that you'd do as a legislator?

You talked about the commission. There was earlier commentary around the information we use as a base level of truth. Could you hypothetically comment on what accountability might look like for those that surreptitiously use these models, algorithms and platforms for nefarious use?

1:10 p.m.

Founder and Chief Science Officer, Network Contagion Research Institute

Joel Finkelstein

I would say there are two cases that I brought up to the Parliament for this reason.

The first is that you have bad actors who exist in closet spaces and basements and are capable of opportunistically upending...and causing mayhem and murder on massive scales. We need complete visibility of where those actors are. We need an alert system that can bring that information to lawmakers before they even know to ask for it, and we need to create a scouting capacity to understand where these emerging threats are happening so that they can be managed before they spill out into the real world.

That means we need complete platform access. We need to have the same access that platforms have, without running the risk of privacy....

1:15 p.m.

NDP

Matthew Green NDP Hamilton Centre, ON

Just for the record, with the last 20 seconds, I think you suggested it should be at arm's length from the government.

1:15 p.m.

Founder and Chief Science Officer, Network Contagion Research Institute

Joel Finkelstein

Yes. It's crucial.

1:15 p.m.

NDP

Matthew Green NDP Hamilton Centre, ON

We're not talking about a Big Brother model or, potentially, authoritarian capture, but an independent commission that would have.... Is that right, just for clarity?

1:15 p.m.

Founder and Chief Science Officer, Network Contagion Research Institute

Joel Finkelstein

One hundred per cent. If I had to tell a democratic public that their government was responsible for managing all their information, count me out. I can imagine citizens not reacting well to that, and for good reason.

Having independence is really crucial. The reason that's true is it's just like a referee. You need someone to blame, and that person has to be willing to wear the stripes and be willing to be blamed.