Evidence of meeting #120 for Access to Information, Privacy and Ethics in the 42nd Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was content.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Claire Wardle  Harvard University, As an Individual
Ryan Black  Partner, Co-Chair of Information Technology Group, McMillan LLP, As an Individual
Pablo Jorge Tseng  Associate, McMillan LLP, As an Individual
Tristan Harris  Co-Founder and Executive Director, Center for Humane Technology
Vivian Krause  Researcher and Writer, As an Individual

Noon

NDP

Charlie Angus NDP Timmins—James Bay, ON

Thank you.

Mr. Black and Mr. Tseng, on the issue of deepfakes and the legal powers, under the Copyright Act in Canada, we have notice and notice, as opposed to notice and takedown. There has been push-back on imposing notice and takedown, because they say that you could be unfairly interfering with someone's rights, that you could be unfairly targeting a competitor.

On the question of deepfakes, are there specific legal things that we have to look at in terms of its effect on say, upending an election?

What are the legal parameters? If someone has been the subject of a deepfake, they could go the libel route. There are a number of traditional mechanisms in place that may be sufficient. But if it happens in the middle of an election, it could upend the democratic system.

Are there specific remedies that would be better able to address the threat of a deepfake, and upending elections?

Noon

Liberal

The Vice-Chair Liberal Nathaniel Erskine-Smith

Please answer very briefly.

Noon

Partner, Co-Chair of Information Technology Group, McMillan LLP, As an Individual

Ryan Black

I'm not sure that deepfake technology would be an appropriate target for any specific action, only because it is one in a very large belt of tools available to people who are trying to manipulate people through social media. Through the ways that both of the other speakers have spoken about, our brains are kind of wired to heuristically solve problems that we can't possibly logically solve because there's so much information being thrown at us at all times.

I worry, truthfully, more about the intent of misinformation and disinformation. I truthfully worry more about that than the specifics of deepfake video. This is only because—again, I go back to my security camera footage—you don't need to have a very sophisticated video or fake video to convince people that something's happened. You don't need to have a very convincing photo to convince people that something's happened. You can use a real image just as easily as you can use a fake image.

Noon

Liberal

The Vice-Chair Liberal Nathaniel Erskine-Smith

Thanks very much.

Our last seven minutes goes to Mr. Picard.

Noon

Liberal

Michel Picard Liberal Montarville, QC

Thank you.

My first question will be for Mr. Harris.

You said, and we agree, that there is a gigantic volume of information thrown at people, to a point that it's almost impossible for us to see clearly through the information we get.

Are you saying that this enormous volume of info limits our capacity to see what's real and what's not? Does it prevent us from being able to cross-check information to a point that, as Dr. Wardle said, we damage...like losing an election? It will impact our behaviour and we will have nothing that we can do to prevent ourselves from being influenced. Therefore, we will see our behaviour impacted without our being able to do something.

12:05 p.m.

Co-Founder and Executive Director, Center for Humane Technology

Tristan Harris

Yes.

Obviously, people have some amount of free choice to double-confirm everything that they're reading and things like that. I try to look, as a sort of a behavioural scientist, at just the reality of human behaviour. What do most people do most of the time? The challenge is that when we are so overloaded and our attention is so finite and we're constantly anxious and checking things all the time, there really isn't that time to realistically double-check everything.

There are two kinds of persuasion. There's persuasion where if I the magician tell you how this works, suddenly the trick doesn't work anymore because you know that it's a technique. There are forms of advertising where that's happened. The second kind of persuasion is that even if I tell you what I'm doing, it still works on you. A good example of this is what Dan Ariely, the famous behavioural economist, says, that it's about flattery. If you tell someone, “I'm about to flatter you and I'm making it up,” it still feels really good when you hear it.

A second example of this is if you put on a virtual reality helmet. I know that I'm here in San Francisco in this office, but in the virtual reality helmet, it looks like I'm on the edge of a cliff. If you push me, even though my mind knows that I'm here in San Francisco, millions of years of evolution make me feel like I should not fall over.

What we have to recognize is that the socio-psychological instincts, such as those that arise when children are shown an infinite set of photos of their friends having fun without them—“I know that is a highlight reel; I know that is a distortion”—still have a psychological impact on people. The same thing is true of the kinds of toxic information or malinformation that Claire is talking about.

12:05 p.m.

Liberal

Michel Picard Liberal Montarville, QC

If I still have a small capacity to tell the difference between what is false and what is right, the big difference today is.... If I go back decades ago, in the 1940s and 1950s, priests in Quebec talked to their people, saying that hell is red and the sky is blue. The priests were referring to the colour of political parties racing in the next election. At that time, the only way to have people aware of what was going on was by mail, so you had to buy stamps, or on TV or radio, so you had to buy publicity. Nowadays when you make advertisements, you use media. You can send messages to millions and millions of people with one click and no cost. It's the same game, but the volume is totally different. The tools of the 1950s are the same, but with more technology.

As a government, we have to regulate something, somehow, somewhere. What do we regulate? Do we regulate the right to say stupid stuff on the media, or do we have to regulate people because apparently they're not able to see the light through all this blackness and dark side of the web?

12:05 p.m.

Co-Founder and Executive Director, Center for Humane Technology

Tristan Harris

You have described it. We've decentralized vulnerabilities so that now, instead of waiting to pay to publish something, I just basically ride on the waves of decentralized chaos and use people's socio-psychological vulnerabilities to spread things that way.

In terms of regulation, one thing we need to think about is at what point a publisher is responsible for the information it is transmitting. If I'm The New York Times and I publish something, I'm responsible for it because I have a licence and I've trained as a journalist and could lose the credibility of being a trusted organization.

One thing the technology companies do is make recommendations. We've given them the safe provision that they're not responsible for the content that people upload, because they can't know what people are uploading. That makes sense, but increasingly, what people are watching, for example, with YouTube, 70% is driven by the recommendations on the right-hand side. Increasingly, the best way to get your attention is to calculate what should go there.

If you're making recommendations that start to veer into the billions, for example, Alex Jones' infowars conspiracy theory videos were recommended 15 billion times, at what point is YouTube, not Alex Jones, responsible for basically publishing that recommendation? I think we need to start differentiating when you are responsible for recommending things.

12:10 p.m.

Liberal

Michel Picard Liberal Montarville, QC

With the amount of information available to me, and I can't control what's coming to me, do I have to rely only on artificial intelligence to help me see transparency through all of this?

12:10 p.m.

Co-Founder and Executive Director, Center for Humane Technology

Tristan Harris

The reality is that most people don't even know anything about what we're talking about. They think YouTube is just showing them stuff. They don't realize that when their mind lands on that YouTube video, they have just entered a chess match with a supercomputer pointed at their brain, in which their brain is the chessboard, and it knows far more moves ahead on that chessboard than they do. I think most people are not even aware of this, and that's what we have to change.

12:10 p.m.

Liberal

Michel Picard Liberal Montarville, QC

As a final note, I have a comment, Mr. Chair.

I'll just mention to my honourable and very respected colleague, MP Kent, that for something to be money laundering requires knowing that the money originates from a criminal source or criminal activity. Before accusing anyone of money laundering, we have to be careful.

Thank you.

12:10 p.m.

Liberal

The Vice-Chair Liberal Nathaniel Erskine-Smith

Thank you very much, Mr. Picard.

We'll move to our five-minute round. The first five minutes go to Mr. Gourde.

October 16th, 2018 / 12:10 p.m.

Conservative

Jacques Gourde Conservative Lévis—Lotbinière, QC

Thank you, Mr. Chair.

My question is for all the witnesses.

From one meeting to the next since the start of this study, it has been chilling to hear everything that can be done digitally to influence people in an election. In my opinion, it is clear that we will have to legislate on this sooner or later.

Do you think it would be possible to do that effectively, in the short or medium term? To my mind, that means we would have to be ready for the election in 2019. Otherwise, would we have to ban all use of advertising and social media in the next election in order to at least be fair and equitable to all the political parties and independent candidates who are running?

12:10 p.m.

Researcher and Writer, As an Individual

Vivian Krause

Is that question for me? If I understood correctly, you want to know if all social media have to be eliminated.

12:10 p.m.

Conservative

Jacques Gourde Conservative Lévis—Lotbinière, QC

If we try to bring in effective legislation, in the short or medium term, will we have to consider banning social media in the upcoming election, to be fair and equitable to everyone running?

12:10 p.m.

Researcher and Writer, As an Individual

Vivian Krause

That does not seem feasible to me. Furthermore, since there are so many ways of using social media effectively, I do not think banning their use makes any sense. The issue is not eliminating them, but rather looking at how they are used. I think regulations are needed. I can only imagine how much people would object to that idea.

It would be like banning free speech.

I don't think we can do that.

12:10 p.m.

Conservative

Jacques Gourde Conservative Lévis—Lotbinière, QC

That is very interesting, but equitable legislation is needed that would provide an avenue for action. An election campaign lasts between 35 and 40 days. When those information networks are used to disseminate fake news or fake videos, that can influence Canadians tremendously. We would never have the time in an election campaign to tell people that fake news had been disseminated and that people have been affected. It will come out, but not until after the election. If we are unable to monitor the information and take action when it is fake, why do we have to accept that?

12:10 p.m.

Researcher and Writer, As an Individual

Vivian Krause

It is the funding that has to be controlled, not what people say. Freedom of speech is very important, especially during an election. What we need to eliminate is outside funding so the outcome of the election is decided by Canadians alone.

12:10 p.m.

Conservative

Jacques Gourde Conservative Lévis—Lotbinière, QC

I would like to hear from the other witnesses, please.

12:10 p.m.

Partner, Co-Chair of Information Technology Group, McMillan LLP, As an Individual

Ryan Black

If I may, in my view, the quicker route to effectively pulling the curtain back on this and giving meaningful government action towards addressing this issue is far more on the education of the public side than it is on the legislative side. I worry that any legislative tool would be a very unpopular and broad hammer that would restrict legitimate uses of social media.

However, we have seen the effectiveness of campaigns in other domains, education campaigns that educate the public, for example, about not sharing their password, not being phished online, or about protecting their information or their social insurance number. These are all things that can be done to educate people, as the witnesses have talked about, to pull back the curtain on what these technology companies are doing.

I do not believe that there will be legislation that could protect us from manipulation through social media, because if you were to ban political ads.... We used the example of the Russian video where the person was pouring bleach. In that case, it wasn't a political ad at all. It was just someone doing something that was a viral video on the Internet that provoked a reaction against feminists and the left wing, and that provoked an action against the right wing.

To me, we should educate people that we do need to take that second step to try to verify and step back from the lizard brain deep within us telling us this is true and say, “Let me apply some rational thinking to this”. I do feel that would result in some more effective means.

12:15 p.m.

Liberal

The Vice-Chair Liberal Nathaniel Erskine-Smith

Thanks very much for that.

Our next five minutes go to Ms. Vandenbeld.

12:15 p.m.

Liberal

Anita Vandenbeld Liberal Ottawa West—Nepean, ON

Thank you for being here and for your testimony.

I'd like to go back to the fact that we are legislators. What we're very much interested in, in this committee, is what government can do, particularly in terms of legislation, but also in other areas.

In your testimony, I'm hearing things like a video that was made in Russia, which is outside of our jurisdiction or social construct of reality, psychological persuasion, and things like we are not rational actors.

How do we legislate? I would like all of you to respond to this. What are legislative actions that we might be able to take that could help mitigate this?

12:15 p.m.

Harvard University, As an Individual

Dr. Claire Wardle

One thing I would say is, as somebody was talking about, this isn't new. In an election campaign, somebody can, the night before an election, send leaflets to a whole constituency with a false rumour about a candidate. This issue that we would legislate around content is just not possible, because a lot of this stuff is the grey, murky, misleading space.

I do think there's something specific around content that makes the election system bumble. For example, we were monitoring the election in Brazil two weeks ago. On election day, there was a great deal of rumour circulating around the fact that the machines weren't working and that you could stay at home to vote via SMS. I think if we're talking about content, that's the kind of space where there is room to say, if the harm is specifically around the election, then there is something that could be done around that.

I think we need more transparency around behaviours, not content. The platforms are moving in this direction, but they need more pressure to be placed on them in terms of what is a behaviour that we can see that we would have a problem with and we would all agree about, whether it's automation, whether the IP address is external to the Canadian border or people using fake accounts.

I think behaviour is something that is worth looking at, but the content part of this is something that is much more challenging. We need more pressure on the platforms to be more transparent about those behaviours, because we don't know what decisions they're making. It's completely opaque at the moment.

12:15 p.m.

Liberal

Anita Vandenbeld Liberal Ottawa West—Nepean, ON

Okay, go ahead.

12:15 p.m.

Associate, McMillan LLP, As an Individual

Pablo Jorge Tseng

Speaking to Ryan's point about education, we still feel that the baseline to any good legislation is a good education that's being disseminated to the public. In addition to that, the education can obviously be supplemented by crafted legislation, which shouldn't be drafted in haste. We've seen examples in the past of what happens when legislation is drafted on a whim. It's just a nightmare for everyone. Legislation definitely should be treated as sacred and analyzed and carefully thought out before it actually comes into force.

As an example of legislation that could be expanded is what Parliament did with the Canada Elections Act, with section 480.1, which is what we were talking about earlier regarding impersonation. Just to give you a brief background, that section basically says, “Every person is guilty of an offence who, with intent to mislead, falsely represents themselves” or causes someone else to be falsely represented. Then there are a number of people who are listed: Chief Electoral Officer, election officer, people authorized to act on behalf of the office, people who are authorized to act on behalf of a registered party, and a candidate.

That's a good scope with regard to impersonation, but that's an example of perhaps a section that could be expanded to explicitly include other forms, maybe false information that's being disseminated. This is not to say this section was crafted in haste—it did target what it was intended to do—but there is room for manipulation to increase its scope.