Evidence of meeting #116 for Access to Information, Privacy and Ethics in the 42nd Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was advertising.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Taylor Owen  Assistant Professor, Digital Media and Global Affairs, University of British Columbia, As an Individual
Fenwick McKelvey  Associate Professor, Communication Studies, Concordia University, As an Individual
Ben Scott  Director, Policy and Advocacy, Omidyar Network

12:10 p.m.

Prof. Fenwick McKelvey

Well, part of the concern is that there was access provided without clear oversight on how they were going to use that data. This is one of the things that's creating a challenge for academic research too. Facebook and many other social media platforms have tightened up their APIs and their data access. That was often done without much transparency—and that's what Facebook has admitted—about how that data was going to be used, so I think it's twofold: it's basically knowing who has access to it and also making sure they're subject to accountability about what they're doing with the data.

12:10 p.m.

Liberal

Michel Picard Liberal Montarville, QC

Mr. Scott, would you comment?

12:10 p.m.

Director, Policy and Advocacy, Omidyar Network

Dr. Ben Scott

I want to point to two interesting provisions in Europe's General Data Protection Regulation. We are not sure yet how they are going to be adjudicated and applied in the market.

One of those provisions says the user should have more control over the consent they give to different kinds of information. Right now, when I sign the Facebook privacy agreement, it's all or nothing. I either agree to whatever is in that 80-page document or I don't use the service. The GDPR says you can't do that anymore. You have to give people meaningful choices when it comes to controlling their own data, especially sensitive data such as that which shapes political views.

I think there's a key question about giving consumers more ability to control what data is collected and how it's used. The German antitrust regulator, interestingly, has launched an inquiry into Facebook. It says that the market power a company like Facebook has over a segment of social networking is so strong that effectively their privacy agreement is a coercion—that it's all or nothing. There's no way for the consumer either to know or to have an incentive to know what's in there, because to say “no” is to abandon the service altogether and not get access to something that two billion on the people on the planet are using.

To me, this points to the fundamental problem. Exactly as Professor McKelvey says, you need to know what they're collecting, and not only do you need to know how they're using it, but you need to have a say in how they're using it. That's what I think is consumer control over the application of my data. That's the key piece that I think we're wrestling with in privacy policy, but it has implications in competition policies as well, because market power plays a big role.

12:10 p.m.

Liberal

Michel Picard Liberal Montarville, QC

Thank you.

Mr. Owen, you said in your opening speech that this impacts democracy and that our electoral system is at risk. It sounds good in political speech, but in reality, what is the problem with it? People say whatever they want about any candidate. Is the problem that our system has been hacked, or can people make up their own minds in cross-checking information they get?

12:10 p.m.

Prof. Taylor Owen

Well, I don't think it's been hacked. I think it's just that the marketplace for our information is structured very differently than it used to be. In that old model, we had all sorts of ways of and mechanisms for limiting and regulating speech during elections, for foreign money going into the media market, for forced disclosures from broadcasters of who's paying for what ad during an election.... These things are the ways that we regulated speech in order to protect our public sphere during the time of an election, noting that this was a particular moment in our society when quality information was important.

Those regulations and laws aren't very applicable in this new ecosystem. The question is, do we think they need to apply? Do these same principles need to apply in this new ecosystem? I would argue that they do, but that the regulations need to look different because the structure of that ecosystem is different.

12:10 p.m.

Conservative

The Chair Conservative Bob Zimmer

Thank you, Mr. Picard.

Next up for five minutes is Mr. Kent.

12:10 p.m.

Conservative

Peter Kent Conservative Thornhill, ON

Thank you very much, Chair.

Before Christopher Wylie became a whistleblower, in pitching the ability to affect election or referenda outcome, he made a statement saying that essentially “we can trigger the underlying dispositional motivators that drive each psychographic audience”.

Dr. McKelvey, I know that you have said you're a bit skeptical of the psychographic microtargeting concept, but we understand from Chris Vickery and others that, rather than the half-dozen or dozen data points that many advertisers use to target responses when they observe the browser history of an individual, Cambridge Analytica, in this case—and ultimately AggregateIQ in Victoria—was working with as many as 500 data points on individuals to exploit their vulnerabilities, such as their sexual preferences, perhaps, or their fears or anxieties.

Do you completely disregard this concept of psychographic microtargeting? Otherwise, do you believe that there is a line that should be drawn on how much data can be used in targeting advertising?

12:15 p.m.

Prof. Fenwick McKelvey

Part of my research is historical. In the 1980s, the Claritas Corporation was using geodemographics and psycho-demographics. In one sense, I think that one of two things can be true. Psycho-demographics can either be something relatively new—the point when you encounter it in the literature is the 1980s—or it's been a myth that the advertising industry has been trying to sell their products with for 30 years. I'm of the latter category.

I think it's a good way of selling their categories. I think that's where I actually have.... My opinion is that I'm not convinced it works. I'm not convinced that you need to collect all this information. I'm not convinced that psycho-demographics is really that effective. In particular, I also think that when you're looking at campaigns with limited resources, they're not writing ad copy for 500 different categories.

Now, there's a certain threat that AI might change that, but I think that for right now, if you tend to think this doesn't work and this probably isn't great, why are we enabling all this data to be collected? If you look at the literature, it says that three or four different variables are really good predictors of actual voter intent. I mean, beyond me, I think it's the question of why we are enabling all this other data collection if there's limited benefit to it.

I'm not against the idea that it might work; I'm skeptical of its overstated claims.

12:15 p.m.

Conservative

Peter Kent Conservative Thornhill, ON

Go ahead, Mr. Owen.

12:15 p.m.

Prof. Taylor Owen

I don't think we should be making regulatory change based on whether the claims of one particular company to do one particular thing using one particular database at one particular moment were effective or not.

I think principles, such as the consent and knowability that Ben just mentioned, are protections against the possibility of that kind of misuse. If we consent regularly to the use, sharing and amalgamation of our personal data—if we have the right to that consent—and if we have the right to know how that data is being used, whether it's for psychographic profiling, for an AI-driven microtargeting campaign, or for whatever reason, that protects us and inoculates us against the potential risk of these technologies in the future, not how they were used in one moment of time by one group.

12:15 p.m.

Director, Policy and Advocacy, Omidyar Network

Dr. Ben Scott

To me, the takeaway from the Cambridge Analytica episode is not that Cambridge Analytica had some special sauce of psychographic manipulation; it's that they were basically using the same tools of microtargeting that Facebook makes available to everybody. They overstated that dramatically in their marketing materials, but I think microtargeting to find audiences that are responsive to particular messages is effective. Facebook makes $40 billion a year in revenue for a reason. I don't think you have to imagine a splashy new way of doing that called Cambridge Analytica to make that meaningful. I think data-driven targeting is the name of the game in advertising today, and we ought to be regulating at the root, rather than in fancy branches.

12:15 p.m.

Conservative

Peter Kent Conservative Thornhill, ON

In the absence of regulation, and in the North American context or Canadian context, there's recognition that the individual owns their own personal data. You've all spoken to the need for education of the users.

When I speak to high school classes or seniors' groups, they take the cautions about participating in polls, playing games online, or guarding their browsing history almost with a grain of salt. Would any of you recommend that the social media companies set aside large amounts of money, not to provide the education service themselves, but for third parties or independent groups to better educate social media users from the early school grades right through life?

12:15 p.m.

Prof. Taylor Owen

I think digital literacy campaigns are incredibly important, but only if done at scale. Who funds that scale? That could be incentivized to the platform companies to put money into that. There's a real government role there, too, for a large-scale digital literacy campaign, not just to separate blatantly true from blatantly false information. That dichotomy is very rarely presented to a user. Rather, users need to understand the system in which they are participating—why they're receiving what they're receiving, what data is being used about them and how that shapes the content they're getting. If we embed those kinds of conversations in our digital literacy campaigns at scale, then we can make some progress.

12:20 p.m.

Conservative

The Chair Conservative Bob Zimmer

Thank you, Mr. Kent.

Next up for five minutes is Madam Fortier.

12:20 p.m.

Liberal

Mona Fortier Liberal Ottawa—Vanier, ON

Frank is taking my turn.

12:20 p.m.

Conservative

The Chair Conservative Bob Zimmer

Mr. Baylis, go ahead.

12:20 p.m.

Liberal

Frank Baylis Liberal Pierrefonds—Dollard, QC

Start the clock now.

I want to come back to the concept of news, information, and the data. It's a simple question we've always asked ourselves. Who's selling you your news? We've always bought news. If we take away the Internet and throw it away, we've got, say, Fox and CNN on TV. To your point, Mr. Scott, when you talk about your list of magazines, I turn on the TV and if I know I want to hear a certain story about a certain president, I'll watch CNN. If I want to hear the same story told a totally different way, I'll watch Fox. That has nothing to do with the Internet, but I'm making a choice as a consumer to buy my news. Now you're saying I can buy it on the Internet with my eyeballs.

A lot of people give it to me for free if I just watch their ads or spend time with them. Other times they'll say that if I want to get, say, The New York Times or The Wall Street Journal, I've got to pay for a subscription.

Using that as a background, another concept we worry about is filtering. Before, we had filters. They were the editor, the publisher, and ultimately the owner of a newspaper. All kinds of people like me—politicians—would have to go and, quite frankly, suck up to these guys so they'd write something nice about us. That's the reality of it. They've actually been weakened.

Great, positive things have come through with the Internet. Twitter has allowed us to speak directly to our people, unfiltered. As you said, Mr. Scott, there are also nefarious things that can come out of this.

You've spoken about transparency. Is transparency the issue? We are always going to buy our news. We are always going to go to a source that can tell us what we want to hear. In that sense of looking at news, written news, looking at TV, and now looking at the Internet, what is the one thing we should be doing there?

Go ahead, Mr. Scott. I'll start with you.

12:20 p.m.

Director, Policy and Advocacy, Omidyar Network

Dr. Ben Scott

I think transparency is only one piece of the puzzle.

I'm a big believer in the decentralization of the communications system. It's a good thing that we have more voices, more journalists, more reporting. The fact that it is no longer a viable business is a big problem, and we need to address that as a systematic issue in the market.

There is a second piece to this. Consumers are at the beginning of a long process of learning how to consume information on the Internet, in the same way that it took us decades to figure out how to consume information on broadcast channels. In the early days of radio, you could see a similar debate playing out. People said, “Wow, everybody is being misled by this new thing called broadcasting. It's completely different from newspapers. You hear it over the radio and it seems true, and people just take it.” That was considered incredibly alarming.

Now, as you have clearly pointed out, we all know how to differentiate what we want on broadcast. That will come eventually on digital media. The trick here is that it's push versus pull. Instead of going on TV and selecting CNN or Fox, Facebook is being pushed at me.

There are 10,000 different news items that are sitting in my Facebook account that Facebook could choose to show me, but I'm only going to see about 5% of them. Facebook decides which 5% I'm going to see. It decides that based on what it thinks I want, not what I choose.

That may be a business that I'm willing to sign up for, but I need to understand much more about why that happens, and why I'm getting what Facebook has decided I should get. Right now, we don't have that. That's why people are so vulnerable to misinformation.

12:20 p.m.

Liberal

Frank Baylis Liberal Pierrefonds—Dollard, QC

What's the one thing you would do to give us that?

September 25th, 2018 / 12:20 p.m.

Director, Policy and Advocacy, Omidyar Network

Dr. Ben Scott

I think it's transparency in the algorithm, an increase in quality journalism, and digital literacy. Without all three of them, you're not going to move the needle substantially.

12:20 p.m.

Prof. Taylor Owen

Can I make a point on the quality journalism aspect of this question?

12:20 p.m.

Liberal

Frank Baylis Liberal Pierrefonds—Dollard, QC

Go ahead, Mr. Owen.

12:20 p.m.

Prof. Taylor Owen

The reason we've seen the precipitous decline of the financial viability of the journalism sector as it was previously constructed is that the advertising revenue that it once depended on is gone. That is the reality. If that had led to a fertile digital ecosystem of vibrant digital start-ups, doing better journalism than their legacy institutions were, we wouldn't have a problem. That is not what has happened, at least in Canada, yet.

If that's what we want to create and enable, then we need to look at policies that can help enable that emerging journalism production. Maybe we're okay with the amount of journalism being done now, as it's produced in our democracy, but I argue we shouldn't be. For example, there are around 100 newspapers left in Canada. Their total revenue is now lower than the revenue of the CBC. I personally don't think that's a healthy ecosystem. There are a host of journalism-related policies we could talk about to help enable this new ecosystem.

12:25 p.m.

Conservative

The Chair Conservative Bob Zimmer

Go ahead, Mr. Masse, for three minutes.

We do have some time afterwards if there are further questions to be asked of the group. We have them until 1 p.m. Let me know if you have a question.

12:25 p.m.

NDP

Brian Masse NDP Windsor West, ON

What would be the quick fix, if there is one, going into the next election that we have coming up? Time is running out.

What should be the consequences for those who break whatever rules we have? Should they be highly punitive, or should it be a carrot-and-stick approach?