Evidence of meeting #134 for Access to Information, Privacy and Ethics in the 42nd Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was facebook.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

David Carroll  Associate Professor, Parsons School of Design, The New School, As an Individual
Chris Vickery  Director of Cyber Risk Research, UpGuard, As an Individual
Jason Kint  Chief Executive Officer, Digital Content Next

5:20 p.m.

Director of Cyber Risk Research, UpGuard, As an Individual

Chris Vickery

I recommend a model going forward that basically defines the terms in very strict ways. If something can be accessed without your permission, you are not the gatekeeper to it, you don't own it and you don't control it. If the government has a criminal record of person X, person X does not own that criminal record. It is a record kept about person X, whereas if Walmart has a shopping history of that person, you could make laws saying that the person has to say it's okay for them to collect it and everything. It's a very different beast, and you've got to let people know that the government holds data on you that you do not own and control. It's just not possible.

5:20 p.m.

Liberal

Anita Vandenbeld Liberal Ottawa West—Nepean, ON

Thank you.

5:20 p.m.

Conservative

The Chair Conservative Bob Zimmer

Last up is Mr. Erskine-Smith.

February 5th, 2019 / 5:20 p.m.

Liberal

Nathaniel Erskine-Smith Liberal Beaches—East York, ON

Mr. Carroll, you have talked a lot about the right to know or the right of access. As a politician, I will use examples here that may be far afield. In the election, maybe Mr. Zimmer wants to know who owns a gun and maybe I want to know who has a pet, and we're going to be knocking on doors and trying to target people. I think the right to access has a necessary moderating influence, in the sense that I'm going to be less likely to collect a mountain of information if I know that you, as a voter, are going to be able to see what I've collected about you. I'm probably not going to collect information that you're divorced if I have a great Divorce Act amendment because I think it's going to be really good for you. If you're able to access that, then I'm not going to do it. I think that's a really important right as far as it goes. As politicians, I think we should all subject ourselves to that right of citizens.

The right to correct makes a lot of sense to me as well. It's better for me. It's better for you as a voter.

The right to delete.... I don't know if you have turned your minds to other rights as far as it goes. As someone running for office, I can access certain information that is given to me by Elections Canada that isn't given to people who aren't. I don't think that information should be deleted as far as it goes.

With respect to other information, how far should that right to delete go? If I know you are really concerned about the global compact for migration and I don't really want to get you out to vote because I think that's a crazy position to take, should I delete that if you ask me to delete it?

5:20 p.m.

Associate Professor, Parsons School of Design, The New School, As an Individual

David Carroll

I don't mean to dodge the question, but I will answer it in a way that relates to what the information commissioner in the U.K. has ruled in relation to her investigation. Inferred data is considered personal data when it is attached to an identity. It's this idea of creating predictions about people and that when you attach them to their voter file, it constitutes personal data.

I think it gets to your question about campaigns that are trying to predict the behaviour of potential voters, and it's based on predictions rather than verifiable, deterministic facts. That could be one boundary that needs to be further negotiated. For example, if I have a gun license, then you have a verifiable fact that I support gun rights. Whereas, if you're using my social media chatter to infer my feelings about gun rights, that's a different threshold that needs to be defined.

5:20 p.m.

Liberal

Nathaniel Erskine-Smith Liberal Beaches—East York, ON

Mr. Kint, you said you had two question marks next to recommendations in our report. What was the other one?

5:20 p.m.

Chief Executive Officer, Digital Content Next

Jason Kint

I knew that would happen. It was around content moderation. It was this idea that within a certain set period—I think you suggested 24 hours, maybe—it should be possible to eliminate content. I know that's being rolled out in a couple of different counties in Europe. I think it's probably wise to watch and study it.

5:20 p.m.

Liberal

Nathaniel Erskine-Smith Liberal Beaches—East York, ON

In the first recommendation, you have a question mark beside the wording on taking into account engagement for democratic purposes, which, hopefully, answers the concerns you had.

5:20 p.m.

Chief Executive Officer, Digital Content Next

5:20 p.m.

Liberal

Nathaniel Erskine-Smith Liberal Beaches—East York, ON

Since you like reading Canadian parliamentary studies—

5:20 p.m.

Voices

Oh, oh!

5:20 p.m.

Liberal

Nathaniel Erskine-Smith Liberal Beaches—East York, ON

—I would point out that before we got to the Cambridge Analytica scandal, we published a report on our privacy law, PIPEDA. . One of the recommendations we did make addressed this issue of consent for secondary purposes. Our recommendation was to require explicit consent for secondary uses, which I think would address a lot of the concerns you have raised.

5:25 p.m.

Chief Executive Officer, Digital Content Next

Jason Kint

Absolutely.

5:25 p.m.

Liberal

Nathaniel Erskine-Smith Liberal Beaches—East York, ON

In terms of future lines of study for this committee, the antitrust issues have been raised by a few witnesses, and this idea of ethical AI. I'll finish with something that Mr. Baylis was talking about, which I think touches on both in a way. When I post a music video or whatever on Facebook and Facebook is able to monetize that, it's that monetization—that pushing it into newsfeeds where they have acted now as editor—where it seems to me that safe harbour rules maybe ought not to apply in the same way.

I don't know if you have a view on that.

5:25 p.m.

Chief Executive Officer, Digital Content Next

Jason Kint

I do. I think that's an interesting point in the recommendations. I think it was in the evidence from Tristan Harris.

The safe harbour has been used widely, including in our safe harbour in the U.S. There has been a lot of press recently around the harms that are happening within recommendations—within YouTube and the AI, which is clearly aligned with profit and designed by humans. There's an issue there.

5:25 p.m.

Liberal

Nathaniel Erskine-Smith Liberal Beaches—East York, ON

I guess the idea is that as algorithms replace editors, we have to hold to account in the same way those who use algorithms for profit.

5:25 p.m.

Chief Executive Officer, Digital Content Next

Jason Kint

The auditing of algorithms is the concept you have in there, too, which I think is very wise to look at.

5:25 p.m.

Liberal

Nathaniel Erskine-Smith Liberal Beaches—East York, ON

Thanks very much.

5:25 p.m.

Conservative

The Chair Conservative Bob Zimmer

As the chair, I'm just going to ask one question.

Not to be overly simplistic, I see the solution as quite simple. The penalties are one thing that we heard from the information commissioner in the U.K. too, and something that we need to have on this side in limiting data collection, and understanding that data is a multi-level thing, not just data in a general sense.

You have the floor with us. We're listening to exactly what you say. What I'm going to ask all three of you is simply this. If you have one last thing you really want us to hang onto, what would it be? What would you leave us with in regard to what we're talking about today on data collection and government services, or just in general?

We'll start with Mr. Vickery.

5:25 p.m.

Director of Cyber Risk Research, UpGuard, As an Individual

Chris Vickery

I would leave you with a very bleak current outlook on it. Things need to change dramatically.

Right now, things are 10 times worse than you think they are, and we need action. We need less talk and more action.

5:25 p.m.

Conservative

The Chair Conservative Bob Zimmer

We do have time for this, but what you mean, “worse than we think”? Could you explain that a bit?

5:25 p.m.

Director of Cyber Risk Research, UpGuard, As an Individual

Chris Vickery

Everybody has breach fatigue because they see all of the data breaches in the news and everything. The number of data breaches that get mentioned in articles is abysmally small. The number of data breaches that actually occur and the amount of data being passed around, whether you want to count employees sharing too much or sending to their personal email or something such as that just in the course of business, is 10 times, 100 times larger than anybody on this committee has their head wrapped around. It is horrifying how bad it is out there right now.

5:25 p.m.

Conservative

The Chair Conservative Bob Zimmer

Thank you, Mr. Vickery.

Mr. Carroll, and then Mr. Kint.

5:25 p.m.

Associate Professor, Parsons School of Design, The New School, As an Individual

David Carroll

The United States, Canada and other countries need to adopt some version of the GDPR, some adaptation of what has been put in that model. The California act moves the needle in the United States and there will be an aggressive race before the California act comes into play to pre-empt state law with a national privacy act of some sort.

It's really a key moment for the United States and Canada to lead and, in a sense, catch up with two decades' worth of data protection that our friends across the Atlantic have achieved.

5:25 p.m.

Conservative

The Chair Conservative Bob Zimmer

Mr. Kint.