Evidence of meeting #122 for Access to Information, Privacy and Ethics in the 42nd Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was users.

A video is available from Parliament.

On the agenda

MPs speaking

Also speaking

Colin McKay  Head, Public Policy and Government Relations, Google Canada

October 23rd, 2018 / 12:45 p.m.

Liberal

Anita Vandenbeld Liberal Ottawa West—Nepean, ON

Thank you very much.

I'd like to go back to the part that you discussed in the beginning when you outlined the key areas on controls, particularly, in this case, not the Google search engine but YouTube. Obviously certain things on YouTube that would go against the user policy are automatically removed, or the person is informed that it's been removed because they're violating the policy.

I would be very interested in knowing how those determinations are made. For instance, is it an individual person who looks at those? Is it an algorithm? Are you using some sort of AI? Are there keywords that you're looking at?

There are a couple of things that I'm a bit concerned about. After the testimony at this committee of Mr. Vickery, I posted my questions back and forth, just like you and I are doing right now. It's televised; it's on ParlVu. One of the questions that I asked was about the fact that some of this data had been found on a Google drive. When I went to post that intervention, which was from a parliamentary television site, it was found to violate the YouTube.... The only caption was the name of our study, which is the breach of personal information involving Cambridge Analytica and Facebook. It was removed, and I was told that I would have penalties. I went for a review, and of course, after a review it was posted back on again.

I know of another member of Parliament who asked a question in question period about cannabis, and that was removed because it was said that he was promoting the use of drugs.

How are these determinations made? What are the algorithms or terms, or how do you do that with YouTube? There are, at the same time, an awful lot of things on YouTube that promote hate, that are anti-democratic, that are perhaps even put there by interests that have links to international crime.

I worry that the way these algorithms are being used might not necessarily be capturing what we really want to remove, while free speech in an environment like this, which is a parliamentary committee, has been actually caught in this net.

12:50 p.m.

Head, Public Policy and Government Relations, Google Canada

Colin McKay

You're describing two very specific cases that I'll dive into right after an observation.

We have intensive review processes that are driven by algorithmic decision-making as well as a focus on keywords, and then also details about the account itself. For example, on a brand new account attempting to post content on flagged material, you may see that you're not able to post it publicly before it's reviewed. You may see that you're not able to monetize it until we've actually reviewed it. We go through this process to ensure that we're checking it against the corpus of material that we are aware of.

I say that specifically because in your first instance, it's quite possible that when you posted a video of televised CPAC material.... CPAC has worked with us to register CPAC material as material that needs to be reviewed under our content ID guidelines. That's our protection for copyright holders that want to avoid television programs, movies, music from being posted by users without their permission.

12:50 p.m.

Liberal

Anita Vandenbeld Liberal Ottawa West—Nepean, ON

It's a very common thing, CPAC. I've posted things from committees all the time.

12:50 p.m.

Head, Public Policy and Government Relations, Google Canada

Colin McKay

That's my supposition as to why that happened.

12:50 p.m.

Liberal

Anita Vandenbeld Liberal Ottawa West—Nepean, ON

That wasn't the reason given.

12:50 p.m.

Head, Public Policy and Government Relations, Google Canada

Colin McKay

Okay. I didn't have that information.

On the cannabis example, there's a relatively fast-breaking space here, and we're trying to adjust our internal systems to differentiate between the promotion and use of cannabis in the illegal context, especially in an international arena, versus what's happening in Canada and the fact that it's now legal. With our advertising systems, as you might imagine, we're learning internally, through that algorithm, through manual review, through flagging by our users, that there's certain content that is now allowed and certain content that isn't.

It's an iterative process and it overlaps technology and human intervention, and that's especially important in sensitive areas. What happens in the case of violent content or extremist content is that once we're given an opportunity to recognize that there's a video or audio clip that is objectionable and illegal, we can use that same content ID flagging system to identify it as soon as it's uploaded and make it unavailable, and in some instances we shut down that account.

12:50 p.m.

Liberal

Anita Vandenbeld Liberal Ottawa West—Nepean, ON

How many resources do you have in terms of actual people who are able to review this material? How much of this is just done automatically by algorithms—

12:50 p.m.

Head, Public Policy and Government Relations, Google Canada

Colin McKay

Hundreds.

12:50 p.m.

Liberal

Anita Vandenbeld Liberal Ottawa West—Nepean, ON

—and how much is it actually...? Do you have sufficient resources—

12:50 p.m.

Head, Public Policy and Government Relations, Google Canada

Colin McKay

I can get you a full number, but it's a continuing level of investment, both on a technological basis and a human basis, to provide the review.

12:50 p.m.

Liberal

Anita Vandenbeld Liberal Ottawa West—Nepean, ON

Thank you.

I will give the questions to Mr. Erskine-Smith if he wants them.

12:50 p.m.

Liberal

Nathaniel Erskine-Smith Liberal Beaches—East York, ON

I just have one question.

There were two topics that I failed to cover. One is algorithmic transparency. We've had witnesses come before us and say there need to be third party auditors who step into Google's offices and Facebook's offices and assess whether the algorithms you're employing are being used in an appropriate way. Would you have any problem with that?

12:50 p.m.

Head, Public Policy and Government Relations, Google Canada

Colin McKay

To me, that's looking at the wrong end of the processes, because in many cases that have been identified so far, it's the data that has demonstrated bias. If you're running an algorithm against biased data or an unrepresentative sample, then you're going to get erroneous or erratic results.

In some cases the algorithm is proprietary commercial technology, and I don't know if an auditor would have the capacity to evaluate what the algorithm is intended to do, or how they would evaluate working versus non-working and under what standard.

12:50 p.m.

Liberal

Nathaniel Erskine-Smith Liberal Beaches—East York, ON

Potentially, then, it's a practical problem for us to solve, but in theory there's not necessarily an objection.

The last thing is just on regulating speech. The right to be forgotten, potential privacy towards defamation, harassment, hate.... There are reasons for material to be taken down.

Do you have a preferred model of how this information should be taken down? If Google doesn't want to be the police, who should be? Should Google pay into a fund to have a public body administering and making these decisions? Do you have a view as to how this should work?

12:50 p.m.

Head, Public Policy and Government Relations, Google Canada

Colin McKay

I'll just make one observation before I answer, because you mentioned the right to be forgotten.

There's an ongoing conversation around the right to be forgotten in Canada. One thing the Privacy Commissioner highlighted in his consultation document and his guidelines was that there's an inherent conflict with the charter on freedom of expression and that there needs to be a full-throated conversation in public, ideally at this committee, around how the right to be forgotten should be exercised in Canada.

From our point of view, these processes exist. They're called the courts. The courts have an understanding of both the standards and the public expectations. They have the tools available to them to—

12:55 p.m.

Liberal

Nathaniel Erskine-Smith Liberal Beaches—East York, ON

We can't expect the courts to take down content all the time, though. There are so many pieces of content that should be taken down. You can't expect someone to pay a lawyer like me $400 an hour to run off to court and take down a little comment here and a little comment there. What would be a better system?

12:55 p.m.

Head, Public Policy and Government Relations, Google Canada

Colin McKay

The alternative that you're describing to me is the process that exists in Europe, where you ask us to make an evaluation about whether your protected expression violates—

12:55 p.m.

Liberal

Nathaniel Erskine-Smith Liberal Beaches—East York, ON

Does it cost too much? Doesn't Google make enough to pay for it?

12:55 p.m.

Head, Public Policy and Government Relations, Google Canada

Colin McKay

—a fundamental charter right, and then you are giving us the responsibility of an administrative court. The question is whether you want to do that or not.

12:55 p.m.

Conservative

The Chair Conservative Bob Zimmer

We have a couple of minutes left. If we can divide that up, it would be great.

12:55 p.m.

Conservative

Peter Kent Conservative Thornhill, ON

I have one very quick question.

Fortune magazine tells us that Google outspent every other company in North America last year in lobbying Washington. I'd like to know how much Google has spent in Canada lobbying governments at the federal, provincial and municipal level, and how many registered lobbyists you have in Canada.

12:55 p.m.

Head, Public Policy and Government Relations, Google Canada

Colin McKay

We have three registered lobbyists. That's my team. They work here in Ottawa. I can't give you the exact expenditures. They're not sizeable. What we do with that lobbying is this sort of interaction, whether on an individual level or at committee or with broader society. The reality of that number is that we are trying to be transparent about our interaction on what is obviously an increasingly complex set of subjects.

12:55 p.m.

Conservative

Peter Kent Conservative Thornhill, ON

Would Google Canada separate its lobbying efforts from those of Sidewalk Labs?

12:55 p.m.

Head, Public Policy and Government Relations, Google Canada

12:55 p.m.

Conservative

The Chair Conservative Bob Zimmer

Go ahead, Mr. Angus, for one minute.