Evidence of meeting #148 for Access to Information, Privacy and Ethics in the 42nd Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was google.

A video is available from Parliament.

On the agenda

MPs speaking

Also speaking

Colin McKay  Head, Public Policy and Government Relations, Google Canada
Jason Kee  Public Policy and Government Relations Counsel, Google Canada

4:30 p.m.

Liberal

Frank Baylis Liberal Pierrefonds—Dollard, QC

I would like to see the exact calculations that were done when they predicted they couldn't meet this date. Does that make sense or not? Is this some kind of bad question I'm asking? I'm just curious.

4:30 p.m.

Public Policy and Government Relations Counsel, Google Canada

Jason Kee

Given the fact that as of June 30, legal obligations came into force and that [Inaudible-Editor]

4:30 p.m.

Liberal

Frank Baylis Liberal Pierrefonds—Dollard, QC

They couldn't meet them, I got that.

4:30 p.m.

Public Policy and Government Relations Counsel, Google Canada

Jason Kee

The question was a binary one, simply “Can you do it by this date?” The answer was no. That was where we were.

4:30 p.m.

Liberal

Frank Baylis Liberal Pierrefonds—Dollard, QC

Okay, so how did they get to the “no”? They were asked, “Can you do it in six months?” They said, “No, we can't.”

How did they get there? Did they just say, “No, can't do it”, or did they make some type of calculation?

It's a simple question. I'm asking you, “Can you get it done in this time frame?” “No.” Did you just say no off the top of your head, or did you do some type of work to say when you can't get it done by?

Facebook said, “Well, our engineers are maybe a little smarter,” or “Our systems are clearly not as complex as their wonderful systems,” or “We have more money than Google to do it.” I don't know what they did, but they did some calculations and said, “Yeah, we can do it.”

You guys did some calculations and said no, or did you just say off the top of your head, “No, we can't do it”?

Did you make it up or did you at least do some type of work? That's what I'm asking; and if you did the work, I'd like to see it.

4:30 p.m.

Public Policy and Government Relations Counsel, Google Canada

Jason Kee

As I said, we examined the requirements and it became clear that we simply couldn't comply within the time frame. To be clear, it has also been clear that Microsoft can't do it; Yahoo! is still undetermined, but likely also; and many others are not going to be able to complete the requirement.

4:30 p.m.

Liberal

Frank Baylis Liberal Pierrefonds—Dollard, QC

Facebook can do it; and Microsoft, you said....

Yahoo! is doing some type of calculations to determine if they can do it.

You did some calculations, or did you make it up?

It's a simple question: Did you calculate it, or did you just decide you can't do it.

4:30 p.m.

Public Policy and Government Relations Counsel, Google Canada

Jason Kee

We were advised by our engineering teams that we would not be able to meet the requirement. That is—

4:30 p.m.

Liberal

Frank Baylis Liberal Pierrefonds—Dollard, QC

Did your engineering team just say it off the top of their heads, or did they do some type of work?

Why are you hiding from this? If you did the work, just say, “Look, Frank, we looked at it; it's going to take us 2.2 years and four months.”

Why are you so upset about it? Just tell me. Why are you hiding from it?

4:30 p.m.

Public Policy and Government Relations Counsel, Google Canada

Jason Kee

It's not a matter of being upset. It's just more the fact that there was a hard deadline and the question was a binary yes or no. The answer—

4:30 p.m.

Liberal

Frank Baylis Liberal Pierrefonds—Dollard, QC

Okay, but how did you get the answer to that question?

4:30 p.m.

Public Policy and Government Relations Counsel, Google Canada

Jason Kee

That is what I'm saying we'd have to look into.

4:30 p.m.

Liberal

Frank Baylis Liberal Pierrefonds—Dollard, QC

Did they just make it off the top of their heads, “We can't do it”, or did they do some calculations? You said Yahoo! is doing some calculations.

4:30 p.m.

Public Policy and Government Relations Counsel, Google Canada

Jason Kee

I've already told you that we would have to inquire.

4:30 p.m.

Liberal

Frank Baylis Liberal Pierrefonds—Dollard, QC

Are you going to come back with the dates—

4:30 p.m.

Public Policy and Government Relations Counsel, Google Canada

Jason Kee

I have to inquire.

4:30 p.m.

Liberal

Frank Baylis Liberal Pierrefonds—Dollard, QC

—and the calculations?

4:30 p.m.

Public Policy and Government Relations Counsel, Google Canada

Jason Kee

I have to inquire. I can't commit to coming back with calculations.

4:30 p.m.

Liberal

Frank Baylis Liberal Pierrefonds—Dollard, QC

If they've done—

4:30 p.m.

Conservative

The Chair Conservative Bob Zimmer

Thank you, Mr. Baylis. We are going to have more time, so if you want some more time, you can ask for that.

Next we'll have Mr. Dusseault.

Folks, we do have quite a bit of time. We have 55 minutes. We have these witnesses for the rest of the time. Anyway, just let me know. Typically, we try to end by five on Thursdays, but we'll see where it goes.

Mr. Dusseault, you have three minutes.

4:30 p.m.

NDP

Pierre-Luc Dusseault NDP Sherbrooke, QC

Thank you, Mr. Chair.

This time I will focus more on YouTube, a very popular and influential platform, just like Google, of which it is a part. I was talking about it earlier. YouTube sometimes directs users to extreme, unreliable content that reports or occasionally praises conspiracy theories. YouTube makes this content look like real information.

I was wondering if you had any details about the algorithm used for users, once they are on a web page displaying, say, political content, since that is the subject of our discussion today. The algorithm will give them suggestions for other videos on the right of the page they are viewing, or under the video if they are using a mobile phone. What algorithm is used and what is the degree of transparency of this algorithm that suggests content to users when they are on a particular web page?

What mechanism is there to ensure that this content does not praise conspiracy theories or give fake news, unreliable information or, perhaps, unbalanced information, in other words information that may just promote an idea or vision, a political party?

What degree of transparency and what mechanism have you put in place to ensure that the content that is suggested to users is quality content, that it is balanced in terms of public policy, political parties and political ideas as well?

4:35 p.m.

Public Policy and Government Relations Counsel, Google Canada

Jason Kee

There are a number of factors that go into the recommendation system. It's worthwhile noting that the weighting that occurs around news and information content is different from, say, entertainment content. Initially, the recommendations were actually built more for entertainment content such as music, etc. It actually works extremely well for that.

When it was applied to news and information, it became more apparent that there were some challenges, which is actually why we changed the weighting system. What that means, again, is looking at the factor once we evaluate the video and then at what's the authoritativeness: overweighting for authoritativeness and under-weighting for information that isn't necessarily going to be authoritative or trustworthy.

That is very contextual. It depends on the specifics of whether you're signed in or not. The information is available on your watch time. It's based on information about the video you're watching, on the kind of video that other people have liked to watch and on what are the other videos that people who have liked this video like to watch, etc. This is why it actually is dynamic and will constantly evolve and change.

In addition to making tweaks and changes to that system over time to ensure that we're actually providing more authoritative information in the case of news and information, we're also adding additional contextual pieces, whereby we'll actually have clear flags, labels and contextual boxes to indicate when there is subject matter or individuals that are frequently subject to misinformation. For example, there's the conspiracy theory issue that you raised.

Essentially, if you see a video that may be suggesting vaccine hesitation or so on and so forth, you'll get information saying that this is not actually confirmed by science and that gives more contextual information about what that video is covering. The same thing applies to things like 9/11 conspiracy theories and so on and so forth. This will be an ongoing process.

Mostly, we want to make sure that even if a user is seeing information, they're actually given context so they can properly evaluate it themselves.

4:35 p.m.

NDP

Pierre-Luc Dusseault NDP Sherbrooke, QC

I would like to use the seconds I have left to discuss the transparency of this algorithm. Earlier, I think Mr. McKay talked about this issue of transparency with respect to advertising, which is why you are shown certain ads. There even seems to be a new feature in the Chrome browser that allows users to see why such advertising has been offered to them.

Is it possible to have the same functionality for content recommended to YouTube users? This would give them a better understanding of why certain content is suggested to them rather than another.

4:35 p.m.

Public Policy and Government Relations Counsel, Google Canada

Jason Kee

Certainly, finding means by which we can actually increase transparency and users can understand the context in which they're being served information is something that we're constantly working on.

We actually produced a report—I think it was 25 pages—on how Google fights this information. That includes an entire dedicated section on YouTube that explains much of what I described to you, as well as, again, the general factors that go in. It's something that we'll strive to work towards, like we're doing on ads on the YouTube platform as well.

4:40 p.m.

Head, Public Policy and Government Relations, Google Canada

Colin McKay

Just to add a supplemental to what Mr. Kee just said, on your Google account writ more largely, you can go into “myaccount” and it will identify what we've identified as your interests across all of our products and services. You can go to myaccount and it will say in general terms that you like 1980s music and racing videos. It will give you that general observation, which you can then correct. You can delete that information or you can add in additional interests so that across our services we have a better understanding of what you're interested in and, as well, what you don't want us to serve.

On the page itself, there is a little three-dot bar beside the videos, the specially recommended videos—on mobile, as you mentioned—where you can signal that you're not interested in that content or that you would like more of it. There is granular control to not seeing that in your video feed, your newsfeed or across Google services.