Evidence of meeting #155 for Justice and Human Rights in the 42nd Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was google.

A video is available from Parliament.

On the agenda

MPs speaking

Also speaking

Colin McKay  Head, Government Affairs and Public Policy, Google Canada

4:25 p.m.

Liberal

Ron McKinnon Liberal Coquitlam—Port Coquitlam, BC

I understand your point about anonymity being sometimes desirable, and many people might need it in certain circumstances.

What I'm looking for as an end-user is to be able to, say, exclude from my feed, by my choice, content that was not authenticated, perhaps. Right?

4:25 p.m.

Head, Government Affairs and Public Policy, Google Canada

Colin McKay

To further refine my point, it's important to realize that YouTube is a platform for content creators, so essentially the people who are on YouTube are people who are trying to build an audience and a community around shared interests. The vast majority of them have given you the information to be able to verify and have some level of certainty about their authority if not their actual identity, whether you're talking about repairing small engines, doing model trains or political commentary, and whether you're talking about traditional news organizations or purely online news organizations.

For the vast material or content, you are operating in an environment where you are able to identify and then qualify what you are looking at. A lot of my opening statement, as well as a lot of our online and automated processes, are based on that immediate response to a crisis or that attempt to incentivize violent extremism or hatred online, where we're trying to do exactly what you're trying to describe. That is to say, wait a second, what's the outlier, what's the uploader, who is the user of our service trying to pursue a negative outcome, and can we identify them and qualify them and then apply our policies against them, whether it means limiting the availability of those videos or taking them down from the system?

We are trying to pursue that goal, not just within the context of authentication.

4:25 p.m.

Liberal

Ron McKinnon Liberal Coquitlam—Port Coquitlam, BC

I've got a few seconds left.

Perhaps you can talk to us about bots. I understand that sometimes they're beneficial and other times they're malicious. I understand you can identify when a bot is at play. Perhaps you could tell us about standards and what action could be taken to control them.

4:25 p.m.

Head, Government Affairs and Public Policy, Google Canada

Colin McKay

I think from our point of view, I'd back away from bots to a wider perspective, which is that, across our system, we've long had experience with automated attacks on our infrastructure as well as our services. What we have focused on over time is providing the signals to our users that they are being subjected to an automated attack that is trying to either compromise their Google account, their Gmail account, or to present misinformation or disinformation to them. That goes all the way back to providing notices to users that they could be subject to a state-sponsored attack on their Gmail account.

Through this sort of deep-level analysis that I described, which analyzes videos writ more broadly across our infrastructure, we are trying both to identify when we see systemic attempts to breach the security of our systems and also to raise the profile and popularity of content, whether it's on search or whether it's on YouTube, to battle that. From our point of view, it's a very different context from the other services, but it's something in which we've historically invested a lot of money and time in both combatting and then also providing flags to our users so they're aware that they're being subject to these attacks or that there's an attempt to try to influence them in this way.

June 4th, 2019 / 4:25 p.m.

Liberal

Ron McKinnon Liberal Coquitlam—Port Coquitlam, BC

Thank you.

4:25 p.m.

Liberal

The Chair Liberal Anthony Housefather

Thank you.

Ms. Moore, you have the floor.

4:25 p.m.

NDP

Christine Moore NDP Abitibi—Témiscamingue, QC

Thank you.

I'd like to invite my colleagues to pick up their phones, if they wish.

If you do a quick Google search of the phrase “how to pimp,” there are countless videos available that inform people of how to take part in human trafficking. Why haven't these sites and videos been removed?

4:25 p.m.

Head, Government Affairs and Public Policy, Google Canada

Colin McKay

I'm sorry. What was the phrase?

4:30 p.m.

NDP

Christine Moore NDP Abitibi—Témiscamingue, QC

How to pimp.

4:30 p.m.

Head, Government Affairs and Public Policy, Google Canada

Colin McKay

How to pimp, okay.

The reality is that we're constantly fighting against issues and content like this. It's a constant effort to identify the context within which there are comments in a video, and then to create what we call a “classifier”, which is an automated system to identify them on a broad scale.

If you see content like that, there is the opportunity to flag it right there on the YouTube video on your mobile device or the page. We use that as a signal to recognize that, wait a second, there is behaviour here and content that needs to be removed. Obviously, that's something we don't want in our system.

4:30 p.m.

NDP

Christine Moore NDP Abitibi—Témiscamingue, QC

Okay. So when you remove those videos and websites explaining how to pimp, is there any information that is transmitted, for example, to police forces or local authorities? If someone has a complete guide explaining how to engage in human trafficking, I think it involves criminal activity. Do you flag the police of that country, for example, and say, “Maybe that guy is involved in something criminal, and you could take a look,” or do you just remove the video because there are just too many of them all the time and you don't have the time to follow up?

4:30 p.m.

Head, Government Affairs and Public Policy, Google Canada

Colin McKay

I'm not sure in this case. I can follow up with you.

4:30 p.m.

NDP

Christine Moore NDP Abitibi—Témiscamingue, QC

Okay.

In addition, the English keywords are accurate, because most people use that language. However, what about algorithms in other languages that may be used less often, such as French? Is it easier to spread hate if you use a language other than English, because the algorithms for keywords are not as well developed?

4:30 p.m.

Head, Government Affairs and Public Policy, Google Canada

Colin McKay

I think we have addressed that broadly, because we've long focused on providing our services in many languages. We have actually developed the artificial intelligence translation systems to be able to translate upwards of 200 languages. Our systems aren't focused solely on English terms and English challenges. It's broader. It's international, and the review teams that I described are also international.

Our team that reviews content is made up of of 10,000 people distributed around the world, specifically so that they can have the linguistic, cultural and societal background to understand the context within which they are seeing comments and material, and making decisions about whether or not the particular content or account needs to be taken down.

We recognize that challenge and we're still using a combination of automated processes that use some of the best individual language specialists and language translation software in the world to filter into the process.

4:30 p.m.

NDP

Christine Moore NDP Abitibi—Témiscamingue, QC

How quickly are you able to remove a video that has already been removed and been modified, for example, using sound that goes faster. They do that. I've often seen that from my daughter. There are people producing Paw Patrol a little faster so that it is not recognized by the system and they are able to publish their video.

In terms of hate videos, are you able to quickly remove a video that has already been removed once and has been modified just to avoid those controls?

4:30 p.m.

Head, Government Affairs and Public Policy, Google Canada

Colin McKay

Yes, we are.

I recognize the example you described. I've seen that as well. That is one of the challenges, especially immediately after a crisis. We're seeing content being uploaded and they are playing with it a little bit to try to confuse our systems.

What we do, particularly in the case of hate content and violent content online, is to tighten the standards within which we identify videos so that we're taking them down even more quickly.

Even in the context of Paw Patrol, I think your daughter will likely find that if she goes back to the same channel two weeks later, they may not have the Paw Patrol content because it will have been recognized and taken down.

4:30 p.m.

Liberal

The Chair Liberal Anthony Housefather

You have one minute left.

4:30 p.m.

NDP

Christine Moore NDP Abitibi—Témiscamingue, QC

Okay.

I would like to know a little bit more about the process of reviewing flagged videos, and who reviews them when it's not done by a computer.

Also, are the workers reviewing these videos provided with any services, because having to listen to these kinds of things all the time causes a lot of distress to people? What services are you providing to these workers to make sure they do not go crazy from listening to all of these things all the time?

4:30 p.m.

Head, Government Affairs and Public Policy, Google Canada

Colin McKay

To begin with the process itself, as I mentioned, especially in the context of hate content, we are dealing with such a quantity that we rely on our machine learning and image classifiers to recognize content. If the content has been recognized before and we have a digital hash of it, we automatically take it down. If it needs to be reviewed, it is sent to this team of reviewers.

They are intensely trained. They are provided with local support, as well as support from our global teams, to make sure they are able to deal with the content they're looking at and also the needed supports. That is so that as they look at what can be horrific content day after day, they are in a work environment and a social environment where they don't face the same sorts of pressures that you're describing. We are very conscious that they have a very difficult job, not just because they're trying to balance rights versus freedom of expression versus what society expects to find when online, but also because they have the difficult job of reviewing material that others do not want to review.

For us, whether they're based in one office or another around the world, we are focused on giving them training and support so they can do their job effectively and have work-life balance.

4:35 p.m.

Liberal

The Chair Liberal Anthony Housefather

Thank you very much.

Ms. Khalid.

4:35 p.m.

Liberal

Iqra Khalid Liberal Mississauga—Erin Mills, ON

Thank you, Mr. McKay, for coming in today.

I'm going to follow up Madame Moore's line of questioning.

How many reviewers do you have to review specifically Canadian content within Google Canada?

4:35 p.m.

Head, Government Affairs and Public Policy, Google Canada

Colin McKay

We have a global team that doesn't treat the content by jurisdiction or region. Depending on what the pressure point may be or where the flow of content may be coming from, they will deal with that as a flow.

Where they get their insight and their expertise on Canada is in part from guidance from my team and my colleagues who work for Google in Canada. Also, we have a sophisticated mechanism for ensuring that the cultural, social and political context within which content is being reviewed is recognized within that review process.

4:35 p.m.

Liberal

Iqra Khalid Liberal Mississauga—Erin Mills, ON

How long does it take you to remove something once it's reported or flagged to you? What's the specific timeline?

4:35 p.m.

Head, Government Affairs and Public Policy, Google Canada

Colin McKay

It varies, depending on the context and the severity of the material.

We've already had examples in our conversation today about whether or not it's commentary or it's news reporting, or it's actual video of a violent attack. In the context of the Christchurch attack, we found that there were so many people uploading the videos so quickly that we had to accelerate our artificial intelligence review of the videos and make on-the-fly decisions about taking down video, based on its being substantially similar to previous uploads.

In that process, the manual review was shortened extremely because we were facing a quantity.... In a case where there's broader context to be considered, there's still a commitment to review it quickly, but we do need a process of deliberation.

4:35 p.m.

Liberal

Iqra Khalid Liberal Mississauga—Erin Mills, ON

In your opening remarks, you spoke about different countries having different legislation. This is something we've heard before this committee, that our government needs to set requirements for providers such as yourself to remove all posts that would constitute hate speech, and failure to do so in a timely manner would result in accountable action or significant fines, etc.

Can you talk a little about what other jurisdictions are doing? How do you keep your global team updated with all the varying legislation within the different countries?