Evidence of meeting #38 for Status of Women in the 42nd Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was content.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Lauren Skelly  Senior Policy Analyst, Google Canada
Malika Saada Saar  Senior Counsel, Human and Civil Rights, Google

3:50 p.m.

Senior Policy Analyst, Google Canada

Lauren Skelly

We do not service pornography, unless you're explicitly searching for it. There are obviously some terms where there is some overlap, for example, like “cougar”. We make sure to demote the pornographic sites on that term so that you don't stumble upon it without having intended to do so.

With child sexual abuse imagery online, we've partnered with a number of non-profits to develop a pretty robust database that allows us to identify those images or those pieces of content with something called “hashes”. That's a partnership with Facebook, Microsoft, and Nik Mic in the States. We don't service that content at all, globally.

With underage people accessing pornography, we do have settings in our products that you can use so that it will never surface. There's something called SafeSearch; no pornography will surface when it is turned on. When that is turned on, it automatically turns on safety mode in YouTube as well.

There are controls that you can access. As a parent, having more of those choices is better.

3:50 p.m.

Conservative

Rachael Thomas Conservative Lethbridge, AB

Right.

You said that they won't automatically be taken to a pornographic website, let's say, but what about even pornographic images? It's really not that hard to cause a pop-up to come up on my screen, so is there a way that algorithms can be used to prevent that?

3:50 p.m.

Senior Policy Analyst, Google Canada

Lauren Skelly

Yes, but it has to be very narrowly based, because if you think about content, say, like breastfeeding, if you just take down all images with breasts, we're going to miss out on all of those educational resources. It's like a race to the bottom; it's a very slippery slope.

With the knowledge of those tools existing, I think that parents can implement enough of what we have in place that they're not going to surface, but it's almost inevitable in some ways. If someone really wants to find it, they'll be able to find it. Even if we take it off Google Search, the website still exists.

3:50 p.m.

Conservative

Rachael Thomas Conservative Lethbridge, AB

Okay.

Is there a way that algorithms can be used specifically to mitigate or thwart violence, like violent acts, violent words, or any sort of violence that takes place against women? Is there a way of using algorithms for that, and is there a way that you're currently doing that right now?

3:50 p.m.

Senior Policy Analyst, Google Canada

Lauren Skelly

On YouTube specifically, we do not allow violence on the platform whatsoever. Those are against our community guidelines, and they are taken down. That is done by user flagging. If someone flags the video as inappropriate, we have a team that reviews them 24/7.

Malika, do you know of any other instances?

3:50 p.m.

Senior Counsel, Human and Civil Rights, Google

Malika Saada Saar

I think Conversation AI will be another opportunity to be able to regulate and monitor for hate speech in comments.

3:50 p.m.

Conservative

Rachael Thomas Conservative Lethbridge, AB

Is there a way that algorithms can possibly be used to help redirect the narrative in terms of the way that women are perceived?

3:55 p.m.

Senior Counsel, Human and Civil Rights, Google

Malika Saada Saar

I think that's part of the work with Conversation AI, and I think that's part of the question we have before us. We in the States have been doing a lot of work with women's rights groups around this issue. How do we make sure that the algorithms don't reproduce misogyny, and how do we use algorithms in a way to help forge a new language of respect?

I think this is emerging, and for us what's been very important is to bring in those different women's rights organizations to think through how we use this new technology in a way that is thoughtful.

3:55 p.m.

Senior Policy Analyst, Google Canada

Lauren Skelly

One other way that I think we can counterbalance this is with the use of counterspeech. It's by encouraging these organizations to flood these places with counterspeech to hate and violence.

We've done some things with our YouTube creators called Creators for Change. We created a fund for them to create counterspeech content with creators who are highly influential, with tons of followers and stuff like that, and I think that's a very effective way of beating them at their own game.

3:55 p.m.

Conservative

The Chair Conservative Marilyn Gladu

Excellent.

We have Ms. Malcolmson for seven minutes.

3:55 p.m.

NDP

Sheila Malcolmson NDP Nanaimo—Ladysmith, BC

Thank you.

Thank you to the witnesses, and I'm looking forward to learning more from you.

I want to start with this question. We're hearing a lot of thanks from all of the corporate witnesses we've had and an indication of how reliant you are on some of the NGOs that are doing cyberviolence and digital literacy, and what good work they're doing. We heard from them, when they came to testify, that they have way more work than they can do well, and that operational funding is a real barrier for them.

I can also imagine that if they're funded by Google or Twitter, the optics of that may not look great from an arm's-length perspective.

I wonder, either on the international side or the Canadian side, if you have any perspectives on what we can do as a federal government to make sure that those independent NGOs have the capacity to do the work that we as a society are relying on them for.

3:55 p.m.

Senior Policy Analyst, Google Canada

Lauren Skelly

Malika, maybe you can speak, since you were in one of these organizations until recently, and I can talk to the Canadian context.

3:55 p.m.

Senior Counsel, Human and Civil Rights, Google

Malika Saada Saar

When I was on the NGO side, the lion's share of my work was around child sex trafficking, and we worked with Google on a campaign we launched called No Such Thing—that there is no such thing as a child prostitute, only victims and survivors of child rape.

What was powerful about having Google as a funder was that it signalled to other funders that we were a legitimate, strong, viable organization. There was no question that the support from Google led to support from other foundations as well as other corporations. I think that's part of the importance of a Google, a Facebook, a Twitter supporting these types of NGOs. It's because it signals to other funders and individual donors that these are important organizations to be supported.

3:55 p.m.

Senior Policy Analyst, Google Canada

Lauren Skelly

I completely agree with what Malika said, and we've heard that from our partners here as well. There comes some weight with brands like Google, Twitter, and Facebook that I think give viability to a lot of these organizations. Obviously the funding is very, very important, but we can offer areas of expertise that these organizations might not have.

For example, I know we second a lot of our engineers to the National Center for Missing & Exploited Children in the U.S.. We lend them our engineering expertise to help them build out their databases and their technical expertise. I think there's always more we can do in this area. We will never be doing enough.

3:55 p.m.

NDP

Sheila Malcolmson NDP Nanaimo—Ladysmith, BC

I have another question on the regulatory side, because we're looking at what we can actually change on the ground from the federal legislative side. Are there protections for young women and girls that you're surprised Canada doesn't have? That's as a comparison, because you're an international company. Are there any best practices or best models around what we can do, insofar as legislation goes, when things go sideways?

3:55 p.m.

Senior Policy Analyst, Google Canada

Lauren Skelly

I'm not familiar with this part of the law in depth, and I don't really feel that I'd be comfortable in making a comparison.

Malika, do you have any insight into the Canadian law?

4 p.m.

Senior Counsel, Human and Civil Rights, Google

Malika Saada Saar

What I would say, from part of what we've learned here and from my own lived experience, is that it's very important for law enforcement to be trained around these forms of cyberviolence, for law enforcement to understand that being cyberharassed is the same as being physically harassed and warrants consideration and response.

I went to a number of different police departments begging them to help me. The first place I went to I was laughed at. It was only because of my own persistence and good fortune that I was able to find a detective who did understand that this is a form of violence against women and girls.

I think it's absolutely critical that in the U.S. and in Canada, we really invest in training law enforcement to be able to understand this and be responsive. I see it as how we did the work around domestic violence. We named domestic violence as a form of violence against women. We also had to do the hard work of training law enforcement to understand that and to have thoughtful, responsive protocols. We have to do the same thing around this issue.

4 p.m.

NDP

Sheila Malcolmson NDP Nanaimo—Ladysmith, BC

A recommendation that I can imagine coming out of this work, because we have in Canada a real patchwork of municipal, regional, first nations, provincial, and territorial police forces, is that it could be a role for the federal government to encourage all of those jurisdictional partners to have that standard of training, so that a victim of cyberviolence, wherever it is that they report it in the country, should have some expectation of a similar level of education. Police work should be trauma-informed and digitally literate.

4 p.m.

Senior Counsel, Human and Civil Rights, Google

Malika Saada Saar

That's exactly right. I would also say that it is good to be linked to the schools as well.

Part of what we saw play out and continue to see play out in the U.S. is that when girls are cyberbullied, it's often called slut shaming. It's not even contemplated as a form of cyberviolence, right? It's instead about regulating and judging the girls' behaviour. It's so critical for schools to be able to recognize that when girls are cyberbullied, often in a very sexualized manner, it is a form of violence done to her and not about judging or regulating her sexual behaviour.

4 p.m.

NDP

Sheila Malcolmson NDP Nanaimo—Ladysmith, BC

That's very helpful and very consistent with testimony we've had from front-line NGOs, from victims, and from survivors, so thank you all for your work and your time.

4 p.m.

Conservative

The Chair Conservative Marilyn Gladu

Excellent.

Now we'll go to Mr. Fraser for seven minutes.

December 7th, 2016 / 4 p.m.

Liberal

Sean Fraser Liberal Central Nova, NS

Thank you very much, everyone, for being here. I really appreciate your testimony.

First, I hope we can cover this one quickly so I can move on. I think it was Ms. Saada Saar who mentioned that there were tools to remove offensive content from the search results, but of course Google can't remove content from someone else's site. When you do identify controversial or abusive content, is that shared with law enforcement or somebody who could potentially have the content removed from another host?

4 p.m.

Senior Counsel, Human and Civil Rights, Google

Malika Saada Saar

Conversation AI is in the process of being created and is not a finished product, but here's an example of this issue around removing content that I've been able to witness up front. When we have child sexual abuse imagery—and we scrub for that on all our platforms—we then reach out to NCMEC, the National Center for Missing & Exploited Children, and we red-flag that content with them. If it is child sexual abuse imagery, that content is removed.

4 p.m.

Liberal

Sean Fraser Liberal Central Nova, NS

Is there any reason we couldn't do a similar thing with explicit content that is shared without the consent of the subject of that content?

4 p.m.

Senior Counsel, Human and Civil Rights, Google

Malika Saada Saar

That's revenge porn.