Evidence of meeting #38 for Status of Women in the 42nd Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was content.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Lauren Skelly  Senior Policy Analyst, Google Canada
Malika Saada Saar  Senior Counsel, Human and Civil Rights, Google

3:30 p.m.

Conservative

The Chair Conservative Marilyn Gladu

I call the meeting to order.

We are very fortunate this afternoon to have representatives from Google and Google Canada with us.

By video conference, we have Malika Saada Saar. Here in the room we have Lauren Skelly, who is a senior policy analyst, and Jason Kee, who is counsel for public policy and government relations.

We're going to allow them to open up with their remarks; then we'll go to our round of questions.

We'll begin with you, Lauren.

3:30 p.m.

Lauren Skelly Senior Policy Analyst, Google Canada

Thank you very much.

Members of the committee, thank you for inviting Google to contribute to this very important study.

My name is Lauren Skelly. I am the senior policy analyst for Google Canada.

I'm joined today by two colleagues: Jason Kee, public policy counsel for Google Canada, and Malika Saada Saar who, thanks to technology, is able to join us today from Washington, D.C., via video conference.

Malika is Google's senior counsel on civil and human rights. Before joining Google, Malika was co-founder and executive director of the human rights project for girls Rights4Girls, a human rights organization focused on gender-based violence against young women and girls in the United States. Today, at Google, Malika provides strategic leadership on critical civil and human rights issues that may impact the company, its users, and the digital world in which we operate.

With that, I will ask Malika to take the lead on Google's remarks to the committee. We look forward to your questions.

3:30 p.m.

Malika Saada Saar Senior Counsel, Human and Civil Rights, Google

Thank you, Lauren.

Thank you, Madam Chair, for the invitation to appear today.

I want to commend your committee for tackling this issue and for the work it is doing with respect to this study, especially your interest in the digital world and how it relates to human rights and gender-based violence.

It is important for me to begin with a personal story that really informs so much of my work here at Google.

From 2012 to 2014, I was cyberstalked by a former colleague. This individual aggressively stalked me online, created false websites against me, and sent shaming emails to former colleagues at the Department of Justice, at the White House, and to my funders. He invented false identities through which he further harassed me.

After many rejections by law enforcement to request for help in my situation, I finally found a detective who did, but here's the thing. At one point during my conversation with that detective, an intern of mine overheard me. The intern approached me afterward and disclosed to me that she, a first-year law student, had been revenge-porned. Those revenge-porn images were essentially her only digital footprint. As a result, no firm would hire her for the summer.

I realized that while the cyberviolence done to me had real emotional consequences, I already had a digital footprint that balanced all of the wreckage done to me, but this young woman did not. As with all forms of gender-based violence, there are emotional as well as economic consequences of the violence against us as women and girls.

Google was founded on the principle that the free flow of information is crucial and must be preserved and protected culturally, socially, and economically. The free flow of information is essential to creativity and innovation, and leads to economic growth for countries and companies alike. However, there are legitimate limits we must look at, even where laws strongly protect free expression and we have clear processes for removals if content violates local laws.

Beyond what is legally required, we want our products to enable positive community interaction, so we have policies about what content we do and do not allow on our platforms. Assessing controversial content can require hard judgements, and there isn't always one clear answer, but we do our best to balance free expression with safety.

I know algorithms have been of particular interest to the committee. For a typical search query, there are thousands if not millions of web pages with helpful information. Algorithms are computer processes and formulas that take your questions and turn them into answers. Google search algorithms rely on more than 200 unique signals or clues that make it possible to guess what you might really be looking for.

Our philosophy is that a search should reflect the whole web, so while we comply with laws and remove content from search results in response to valid legal requests, we only go beyond that for a few narrow categories. For example, if a user searches for child sexual abuse imagery, or what we call child porn, we block the content. We also remove nude or sexually explicit images of individuals shared publicly without their consent—for example, revenge porn—by reviewing requests from people to remove images shared without their consent from search, and by demoting websites dedicated to revenge porn.

We prohibit revenge porn on all Google-hosted platforms, including YouTube, Blogger, G+, and Play.

But remember, removing controversial content from Google Search does not necessarily remove this content from the Internet. Even if Google deletes particular URLs from search results pages, the web page hosting the content in question still exists.

We provide resources so that users understand that webmasters control individual sites and the content on them. We help users contact webmasters in order to seek removal of content from the source. This is the only way to actually get the content removed from the web. We think of Google Search like your public library. Taking the index card out of the card catalogue doesn't remove the book from the library. Removing the search won't eliminate the source material.

We also rely on our community to send us signals when content violates our guidelines, much like an online Neighbourhood Watch program. On YouTube, for example, people can use the flagging feature located beneath every video and comment to help report content they believe violates our community guidelines. In 2015 alone, we removed 92 million videos for violation of our policies through a mix of user flagging and our spam detection technology.

We are always looking to new technologies to help counter hate speech online. Jigsaw, Google's think tank, is working on a set of tools called Conversation AI, which is designed to use machine learning to automatically spot the language of abuse and the language of harassment far more accurately than other keyword filters and far faster than any team of human moderators.

Creating a positive, safe online experience for kids and families is an absolute priority for us, and we do this in a number of ways.

First, we want to ensure parents and children have the tools and knowledge they need to make smart and responsible choices online. We are committed to building an informed and responsible generation of digital citizens. We have several programs that train kids and teachers in the basics of privacy, security, and conscientious behaviour online.

We deeply believe that companies like Google have a responsibility to ensure all the products and services we provide to families offer safe and secure experiences for them online. We build features into our products that put families in the driver's seat, such as safety settings on Search and YouTube, so that users have a way to filter out more explicit content. We've also built products with families in mind, such as YouTube Kids, our stand-alone app that makes it safer and easier for children to find videos on topics they want to explore, but in a safe and age-appropriate way.

There has been a huge amount of progress made, and the technologies developed and shared by our industry are making a real difference in keeping women and girls safe online, but there is still so much more work that needs to be done.

We look forward to continuing to work with industry, non-profits, and governments to protect all people from harm while keeping the Internet free and open for everyone.

Thank you for the opportunity to provide these comments to the committee. I look forward to answering questions.

3:40 p.m.

Conservative

The Chair Conservative Marilyn Gladu

Excellent.

We'll start our first round of questioning with Ms. Damoff.

3:40 p.m.

Liberal

Pam Damoff Liberal Oakville North—Burlington, ON

Thank you so much for appearing here today.

You mentioned algorithms. After our last meeting, one of my colleagues said, “Watch what happens when you type this in”, so we typed in “are blacks”. The second response that came up was “dumber”. That's not what I was looking for, but that's obviously an algorithm, right?

How do you combat that? Obviously someone has moved that up in the search engine for their own reasons. How do you deal with these kinds of things?

3:40 p.m.

Senior Counsel, Human and Civil Rights, Google

Malika Saada Saar

I want my colleagues to respond as well.

I think it is an ongoing challenge to create algorithms that are corrective and that do not reproduce attitudes and language and behaviours of bias. That is an ongoing issue. It is certainly an issue that we are tackling around Conversation AI to make sure that the algorithms that are being created are not algorithms that are reproducing bias and that are not algorithms that aren't corrective.

3:40 p.m.

Liberal

Pam Damoff Liberal Oakville North—Burlington, ON

We heard at the last meeting about something called...“brigading”, I think, was the term, whereby actions by groups can move things up or down on a Google search. Are you familiar with that?

3:40 p.m.

Senior Counsel, Human and Civil Rights, Google

Malika Saada Saar

We are starting to become familiar with it. Again, this is why it's so important for this conversation to happen with the public. We rely on the community to flag these issues so that we are able to respond to make a corrective measure in the way the search is done.

3:40 p.m.

Liberal

Pam Damoff Liberal Oakville North—Burlington, ON

Okay.

Something else we've heard a lot about is the need for more digital literacy. I was quite pleased to hear you talk about how you're already doing some of that. Can you expand on the programs that you run and to whom they are available?

3:40 p.m.

Senior Counsel, Human and Civil Rights, Google

Malika Saada Saar

I can talk about what we do in the U.S. and I would love my colleagues to talk about what we are doing in Canada as well.

In the U.S. we have a program called Applejacks. We go into the schools and do this type of hands-on digital literacy with students as well as with teachers, because those of us who are parents know that sometimes our children are quicker and more intuitive around online behaviours and interactions. It's important to also work with teachers in helping to promote digital literacy and responsibility in how we use online products.

3:40 p.m.

Liberal

Pam Damoff Liberal Oakville North—Burlington, ON

I'd love to know what we're doing in Canada.

3:40 p.m.

Senior Policy Analyst, Google Canada

Lauren Skelly

In Canada we support many organizations that you have already heard from, such as MediaSmarts, the Canadian Centre for Child Protection, and the Missing Children Society of Canada. We also believe in empowering youth and students to become digital creators and citizens themselves by encouraging them to learn digital skills beyond just safety and privacy, such as how to code, or programming. We're supporting organizations such as Ladies Learning Code and Actua so that they understand how to use the Internet for good and how to empower themselves.

3:40 p.m.

Liberal

Pam Damoff Liberal Oakville North—Burlington, ON

When you talk about supporting them, do you mean you're giving them financial support?

3:45 p.m.

Senior Policy Analyst, Google Canada

Lauren Skelly

There are a number of things. Yes, we provide funding to MediaSmarts and the Missing Children Society of Canada. We provide foundational support.

Colin McKay, who is the head of government relations, is on the board of MediaSmarts. With Actua we have developed a program called Codemakers, which is a coding program that we've delivered to over 100,000 Canadians, a majority of them being girls.

3:45 p.m.

Liberal

Pam Damoff Liberal Oakville North—Burlington, ON

That's great.

We had a witness who talked about the creation of an e-safety commissioner, something they have in Australia right now. It would be someone within government who would be responsible for responding to complaints and basically being the coordinator for e-safety.

Are you familiar with that in Australia? What are your thoughts on it?

3:45 p.m.

Senior Policy Analyst, Google Canada

Lauren Skelly

I'm not familiar with an e-safety commissioner in Australia.

I think any work that the government can do to promote education amongst children, because I think that's probably the biggest key here, and having someone to coordinate with industry, would be welcome.

Malika, do you have any thoughts there?

3:45 p.m.

Senior Counsel, Human and Civil Rights, Google

Malika Saada Saar

Yes, I agree. It is absolutely important to create these spaces where we are having ongoing conversation, dialogue, and action around safety, because as you all have seen, this is a dynamic space, so it is important to be in constant communication and action around what we see cropping up with online issues and our children and safety.

3:45 p.m.

Liberal

Pam Damoff Liberal Oakville North—Burlington, ON

Going back to your programming just for a moment, MediaSmarts was one of our best witnesses in terms of what they're doing and the information they could provide to us. Certainly, ongoing funding is an issue for a group like that.

I think any assistance that companies such as yours can provide to step it up.... I'm not asking for a commitment, but government can't provide all the funding for that. They were talking about the fact that it's a multi-billion-dollar industry in terms of algorithms.

Do you have any other suggestions on who could provide funding for groups such as MediaSmarts, besides Google?

3:45 p.m.

Senior Policy Analyst, Google Canada

Lauren Skelly

I think that tech companies writ large have to step up to these things together. There are obviously areas in which we compete, but this is not one of them. We all have a best interest in keeping our children and families safe online. I think a more coordinated effort on our part is something that we can probably do a bit better on. I think that MediaSmarts is doing tremendous work in this country, and it deserves more support.

3:45 p.m.

Liberal

Pam Damoff Liberal Oakville North—Burlington, ON

Okay.

This is probably my last question, because I didn't start my time right away. Someone else, one or our witnesses—and it was to do with Twitter, but it would apply to your algorithms as well—suggested having more women involved in the process of developing the algorithms. I'm wondering how many women Google has doing that, and if you do see the need for having more diversity in the people who are developing and monitoring the algorithms online.

3:45 p.m.

Senior Policy Analyst, Google Canada

Lauren Skelly

Yes. Google was one of the first companies to publicly publish their diversity numbers, because in technology we have a huge diversity problem with regard to women, people of colour, and immigrants. This is why we invest in organizations like Ladies Learning Code, Actua, and a number of organizations in the States.

The engineer who leads up our search personalization team is a woman, which is great, but there's obviously more that we can do. I think that the impetus for investing in diversity is that we need to have our team reflect the users of our products. We need to understand what they're looking for and searching for, so we're in agreement there, but there's still a lot of work to do.

3:45 p.m.

Conservative

The Chair Conservative Marilyn Gladu

All right. That's your time.

We're going to go now to Ms. Harder for seven minutes.

3:45 p.m.

Conservative

Rachael Thomas Conservative Lethbridge, AB

Thank you very much.

I'm wondering if you can comment a little further on the economic impact that sexual violence has on women. Can you elaborate on that just a wee bit?

3:45 p.m.

Senior Counsel, Human and Civil Rights, Google

Malika Saada Saar

The reality is that our online lives are as real and relevant and meaningful as our physical lives, so as women and girls, our ability to feel our full participation and voice online is absolutely imperative. It is imperative economically as well.

That's why I was so stunned by the example around understanding that revenge porn has not only an emotional cost on a woman or girl. We've heard this from women journalists who have been attacked online and have been trolled on Twitter. It makes them feel that they don't want to be online. Shutting down an online voice for them has real implications for their careers.

A young woman who simply wants to be able to be hired by a law firm and who wants to appear in the full recognition of the hard work that she's done in law school is not allowed to have that because her digital footprint is one of sexual images without her consent. There is that real economic consequence, and she's not alone in that. We heard that from so many women and girls at the beginning of their careers when they had that as their digital footprint.

I think, in general, when we hear about how women and girls are disproportionately cyberstalked and cyberharassed, we have to understand that it's not just emotionally wrong and dangerous but also understand what it means for women's and girls' voices online.

3:50 p.m.

Conservative

Rachael Thomas Conservative Lethbridge, AB

Thank you.

I'm wondering if those who are listening at the table here can comment on whether algorithms could be used in a positive way to help mitigate the risk of underage children accessing pornography.