Evidence of meeting #37 for Status of Women in the 42nd Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was content.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Jane Bailey  Professor, Faculty of Law, University of Ottawa, As an Individual
Matthew Johnson  Director of Education, MediaSmarts
Sandra Robinson  Instructor, Carleton University, As an Individual
Corinne Charette  Senior Assistant Deputy Minister, Spectrum, Information Technologies and Telecommunications, Department of Industry

4:05 p.m.

Professor, Faculty of Law, University of Ottawa, As an Individual

Jane Bailey

Do you want to go first, Matthew?

4:05 p.m.

Liberal

Pam Damoff Liberal Oakville North—Burlington, ON

Because that seems to be one of the things that would help. If you even knew this was going on, you'd be looking at it, whether in search engines or your news feed, with a different lens than we are now.

4:10 p.m.

Director of Education, MediaSmarts

Matthew Johnson

There are a number of things that I think can be done at the federal level. One of those is supporting the development of digital literacy resources. Most of the provinces that are adopting digital literacy in the curriculum don't necessarily have resources for teachers to use, and certainly that can help them. One of the big risks in digital literacy education is “silo-ization”; that is, efforts and energy are being wasted because the same wheel is being reinvented 13 times across the country. I think federal efforts can certainly help to prevent that.

Also, certainly, there is making the public in general more aware of digital literacy as an issue and digital literacy skills as life skills that all of us need at all of our stages in life. We need also to incorporate it in early childhood education, making sure, again, that it begins as soon as young people are using digital devices.

4:10 p.m.

Liberal

Pam Damoff Liberal Oakville North—Burlington, ON

Thank you.

Jane.

4:10 p.m.

Professor, Faculty of Law, University of Ottawa, As an Individual

Jane Bailey

The other thing at the federal level, of course, is that we have the Office of the Privacy Commissioner of Canada. We could think more about giving them more power, about giving them real enforcement authority and the authority to deal with algorithmic curation kinds of issues as well. Compared to many jurisdictions in the world, our PIPEDA is an important piece of legislation in terms of public control over private organizations and what they do with data. From a federal perspective, thinking about strengthening the powers and the jurisdiction of that body, I think, is certainly something that should be on the table.

4:10 p.m.

Liberal

Pam Damoff Liberal Oakville North—Burlington, ON

Thank you.

4:10 p.m.

Conservative

The Chair Conservative Marilyn Gladu

We'll go to Ms. Harder for seven minutes. She'll be sharing her time with Ms. Vecchio.

December 5th, 2016 / 4:10 p.m.

Conservative

Rachael Thomas Conservative Lethbridge, AB

Thank you so much.

I apologize for not being here for your presentations. Unfortunately, I was called out for a moment. I am certainly one of the strongest advocates for having you at the table today, and I appreciate your time.

That said, I have a few questions for you.

My first question is for each of you, if you don't mind. What can be done to prevent pornographic images or videos from coming up on my newsfeed when I'm just searching? My nieces search innocent things all the time and come up with crazy images that pop up on the screen. I understand that my colleague Karen has asked a similar question, but can you expand on this? What can be done to prevent these things from happening to our children?

4:10 p.m.

Director of Education, MediaSmarts

Matthew Johnson

Technically, we have a number of easy steps that can be partially effective.

If you're talking on an individual level, almost every search engine has a “SafeSearch” setting. There are also content filters that are available. Most ISPs make those available. There are commercial filter programs as well. These are never going to be 100% effective, particularly when you broaden your definition of inappropriate content beyond just nudity. There are certainly things that we recommend, especially using something free like the SafeSearch.

This is one of the reasons why we approach digital literary in a holistic way. This is why things like authentication and search skills address content issues as well. One of the best ways to avoid finding this is having sufficient search skills so that you're looking for only the one thing that you're looking for, so that you're able to craft a successful search string that will narrow out things you don't want.

There are certainly also steps that you can take to avoid having a profile built. If you watch a video that may, for whatever reason, have inappropriate content algorithmically connected with it, if you're not having a profile of you built, it's going to have less of an effect. There are measures like using search engines that don't collect data on you, possibly using an IP proxy, or using, in some cases, the incognito modes of browsers, or activating the do-not-track function in browsers.

All of those, again, are incomplete on their own. Again, that's why we say that you can never entirely shield young people. That's why we have to talk about these issues. Those are all effective steps that you can take to reduce the odds of those things happening.

4:15 p.m.

Conservative

Rachael Thomas Conservative Lethbridge, AB

Jane, your brief comments, please.

4:15 p.m.

Professor, Faculty of Law, University of Ottawa, As an Individual

Jane Bailey

The sorts of things we're doing as damage control, which Matthew gave us a really good rundown on, have other implications, but aside from that, I think my answer to everything, really, as you may remember—I said this the last time too—is to end patriarchy, and then we won't have to see unwanted pornography. There are root causes here that are the issue. We can and we have to do damage control, because we want to educate our kids and we want to protect our kids. Kids need to know how to deal with this content, and they need to be able to think critically about this content as well.

At the end of the day, if violent pornography is an issue, that's a systemic issue. We have to take care of misogyny.

4:15 p.m.

Conservative

Rachael Thomas Conservative Lethbridge, AB

In your estimation, is it an issue?

4:15 p.m.

Professor, Faculty of Law, University of Ottawa, As an Individual

Jane Bailey

Is violent pornography an issue?

4:15 p.m.

Conservative

Rachael Thomas Conservative Lethbridge, AB

Yes.

4:15 p.m.

Professor, Faculty of Law, University of Ottawa, As an Individual

Jane Bailey

Certainly.

4:15 p.m.

Conservative

Rachael Thomas Conservative Lethbridge, AB

What exactly is the issue, if you were to get to the heart of it? You have 30 seconds.

4:15 p.m.

Professor, Faculty of Law, University of Ottawa, As an Individual

Jane Bailey

The heart of the issue is misogyny. The heart of the issue is representation of rape or sexual assault as sex. We shouldn't be confused about that. That should not be confused. That's the heart of it. It's overlain by all kinds of other intersections, such as racism, classism, and ableism. It's overlaid by all of those things, but if the heart of an industry is to make money from enacting sexual violence against women, then we have some hard questions to ask ourselves about what society we're living in and what kinds of industries we're supporting.

4:15 p.m.

Conservative

The Chair Conservative Marilyn Gladu

All right.

We're going to Ms. Sahota for five minutes.

4:15 p.m.

Liberal

Ruby Sahota Liberal Brampton North, ON

I want to get a bit more information about upvoting content and downvoting content. Those are terms I'm not familiar with. Can you break it down for me in terms of popular sites that we would search on and how we would do such things?

4:15 p.m.

Director of Education, MediaSmarts

Matthew Johnson

I'm using upvoting and downvoting in a generic sense. I'm using “upvoting” to mean taking any action that boosts content, that spreads it, and particularly that makes it seem more relevant to the algorithm, and “downvoting” to mean anything that does the opposite, that limits the reach or makes it seem less relevant.

Each platform does that in a different way. An easy example would be “liking” something on Facebook, which is a way of upvoting it, because in future,things that you “like” will be seen as more relevant to you. Facebook is more likely to show you that if you've selected “most relevant” rather than “most recent”. You do have the option on Facebook of toggling to just a straight timeline, but the default normally is to be shown what the algorithm feels is relevant to you.

The reddit platform, for example, has pure upvoting and downvoting. In reddit, each user can literally boost something by making it more popular or drop it by making it less popular. In the case of reddit, that's also a big issue in terms of what appears on the front page of the site, which is to say what you see when you just go to reddit.com, rather than one of the many sub-reddits. That is something that we know hate groups have manipulated. They have made an effort to get certain hateful messages to the home page by getting enough people to upvote them, and again, when they've decided to target particular critics, they have tactically downvoted them in the same way.

4:15 p.m.

Liberal

Ruby Sahota Liberal Brampton North, ON

I've heard about the delivery issue that you were talking about in certain areas in the U.S., that being a case that was brought up. What disturbed me a bit in the presentation was the effect on certain outcomes of a person's life, not just with regard to the content that you're viewing currently, but in regard to the long-term effects that this could possibly have.

Can you shed more light on how somebody could consciously upvote or downvote something to perhaps get rid of something like this, when it is so subconscious and maybe no one is doing that themselves on the computer intentionally...? It's very difficult for me to understand how these people are getting targeted, especially if they're so young. You're talking about criminal record checks versus law school advertisements. How would that ever end up happening? Would it just be the demographics of where the person lives?

4:20 p.m.

Professor, Faculty of Law, University of Ottawa, As an Individual

Jane Bailey

Yes, that's the thing: it's a kind of a tragedy of the commons problem, where, when you and I make individual decisions in individual situations, and we think we're fine because we've agreed to what we've done, the implications of what we've done, our choices, can be part of what aggregates. It's the algorithmic sort of aggregation.

I'll give you an example from Latanya Sweeney's research. She did research in the United States which showed that black-sounding names were more likely to have pop-up advertising for services that allowed you to get a criminal record check than white-sounding names were. The advertising itself reflected embedded prejudice.

Then the question became, how did that happen? The search engine said, “Well, it's not us, we didn't program in a prejudice.” They said that it must be that the algorithm was reflecting societal prejudice. They said that it was more likely in the databases that we're searching that more people are searching for a criminal record check on a black-sounding name than on a white-sounding name, so they put it back as a reflection of consumers.

Part of the answer is that we won't necessarily know, but it's a powerful indicator of how it can happen, whether or not.... The algorithm curates our aggregate bias and our aggregate discrimination and feeds it back to us in ways that obviously have disparate impacts on members of marginalized communities, impacts that are not felt by members of the majority. It's complicated.

4:20 p.m.

Conservative

The Chair Conservative Marilyn Gladu

All right. That's your time.

We're going to go to Ms. Harder for five minutes.

4:20 p.m.

Conservative

Rachael Thomas Conservative Lethbridge, AB

I'm wondering if it's possible to change algorithms to pick up on buzzwords. For example, Twitter once picked up the buzzword “slut”, and said that it was a negative word, that anytime it was used, it was negative. Then they began to realize that most of the time when people were making use of the word “slut”, it wasn't necessarily negative, so they changed their algorithm.

My question, then, is that if algorithms can be changed in that way in order to guide us as users, would it be possible to change algorithms in such a way as to be able to prevent underage Internet users from having access to pornography?

4:20 p.m.

Director of Education, MediaSmarts

Matthew Johnson

I think that's a question that's a little more technical, if I can speak for Jane, than either of us is qualified to answer.

What I will say is that I'm not certain that there is any power in the world that will prevent teenagers from accessing pornography. I don't mean to be flippant with that. Certainly, there are tools that can be used, and I know there are tools that are being used. This is being discussed in the U.K. right now. I believe legislation has just been passed on that very topic. There's a lot of discussion going on, in terms of Internet safety in the U.K., about how to actually make this work and whether it is worth doing.

Certainly, there are ways of identifying or guessing people's ages. That's a big part of what algorithms do, because part of your profile is how old you are, but there are a lot of technical challenges to something like that. Like all other blocking or filtering tools, it's never going to be 100% effective, and there's a very good chance that it will result in a lot of false positives, like most blocking and filtering tools do. At the very best, it would be a complement to a digital and media literacy approach to pornography.