Evidence of meeting #37 for Status of Women in the 42nd Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was content.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Jane Bailey  Professor, Faculty of Law, University of Ottawa, As an Individual
Matthew Johnson  Director of Education, MediaSmarts
Sandra Robinson  Instructor, Carleton University, As an Individual
Corinne Charette  Senior Assistant Deputy Minister, Spectrum, Information Technologies and Telecommunications, Department of Industry

4:25 p.m.

Professor, Faculty of Law, University of Ottawa, As an Individual

Jane Bailey

I'll jump in there. The other thing we have to be conscious of is that I don't think we actually want to be keeping kids from sexually explicit material. I think there's a lot of information that's necessary for kids to know about sexual activity and sexual health, which I distinguish from violent pornography. The idea of surveilling kids to prevent them from access to content about sexuality I think would be a real problem, whether or not you algorithmically distinguish between violent pornography—which in my view isn't just a problem for kids but a problem for adults too—and sexually explicit material, which is important for people to have access to. That's another problem.

Filters often over-filter, so that you don't have access to material that's important for sexual health, for example, or for developmentally appropriate sexual curiosity and self-definition. Again, going back to eGirls, the girls told us that surveillance is a problem, not a solution. I'm not sure that mechanisms that are surveilling kids or blocking kids are necessarily the approach we want to take, even if scientifically we actually could design the algorithms to do that fairly well.

December 5th, 2016 / 4:25 p.m.

Director of Education, MediaSmarts

Matthew Johnson

I would add as well that most of the well-documented negative effects of pornography are also found to be caused by other forms of sexualized media that aren't explicit. For most of the things that we see in youth that we are fairly confident are caused or influenced by pornography are also caused by sexualized advertising, sexualized music videos, and so on. In some ways, pornography really is just the most extreme end, but blocking that is only going to have a very limited effect on those issues. We really need to take a broader media literacy look at gender and sexuality and those related issues to be effective.

4:25 p.m.

Conservative

The Chair Conservative Marilyn Gladu

This has been a wonderful session. I want to thank both of our witnesses for being here and for illuminating us. Now we have even more questions, I'm sure, but we're out of time. Thanks for coming. We hope to see you again in the future.

I'm going to suspend while we switch panels.

4:30 p.m.

Conservative

The Chair Conservative Marilyn Gladu

I will call the meeting back to order. We are going to start our second panel discussion. I have a couple of announcements before we get to that.

I want to remind members that tomorrow is the National Day of Remembrance and Action on Violence against Women. You will remember that years ago the most savage, violent attack in Canada happened at École Polytechnique, and women engineers—I have to say that they were my sisters—were killed in an act of horrific gender violence. Please remember tomorrow. I know that we're not meeting because of votes in the evening, but I'm sure there will be other activities going on to remember that by.

The other thing I want to let you know is that when we were discussing our next study at committee and how we were going to move forward, we were going to have a bunch of the economic development area networks come and speak first. They've all declined to appear—amazing—so we have an opportunity instead to have one panel discussion with ISED, ESDC, and StatsCan, along with Status of Women. We could have that whole bunch come and talk to us in the first hour. For the second hour, the analysts have agreed to get our work plan ready by Friday and sent out to us, so that we can start talking about the work plan and at least agree on some of the initial meetings in the new year. Unless there's an objection, I'm going to suggest that we do that.

Without any further ado, we want to welcome our witnesses for this panel discussion. We have with us Sandra Robinson, who is an instructor at Carleton University. I will just let you know that Sandra wants to be sure she can hear your questions, so if you would ask them loudly and enunciate, that would be very good. We also have with us, from the Department of Industry, Corinne Charette, Senior Assistant Deputy Minister, Spectrum, Information Technologies, and Telecommunications Sector.

Welcome, ladies. We are going to give each of you seven minutes for your remarks.

We'll start with you, Sandra.

4:30 p.m.

Dr. Sandra Robinson Instructor, Carleton University, As an Individual

Thanks to the committee for the invitation today. It's a pleasure and a privilege to appear before you.

I am a full-time faculty member at Carleton University in communication and media studies. I teach in the areas of media and gender, law communication and culture, and algorithmic culture and data analytics on the more technical side. I'd like to share some concerns and considerations about the role of algorithms in the context of networked communications, such as those for social media, search, and, in particular, what is broadly conceived as automatic content curation by algorithms.

There's been some discussion of this already, obviously, so I'll focus on three things: defining algorithms and their operations; the trade-off between user interfaces and the increasing complexity of software; and, the impact of algorithmic content curation.

I want to be clear at the start about what I mean when I refer to an “algorithm”. In very simple terms and in the context of information systems and networked communication, it can be thought of as a series of computational steps or procedures that are carried out on information as an input to produce a particular output. For example, a search term typed in as input to “Google Search” produces an output in terms of search results.

Also, they don't operate in isolation. Algorithms are part of a complex network of digital devices, people, and processes constantly at work in our contemporary communication environment.

Embedded in any algorithmic system is a capacity for control over the information it analyzes, in that it curates or shapes the output, based on multiple factors or capacities the algorithm uses to generate the outputs. Again, in the case of Google Search, their suite of algorithms takes in the search term, personal search history, similar aggregated history, location, popularity, and many other factors to generate a particular set of filtered results for us.

The rather amazing thing about any of the algorithms incorporated into our contemporary communication is that these computational systems know much more about us than we know about them. They're often mysterious and non-transparent, as has been mentioned: a black box that governs our information landscape, persistently at work to shape information flows, determining what information we see and in what order we see it, and then nudging us towards certain actions by organizing our choices.

Algorithms do govern content automatically, but they do so because they have been designed that way. The capacity of algorithms to curate or sort information has been designed to sit behind the user interface of our popular search and social media applications, so we don't directly interact with the algorithm. Curation and filtering of information is sometimes something that we can see happening, but it's not entirely clear how it is happening. For example, the simplification includes things like swiping and tapping, and clicking icons in our mobile apps—highly simplified behaviour.

The extraordinary complexity of algorithms in automated curation is thus deeply hidden in the software and digital infrastructure necessary for networked communication, and this leads to a sort of distancing effect between us as human users and the complexity in the systems we are interacting with, such as Google Search, for example. It becomes difficult for us to connect our simple button choices or search queries to any wider effect. We don't necessarily think that our own individual actions are contributing to the ranking and sorting of other information searches or the popularity of a particular newsfeed post.

Social media companies tell us that reaction buttons like “Like” and “Don't Like”, or love or angry icons, are a way to give feedback to other users, stories, and posts, and to connect with the issues, ideas, and people we care about, but this effectively trains us to input information that feeds the algorithm so that it can generate its output, including ranking posts and shares based on these measures.

I was recently reminded of the powerful ways algorithmic curation happens. In the context of a group of Facebook users making a few original and offensive posts, the situation quickly escalated over a week, and hundreds of reactions or clicks on all those “like”, “angry”, or “haha” buttons continually moved up that cyber-bullying incident in people's newsfeeds. As Facebook itself notes on the relevancy score of a newsfeed algorithm, “we will use any Reaction similar to a Like to infer that you want to see more of that type of content”. These simple actions literally feed the algorithm and drive up the issue.

I also find Google's auto-complete algorithm even more troubling. While Google likes to make grand public assurances that their auto-complete algorithm—the drop-down of suggestions you see when you're searching—is completely objective and won't link personal names with offensive auto-completes, it still drives users to problematic content via its complex and comprehensive knowledge graph.

Google's knowledge graph combines search results in one page with images, site links, stories, and so on, but it still combines information that is problematic. For example, the Google auto-complete algorithm still points us to details of the late Ms. Rehtaeh Parsons' horrific case that were propagated by Internet trolls and continue to feature in Google's “searches related to” suggestions that appear at the bottom of the search page, pointing to images and other problematic content.

Recent changes to automated curation techniques point to our need for sustained efforts to build digital literacy skills, as discussed earlier, that steer young people into thinking more critically and being ethically minded in terms of what's going on. I would argue that we also need, then, a specific effort to educate young people about what algorithms are, not in their mathematical complexity, but generally how it is that they're operating with these simplified user actions that young people are so eager to participate in.

Visibility and publicity, and shares and various Snapchat scores are part of the new social accounting that young people value, and it's driven by an increasingly subtle yet complex infrastructure: an algorithmic milieu of communication and control that leaves very little in the hands of users.

Algorithmic sorting, ranking, and archiving is persistent and ceaseless. It churns away continuously as social media and search users navigate, click, view, search, post, share, retweet, @mention, hashtag, and react. As users, these actions and immediate results feel dynamic and vital. At its best, it affords us efficiencies in information retrieval and communication, and at its worst, it amplifies some of our most problematic and prejudicial expression, action, and representation online.

Thank you. I look forward to your questions.

4:40 p.m.

Conservative

The Chair Conservative Marilyn Gladu

That was excellent.

Corinne, you have seven minutes.

4:40 p.m.

Corinne Charette Senior Assistant Deputy Minister, Spectrum, Information Technologies and Telecommunications, Department of Industry

Thank you very much, Chair. Thank you for inviting Innovation, Science and Economic Development to address the issue of big data analytics and its applications to algorithm-based content creation, to the detriment, in some cases, of young girls and women.

This is an important issue for me not only because of its impact on my work, but also because I am a woman engineer who was in Montreal during the events at the Polytechnique, which were devastating for me.

Following graduation as an electrical engineer, I was very fortunate to have many great roles in technology with a lot of leading organizations, including IBM, KPMG, and FINTRAC, our money-laundering detection agency. I was the government CIO, until my current post as the SADM of SITT. For 30 years, I've been working in technology, I've seen the adoption of many great technology trends, including the Internet and big data analytics.

Now, as senior assistant deputy minister, my job is to use key tools—policies, programs, regulations, and research, to advance Canada's digital economy for all Canadians.

Briefly, my sector is responsible for a wide range of programs, including the radio frequency spectrum, helping to maintain the security of our critical telecommunications infrastructure, and building trust and confidence in the digital economy. We safeguard the privacy of Canadians through two key pieces of legislation: the Personal Information Protection and Electronic Documents Act, or PIPEDA, Canada's private sector privacy legislation, and Canada's anti-spam legislation. In my capacity, I can affirm that the Government of Canada is committed to seizing the benefits of big data analytics through the discovery, interpretation, and communication of meaningful patterns in data, while protecting the privacy of Canadians.

Today, I would like to share with the committee two linked ideas about predictive analytics and algorithm based content curation.

The first relates to the personal stewardship of our digital information and the second is about the Government of Canada's commitment to building trust and confidence in the economy.

To begin, I would like to note that what citizens, business, and government do online generates a massive amount of data about our world and about us as individuals.

Every day, businesses and consumers generate trillions of gigabytes of data, structured and unstructured, in texts, videos, and images. Data is collected every time someone uses their mobile device, checks their GPS, makes a purchase electronically, and so on. This data can provide beneficial insights on developing new products and services, predicting preferences of individuals, and guiding individualized marketing.

This is a tremendous opportunity for Canadian innovation. According to International Data Corporation, the big data analytics market is expected to be worth more than $187 billion in 2019. The amount of data available to analyze will double very quickly and progressively; however, there are growing concerns about whether the benefits of big data analytics could be overshadowed by the accompanying pitfalls and risks.

Studies demonstrating biased results and decisions that impact whether people can access, for example, higher education or employment opportunities, are increasing. We do need to better understand how biases towards individuals are generated and how we can guard against this. This phenomenon may be explained in part by algorithms that are poorly designed—poorly from a user's perspective—or data that is poorly selected, incorrect, or not truly representative of a population.

It is easy to see that spotty data and mediocre algorithms could lead to poor predictive analysis which can be very detrimental to individuals.

I share my colleague's comment that step one in terms of risk mitigation involves better digital literacy for all Canadians as an increasingly important tool to ensure that we know what bread crumbs we are leaving behind online. It can give Canadians the knowledge and tools to understand how to use the Internet and technology effectively, critically, and responsibly.

Personal stewardship of our online information can help all Canadians, especially young women and girls, but it needs to also be supported by my second point, which is about the frameworks that preserve our privacy, and now I will talk about PIPEDA. Canada's federal sector privacy law, PIPEDA, sets out a flexible principles-based regulatory framework for the protection of individual privacy.

The principles set out in PIPEDA are technologically neutral and are based on the idea that individuals should have a degree of control over what information businesses collect about them and what they use it for, regardless of the circumstances.

Of course, some information, such as demographics, geographic location, etc., can be determinants to targeted advertising. This is the data that these algorithms use to produce these recommendations, but this can have significant implications for the privacy of individuals, especially given the lack of transparency of the privacy policy of many online sites, and the lack of awareness amongst young people—and also older Canadians—about the data that they are freely sharing.

We need to strike the right balance between privacy and the economic opportunities resulting from the collection of personal information.

In conclusion, Innovation, Science and Economic Development Canada

works with a number of other government departments to promote the use of big data analytics and other digital technologies by the government and the private sector.

We need to promote an increased understanding of both the opportunity and the risks of our digital world, of how our data can be used, and of the privacy obligations of prediction analytics users, so that the benefits can be enjoyed by all, especially young women and girls.

I want to thank you for making this issue a part of your important work.

Thank you.

4:45 p.m.

Conservative

The Chair Conservative Marilyn Gladu

Thank you very much.

We will now begin the question and answer period.

You have the floor, Ms. Ludwig.

You have seven minutes.

4:45 p.m.

Liberal

Karen Ludwig Liberal New Brunswick Southwest, NB

Thank you.

Thank you for your presentations.

I want to share a bit of my personal background. When I was working on my Ph.D. in education, one of the areas that I focused on was technology, and when I was teaching curriculum development to teachers what I found was that it was very hard at times to get the message across—this is for Dr. Robinson—about the use of technology in the classroom and its impact of that when we have so many teachers in the K-to-12 system whose original introduction to education never included technology—it just kind of lands in the classroom.

What recommendation would you give to our committee about how we could help teachers, for example, to understand the implications of technology in the K-to-12 system? Also, where should that be incorporated? Should it be in a class of sex education or any course in general about the implications of algorithms?

4:45 p.m.

Instructor, Carleton University, As an Individual

Dr. Sandra Robinson

That's a great question. Thank you.

From my perspective, it's interesting because I catch up with youth in their first year of university. I think I get a sense, then, of that lack of digital literacy. They can Snapchat the heck out of the world, but they are struggling to understand how some of the pieces make that technology happen.

For teachers, perhaps, in those years prior to their students bursting out onto the world, I think we need to make a concerted effort. One recommendation is, organize appropriate training for teachers. I even think that teachers who feel that they have a facility with technology maybe should be marked out as champions within their schools or within their program to help be leaders and to encourage their colleagues. People fear technology and, in most studies, women more so than men. I think there has to be a very safe and encouraging environment to consider what kind of participation can happen.

I think we need to tackle not just the surface level of software applications, the how do we use this.... We've come full circle now with the Internet and now it's time to come back and say, “Hang on a second.” These simple user interfaces are masking a very complex ecosystem of software, and we can't escape trying to make an effort to understand and then to share that understanding with youth and among ourselves in pulling each other into the 21st century. I think it absolutely has to happen before students reach the upper level of their school training, in high school and whatnot. I think that in some ways it's never too early, given where you see young people and kids with cellphones.

4:45 p.m.

Liberal

Karen Ludwig Liberal New Brunswick Southwest, NB

Thank you.

One of the things I want to follow up on, Dr. Robinson, is that you mentioned “champions”, as in the champions of technology. You're probably well aware of adoption theories for taking and incorporating technology. The early adopters are the ones who are always rewarded, particularly in education. Those who hang back and ask why or what's in it for them are seen as the resisters. Thank you for that.

Marshall McLuhan is well-known for coining “the medium is the message”. Listening to the presentations today, I was certainly reminded of that. For example, how a communication is conveyed can be more important than its content. In so much of what we've heard regarding cyber-bullying, we have anonymous abusers. I think the system of the algorithms themselves, from what I've learned today, sets up the system for, as Ms. Charette mentioned, targeted marketing. In many respects, it almost creates a environment for targeted victims. Would either one of you agree?

4:50 p.m.

Senior Assistant Deputy Minister, Spectrum, Information Technologies and Telecommunications, Department of Industry

Corinne Charette

I would say that it's the amount of information freely available on the Internet that can be so easily aggregated by using the most basic of tools, such as search engines of any kind, that makes it easy for malicious actors to aggregate information on individuals or groups of individuals and to exploit that. It's not always algorithmically based.

Fundamentally, the algorithms will not work unless the data is available online. Really, the amount of data that most people deposit online wittingly—but mostly unwittingly—every single day is staggering. Over time, there are very many bread crumbs, and the search engines and other tools seek them out and aggregate them.

4:50 p.m.

Liberal

Karen Ludwig Liberal New Brunswick Southwest, NB

If I may ask this, then, in looking at the bread crumbs, let's say there's a family of five. They all share the same computer. There is one log-in, or potentially no log-in, and maybe there are varying ages within a family. Is it quite likely, then, if a child goes off to do homework, that child unwittingly, to the parents or anyone else in the home, in terms of whatever interests there are in that home, could be confronted with images and messages that were never intended? Because of the bread crumbs that are left, could that be presented on the screen to whoever the user is of that shared unit?

4:50 p.m.

Senior Assistant Deputy Minister, Spectrum, Information Technologies and Telecommunications, Department of Industry

Corinne Charette

There's no doubt that malicious actors find ways to infiltrate networks, home networks, and home computers and are going to find children, as well as adults, using those computers. Today, most households have more than one Internet-enabled device. They have at least one cellphone, probably some kind of tablet or computer for the kids to do their homework on—a lot of homework is online now—and they may have other Internet-enabled devices, including their home thermostat, their smart TV, and so on. All of those devices basically will stream out of that home at multiple times during the day, based on who's using what.

4:50 p.m.

Liberal

Karen Ludwig Liberal New Brunswick Southwest, NB

On that, if I may add, here's my next question. As a committee that has been studying cyber-bullying and violence against young women and girls, I'm sure we've all been doing our own Google searches. Are we also leaving bread crumbs about violence against women and violence against girls, because of sites that possibly we may have been researching? Are we sending those bread crumbs out there as well to be added to our profiles?

4:50 p.m.

Senior Assistant Deputy Minister, Spectrum, Information Technologies and Telecommunications, Department of Industry

Corinne Charette

I'll let Sandra do that one.

4:50 p.m.

Instructor, Carleton University, As an Individual

Dr. Sandra Robinson

I think that's something that I'm glad you've raised, because I—

4:50 p.m.

Conservative

The Chair Conservative Marilyn Gladu

You're out of time. I'm sorry.

I'm going to have to go to Ms. Harder for seven minutes.

4:50 p.m.

Conservative

Rachael Thomas Conservative Lethbridge, AB

If you can do so in under a minute, you're welcome to finish that thought.

4:50 p.m.

Instructor, Carleton University, As an Individual

Dr. Sandra Robinson

Sure. Even when we search for things and we have the best of intentions, our searches are ramping up and feeding the algorithms. That's the answer to that, unfortunately.

4:50 p.m.

Conservative

Rachael Thomas Conservative Lethbridge, AB

Okay.

I've asked witnesses this question before, and I'll ask it again, because I'm interested in your thoughts. With regard to the use of algorithms, you're using the word “curate”. I actually really like that word; I think it's a good one. Or there's “steer traffic”, or whatever you want to say. I asked the other witnesses this question: could algorithms be used, then, in order to prevent access to pornography for those who are under age in the same way that algorithms are used on, say, Twitter? Algorithms are used to detect the word “slut” or the word “bitch”, etc., right? In the same way, could algorithms be used to positively steer young viewers?

4:50 p.m.

Senior Assistant Deputy Minister, Spectrum, Information Technologies and Telecommunications, Department of Industry

Corinne Charette

The problem with that is that the Internet doesn't distinguish the age of the user online unless you have some form of search engine that requires your age to be disclosed. You might have parental filters on your home Internet connection that would prevent your youngster from getting to these sites from your home PC, just like there are filters in business and government that prevent users from going to any malicious sites and so on. Unfortunately, a search is generated by an anonymous user.

4:55 p.m.

Conservative

Rachael Thomas Conservative Lethbridge, AB

Okay.

Do you have any additional thoughts?

4:55 p.m.

Instructor, Carleton University, As an Individual

Dr. Sandra Robinson

I would agree. I think the other thing is that because there are so many mobile devices now, the difficulty with that sort of “one fix” is that the platforms work quite differently—the mobile technology platform versus the browser-based desktop or laptop—so it's difficult for that to be an extensive utility to prevent access.

4:55 p.m.

Conservative

Rachael Thomas Conservative Lethbridge, AB

Thank you.

Here's another question, then, for each of you to answer. I guess the basic question is, for algorithms, could legislation be brought in to build some parameters around how algorithms are used in an effort to essentially safeguard our young people in particular? I guess that would be my main interest. Could that be done?