Evidence of meeting #21 for Status of Women in the 42nd Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was violence.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Rena Bivens  Assistant Professor, School of Journalism and Communication, Carleton University, As an Individual
Valerie Steeves  Associate Professor, Department of Criminology, University of Ottawa, As an Individual
Angela MacDougall  Executive Director, Battered Women's Support Services
Rona Amiri  Violence Prevention Coordinator, Battered Women's Support Services
Dee Dooley  Youth Programs Coordinator, YWCA Halifax

3:45 p.m.

Conservative

The Chair Conservative Marilyn Gladu

Good afternoon, colleagues. I hope you had a good summer.

Welcome back. We're going to get right to work on our cyber-bullying study.

We're pleased today to welcome Rena Bivens, assistant professor, from the School of Journalism and Communication at Carleton University, and Valerie Steeves, associate professor, from the Department of Criminology at the University of Ottawa.

We are excited to see you. I believe you both have 10 minutes for opening statements. Then we'll proceed to our questions.

Rena, we'll start with you.

3:45 p.m.

Rena Bivens Assistant Professor, School of Journalism and Communication, Carleton University, As an Individual

What I will share today is informed by a number of different research projects, ranging from a study on the use of social media by anti-violence non-profits to investigations of gender-related programming practices in popular social media platforms and in mobile phone apps designed to prevent sexual violence.

One issue I've encountered relates to terminology. Many terms in this area, as you know, have histories, and this baggage enters the room when we use the term. For some, “violence against women” evokes the deep-seated racism, ableism, heterosexism, and cissexism that taint early iterations of the women's movement. For others, “gender-based violence” can be problematic because it has been employed by some as a way of neutralizing the differences between men's and women's experiences of sexual violence.

In my research with non-profits, I've heard that some organizations prefer to avoid umbrella terms altogether. Instead, they narrowly focus on what they are doing at that particular moment. It may be transmisogyny one day and consent the next. This approach is seen as more genuine and honest since it has the capacity to focus on the intersections arising out of a particular situation while resisting the impulse to include everything within one label, thus obscuring the specific ways in which power operates.

As we know, violence against young women and girls occurs in settings that blend off-line and online elements, but when we focus on technology as part of this mixture, it's important to ask questions about design, in addition to questions about how people are using technologies. Still, we have to be clear that technology itself is not a cause of the violence that people experience. That's what we would call “technological determinism”, whereby technology is taken out of a social context, seemingly appearing out of thin air, and blamed for society's ills. At the same time, it's possible to focus on technological development and design since these processes aren't simply technical but are social too.

My research interests centre around questions of design and begin with the premise that technology is not neutral. I explore values and norms that become embedded in technology by designers, programmers, stakeholders, and other actors in processes of technological development.

I think particularly interesting and important for the committee's study are the ways in which technological design is a social and political act that has recursive consequences for society; that is, design decisions can, often inadvertently, solidify social relations. For example, of the 215 mobile phone applications designed to prevent sexual violence that my colleague Amy Hasinoff and I examined, the vast majority reinforce prevalent rape myths by placing the responsibility for preventing sexual violence on the victim. Only four apps out of that 215 target perpetrators, and there is an assumption that strangers are the most likely perpetrators.

Since technological design and development processes are never just technical or social, they're a viable target for policy intervention. There are a number of issues here to discuss.

First, software has many layers. Some are more visible to us as users. Think of Facebook and its blue-and-white interface. Then there are others, such as the database where Facebook collects information about each user. I have argued that software has the capacity to conceal the ways in which it enacts violence. Think about the changes to Facebook's user interface in 2014. Suddenly, people were able to identify beyond the traditional categories of “men” and “women”. They could be two-spirit, genderqueer, gender questioning, etc.

In my study, I discovered there was a difference between the progressive moves that the company made on the surface of the software, moves that worked towards dismantling oppressive conceptions of gender as binary—that there are only men and women in the world—versus the decisions they made in deeper layers of the software, layers inaccessible to most of us. To accommodate this modification they made on the surface, programmers developed a way for the software to translate these non-binary genders into a binary categorization system by focusing only on the pronoun that a user selects.

We know that people with non-binary genders experience disproportionate levels of discrimination and violence. A 2014 study from the Canadian Labour Congress, cited by the ongoing federal strategy on gender-based violence, notes that rates of intimate partner violence for transgender participants are almost twice as high as those for women and men: 64.9% lifetime prevalence rates were recorded. We also know, from the U.S. context, that transgender women of colour are targets of violence at even higher rates than their white counterparts, making up most of the murders committed against transgender people.

While the act of misgendering someone is often experienced as violence in and of itself, it's also symptomatic of the broader social systems that contribute to transphobia. What I'd like us all to consider, then, is the ways in which programming practices can be violent by reproducing and calcifying dominant regimes of gender control. Concealing this violence, by, for instance, storing that gender as “female” for someone in the database who has indicated on the surface that they are gender queer but happen to prefer the pronoun “she”, is a cause for concern, particularly when that gendered information does not simply remain in the database but is accessed by other sets of users like advertisers and marketers. So while social pressure may have led to the surface, superficial modification, it was a corporate logic that motivated Facebook to design their software in a way that misgenders users.

We're also witnessing mergers between different social media platforms, such as when Facebook picks up Instagram. This has led to an exchange of data between different platforms, so one platform doesn't even have to collect identifiers any more if it can access them from another platform. Digital delegation means being asked to sign up for Instagram through Facebook, and your Facebook information is used to do that. With my colleague Oliver Haimson, I have examined popular social media platforms to determine both how gender has been programmed into user interfaces and how gender has been programmed into spaces designed for advertisers, the advertising portals. We argue that social media platforms have become intermediaries in a bigger ecosystem that includes advertising and web analytics companies.

As a result, though, social media platforms get entrusted with a lot of control over how gender and other identifiers are categorized, and these design decisions are shaping how the public and the advertising industry understand identity. These systems they are building are like another layer of society that could promote progressive social change but instead is reifying inequalities.

I want to try to translate this into two quick points. First, the technology sector is well known for its lack of diversity, and that impacts who is making things and who designers think the user is. It's not only about adding women to the sector and stirring. Funding education that targets engineering and other related disciplines, that is informed by feminist, queer, race, and even disabilities studies lenses, is needed to open up the design process. Finally, incentives for the technology sector to support social change objectives in their design and ongoing development of technologies could also be helpful.

Thank you.

3:50 p.m.

Conservative

The Chair Conservative Marilyn Gladu

Excellent. Thanks very much.

I was remiss earlier on in not informing the committee that we have a new member of our committee. Welcome to Marc Serré. We look forward to engaging you in delightful conversation over this next session.

3:50 p.m.

Liberal

Marc Serré Liberal Nickel Belt, ON

Thank you.

3:50 p.m.

Conservative

The Chair Conservative Marilyn Gladu

Welcome as well to Filomena, who is joining us today, and Garnett Genuis, who is with us.

We'll begin our questioning with my Liberal colleagues, starting with Ms. Nassif.

Oh, wait, I'm wrong. I have to let Valerie speak first. I'm sorry. I'm out of practice, you see, over the summer.

Valerie, you have 10 minutes and you can begin. Thanks.

3:50 p.m.

Dr. Valerie Steeves Associate Professor, Department of Criminology, University of Ottawa, As an Individual

Thank you very much for the invitation. It's a privilege to be here, and I'm delighted that you're undertaking this study. I'm really curious to see what comes out of it and quite encouraged by the process itself.

For the past 20 years, a large part of my research agenda has been looking at how kids use network technologies, how they experience them, and what their perspectives about those uses and experiences are. It's really grounded in my belief that good policy should be founded on a solid understanding of those lived experiences, because I think the policies we're trying to enact are designed to provide young people with the support they need to successfully navigate the network world.

When I was thinking of what I could contribute in my 10 minutes before we get to questions, three things came to mind, and I think these are three things that the girls and young women whom I've spoken to over the last 20 years would want you to know, or would want you to take into consideration.

The first one is surveillance isn't a solution to cyber-violence or cyber-harassment; in fact, surveillance makes things worse for them, makes it harder for them to navigate through this online world. Unfortunately, if you look back at how we've responded to a lot of these policy questions, surveillance has been a standard response.

My research partner Jane Bailey and I, a number of years ago, started a review of all of the interventions before Parliament whenever kids and technology were mentioned. So starting right back from the information superhighway forward—if any of you are old enough to remember as I do—we started with this really strong narrative that kids are savvy, natural technology users, and that they're innovators and they're going to create wealth.

The lesson we draw from that is not to regulate the technology, because that will shut down innovation. But at the same time as we were advancing through this policy arc over the past 20 years or so, we started to talk about kids as being “at risk”. Kids were at risk of seeing offensive content; they could see pornography online. The solution was to put them under surveillance to make sure they wouldn't.

Then we talked about kids being at risk because they've naive. They get out into these technological platforms, and they don't really understand the bad things that can happen to them. The solution was to put them under surveillance.

Lastly, especially once we started talking about behaviours like sexting, we started to talk about kids being at risk because kids are badly behaved, so we have to put them under surveillance because we need to protect them from their own choices.

Now, from the kids' points of view this just doesn't work. From their point of view, the main problem with surveillance is that the lesson of surveillance is that adults don't trust them. They don't trust them to use their judgment; they don't trust them to make mistakes and learn from them. What they glean from this is that they can't trust adults. We've rolled out surveillance through schools and through public libraries. We're encouraging parents all the time to make sure they have their kid's Facebook password and rifle through their accounts. All of these strategies, which were designed I think in a well-intentioned way to help children, have backfired, because they have eroded the relationships of trust that are at the heart of our being able to help kids confront cyber-harassment and cyber-misogyny when they occur. I have all sorts of research findings to support this, stories of kids saying “just when this terrible thing happened to me, I couldn't go to my teacher, because then I knew the cops would be called in, and I can't trust adults not to go crazy, because they don't understand my life.”

I think that's a really important lesson. Surveillance isn't a solution. Surveillance really complicates things and makes it harder for girls and young women to cope with cyber-harassment and misogyny.

I think the second thing that they would like to say, and this really resonates with Rena's comments about design, is that the problem isn't them; the problem is the environment, and we adults are the ones who are responsible for the design of that environment.

Kids, for example, often complain that adults force them to use network technologies, and they really resent it. So, again, if you think about how we often talk about kids, we say they're natural; they're savvy; they love technology; they're online all the time. Doing research over the last 20 years with kids all across the country, we have heard very different stories. We've heard that technology actually often causes them a lot of problems.

For example, I was talking to a group of youth in Toronto just this past weekend at the CCLA, and the first question they asked was, “How can we tell our school to stop forcing Microsoft tablets on us? Now, I have to do all of my science work in class on this darn tablet, and I don't like it.” They felt it was a bad way to learn. They're actually right. All sorts of research indicates that computing technology actually reduces learning outcomes, but what they were worried about was that the commercial design of that technology made disclosure the default. As soon as they used it, they had no control over the information they inputed into that tool.

They knew that this information then made them more visible to their peers and to their teachers in ways that they are uncomfortable with. It's the lack of privacy they experience in network spaces that makes it harder for them to navigate through all of the cyber-misogyny and the harassment that exist in those spaces, and it actually sets them up for conflict with peers.

They also find that the lack of privacy built into the environment means that they are held to account for every mistake they make. It's harder for them to figure out what is and what isn't acceptable behaviour. It tends to magnify bad behaviours and silence good behaviours in really strange ways. That's the second thing. The problem is the environment. Look at the design.

I think the third thing they would want to say is that if you're going to take these seriously, move away from surveillance as a knee-jerk response and critically analyze the environment. Then start examining the commercial agenda behind the technology and think about how that commercial agenda plays into and magnifies stereotyping cyber-harassment and cyber-violence.

When I sit down with kids, they bring up misogynist trolling. Slut shaming is a huge part of the problems they face online, along with threats of rape and other kinds of sexualized violence. When I ask where they think that's coming from, they very readily point the finger at mediatization. They say the online environment that they learn and play in, that they connect with their grandmother in, is wallpapered with gender stereotypes through ads, videos, and audio files that are everywhere. They know that's part of a commercial model where everything they do online is constantly collected about them and fed back into those images and intensifies the effect of those stereotypes.

Certainly the visual nature of the environment or the media makes it much harder for girls to resist those stereotypes. We live in an age of cheat days, where five days of the week you're supposed to not eat, and then two days of the week you're allowed to have meals, which is one of the things that is coming up in public schools among girls. The girls we've talked to tell us they try to conform, at least to some extent, to these very narrow stereotypical ways of performing gender. If they don't, they are subjected to incredibly harsh judgment from their peers, and that grows into conflict, which grows into harassment and threats.

When they find that it gets to the point where they need someone to help them and they go to adults, they are judged by the adults because they've broken the rules about disclosure: “Well, you shouldn't have posted that picture. What were you doing talking to your friend about that and using that language on the Internet?” Their argument is that the whole environment is designed to make them do that. All of the incentives in that environment are for them to disclose information, to portray a certain kind of femininity, to perform according to a particular kind of identity as a girl, whether they're a learner or hanging out with friends, or just trying to find out what the adult world is like.

Given Rena's comments about the importance of layers and how that database level is so key, and how software can conceal how we as a society enact violence, I think this problem is only going to be magnified by big data algorithms that sort kids into categories for commercial purposes. We already know that those algorithms intensify inequalities. They hide these biases and sources of inequality in the algorithm, and once they're there, it's very hard to hold anybody to account.

If we look at these three things that I think girls and young women would want me to say on their behalf, I think part of the solution has to be taking responsibility for creating public spaces that are not commercialized, places where kids can gather for social interaction, for learning, and for exploring the world.

Ironically, I think before we passed the Personal Information Protection and Electronic Documents Act, the federal government actually demonstrated a lot of leadership in this regard. These were places like SchoolNet, and public access points for rural and impoverished populations. These initiatives were equality-driven and value-driven, and they were designed to promote a healthy networked public sphere. Once PIPEDA was passed, all of that funding was pulled.

I think as you listen to all of this different information and talk to different intervenors, I would urge you to keep in mind that the role of government is to create conditions that provide equal access to free speech and to support a public sphere where community norms are both articulated and respected in ways such that we hold each other to account for violence and discrimination.

Thank you.

4 p.m.

Conservative

The Chair Conservative Marilyn Gladu

Excellent. Thanks very much.

We'll begin, then, with my Liberal colleague, Ms. Nassif.

4:05 p.m.

Liberal

Eva Nassif Liberal Vimy, QC

Welcome back, everyone. I hope you have a good session.

My thanks to the witnesses for their presentations.

I would first like to turn to Ms. Bivens.

Based on your online comments, the monitoring practices for misogyny on social media are part of your interests and current projects. We hear a great deal about those female social media professionals or users who become the targets of misogynist behaviours and receive some of the most hateful comments that they have ever read or heard. Those behaviours are seen particularly on social media.

Just think of the recent Gamergate controversy. Many women have channels on Twitch or YouTube and they are all trying to participate in healthy social discussions online or they simply want to do something they love. Could you address this trend in particular? In fact, there is a strong link, though not exclusive, with women who try to break into fields, professions or recreational activities that are men's turf right now, such as the jobs of sports commentators and analysts, the video game industry or online gaming.

Do you think this trend has something to do with the current vitriol on social media when people try to overcome social and cultural obstacles?

My question has another part. Is this cyberviolence really different from other forms of harassment and intimidation, or is this simply a new medium that enables people to continue to perpetrate these crimes relatively easily under the cover of a degree of anonymity?

That's my first question. It is long.

4:05 p.m.

Assistant Professor, School of Journalism and Communication, Carleton University, As an Individual

Rena Bivens

I apologize; I didn't have the translation on from the very beginning of your question, but one thing I heard was about women getting into the design of video games and the design of technologies, and about the kinds of obstacles they might find when they're trying to participate in these careers.

Some research has found that obstacles they face include the demand to fit into this masculine culture and to take away some of their own feminine identity in order to do so. Some critiques on that have been about how many people have some feminine characteristics and some masculine characteristics. There's a spectrum there; it's not quite straightforward. But there is a heavily dominated culture, which people feel they have to fit into, if they are excluded from it from the beginning.

4:05 p.m.

Liberal

Eva Nassif Liberal Vimy, QC

The second part of my question was whether cyber-violence is really that different from other forms of harassment and intimidation or it is a new medium that enables people to perpetuate those crimes with relative ease and anonymity.

4:05 p.m.

Assistant Professor, School of Journalism and Communication, Carleton University, As an Individual

Rena Bivens

I think it's an important question to ask. Many people critique the term cyber-violence as well and ask if it is something new or just more of the same. Certainly we know that off-line and online are blurred. I think you're aware of all these things.

We try to look at the technology and ask what specifically is new or different about it. Some people like to focus on the anonymous angle—you can be anonymous—but many of these platforms are encouraging people to be “authentic”, as they call it, and to use their real names, etc. There's a lot of push-back on that as well.

I don't know how different it is in the off-line and the online environments. That's where I'd leave that.

4:05 p.m.

Liberal

Eva Nassif Liberal Vimy, QC

Do you think social media and the digital world itself is the only medium that can be used to curb this kind of behaviour? How, in your opinion, would the government best operate in these circumstances to curb cyber-violence?

I ask this question because children at age 10 and sometimes even younger have their own computers, tablets, or phones, which, unless monitored carefully by their parents, expose them to this kind of online treatment.

4:05 p.m.

Assistant Professor, School of Journalism and Communication, Carleton University, As an Individual

Rena Bivens

From what I'm hearing, you're asking about whether technology is the answer to try to resolve this problem, and about what we do when young people need to be surveilled because they have so much access.

I think what Valerie was saying speaks to a lot of that. They're losing our trust when we put them under so much surveillance, so I don't think that is the answer. That's not the way we should move. I think one big important issue here is about how we design these spaces. Valerie was speaking to this as well, and quite eloquently, in terms of trying to create spaces that don't pressure people to disclose everything and lose all of their privacy.

One scholar I admire a lot talks about how these networks are created in ways that are “leaky”. They're actually called “promiscuous” networks. You never hear “monogamous network” being used. They're meant to capture everything and then use and store what they need. They're created to be leaky. That's how they've been created. So we can try to do better in terms of designing these systems in the ways that, from what we're hearing from the eQuality Project, for instance, young people are asking for.

4:10 p.m.

Conservative

The Chair Conservative Marilyn Gladu

You have 30 seconds left.

4:10 p.m.

Liberal

Eva Nassif Liberal Vimy, QC

It's okay.

4:10 p.m.

Conservative

The Chair Conservative Marilyn Gladu

All right.

We'll go to my Conservative colleagues, and we'll begin Ms. Vecchio.

4:10 p.m.

Conservative

Karen Vecchio Conservative Elgin—Middlesex—London, ON

Wonderful.

I'd like to thank you both for coming today. This will be an excellent discussion.

I'd like to start with you, Ms. Steeves. First of all, thank you for all the research you've done with the eGirls Project. That's phenomenal.

In one of your publications, you highlight the slippery slope for girls between private experimentation and public performance in online social media. How does this phenomenon play into the hypersexualization of women, as seen in traditional and social media? Is this public performance mindset becoming more prevalent or even accepted among young people? How do we stop this from manifesting?

4:10 p.m.

Associate Professor, Department of Criminology, University of Ottawa, As an Individual

Dr. Valerie Steeves

I'll preface this by picking up on one of the earlier questions. A lot of this is age-driven. If you look at human development, kids up to about 11 and 12 tend to form their sense of identity through their relationship with their family. Once they hit 12 or 13, things start to shift a bit, generally speaking. The usual path is that we're then trying to break away from the family, get out into the world, explore different identities, and find out who we want to be as an adult. It's fraught with difficulty and lots of mistakes.

To a certain extent, that's also a performativity. One of the reasons you see so many 13- to 22-year-old kids and young adults hyper-performing is that they are developmentally predisposed to try on different identities, get them out there, see what the reaction is, and then retreat into a private space to figure out if that works for them or not.

I think the thing you've raised is that when you do this in a commercialized surveillance space, then certain kinds of identities are privileged—hypersexualized identities, for example. With the eGirls data, and similarly with the work we've been doing on the eQuality Project as well, kids tell us that instead of finding a whole range of ways of being a girl in network spaces, there's just this very narrow hypersexualized identity that's available to them, and performing it is almost protective—i.e., “I have to have a friend on my friends list who does it, or I have to do a little bit of it, because if I don't, I'm trolled.” Then they have to deal with all this incredible negativity.

I think it's interesting to see how the technology does interface with these very old stereotypical concerns around gender and problems of equality. Especially with the eGirls data, girls would tell us things like, “You know, when I'm at school, I don't feel pressure to have the makeup on and do the hair and all this type of thing, but I have friends who went online, just took pictures of the way they normally look, and got attacked immediately.” They were told they were fat and they were told they were ugly.

It's very heterosexist; it's very normative; it's very gendered, and it's very misogynistic. When they're online, they're very careful about performing in a particular way.

As well, our data actually is drawn from a really diverse group of girls. I agree with Rena on everything she said about intersectionality. It's really important to understand how race plays out with gender and how socio-economic status plays out with gender, yet all of our diverse participants indicated that they had to negotiate with this. To go back to my opening comments, they point the finger at the media stereotypes that are embedded everywhere. It's easier for them to push against the stereotype in the real world. Once you're online, it's really hard.

4:15 p.m.

Conservative

Karen Vecchio Conservative Elgin—Middlesex—London, ON

Thank you so much.

Ms. Bivens, thank you for joining us today. We've heard from previous witnesses about both real and exaggerated concerns over sexting. You have a chapter in your book called, “Quit Facebook, Don't Sext, and Other Futile Attempts to Protect Youth”.

What do you perceive to be the biggest misconception about sexting, and how do you protect youth from dangers such as online stalking or predators, without becoming futile?

4:15 p.m.

Assistant Professor, School of Journalism and Communication, Carleton University, As an Individual

Rena Bivens

I would say the biggest misconception about sexting would be that you can't take pleasure from it, and that only young people are doing it. There are people of all ages who take pleasure in sexting. I think that's one thing that we have to keep in mind, and we have to listen to the people who are doing the sexting. The question is how to do it so that it's not futile.

4:15 p.m.

Conservative

Karen Vecchio Conservative Elgin—Middlesex—London, ON

How do we keep our children from being in danger? How do we make sure that our children remain safe and that they're not putting themselves in harm's way?

4:15 p.m.

Assistant Professor, School of Journalism and Communication, Carleton University, As an Individual

Rena Bivens

It's tricky. In the chapter you mentioned, I'm looking at particular advertising campaigns, and how it can be futile to just say, “Oh, don't do that.” I haven't done a lot of research talking with youth themselves, but I'm really curious about what would be dangerous and what we need to protect them from, because it feels as though often we are trying to protect children, trying to protect certain types of children. We think girls are the most at risk and you hear this about so many new technologies. When trains came out, people were worried that women's uteruses would fly out of them. When electricity came in, people thought men would be able to see women and young girls in their homes and then would break in.

These are normal reactions, I think, to new technologies, so I guess I would ask what dangers we're concerned about.

4:15 p.m.

Conservative

Karen Vecchio Conservative Elgin—Middlesex—London, ON

Fantastic.

Ms. Steeves, part of your work is teaching kids to spot online marketing strategies. What correlation do online marketing strategies have with the actual behaviour and protection of girls online, and how does online marketing adversely affect the girls? Could you share some more ideas on that?

4:15 p.m.

Associate Professor, Department of Criminology, University of Ottawa, As an Individual

Dr. Valerie Steeves

I might spin this in a bit of a different direction. More and more, kids are aware of these strategies. They know they're not in a private space, and they feel disempowered to do anything about it. I think the disconnect is between how they perceive privacy and protective initiatives and how we've legislated privacy and protective initiatives. The pivot of focus is on non-disclosure. Advocating non-disclosure is completely out of keeping with the way kids think about privacy. Their attitude is that putting something out there doesn't mean anyone should be able to look at it and judge them for it. If they put it out there for their friends, it's meant for their friends. It's not meant for a corporation.

4:15 p.m.

Conservative

The Chair Conservative Marilyn Gladu

Thank you.

We're going to move to Ms. Malcolmson.