Thank you very much for the invitation. It's a privilege to be here, and I'm delighted that you're undertaking this study. I'm really curious to see what comes out of it and quite encouraged by the process itself.
For the past 20 years, a large part of my research agenda has been looking at how kids use network technologies, how they experience them, and what their perspectives about those uses and experiences are. It's really grounded in my belief that good policy should be founded on a solid understanding of those lived experiences, because I think the policies we're trying to enact are designed to provide young people with the support they need to successfully navigate the network world.
When I was thinking of what I could contribute in my 10 minutes before we get to questions, three things came to mind, and I think these are three things that the girls and young women whom I've spoken to over the last 20 years would want you to know, or would want you to take into consideration.
The first one is surveillance isn't a solution to cyber-violence or cyber-harassment; in fact, surveillance makes things worse for them, makes it harder for them to navigate through this online world. Unfortunately, if you look back at how we've responded to a lot of these policy questions, surveillance has been a standard response.
My research partner Jane Bailey and I, a number of years ago, started a review of all of the interventions before Parliament whenever kids and technology were mentioned. So starting right back from the information superhighway forward—if any of you are old enough to remember as I do—we started with this really strong narrative that kids are savvy, natural technology users, and that they're innovators and they're going to create wealth.
The lesson we draw from that is not to regulate the technology, because that will shut down innovation. But at the same time as we were advancing through this policy arc over the past 20 years or so, we started to talk about kids as being “at risk”. Kids were at risk of seeing offensive content; they could see pornography online. The solution was to put them under surveillance to make sure they wouldn't.
Then we talked about kids being at risk because they've naive. They get out into these technological platforms, and they don't really understand the bad things that can happen to them. The solution was to put them under surveillance.
Lastly, especially once we started talking about behaviours like sexting, we started to talk about kids being at risk because kids are badly behaved, so we have to put them under surveillance because we need to protect them from their own choices.
Now, from the kids' points of view this just doesn't work. From their point of view, the main problem with surveillance is that the lesson of surveillance is that adults don't trust them. They don't trust them to use their judgment; they don't trust them to make mistakes and learn from them. What they glean from this is that they can't trust adults. We've rolled out surveillance through schools and through public libraries. We're encouraging parents all the time to make sure they have their kid's Facebook password and rifle through their accounts. All of these strategies, which were designed I think in a well-intentioned way to help children, have backfired, because they have eroded the relationships of trust that are at the heart of our being able to help kids confront cyber-harassment and cyber-misogyny when they occur. I have all sorts of research findings to support this, stories of kids saying “just when this terrible thing happened to me, I couldn't go to my teacher, because then I knew the cops would be called in, and I can't trust adults not to go crazy, because they don't understand my life.”
I think that's a really important lesson. Surveillance isn't a solution. Surveillance really complicates things and makes it harder for girls and young women to cope with cyber-harassment and misogyny.
I think the second thing that they would like to say, and this really resonates with Rena's comments about design, is that the problem isn't them; the problem is the environment, and we adults are the ones who are responsible for the design of that environment.
Kids, for example, often complain that adults force them to use network technologies, and they really resent it. So, again, if you think about how we often talk about kids, we say they're natural; they're savvy; they love technology; they're online all the time. Doing research over the last 20 years with kids all across the country, we have heard very different stories. We've heard that technology actually often causes them a lot of problems.
For example, I was talking to a group of youth in Toronto just this past weekend at the CCLA, and the first question they asked was, “How can we tell our school to stop forcing Microsoft tablets on us? Now, I have to do all of my science work in class on this darn tablet, and I don't like it.” They felt it was a bad way to learn. They're actually right. All sorts of research indicates that computing technology actually reduces learning outcomes, but what they were worried about was that the commercial design of that technology made disclosure the default. As soon as they used it, they had no control over the information they inputed into that tool.
They knew that this information then made them more visible to their peers and to their teachers in ways that they are uncomfortable with. It's the lack of privacy they experience in network spaces that makes it harder for them to navigate through all of the cyber-misogyny and the harassment that exist in those spaces, and it actually sets them up for conflict with peers.
They also find that the lack of privacy built into the environment means that they are held to account for every mistake they make. It's harder for them to figure out what is and what isn't acceptable behaviour. It tends to magnify bad behaviours and silence good behaviours in really strange ways. That's the second thing. The problem is the environment. Look at the design.
I think the third thing they would want to say is that if you're going to take these seriously, move away from surveillance as a knee-jerk response and critically analyze the environment. Then start examining the commercial agenda behind the technology and think about how that commercial agenda plays into and magnifies stereotyping cyber-harassment and cyber-violence.
When I sit down with kids, they bring up misogynist trolling. Slut shaming is a huge part of the problems they face online, along with threats of rape and other kinds of sexualized violence. When I ask where they think that's coming from, they very readily point the finger at mediatization. They say the online environment that they learn and play in, that they connect with their grandmother in, is wallpapered with gender stereotypes through ads, videos, and audio files that are everywhere. They know that's part of a commercial model where everything they do online is constantly collected about them and fed back into those images and intensifies the effect of those stereotypes.
Certainly the visual nature of the environment or the media makes it much harder for girls to resist those stereotypes. We live in an age of cheat days, where five days of the week you're supposed to not eat, and then two days of the week you're allowed to have meals, which is one of the things that is coming up in public schools among girls. The girls we've talked to tell us they try to conform, at least to some extent, to these very narrow stereotypical ways of performing gender. If they don't, they are subjected to incredibly harsh judgment from their peers, and that grows into conflict, which grows into harassment and threats.
When they find that it gets to the point where they need someone to help them and they go to adults, they are judged by the adults because they've broken the rules about disclosure: “Well, you shouldn't have posted that picture. What were you doing talking to your friend about that and using that language on the Internet?” Their argument is that the whole environment is designed to make them do that. All of the incentives in that environment are for them to disclose information, to portray a certain kind of femininity, to perform according to a particular kind of identity as a girl, whether they're a learner or hanging out with friends, or just trying to find out what the adult world is like.
Given Rena's comments about the importance of layers and how that database level is so key, and how software can conceal how we as a society enact violence, I think this problem is only going to be magnified by big data algorithms that sort kids into categories for commercial purposes. We already know that those algorithms intensify inequalities. They hide these biases and sources of inequality in the algorithm, and once they're there, it's very hard to hold anybody to account.
If we look at these three things that I think girls and young women would want me to say on their behalf, I think part of the solution has to be taking responsibility for creating public spaces that are not commercialized, places where kids can gather for social interaction, for learning, and for exploring the world.
Ironically, I think before we passed the Personal Information Protection and Electronic Documents Act, the federal government actually demonstrated a lot of leadership in this regard. These were places like SchoolNet, and public access points for rural and impoverished populations. These initiatives were equality-driven and value-driven, and they were designed to promote a healthy networked public sphere. Once PIPEDA was passed, all of that funding was pulled.
I think as you listen to all of this different information and talk to different intervenors, I would urge you to keep in mind that the role of government is to create conditions that provide equal access to free speech and to support a public sphere where community norms are both articulated and respected in ways such that we hold each other to account for violence and discrimination.
Thank you.