What I will share today is informed by a number of different research projects, ranging from a study on the use of social media by anti-violence non-profits to investigations of gender-related programming practices in popular social media platforms and in mobile phone apps designed to prevent sexual violence.
One issue I've encountered relates to terminology. Many terms in this area, as you know, have histories, and this baggage enters the room when we use the term. For some, “violence against women” evokes the deep-seated racism, ableism, heterosexism, and cissexism that taint early iterations of the women's movement. For others, “gender-based violence” can be problematic because it has been employed by some as a way of neutralizing the differences between men's and women's experiences of sexual violence.
In my research with non-profits, I've heard that some organizations prefer to avoid umbrella terms altogether. Instead, they narrowly focus on what they are doing at that particular moment. It may be transmisogyny one day and consent the next. This approach is seen as more genuine and honest since it has the capacity to focus on the intersections arising out of a particular situation while resisting the impulse to include everything within one label, thus obscuring the specific ways in which power operates.
As we know, violence against young women and girls occurs in settings that blend off-line and online elements, but when we focus on technology as part of this mixture, it's important to ask questions about design, in addition to questions about how people are using technologies. Still, we have to be clear that technology itself is not a cause of the violence that people experience. That's what we would call “technological determinism”, whereby technology is taken out of a social context, seemingly appearing out of thin air, and blamed for society's ills. At the same time, it's possible to focus on technological development and design since these processes aren't simply technical but are social too.
My research interests centre around questions of design and begin with the premise that technology is not neutral. I explore values and norms that become embedded in technology by designers, programmers, stakeholders, and other actors in processes of technological development.
I think particularly interesting and important for the committee's study are the ways in which technological design is a social and political act that has recursive consequences for society; that is, design decisions can, often inadvertently, solidify social relations. For example, of the 215 mobile phone applications designed to prevent sexual violence that my colleague Amy Hasinoff and I examined, the vast majority reinforce prevalent rape myths by placing the responsibility for preventing sexual violence on the victim. Only four apps out of that 215 target perpetrators, and there is an assumption that strangers are the most likely perpetrators.
Since technological design and development processes are never just technical or social, they're a viable target for policy intervention. There are a number of issues here to discuss.
First, software has many layers. Some are more visible to us as users. Think of Facebook and its blue-and-white interface. Then there are others, such as the database where Facebook collects information about each user. I have argued that software has the capacity to conceal the ways in which it enacts violence. Think about the changes to Facebook's user interface in 2014. Suddenly, people were able to identify beyond the traditional categories of “men” and “women”. They could be two-spirit, genderqueer, gender questioning, etc.
In my study, I discovered there was a difference between the progressive moves that the company made on the surface of the software, moves that worked towards dismantling oppressive conceptions of gender as binary—that there are only men and women in the world—versus the decisions they made in deeper layers of the software, layers inaccessible to most of us. To accommodate this modification they made on the surface, programmers developed a way for the software to translate these non-binary genders into a binary categorization system by focusing only on the pronoun that a user selects.
We know that people with non-binary genders experience disproportionate levels of discrimination and violence. A 2014 study from the Canadian Labour Congress, cited by the ongoing federal strategy on gender-based violence, notes that rates of intimate partner violence for transgender participants are almost twice as high as those for women and men: 64.9% lifetime prevalence rates were recorded. We also know, from the U.S. context, that transgender women of colour are targets of violence at even higher rates than their white counterparts, making up most of the murders committed against transgender people.
While the act of misgendering someone is often experienced as violence in and of itself, it's also symptomatic of the broader social systems that contribute to transphobia. What I'd like us all to consider, then, is the ways in which programming practices can be violent by reproducing and calcifying dominant regimes of gender control. Concealing this violence, by, for instance, storing that gender as “female” for someone in the database who has indicated on the surface that they are gender queer but happen to prefer the pronoun “she”, is a cause for concern, particularly when that gendered information does not simply remain in the database but is accessed by other sets of users like advertisers and marketers. So while social pressure may have led to the surface, superficial modification, it was a corporate logic that motivated Facebook to design their software in a way that misgenders users.
We're also witnessing mergers between different social media platforms, such as when Facebook picks up Instagram. This has led to an exchange of data between different platforms, so one platform doesn't even have to collect identifiers any more if it can access them from another platform. Digital delegation means being asked to sign up for Instagram through Facebook, and your Facebook information is used to do that. With my colleague Oliver Haimson, I have examined popular social media platforms to determine both how gender has been programmed into user interfaces and how gender has been programmed into spaces designed for advertisers, the advertising portals. We argue that social media platforms have become intermediaries in a bigger ecosystem that includes advertising and web analytics companies.
As a result, though, social media platforms get entrusted with a lot of control over how gender and other identifiers are categorized, and these design decisions are shaping how the public and the advertising industry understand identity. These systems they are building are like another layer of society that could promote progressive social change but instead is reifying inequalities.
I want to try to translate this into two quick points. First, the technology sector is well known for its lack of diversity, and that impacts who is making things and who designers think the user is. It's not only about adding women to the sector and stirring. Funding education that targets engineering and other related disciplines, that is informed by feminist, queer, race, and even disabilities studies lenses, is needed to open up the design process. Finally, incentives for the technology sector to support social change objectives in their design and ongoing development of technologies could also be helpful.
Thank you.