Thank you for inviting me into this space.
My perspective is informed by my work with survivors in the gender-based violence sector, and I will focus on the need for a gender-based analysis when we're talking about online harms and legislation.
Specifically, I'm going to focus on two online harms—the non-consensual distribution of intimate images, which I refer to as NCIID, and then also deepfake sexual abuse—although I'm happy to speak more to further forms that haven't been necessarily brought forth as much, such as cyberflashing.
Each of these forms of violence are increasing in the Canadian context. They target marginalized individuals, and they produce gendered and intersectional harms. When we're talking about the non-consensual distribution of intimate images, violence occurs when individuals have their content taken from even their computers that were private, but also posted online....
People do so for a variety of motivations, many of which link into other forms of violence. They do so to control, monitor and harass their current or past intimate partner. As well, we see especially young boys doing so, because of social pressures they face relating to traditional masculinity and expectations around sexual experience—that they should have this experience and that they should be promoting it.
We have also seen NCIID used as a tactic to advertise, recruit and maintain control over individuals who experience sex trafficking. NCIID does disproportionately target women. Out of the 295 Canadian cases of NCIID reported to police by adults in 2016, 92% were reported by women. Police-reported incidents, from 2015 to 2020, by youth 12 to 17, found girls, again, overrepresented as targets, at 86%, in comparison to boys at 11%.
Unfortunately, we are lacking intersectional Canadian data, but if we look at national studies in America and Australia, we see that they share that NCIID also disproportionately targets Black, indigenous and 2SLGBTQ2IA+ individuals, and people with disabilities.
We see very much the same targeting when we're talking about deepfake sexual abuse. Many of these forms of applications and technology only work on women's and girls' bodies. A study of 95,000 deepfake videos in 2023 found that 98% were sexually explicit, and of those, 99% targeted women.
When we're talking about the impacts, as you can imagine they are vast. They are emotional, economic, physical and social. Survivors have likened these forms of violence to additional forms of sexual violence wherein their autonomy is denied. They have also shared that one thing that's distinct about online harms is the way in which the harm becomes crowdsourced, and people are sharing this violent experience.
Technology-facilitated violence impacts different groups in qualitatively specific and intersecting ways. For instance, sexual double standards result in women in comparison to men being more likely to be blamed, discredited and stigmatized due to sexual imagery online. The 2SLGBTQ2IA+ individuals have identified that NCIID has been a tool to “out” their sexual orientation and their gender identity. Finally, deepfake sexual abuse also impacts sex workers, especially women and sex workers who have their likenesses stolen and used to inflict violence, and who then face stigma and criminalization in response.
In terms of ways to address this harm, I think much of the focus on legislation has been on regulation and removal of content, and that is absolutely essential. We also need to recognize the people this is impacting, the survivors and who survivors are going to. They are going to gender-based violence services in order to cope and heal from these harms. An added dimension when we're talking about addressing online harms is making sure we're supporting the gender-based violence agencies that are doing the work to support survivors who already have robust sex education programs.
Some of this work is also outlined in the national action plan to end gender-based violence.
As well, I want to echo Carol Todd's remarks about the importance of consent-based education, especially when we're talking about deepfake sexual abuse. Sometimes there's not an understanding of it as a form of harm, so we need to have education in schools and in society that is sex-positive and trauma-informed, to share that this is a form of violence and also to fight against victim blaming.
Thank you.