Thank you.
Good afternoon, committee members, and thank you for this opportunity to speak with all of you.
MediaSmarts has been working in the field of online hate for nearly two decades. Our research has consistently found that Canadian youth are frequently exposed to racist and sexist content online and that they feel it is important to do something about it, but also that they are not prepared to critically engage with hate content or to push back when they encounter it.
Our research with youth examined their attitudes and experiences with hate online—specifically, why they do or don’t intervene. We found that what’s more common than overt hate are cultures of hatred, communities in which racism, misogyny and other forms of prejudice are normalized. When hate online goes unchallenged, users may believe that intervention is overreaction. A community's norms are largely set by the most committed 10% of members.
When cultures of hatred are masked as consensus and the behaviour is not seen as harmful, the majority of witnesses may not believe intervention is worth the risk of social exclusion. Youth are particularly vulnerable because they are worried about disrupting social harmony, losing their social capital or status with their peers and drawing unwanted attention to themselves.
Hate groups take advantage of this as well as the digital architecture of online spaces, working to make hate appear more mainstream and acceptable to expand their pools of potential recruits and create an online environment hostile to their targets. Our most recent study with young Canadians shows that 2SLGBTQ+ youth are almost twice as likely to report having been bullied and to have seen racist and sexist content online.
Our study on algorithmic awareness highlights how design, defaults and artificial intelligence are shaping our online spaces. Recommendation algorithms can diminish our capacity to verify whether or not something is true online, as users may perceive content that is delivered algorithmically and curated for them as more trustworthy.
Online hate has the power to change how we know what we say we know about scientific and historical facts, social norms and even our shared reality. As youth overwhelmingly turn to the Internet as a source of information, they run the risk of being misled by hate content. If that misinformation is not challenged and users do not have the critical thinking skills to challenge it, some youth may come to hold dangerously distorted views.
Youth need to be supported in developing the skills and knowledge to be able to recognize online hate. This means learning general critical thinking and digital media literacy skills, as well as the techniques and ideologies of hate. In order to talk about controversial topics and have healthy debate, users need to be able to distinguish between arguments based on facts and those that appeal to dehumanization and fear of the other.
Youth also need clear examples of how they can respond when they encounter hate and prejudice online. Interventions should emphasize that even small efforts to push back against online hate can have profound impacts on motivating others to intervene. They need to feel that their opinions and experiences matter and will be considered by those with decision-making capacity.
Youth believe platforms and technology companies have a responsibility to set clear rules and community standards to make it easier for users to report hate and then respond to those reports through publicized enforcement metrics. They also feel that policy interventions should give youth and the trusted adults in their lives more opportunities to learn digital media literacy in Canadian classrooms, homes and communities.
I'll conclude my comments by expanding on that final point.
The value of an educational approach to online hate cannot be overstated. While governments and online platforms have important roles to play, we cannot legislate, moderate or design our way out of these challenges. We need to ensure that all people in Canada have the tools and critical capacities to safely and positively engage as ethical digital citizens.
In this way, digital media literacy is a preventative measure and a harm reduction approach to ideologically motivated violent extremism. This approach does not let either platforms or regulators off the hook by laying the burden of the challenge on the shoulders of individual users. Rather, what’s needed is a whole-of-society approach that holds platforms and governments accountable, both in their role in combatting online harm as well as in supporting digital media literacy.
MediaSmarts has been advocating for a national digital media literacy strategy for over 15 years, a recommendation consistently endorsed by key stakeholders and community partners and reconfirmed in our report on building a national “Digital Media Literacy Strategy for Canada”, released last month. This strategy would provide experts, advocates and service providers with a unified but flexible approach for preventing and responding to online harm—