Thank you to the committee for inviting MediaSmarts to testify on this issue.
Our research suggests that algorithms and the collection of the data that make them work are poorly understood by youth. Only one in six young Canadians feel that the companies that operate social networks should be able to access the information they post there, and just one in 20 think advertisers should be able to access that information, but almost half of youth appear to be unaware that this is how most of these businesses make money.
With support from the Office of the Privacy Commissioner, we've been creating resources to educate youth about this issue and to teach them how to take greater control of their online privacy.
Algorithmic content curation is relevant to cyber-violence and youth in a number of ways. When algorithms are used to determine what content users see, they can make it a challenge to management one's online privacy and reputation. Because algorithms are typically mostly opaque in terms of how they work, it can be hard to manage your online reputation if you don't understand why certain content appears at the top of searches for you. Algorithms can also present problems in terms of how they deliver content, because they an embody their creator's conscious or unconscious biases and prejudices.
I believe Ms. Chemaly testified before this committee about how women may be shown different want ads than men. There are other examples that are perhaps more closely related to cyber-violence. Auto-correct programs that won't complete the words “rape” or “abortion”, for example, or Internet content filters, which are often used in schools, may prevent students from accessing legitimate information about sexual health or sexual identity.
This is why it remains vital that youth learn both digital and media literacy skills. One of the core concepts of media literacy is the idea that all media texts have social and political implications, even if those weren't consciously intended by the producers. This is entirely true of algorithms as well and may be particularly relevant because we're so rarely aware of how algorithms are operating and how they influence content that we see.
Even if there is no conscious bias involved in the design of algorithms, they can be the product and embodiment of our unconscious assumptions, such as one algorithm that led to a delivery service not being offered in minority neighbourhoods in the United States. Similarly, algorithms that are designed primarily to solve a technical problem, without any consideration of the possible social implications, may lead to unequal or even harmful results entirely accidentally.
At the same time, a group that is skilled at gaming algorithms can amplify harassment by what's called “brigading”: boosting harmful content in ways that make it seem more relevant to the algorithm, which can place it higher in search results or make it more likely to be delivered to audiences as a trending topic. This was an identified problem in the recent U.S. election, where various groups successfully manipulated several social networks' content algorithms to spread fake news stories. Also, it could be easily used to greatly magnify the reach of an embarrassing or intimate photo, for example, that was shared without the subject's consent.
Manipulating algorithms in this way can also be used to essentially silence victims of cyber-violence, especially in platforms that allow for downvoting content as well as upvoting.
In terms of digital literacy, it's clear that we need to teach students how to recognize false and biased information. Our research has found that youth are least likely to take steps to authenticate information that comes to them via social media, which, of course, is where they get most of their information. We need to educate them about the role that algorithms play in deciding what information they see. We also need to promote digital citizenship, both in terms of using counter-speech to confront hate and harassment, and in terms of understanding and exercising their rights as citizens and consumers. For example, there have been a number of cases where consumer action has successfully led to modifying algorithms that were seen to embody racist or sexist attitudes.
Thank you.