Thank you.
My name is Monique St. Germain, and I am general counsel for the Canadian Centre for Child Protection, which is a national charity with the goal of reducing the incidence of missing and sexually exploited children.
We operate Cybertip.ca, which is Canada's tip line to report the online sexual exploitation of children.
In 2024 alone, Cybertip processed over 29,000 reports, most of which involved child sexual abuse and exploitation material, also known as CSAM. The next most common reporting category was tied to online luring or sextortion.
To tackle the explosive growth in online CSAM, we launched Project Arachnid in 2017. It is an innovative, victim-centered set of tools that targets the detection and removal of CSAM online.
Operating at scale, Project Arachnid issues roughly 10,000 requests for removal every day, and some days it's over 20,000. To date, over 67 million notices have been issued to over 1,500 service providers worldwide. It is because we operate Project Arachnid that we understand the challenges of content removal and the immense harm to children when content is not promptly removed. It is through Cybertip.ca that we hear every day from Canadian children and families impacted by something happening online.
In addition to processing those reports, in 2024 alone, we managed nearly 2,800 requests from survivors, youth and caregivers for assistance and support. This unique lens equips us to understand how children are being targeted, victimized and sextorted on the platforms they use every day.
We understand the focus of this committee to be specifically on the issue of child influencers and social media harms to children. On the issue of child influencers, while we understand that these accounts can be a source of income for the child and their family, this comes at a personal cost to the child and their safety.
The followers of these types of accounts tend to overwhelmingly be men with a sexual interest in children. In addition, the images child influencers share are often reposted in online forums and chats amongst groups of users who comment on and sexualize these children. This heightens the risk to the individual child and to children generally.
The way social media works makes it easy for those who have a sexual interest in children to not only find child social media accounts, but also to connect with like-minded individuals who share their sexual desires about children. Images of these children are then shared within these groups to fuel sexual discussions about the child. This is likely to have repercussions for the child, extending into adulthood.
Adding fuel to the fire are algorithms. Once a user of a social media platform engages with content in some way, such as liking it or sharing it on their own account, the algorithms are tuned to ensure that the user will see even more of that type of content. The algorithms effectively amplify the content within certain user groups and connect users together who may not otherwise have been connected.
The reality is that social media is focused on ways to increase user engagement, as that's what makes the company's money. To maximize engagement—and thus profits—social media companies have developed these sophisticated algorithms, which help ensure that the user sees more of the content they like.
This is a complex child safety issue. As such, we welcome measures by the federal government to tackle the company's role in this directly, as well as measures by provincial governments to tackle it through existing child welfare and labour legislation.
On the federal side, we need to regulate a duty of care on platforms with Canadian users or where they utilize content depicting Canadians. Mandating basic safety and design expectations that need to be adhered to is critical. We also need to mandate the detection and removal of known CSAM. Young people need easily understandable and readily accessible ways to have content involving them removed quickly.
Bill C-63, introduced in the last Parliament, while not expressly dealing with child influencers, would have started Canada in a positive, meaningful direction to begin tackling issues like this. It would have imposed the duty of care on companies. It contemplated the development of an age-appropriate design code for children, and it included specific measures to ensure that types of sexual content were promptly removed.
We would also add that there's an urgent need to implement age assurance tools and to increase the use of tools like project arachnid to enhance removal and to prevent the re-upload of CSAM.
Thank you for allowing us to be part of this committee's study.