Good morning, Chairperson and distinguished members of the committee. Thank you for giving us this opportunity to present.
I am Lianna McDonald, executive director of the Canadian Centre for Child Protection, a charity dedicated to the personal safety of children. Joining me today is Lloyd Richardson, our director of technology.
By way of background, our agency operates Cybertip.ca, which is Canada’s tip line for reporting the online sexual exploitation of children. The tip line has been operating for over 18 years and currently receives, on average, 3,000 or more public reports per month.
Our agency has witnessed the many ways in which technology has been weaponized against children and how the proliferation of child sexual abuse material, otherwise known as CSAM, and non-consensual material fosters ongoing harm to children and youth. Over the last decade, there has been an explosion of digital media platforms hosting user-generated pornographic content. This, coupled with a complete absence of meaningful regulation, has created the perfect storm whereby transparency and accountability are notably absent. Children have been forced to pay a terrible price for this.
We know that every image or video of CSAM that is publicly available is a source of revictimization for the child in that image or video. For this reason, in 2017 we created Project Arachnid. Processing tens of thousands of images per second, this powerful tool detects known CSAM for the purpose of quickly identifying and triggering the removal of this illegal and harmful content. Project Arachnid has provided our agency with an important lens into how the absence of a regulatory framework fails children. To date, Arachnid has processed more than 126 billion images and has issued over 6.7 million takedown notices to providers around the globe. We keep records of all these notices we send, how long it takes for a platform to remove CSAM once advised of its existence, and data on the uploading of the same or similar images on platforms.
At this point, we would like to share what we have seen on MindGeek’s platforms. Arachnid has detected and confirmed instances of what we believe to be CSAM on their platform at least 193 times in the past three years. These sightings include 66 images of prepubescent CSAM involving very young children; 74 images of indicative CSAM, meaning that the child in the image appears pubescent and roughly between the ages of 11 to 14; and 53 images of post-pubescent CSAM, meaning that sexual maturation of the child may be complete and we have confirmation that the child in the image is under the age of 18.
We do not believe the above numbers are representative of the scope and scale of this problem. These numbers are limited to obvious CSAM of very young children and of identified teenagers. There is likely CSAM involving many other teens that we would not know about, because many victims and survivors are trying to deal with the removal issue on their own. We know this.
MindGeek testified that moderators manually review all content that is uploaded to their services. This is very difficult to take seriously. We know that CSAM has been published on their website in the past. We have some examples to share.
The following image was detected by Arachnid. This image is a still frame taken from a CSAM video of an identified sexual abuse survivor. The child was pubescent, between the ages of 11 and 13, at the time of the recording. The image shows an adult male sexually assaulting the child by inserting his penis in her mouth. He is holding the child’s hair and head with one hand and his penis with the other hand. Only his midsection is visible in the image, whereas the child’s face is completely visible. A removal request was generated by Project Arachnid. It took at least four days for that image to come down.
The next example was detected also by Project Arachnid. It is a CSAM image of two unidentified sexual abuse victims. The children pictured in the image are approximately 6 to 8 years of age. The boy is lying on his back with his legs spread. The girl is lying on top of him with her face between his legs. Her own legs are straddling his head. The girl has the boy’s penis in her mouth. Her face is completely visible. The image came down the same day we sent the notice requesting this removal.
We have other examples, but my time is limited.
While the spotlight is currently focused on MindGeek, we want to make it clear that this type of online harm is occurring daily across many mainstream and not-so-mainstream companies operating websites, social media and messaging services. Any of them could have been put under this microscope as MindGeek has been by this committee. It is clear that whatever companies claim they are doing to keep CSAM off their servers, it is not enough.
Let's not lose sight of the core problem that led to this moment. We've allowed digital spaces where children and adults intersect to operate with no oversight. To add insult to injury, we have also allowed individual companies to decide the scale and scope of their moderation practices. This has left many victims and survivors at the mercy of these companies to decide if they take action or not.
Our two-decades-long social experiment with an unregulated Internet has shown that tech companies are failing to prioritize the protection of children online. Not only has CSAM been allowed to fester online, but children have also been harmed by the ease with which they can easily access graphic and violent pornographic content. Through our collective inaction we have facilitated the development of an online space that virtually has no rules, certainly no oversight and that consistently prioritizes profits over the welfare and the protection of children. We do not accept this standard in other forms of media, including television, radio and print. Equally, we should not accept it in the digital space.
This is a global issue. It needs a global coordinated response with strong clear laws that require tech companies to do this: implement tools to combat the relentless reuploading of illegal content; hire trained and effectively supervised staff to carry out moderation and content removal tasks at scale; keep detailed records of user reports and responses that can be audited; be accountable for moderation and removal decisions and the harm that flows to individuals when companies fail in this capacity; and finally, build in, by design, features that prioritize the best interests and rights of children.
In closing, Canada needs to assume a leadership role in cleaning up the nightmare that has resulted from an online world that is lacking any regulatory and legal oversight. It is clear that relying upon the voluntary actions of companies has failed society and children miserably. The time has come to impose some guardrails in this space and show the leadership that our children deserve.
I thank you for your time.