Honourable Chair Diab and all member of the Standing Committee on Justice and Human Rights, thank you for the opportunity to be here today.
My first point is that, in Canada, our current legal framework addresses child sexual abuse and exploitation via the Criminal Code and the law for protection of children from sexual exploitation. However, we should not be relying on a broad duty of care by any Internet platform. There should be law requiring the identification and immediate action to report and take down illegal sexually explicit images. We need regulation that is fit for purpose and safety by design.
My second point is this. Bill C-63 reads, “reduce harms caused to persons in Canada as a result of harmful content online and ensure that the operators of social media services...respect...their duties under that Act.” This is a glitch. All Internet platforms need accountability, not just social media sites. It takes just three clicks to find child sexual abuse imagery or child sexual exploitation material on the regular Net, and this includes images generated by artificial intelligence found through accessing many, many online platforms, including the dark web. These IPAs are disguised within websites and embedded in emojis and hidden links, requiring the viewer to follow a digital pathway that can disappear as quickly as the next link is clicked on.
In 2022, the IWF found a 360% increase in self-generated child sexual abuse reports of seven-year-olds to 10-year-olds, more prevalent than non-self-generated content. This trend has continued into 2023, when the IWF hashed 2,401 self-generated sexually explicit images and videos of three-year-olds to six-year-olds. Of those images, 91% were girls showing themselves in sexual poses, displaying their genitals to the camera. It's normal for children to have curiosity, explore their bodies or experiment sexually, but that is not what the IWF found. What is shocking is the unsupervised access of children using digital devices.
My third point is with regard to guidelines respecting the protection of children in relation to regulating services and age of consent to data processing and in using social media. There is a duty to make certain content inaccessible. Caution should be used in passing regulation based on precedents set out in other countries. We need to look in turn at all the international laws, treaties and conventions. A single guiding principle is in article 5 of the UNCRC, concerning the importance of having regard for an individual child's “evolving capacities” at any moment in time in their interactions with the online world.
My fourth point is the establishment of a digital safety office of Canada, a digital safety commission and a digital safety ombudsperson. Could Canada benefit by establishing an online safety office and a children's commissioner or ombudsperson? The answer is yes, and several countries have been blazing a trail for us. These countries are part of a global online safety regulators network that aims to create a coordinated approach to online safety issues. Canada, sadly, is not at the table.
Last week, I was invited to attend a global summit in Abu Dhabi, sponsored by WeProtect and the UAE government. I was the only child protection representative from Canada, and I'm a self-funded third party voice.
I have a few final thoughts.
It took 50 years from the development of the Gutenberg Press to develop 20 million books. It took Ford 10 years to develop 10 million Model Ts. It took Playboy approximately two years to sell over a million copies each month. It took the global Internet in 1995 two years to develop 20 million users. It took Facebook 10 months to reach one million users. Today, Meta's ecosystem—including Instagram, WhatsApp and Messenger—has approximately 2.93 billion daily active users.
We need to close the gap between the rapid development and access of the Internet and needed regulation. We cannot have a continued partisan approach, lacking civility, to develop the important regulations needed to protect children and vulnerable individuals.