Good morning, Madam Chair Shanahan and honourable members of the committee.
My name is John Clark. I am the president and CEO of the U.S.-based National Center for Missing and Exploited Children, sometimes known as NCMEC.
I am honoured to be here today to provide the committee with NCMEC's perspective on the growing problem of child sexual exploitation online, the role of combatting the dangers children can encounter on the Internet, and NCMEC's experience with the website Pornhub.
Before I begin with my testimony, I'd like to clarify for the committee that NCMEC and Pornhub are not partners. We do not have a partnership with Pornhub. Pornhub has registered to voluntarily report instances of child sexual abuse material on its website to NCMEC. This does not create a partnership between NCMEC and Pornhub, as Pornhub recently claimed during some of their testimony.
NCMEC was created in 1984 by child advocates as a private, non-profit organization to help find missing children, reduce child sexual exploitation and prevent child victimization. Today I will focus on NCMEC's mission to reduce online child sexual exploitation.
NCMEC's core program to combat online child sexual exploitation is the CyberTipline. The CyberTipline is a tool for members of the public and electronic service providers, or ESPs, to report child sexual abuse material to NCMEC.
Since we created the CyberTipline over 23 years ago, the number of reports we receive has exploded. In 2019 we received 16.9 million reports to the CyberTipline. Last year we received over 21 million reports of international and domestic online child sexual abuse. We have received a total of over 84 million reports since the CyberTipline began.
A United States federal law requires a U.S.-based ESP to report apparent child sexual abuse material to NCMEC's CyberTipline. This law does not apply to ESPs that are based in other countries. However, several non-U.S. ESPs, including Pornhub, have chosen to voluntarily register with NCMEC and report child sexual abuse material to the CyberTipline.
The number of reports of child sexual exploitation received by NCMEC is heartbreaking and daunting. So, too, are the many new trends NCMEC has seen in recent years. These trends include the following: a tremendous increase in sexual abuse videos reported to NCMEC, reports of increasingly graphic and violent sexual abuse images, and videos of infants and young children. These include on-demand sexual abuse in a pay-per- view format, and videos showing the rape of young children.
A broader range of online platforms are being used to access, store, trade and download child sexual abuse material, including chats, videos and messaging apps, video- and photo-sharing platforms, social media and dating sites, gaming platforms and email systems.
NCMEC is fortunate to work with certain technology companies that employ significant time and financial resources on measures to combat online child sexual abuse on their platforms. These measures include large teams of well-trained human content moderators; sophisticated technology tools to detect abusive content, report it to NCMEC and prevent it from even being posted; engagement in voluntary initiatives to combat online child sexual exploitation offered by NCMEC and other ESPs; failproof and readily accessible ways for users to report content; and immediate removal of content reported as being child sexual abuse.
NCMEC applauds the companies that adopt these measures. Some companies, however, do not adopt child protection measures at all. Others adopt half-measures as PR strategies to try to show commitment to child protection while minimizing disruption to their operations.
Too many companies operate business models that are inherently dangerous. Many of these sites also fail to adopt basic safeguards, or do so only after too many children have been exploited and abused on their sites.
In March 2020, MindGeek voluntarily registered to report child sexual abuse material, or CSAM, on several of its websites to NCMEC's CyberTipline. These websites include Pornhub, as well as RedTube, Tube8 and YouPorn. Between April 2020 and December 2020, Pornhub submitted over 13,000 reports related to CSAM through NCMEC's CyberTipline; however, Pornhub recently informed NCMEC that 9,000 of these reports were duplicative. NCMEC has not been able to verify Pornhub's claim.
After MindGeek's testimony before this committee earlier this month, MindGeek signed agreements with NCMEC to access our hash-sharing databases. These arrangements would allow MindGeek to access hashes of CSAM and sexually exploitive content that have been tagged and shared by NCMEC with other non-profits and ESPs to detect and remove content. Pornhub has not taken steps yet to access these databases or use these hashes.
Over the past year NCMEC has been contacted by several survivors asking for our help in removing sexually abusive content of themselves as children that was on Pornhub. Several of these survivors told us they had contacted Pornhub asking them to remove the content, but the content still remained up on the Pornhub website. In several of these instances NCMEC was able to contact Pornhub directly, which then resulted in the content being removed from the website.
We often focus on the tremendous number of CyberTipline reports that NCMEC receives and the huge volume of child sexual abuse material contained in these reports. However, our focus should more appropriately be on the child victims and the impact the continuous distribution of these images has on their lives. This is the true social tragedy of online child sexual exploitation.
NCMEC commends the committee for listening to the voices of the survivors in approaching these issues relating to Pornhub. By working closely with the survivors, NCMEC has learned the trauma suffered by these child victims is unique. The continued sharing and recirculation of a child's sexually abusive images and videos inflicts significant revictimization on the child. When any website, whether it's Pornhub or another site, allows a child's sexually abusive video to be uploaded, tagged with a graphic description of their abuse and downloaded and shared, it causes devastating harm to the child. It is essential for these websites to have effective means to review content before it's posted, to remove content when it's reported as child sexual exploitation, to give the benefit of doubt to the child or the parent or lawyer when they report content as child sexual exploitation, and to block the recirculation of abusive content once it has been removed.
Child survivors and the children who have yet to be identified and recovered from their abuse depend on us to hold technology companies accountable for the content on their platforms.
I want to thank you for the opportunity to appear before this committee. This is an increasingly important topic. I look forward to answering the committee's questions regarding NCMEC's work on these issues.