Good afternoon.
Madam Clerk and members of the committee, I am very glad to be here this afternoon.
My name is Francis Fortin, and I am an assistant professor at the Université de Montréal's school of criminology, as well as a researcher at the International Centre for Comparative Criminology. The focus of my research is cybercrime and the sexual exploitation of children on the Internet. Before getting into research, I spent 12 years working in cyber investigation and criminal intelligence at the Sûreté du Québec. I've authored a number of scholarly articles and three books, as well as a dozen or so chapters on cyberpedophiles.
Having a limited amount of time, I chose to divide my presentation into three parts. First, I will discuss options to encourage corporate compliance. Second, I will talk about ways to support and guide victims. Third and finally, I will address prevention and research.
Before I get into that, though, I want to say a few words about the current context. If you ask law enforcement agencies to break down the cases they deal with, two main categories emerge. The first category involves minors, and in those cases, a fast lane of sorts exists. Canada has a series of legal measures that make it easier to remove some child pornography content.
The second category involves adults, and the law is more vague in relation to those cases. For example, an adult who files a police complaint can be told that their case is a civil litigation matter. One of the witnesses gave such an example earlier. Basically, it's considered a civil matter, and the burden of taking the necessary steps falls on the complainant. As I see it, that's problematic.
Keep in mind that the revenge porn trend emerged a few years ago and shows no signs of slowing. As far as I know, Canada still has no active measures that allow authorities to take action in those cases.
Now I will turn to solutions, or ways to encourage corporate compliance. The key is to hold adult content providers accountable. One of this morning's witnesses mentioned the use of digital signatures. A number of worthwhile initiatives exist and are deployed mainly by law enforcement. Police keep child pornography databases and rely on digital signatures. Someone alluded to electronic fingerprinting earlier. These images have to be able to be shared on all platforms, including Google, Apple, Facebook and Amazon, the GAFA platforms. I know that Google and Facebook use lists they obtain in the United States. These platforms should be required to block content that has previously been deemed illegal.
The requirement to report content is another option, although it remains a thorny issue. A tremendous effort is needed to educate web giants on the importance of reporting. The current approach tends to involve removing the content and claiming that nothing can be done. Things are even worse on the platforms of the web giants. They refuse to even remove the content. That is a far cry from relying on the platforms for co-operation and encouraging them to report issues to the authorities. Reporting is essential to investigate suspects who repeatedly engage in this behaviour.
Another option is to prevent content from being shared anonymously. It's easy to see how knowing and validating the identity of individuals who spread this content would significantly decrease the risks associated with illegal content. That would result in platforms having trustworthy content providers, since new users would ultimately have to undergo verification to gain platforms' trust.
Litigation is another avenue, as one of the witnesses mentioned. One of the benefits of involving the police is that they assess the complaint to determine whether it is founded.
I think that's an important step. I don't think platforms, content providers or anyone else should be doing an assessment of the complaint, especially in cases where there is a consensus. I'll come back to that point later.
The prompt removal of the content in question is an important consideration.
In all the cases you've heard about, there's one thing to remember: it's a race against time. In order for the parties to satisfy their legal obligations, it may be appropriate for companies to immediately suspend access to the content once it has been confirmed that there are reasonable grounds for doing so. That would happen even before guilt had been established. In this scenario, reasonable grounds would lead to the prompt suspension of content access.
I think it's important to consider issuing an operating licence as a way to support all of these measures. Companies would have to satisfy those compliance requirements in order to operate. It could be done through the adoption of an ISO standard or the issuing of a licence to operate in Canada.
The second thing I'd like to talk about is support and guidance for victims.
It's clear from their stories that they found themselves fighting the situation on their own. They were up against something that they didn't understand, something that had never happened to them. Obviously, that's extremely difficult.
Basically, there has to be a shift towards victim support. That means creating a new position, a victim liaison of sorts, who would help and guide victims. As soon as problematic content on a platform was flagged, that liaison would get involved.
Whenever a new case came to the attention of police or other front-line workers, they would contact the person designated to guide and support the victim. That person's role would be to quickly assess the complaint, and respond accordingly and swiftly. Establishing such a role would help victims because the liaison worker would be familiar with the process, know what steps to take and know who to contact at the main providers. That would prevent the cat-and mouse-game the victim gets caught up in, figuring out on her own what to do and who is responsible under the law. There would be a single person dealing with the different platforms.
A list could be drawn up outlining the steps to take when an incident of this nature occurs, similar to the process in the case of an accident. On one hand, police handle the investigation and deal with the criminal aspect, and on the other, the liaison steps in to manage the accident, so to speak. Furthermore, that person could—should, in fact—have the necessary powers to be effective.
The liaison could work with police and organizations involved in preventing sexual exploitation. In fact, I could readily see victims groups, even the Office of the Privacy Commissioner of Canada, taking on that role in the future.
A novel approach would be to establish a special victims task force, which would bring together police and liaison workers, and have all of the necessary legal tools to track down content. The task force would, of course, uncover information about suspects, but would not be responsible for the follow-up. The information would be turned over to the appropriate investigative authorities, and the task force would focus on tracking down content and ensuring platforms comply with the new measures. If Canada were to introduce an operating licence system, as I mentioned earlier, it would make the task force's job easier, as would having the contact information of those in charge.
That brings me to my third point. I want to underscore the importance of focusing on prevention in schools.
A continued focus on awareness is needed to make sure young people understand the significance of pictures and videos. Victims readily put their trust in people or technology. Many cases involve young people who trusted apps and sites like Snapchat because they felt secure knowing that the content would be removed. They ended up realizing, however, that their pictures and videos were shared without their consent.
Lastly, I want to stress how relevant research is.
In Canada and the U.S., we have no evidence focused on the phenomenon. The sexual exploitation of children on the Internet is hard to measure. I recommend that the government adopt measures to make it easier to access data, so that researchers like myself can build a body of evidence to effectively inform public policy.
I have been working on this problem for nearly 20 years now. I don't think we can rely on the industry to regulate itself. That's quite clear from the stories you've heard.