Evidence of meeting #38 for Access to Information, Privacy and Ethics in the 43rd Parliament, 2nd Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was online.

A video is available from Parliament.

On the agenda

MPs speaking

Also speaking

Charles DeBarber  Senior Privacy Analyst, As an Individual
Arash Habibi Lashkari  Assistant Professor, Faculty of Computer Science, University of New Brunswick and Research Coordinator, Canadian Institute for Cybersecurity, As an Individual
Melissa Lukings  Juris Doctor Candidate and Advocate and Cybersecurity Researcher, As an Individual

11:55 a.m.

Conservative

The Chair Conservative Chris Warkentin

Colleagues, we will suspend our meeting just for a moment until we get the next witnesses lined up, then we will call this meeting back to order.

The meeting is suspended.

Noon

Conservative

The Chair Conservative Chris Warkentin

[Technical difficulty—Editor] We have a number of witnesses. We have Charles DeBarber, who is a senior privacy analyst.

We have Arash Habibi Lashkari, who is an assistant professor in the Faculty of Computer Science at the University of New Brunswick and a research coordinator at the Canadian Institute for Cybersecurity.

I'd like to welcome back Melissa Lukings, as well, who is a juris doctor candidate and an advocate for cybersecurity research.

I know you'll have some opening statements, so we'll turn to Mr. DeBarber to begin.

Noon

Charles DeBarber Senior Privacy Analyst, As an Individual

Hello. Good afternoon. My name is Charles DeBarber and I'm a senior privacy analyst with Phoenix Advocates and Consultants. My background is U.S. Army cyber-intelligence and cybersecurity.

I began my work with victims of non-consensual pornography, or NCP, in 2015, when I worked for the elite firm Fortalice. As the program manager for open source intelligence, I assisted victims of NCP through our reputation services. Since departing Fortalice in 2018, I have done freelance work on behalf of victims of revenge porn, extortion schemes and cyberstalking, and on purging content for victims of human trafficking. I've written bespoke information guides for clients to help protect their digital privacy and to reduce the chances of their being a target of successful doxing.

My background gives me deep insight into the sources of content on the Internet, and today I want to share with you guys some knowledge about the surface web, deep web and dark web. In addition, I'd like to share some research about the sources of adult NCP on these three layers.

As a disclaimer, I want to be clear that my data regarding NCP is limited in a few ways. First, my data is limited to the 90-plus cases that I've undertaken since 2019. You'll see these are sourced as “PAC Research 2016 to 2021”. I recognize there's a selection bias to that data due to it being from only our casework. Second, much of my information on NCP involving children is largely anecdotal, as I've never produced statistics on it. In addition, the bulk of my work has been with adult victims. Third, I am discussing the concepts of surface web, deep web and dark web and how they relate to the volumes and types of NCP often found on them. This is not to paint any of these layers as good or bad. The dark web has an especially heinous reputation, but remember that there are people who use the dark web to subvert censorship or express their free speech in countries where freedom of speech is very limited.

You'll see in the handout the beautiful iceberg graph that is commonly used to explain the three layers. You have surface web, deep web and dark web. We'll start with the surface web.

The surface web is basically the Internet content indexed by search engines themselves and things you can directly jump to from search engines. It's aggregated web content that can be found with web crawlers, also known as spider bots or spiders. Make note of that, because it is very important for one of the points I'll make later. The surface web is the minority of online content, around 4% to 5%.

What's the deep web? That's the majority of the web, more than 90% of it. It's Internet content that's not part of the surface web and is not indexed in search engines. It's mostly content that is not readily accessible through standard means, such as search engines. As I said, it's the majority of content on the Internet.

Then there's the dark web. It's part of the deep web, but what makes it different is that you have to use encryption software and special software to access it—things like Tor Browser or Freenet or Freegate. It's also used interchangeably with dark net. It can be called both.

NCP comes in many forms. Some of the key forms for adult victims include revenge porn, non-consensual surveillance, human trafficking and data or device breaches. We have the following statistics from our casework. The majority of adult NCP, 73.5% of our cases, was found on the surface web. We believe that the reason for this is that adult NCP pornography easily blends in with amateur pornography. The ease of use and popularity of video- and image-sharing sites on the surface web is the main cause of this.

On top of that, the deep web accounts for about 23.2%. These are often private forums for pirated content, BitTorrent sites, and VoIP and messaging apps like Discord communities. The more compartmented nature of the deep web leads to a lower volume of content that is also less viral.

The dark web accounts for little of our content. Content there, in our experience, includes things that we consider highly illegal, things you would find only on the dark web because they are highly illegal. This could be things like hidden bathroom cam footage, extremely violent content, child pornography and bestiality. NCP blends in with amateur pornography and is readily available on upper layers. There's no reason to go to the dark web for it. Only a minority of Internet users have enough expertise and knowledge of the dark web to use it anyway. The even more compartmentalized nature of the dark web just keeps people off it. This results in more extreme and illegal content being relegated to the dark web.

In our casework, only about 3.3% is dark web content.

There are a few observations I would like to share with the committee. I've removed over 100,000 pieces of NCP content in the last five years. My average client has between 400 to 1,200 pieces of content, and that could be the same picture, video or handful of pictures, but it's shared on many different sites. Viral content itself can be upwards of 6,000 pieces of content and above. Very rarely do I utilize the NCP removal processes created by search engines such as Google or Bing or social media like Facebook, Twitter or Reddit.

I normally use the copyright removal process here in the United States, known as the Digital Millennium Copyright Act. The NCP process often is more complicated and takes longer for victims who have to follow it for every piece of content. Imagine, if you have 400 pieces of content out there, that might be 400 different applications you have to put out. These companies, frankly, respect intellectual property more than victims, because the copyright process is so much easier.

The removal process is costly in both time and resources. I utilize automation, which is not cheap. For a client with more than 400 pieces of content, it would usually cost $2,000 for automated removal and $5,000 for bespoke removal services, and that just mitigates the problem. Victims using it manually require a certain level of understanding of information systems, search engines and web caching, and that is if the victim can find most of the content without using automated aggregators. My junior analysts, some of them with information systems and computer science backgrounds, take up to a month of hands-on work to learn how to effectively purge content. The average victim is expected to have this expertise if they cannot afford professional services. The tools for victims to effectively mitigate their digital footprint of content aren’t readily available.

Great strides have been made to get Silicon Valley to recognize the issue, and I don’t wish to demean those efforts or that recognition. Laws in my home country are now in 48 states and two territories to protect victims of NCP. However, picking up the pieces after NCP floods surface web sites is still an uphill battle. We’ve worked tirelessly so clients can google their name without NCP coming up. One of our clients lives in fear of her 10-year-old using the computer and googling her name. Others have lost job opportunities, housing opportunities and relationships. Many of our clients have contemplated or attempted suicide.

Finally, video upload sites that allow pornography, such as Pornhub or Xvideos, have exacerbated the problem. This is one of the big points I want to make. Content goes viral a lot faster with these sites, and these sites use what is called search engine optimization to flood Google with their content. Even if the content is deleted within 72 hours, it often takes days, frankly, for a victim to even find out that they're a victim. Smaller video upload sites then aggregate this material from search engines and repost it, making this a feedback loop that keeps feeding the search engines and makes it a viral issue.

The issue has become so significant that when a victim’s name is posted in a video title that they're aggregated in and it's then used in search engine keywords for porn sites that don't even have their content, it just becomes a random keyword—their name—and God forbid you have a unique name. Imagine googling your name, and hundreds of porn sites coming up because your name is a keyword empowered by SEO techniques.

We need to find a balance between verification and privacy. That's very easy for me to say, but sites having a reasonable policy for age verification is required. I compliment Pornhub in adopting a verified content policy in late 2020. I'm very angry [Technical difficulty—Editor] and I badly want them held accountable for that, but I want to make sure it's also not so cumbersome that sex workers who are free agents can't operate without reasonable privacy.

Search engines—and this is a key one, and I would recommend you put this forward, or at least encourage them to change their policies—shouldn't allow indexing from adult video image upload sites that do not come from verified accounts. This means that, with verified accounts, the spiders can be turned on so that they can feed into Google, Bing and so on. However, spiders should be turned off on any website where any Joe Schmo can come and upload content, whether it be videos or images. They should be turned off on that content until it is verified. That keeps it from hitting search engines in 72 hours.

Remember, with all NCP, you're really fighting time, and that keeps it from going viral a lot more quickly, quite frankly. It makes the clean-up process significantly better, and it can mitigate it. Furthermore, it would probably protect the intellectual property of other sex workers. As I said, Pornhub and other major tube sites have more or less put NCP into the express lane via SEO techniques.

Finally, the doxing of victims and sex workers is a very serious issue. Despite many of my clients being Jane Does, I can't get Google to delist web pages that post the real names of victims. I wish there was a policy that allowed the delisting of the real names of Jane Does, of sex workers, that exist on sites such as the defunct Porn Wikileaks, which were very dangerous for them and were made for doxing victims.

I'm very open to questions you may have and appreciate your welcoming me today. I'm honoured to be here.

Thank you.

12:10 p.m.

Conservative

The Chair Conservative Chris Warkentin

Thank you, Mr. DeBarber.

Professor Lashkari, we'll turn to you for your opening statement.

12:15 p.m.

Dr. Arash Habibi Lashkari Assistant Professor, Faculty of Computer Science, University of New Brunswick and Research Coordinator, Canadian Institute for Cybersecurity, As an Individual

Thank you so much.

Good afternoon, everyone. I think Mr. DeBarber mentioned most of the content that I wanted to share with you, but maybe I'm talking from another perspective, as a researcher. I'm also going share some of my latest findings, which I have already published.

As a short bio, I am Arash Habibi Lashkari, assistant professor in the faculty of computer science at UNB, research coordinator at the Canadian Institute for Cybersecurity and also a senior member of the IEEE.

In the past two and a half decades, I have been involved in different projects related to designing, developing and implementing the next generation of detecting and preventing disruptive technologies in academia and industry.

Actually, on the academic side, I can share with you that I have over 20 years of teaching experience spanning several international universities. On the research side, I have published 10 books and around 90 research articles on a variety of cybersecurity-related topics. I have also received 15 awards in international computer security competitions, including three gold medals. In 2017, I was recognized as one of the top 100 Canadian researchers who will shape the future of Canada. My main research areas are Internet and Internet traffic analysis, malware detection and also threat hunting.

As has been requested here, today I am talking about the dark and deep web and also the dark and deep net, but I'm trying to make it simpler so that it's possible to easily visualize and so that everybody can imagine it.

We have three layers, and the first one, which is the common layer, we call the “surface web”. This is everything that is available and open, everything that can be found as you search the different search engines such as Google, Bing, Baidu and others. We call this the “indexed web”, which means the websites that have been indexed by the search engines.

The second one is the deep web, which is the portion of the Internet that is hidden from the search engines, and we call this “unindexed web”. It includes mainly personal information, such as payment information, medical records and corporate private data, or when, for example, we are using a VPN, a virtual private network, to connect to these contents.

The third one is the dark web, and this portion is certainly hidden from search engines and actually includes the www content that exists on darknets. These websites can be accessible to special software and browsers that allow the users and also the website operators to remain anonymous and untraceable. There are several projects going on here to support the dark net, such as Tor, The Onion Router; I2P, the Invisible Internet Project; and also Riffle, which is the collaborative project between MIT and EPFL in response to the problems we have with the Tor network.

What is the source of the basic darknet? In 1971 and 1972, two Stanford students, using an ARPANET account at the AI laboratory, tried to engage in a commercial transaction with their counterparts at MIT. This means that before Amazon and before eBay, the seminal act of e-commerce was a drug deal, and the students used this network to quietly arrange for the sale of an undetermined amount of marijuana through the precursor to the Internet we know today.

What is the new version of the darknet, or the modern darknet? In 1990 the lack of security on the Internet—and its ability to be useful in tracking and surveillance—became clear, and in 1995 three guys from NRL, which is the U.S. Naval Research Lab, asked themselves if there was any way to create Internet connections that didn't reveal who was talking to whom, even to someone, for example, monitoring the network. The answer was onion routing.

The goal of onion routing was to have a way to use the Internet with as much privacy as possible, and the idea was to route traffic through multiple servers and encrypt it each step of the way, making it completely anonymized.

In 2000, one student from MIT—Roger—had already started to work with one of these guys at the NRL and created a new project named Tor, or The Onion Router. After that, in 2006, another student or classmate joined this team. They received funds from the EFF, and officially in 2006 they opened this non-profit organization.

My latest research results—all of them have been published in 2016, 2017 and 2020—show that it is possible to actually detect users who are connecting to the dark or deep web in a short period of time—around 10 to 15 seconds. Also, we can detect the type of software or application they are using, but from their machine, not from the Internet. From the Internet, everything is completely anonymized, but from the actual user's machine it is possible to detect their activity somehow.

I am completely ready for any question if the committee asks.

Thank you.

12:20 p.m.

Conservative

The Chair Conservative Chris Warkentin

Thank you, Professor.

We're going to turn to you, Ms. Lukings. Thanks so much for joining us again this morning.

June 7th, 2021 / 12:20 p.m.

Melissa Lukings Juris Doctor Candidate and Advocate and Cybersecurity Researcher, As an Individual

Hello, friends. I feel like most of us have met before, but in case we haven't, I'll quickly introduce myself.

My name is Melissa Lukings. I'm a juris doctor candidate in the University of New Brunswick's faculty of law. I'm also a cybersecurity law and legal researcher, an alumnus of Memorial University of Newfoundland with a B.A. in linguistics, and a social justice and legal reform advocate. I have intersectional lived experience as related to previous testimonial evidence, which was invited to be heard by this committee before. I sent in some handouts. Everyone can read about my background there. I don't really want to waste time on that. I just want to go right into what I wanted to say.

My message to you today, basically, is one of concern at the overbroad and ambiguous nature of some of the proposed legislation that has been put forward.

Here are the issues.

We're being told that the rationale behind the proposed regulations and the push for digital content censorship is to prevent the prevalence and dissemination of non-consensual pornographic material, child pornography and other abusive material, which tends to pop up mostly on the surface web, as we heard earlier. We also want to deter and detect illegal material, prevent it from being uploaded and, optimistically, reduce the instances of human trafficking done via a connection in Canada, and/or with some ties to Canada.

The last time I was here, I expressed my concern that creating more intensive regulations of any sort on surface web content will inevitably push fringe traffic onto dark forums, which are much more difficult to detect and where an influx of user access would saturate an already challenging area for law enforcement. As Dr. Lashkari pointed out, whereas you can detect dark web traffic from the user source computer, it cannot be detected in the Net, from inside, which presents a challenge.

We have some graphics that we've created. They're all in your handouts. They explain how all the different aspects of the dark web work, so if you have any questions, we have illustrations for that.

When I was last here, the response was that it's not the intention of the federal government to push human trafficking, sexual exploitation, illegal content, violence, child porn and all of that onto the dark web. That's great.

Also, as a side note, I really enjoyed being a professor for, like, a minute in your last meeting. Thanks. That was super fun. I made a GIF.

True, we don't want to push these things onto the dark web, and that's great. You wouldn't want to sweep these under the metaphorical rug that is the hidden Internet, yet we're continuing to discuss the creation of additional regulations as if there's not a direct consequence of doing so, even though there is. It's not just a matter of NIMBY or not in my backyard when it comes to illegal content. Hiding it doesn't make it go away. It just hides it from sight, which isn't really a way to address these issues.

On point number four on my notes, when I was last here, I found it really frustrating that the adult entertainment issue and sex work in general had been conflated with sexual exploitation, abuse and trafficking within discussions at this very committee.

Indeed, MP Arnold Viersen was so taken by the emailed testimony of people with common experiences in commercialized sexual activity that he felt it was appropriate to waste his speaking time reading out victim porn-type emails from unknown persons, rather than engaging with the spoken testimony of people who also had common experiences in commercialized sexual activity and who had been invited to be heard at the committee hearing.

That's not okay. Hearings are usually for being heard. You're supposed to be hearing from the people who you invite and who are to be heard at your hearing. That's why it's called a “hearing”. Anyway, that's that.

Through highly inaccurate media portrayals, the dark web has become nearly synonymous with illegal activities. However, it is also used....

12:20 p.m.

An hon. member

[Inaudible—Editor]

12:20 p.m.

Juris Doctor Candidate and Advocate and Cybersecurity Researcher, As an Individual

Melissa Lukings

Chris, are you okay? Do you want me to stop?

12:25 p.m.

Conservative

The Chair Conservative Chris Warkentin

We were just.... Pardon me. I apologize.

12:25 p.m.

Juris Doctor Candidate and Advocate and Cybersecurity Researcher, As an Individual

Melissa Lukings

No worries.

The committee is dealing with a Canadian-controlled private corporation, a CCPC, which is a private commercial organization based in and operating with headquarters located in Canada. It is a Canadian company. We know this, and that's fine. Commercial organizations in Canada are bound by the Personal Information Protection and Electronic Documents Act. PIPEDA outlines the rules and remedies, including the fines and other penalties, for corporations that fail to abide by the provisions specified in the act.

Beyond the corporate level, we also have the Criminal Code of Canada, which outlines the criminal offences and punishments for committing such offences. We have these. We need to apply them. Everyone is bound by the Criminal Code of Canada.

Why, then, do we need additional regulations? Why do we need more oversight when we have not yet tried to simply apply the law we already have? We have these laws. We can use them, so let's use them. That's what they're for. What's the point in even having these statutes if you're not going to apply them when they're needed? What are we doing here?

We're here because a portion of those involved have decided to conflate the issue of corporate negligence with highly sexualized and emotive criminal activity—read again, child rape porn testimony. It elicits an emotional response—the sympathetic nervous system and all of that. It doesn't matter. This is about a corporation and user-generated content. It does not matter what is depicted in the content as much as it matters that the content, whatever it may be, should not have gotten past the corporation's screening system before being made live on the site. When the issue was brought to its attention, the corporation responded inadequately at first, so we need corporate law. We need to look at liability and feasibility standards.

Why has this become a forum for grandstanding religious ideologies? I'm sure you've all heard about Exodus Cry in the news, if you've been following it. Exodus Cry is a fundamental Christian organization founded on religious ideologies stemming from the United States. Why is it relevant to a question of corporate liability in Canada? It isn't. It doesn't make any sense.

Why are we arguing about exploitation? Why are we discussing mass censorship? Is that not a massive overreaction to a simple corporate negligence question? It seems glaringly obvious to me, so why are we not discussing reasonable options for encouraging corporations to better serve their users?

Also, I have some opinions about the genderedness of this. You can read about it in my notes.

When it comes down to it, you can't eliminate sex. We're humans, and there is always going to be a demand for sex. You can't eliminate sex work because the demand exists. You can't eliminate extramarital sex or porn or masturbation or demand for sexual services, but sexual assault is illegal, even when that person is your spouse. We need it to be that way. We want to protect people. If you're saying you can do certain things only within the context of marriage, you're setting yourself up for failure. It's true.

Yes, I said “masturbation” in a hearing. Oh my God.

You cannot eliminate base human desires, so you can't eliminate sex. That would be silly. It's okay to not like these things, and just because you don't like a thing or you feel that a thing is not for you, it doesn't mean it's inherently evil and should be eliminated. It doesn't work that way. It's not about and should not be about pornography or the actual content of online material here. This is about creating reasonable laws that work for Canada, Canadian corporations and everyone residing within Canada. We don't need new regulations; we don't need a new regulator, and we don't need online censorship. We need to use the tools we already have, which were designed for a reason. Why be redundant?

That is my diatribe.

Thank you for having me. I will take any questions you throw at me.

12:25 p.m.

Conservative

The Chair Conservative Chris Warkentin

Thank you.

Colleagues, we will begin with rounds of questions.

I want to highlight that I am getting notice that there is a possibility there will be a vote in the House of Commons. I will proceed with questions through the bells if there is consent from committee members. As we get closer to the vote, we'll suspend if need be, but I am hopeful that will not be the case.

Mrs. Stubbs, we'll begin with you for the first round.

12:30 p.m.

Conservative

Shannon Stubbs Conservative Lakeland, AB

Thanks, Chair.

Melissa, thanks for your testimony and for being here today.

I share your perspective that it is crucial to distinguish between the hosting and distribution of child sexual abuse material and of material and images that don't have the explicit consent of the people depicted in them.

I think you'd agree—or let me know if you do—that people have a right to own their own images and content that include them, and also the right to withdraw that if they so choose. This is the thing that I think all of us are grappling with—your very strong point about the Criminal Code already being in place and the laws and the regulations that already exist to provide these protections for children and for others who do not give their consent.

What do you make of what the actual problem is, then? What is the enforcement issue, the lack of enforcement and the lack of application of the existing law?

12:30 p.m.

Juris Doctor Candidate and Advocate and Cybersecurity Researcher, As an Individual

Melissa Lukings

I think the current issue is that perhaps the penalties that currently exist in PIPEDA are not strong enough to deter corporations. I'm not saying to put in new regulations—I'm not saying that—but when you're going to do the digital charter implementation act and you're discussing things like Bill C-10 and Bill C-11, it's important to remember that.

I think there is room for improvement. Because we've found that financial penalties don't really seem to impact companies that make a lot of money, fines could instead be based on percentages. The key here is that we need to not have increased regulation. If what we're trying to do is in fact what we say we're trying to do, which is to reduce human trafficking and harm to young people, additional regulations are not going to help that.

Did I answer your question?

12:30 p.m.

Conservative

Shannon Stubbs Conservative Lakeland, AB

Yes.

On April 19 you mentioned a couple of possibilities related to the digital charter implementation act. You touched on the possibility of fines for companies that host and distribute already illegal content. The Minister of Heritage was just here, as you know, so I just wonder if there is.... I understand that you got cut off in your testimony last time, so I just want to see if there are any other details or recommendations you wanted to add in terms of that work.

12:30 p.m.

Juris Doctor Candidate and Advocate and Cybersecurity Researcher, As an Individual

Melissa Lukings

In terms of the digital charter implementation act?

12:30 p.m.

Conservative

Shannon Stubbs Conservative Lakeland, AB

Yes.

12:30 p.m.

Juris Doctor Candidate and Advocate and Cybersecurity Researcher, As an Individual

Melissa Lukings

For corporations the question here is, how much responsibility do they have to have in order to cover their own selves from liability for negligence? That needs to be specified. It needs to be put in words.

Other than that, we really need to work on applying the laws that we have, so if there's something standing in the way of that and that can be remedied through the new digital charter implementation act, that should be discussed, absolutely. That is my recommendation.

12:30 p.m.

Conservative

Shannon Stubbs Conservative Lakeland, AB

Thank you.

I wonder if, from your work experience and your lived experience, you might want to expand on the importance of verification and consent. If platforms ever do that without your consent or your agreement, what are the commercial consequences, or the personal consequences in the case of adults who are choosing freely to engage in this work?

12:30 p.m.

Juris Doctor Candidate and Advocate and Cybersecurity Researcher, As an Individual

Melissa Lukings

We're talking about what are the consequences if someone consensually uploads their own material?

12:30 p.m.

Conservative

Shannon Stubbs Conservative Lakeland, AB

If an online platform were to host your material without an agreement with you or—

12:30 p.m.

Juris Doctor Candidate and Advocate and Cybersecurity Researcher, As an Individual

Melissa Lukings

That's intellectual property. That's a copyright issue right there. As a photographer, when you take photos, you have a model release form. These are all contractual issues that would arise. If someone doesn't have your permission to use the material, then that is a digital copyright infringement. That's an artistic thing. It's exactly the same as if someone were to host any artistic content anywhere without the permission of the artist. It's very similar to that.

Again, we have the Copyright Act for that.

12:35 p.m.

Conservative

Shannon Stubbs Conservative Lakeland, AB

I think this is probably what's mind-boggling to many of us on this committee and probably many Canadians listening. A colleague said to me recently that, somehow, organizations like ag societies and school fundraisers and Legions are put through mountains of paperwork and administration to, say, play certain songs or use certain visual material. Then there are also online sites, say, that sell cannabis or alcohol, or host gambling, and in those two cases the country seems fairly effective at having a set of laws and bylaws and policies and regulations for these organizations [Technical difficulty—Editor] seem to manage to enforce and crack down on all of that being done illegally.

12:35 p.m.

Juris Doctor Candidate and Advocate and Cybersecurity Researcher, As an Individual

Melissa Lukings

Yes. It's magic.

12:35 p.m.

Conservative

Shannon Stubbs Conservative Lakeland, AB

I would just give you the opportunity to expand on any other specific recommendations in terms of both the enforcement and protections to combat the proliferation of child sexual abuse material and other illegal content, while also maintaining free expression, privacy and the right of individuals to have ownership and choice over their own images.