Thank you.
I'll just introduce myself briefly to the committee. My name is Michael Bowe. I'm a partner in Manhattan at the law firm of Brown Rudnick.
We have been investigating Pornhub and MindGeek, its parent, and its other sites for just about a year. Included in that investigation are hundreds of accounts that are similar to Serena's, of underage women who were children who had exploited material posted on Pornhub, of adult women who were raped and the rape was videotaped and put on Pornhub, of trafficked women who have had their videos put on Pornhub, and all sorts of other non-consensual content that has been put on Pornhub.
In the short time I have, I want to address four topics that hopefully will serve as somewhat of a road map for questions and follow-up: what is it that we're really here about; how did we get here; MindGeek's knowing decision to commercialize this type of conduct; and where do we go from here?
First, what are we here about? It's really a question of what we're not here about. In a second, I'll explain why I need to raise this right up front. This is about rape, not porn. It's about trafficking, not consensual adult performance or entertainment. This is not about policing consensual adult activity. It's not about religion. I think, even in these days, everybody can agree that no industry should be commercializing and monetizing in rape, child abuse and trafficked content. I think we all expect that any legitimate business or industry wouldn't do so and would do whatever it could to make sure that type of content doesn't pollute its product.
Why am I raising this? I'm raising this because, for the last year, when public scrutiny started to be focused on MindGeek, a Canadian company, about the fact that it knowingly commercialized and monetized this type of content, instead of acknowledging the problem and aggressively dealing with it, what it has aggressively done is conduct a gaslighting campaign in the media and social media to discredit victims and deflect from the issue and blame it on other things. I'll talk about that in a minute.
This is a real problem. It's real in the sense that it happens; it's not isolated and it's awful. It's significant; it is not one or two people here and there or certain things that slipped through the cracks. As I'll explain in a minute, this type of content is part of the business model, and not just for MindGeek, which is of particular importance to this committee because it's a Canadian company, but for its competitors and in the industry.
To drive home how real it is, let me give you just a few examples of other victims we've talked to and verified.
A girl was raped at 15, and a video was posted on Pornhub and distributed through a community. Pornhub refused to remove the video for three weeks, then said it had been removed when in fact it wasn't removed for another two months, with several hundred thousand additional views, downloads and distribution in that community.
A child younger than 10 was sold into trafficking and was the subject of child pornography for almost 10 years. Those videos were distributed on various MindGeek platforms where they could remain at least until later last year.
A 15-year-old was secretly filmed via computer hack and then extorted to do other videos. Those videos were posted on Pornhub with her personal information, distributed widely, including to her community and to her family, and subjected her to long-term abuse and stalking. When she raised the issue at Pornhub, it refused to search for the videos or take any other proactive steps to prevent their distribution. The trauma led her to consider suicide.
A woman was raped on videotape and it was distributed on Pornhub, including through her community.
A 17-year-old was secretly recorded by an underage boyfriend, and it was posted to Pornhub and distributed throughout her school community and to her family, subjecting her to harassment and extortion.
A woman was drugged and raped after meeting someone on a date. The rape was videotaped and posted on Pornhub. We believe it was sold on Pornhub by the person who posted it.
A 14-year-old was secretly recorded by her boyfriend, who posted the video to Pornhub and distributed it, again, through her school and community.
Child pornography posted on Pornhub of an individual had hundreds of thousands of views and an unknown number of downloads. When confronted, Pornhub failed to report it to the authorities. That's something I'll talk about in a second.
A 16-year-old was coerced into a sexual act that was videotaped and posted on Pornhub without her knowledge or consent.
A 16-year-old girl was trafficked by two American men who filmed the sexual acts as part of the trafficking. In fact, that was what she was offered for. Those acts were posted to Pornhub. This individual is aware of other women in that trafficking ring who were sold for the same purpose.
An underage girl was trafficked for years by a business colleague of her father's. Videos were monetized on Pornhub. She reported the incident, but the videos were not taken down for an extended period of time.
An underage girl attempted suicide multiple times and turned to drugs after videos were posted on Pornhub.
Those are just a few examples. We've found many, many examples. We've investigated hundreds. We've talked to several dozen victims whom we've been able to verify. We've talked to advocates, investigators, media people, industry people and whistle-blowers. These are not isolated incidents. It's a real problem.
How did we get here? Well, we got here like we've gotten to many places at this stage in our culture—because the Internet was a major disrupter in the pornography industry. Prior to Tube sites, the pornography industry had a policing mechanism. There were statutes. We have section 2257 in the States. It requires anyone who's going to produce pornographic material to have written consent that says they've verified the age and that the stuff is consensual. If you were going to distribute it, if you were going to sell it, if you were going to stream it on the Acme Hotel Company entertainment centre, if you were going to put it on a cable channel, then everything you were going to distribute had to have that disclosure on it saying that in fact those rules had been complied with. That system worked relatively well. It wasn't perfect, but it worked.
Enter the Tube site, where anyone could post anything at any time. Millions and millions of videos were posted in a given year. In our view, section 2257 applies to much of MindGeek's business model. It might not apply to all. It's pretty clear that MindGeek and the industry's view is that it doesn't apply at all. As a result, there was no requirement of the posters. There was no compliance on behalf of the Tube sites.
Then you add in how the business model for Tube sites works and search engine optimization. The goal, of course, is to end up number one in Google searches so that if someone types “porn” with a particular topic into Google, it will pull up your site first. All of these sites—MindGeek and its competitors—were basically in an arms race to be number one.
I don't have anywhere near enough time, nor probably enough understanding, to fully explain all the elements of search engine optimization, but I can tell you certain simple truths. Content is king. Search terms are king. Long search terms are king. Descriptions are king. The more content you have, the more titles you have, the more tags you have—all of that is gold [Technical difficulty—Editor] optimization.
So [Technical difficulty—Editor] not by the [Technical difficulty—Editor], including by this Canadian company, which essentially became the Monsanto of porn, that it would just simply not put any limits on content that was coming on to the site. We've talked to whistle-blowers and industry insiders. As soon as you start to try to somehow police and filter the content on your site, you start losing content. You start delaying upload times. You start losing the search engine optimization race.
The fact of the matter is that they knew and decided not to do anything about this.
How do we know that they knew? The evidence is overwhelming. First of all, before Tube sites, it was common knowledge in the industry that absent policing, non-consensual content—children, women being trafficked and rape videos, which are the metaphor of a snuff film—would find their way into commerce. That's why we had statutes, studies and congressional hearings on this. It was common knowledge. You couldn't be in this industry and not know that if you took those away and just simply distributed anything, you would end up with this content.
Then you have the fact that search engine optimization is at the core of their business. In fact, if you go to MindGeek's website, you would not know that it is the largest Internet pornography company in the world. You would think it is a tech company. That is how it describes itself. It describes itself as an expert in search engine optimization, which means knowing what's on its site, selling advertising to people who want access to those users, selling it smartly and profitably and selling the data back to those people from that product. Put simply, in terms of knowledge, a search engine optimization company like MindGeek that is running this business model on its sites knows as much about what's on that site as NASA knows about what's going on in the space capsule. That is to say, it knows everything that's going on. It does that on a daily basis. It optimizes that on a real-time basis.
At the centre of all this is an algorithm. If you go to the site and you're drawn to that site with a particular search, the algorithm then figures out what else to send you to. It needs to know exactly what it is that's on its site to know what it is it's sending people to. For people who would search for child pornography or for titles that we know are child pornography, they would pull up a search and MindGeek itself—its algorithm—would begin directing the user to more and more of that content. It knew what was on its site like NASA knows what's in its space capsule.
Moderators purportedly reviewed all the uploads. According to MindGeek's public statements and pronouncements, it reviews all the content that is uploaded to its site, which is an admission that it reviewed all the child pornography that's found on that site.
The people it externally calls moderators, it doesn't call them that internally. It calls them “formatters”. That's important because it shows you where the emphasis is. It's not really a moderator screening for content. It's a formatter making sure that content is in the right format to maximize search engine optimization. How so? Is the title right? Are the tags right? Is the video the right length?
Whatever you call it, they reviewed it. It's on their site. They knew it was there and they chose to let it be there.
Their treatments of complaints, comments and red flags.... You've heard Serena's story. If you've read accounts in the press—and certainly from people we've seen who were victims, good Samaritans, appalled users—they've essentially been stonewalling over the years when someone would raise a complaint. To say it was non-responsive does not accurately characterize it. It was hostile. It was discouraging. It was designed to make people go away.
Again, a search engine optimization company understands and is using all of this content to maximize the value of its content and monetization.
The comment sections of many of these videos, where people are explicitly saying that this is obviously rape, where you have a woman who is clearly passed out drunk—where the person videotaping is opening her eye and poking her in the eye—and being raped, where you have people saying that this person clearly can't even be 12 years old—this is all content that MindGeek is scanning and is aware of on its site, yet those videos remained for years, and they weren't the only videos.
The treatment of illegal content, when they were called out and when they were forced to do something.... You would think that the entire post would be deleted, that the user's account would be deleted, that they would look at the user's other accounts for similar content, that they would ban that content. But in fact, the only thing that would happen was that the video would be disabled. The link is still there; the page is still there; the search terms are still there; the tags are still there. They're there because now they can still use them in attempting to maximize their search engine optimization.
In fact, last week I typed in a title for a notorious example of a child rape that occurred, which was taken down last year around this time. Even though MindGeek had taken down 10 million of its videos and that video had been taken down in the spring under public scrutiny, lo and behold, Google took me right back to Pornhub, that exact search. This shows you how it works and why it was left up there. All of that was left up there. The user might not get the video, because it was disabled, but the algorithm would then steer them to other content like it—other content that...people had clicked on that and also watched something else.
Oftentimes, when people put some public scrutiny on things, or when NCMEC, the U.S. authority on this, would direct them to take it down, they would post something that would say, “Taken down at the direction of NCMEC”, which I think they're required to do. When they are forced to take it down, oftentimes instead they would say, “Taken down due to a copyright violation”, even though they knew that wasn't what it was. We also have examples of cases, when public scrutiny has been drawn to non-consensual content based on comments and tags, of them going in and not removing the video but removing the content and tags.
The other evidence of their knowledge and intent, to a trial lawyer like myself, is what they did over the course of the last year when all of this really finally got the public scrutiny it required. As someone who advises companies that sometimes end up in a jam because someone or something or the company did something they shouldn't have done, I would say we all know what the right formula is: You acknowledge the problem; you indicate that you are going to fix the problem; you hire whoever it is from the outside and give them whatever resources they need to do that; and then you go ahead and do it. That's what real companies do, what responsible companies do—certainly companies that are running businesses in industries that are as lucrative as this.
But that's not what happened. The reason I started out my presentation with something that you might have thought was obvious—by saying what we're here about and what we're not here about—is that for a year, in response to this, despite the fact that nobody knew what was on Pornhub's site better than Pornhub and MindGeek, MindGeek has run a gaslighting campaign that has denied this was a problem, denied its extent, discredited victims, discredited advocates, and essentially attempted to silence everyone and deflect. They say to this day.... Not just MindGeek, but its agents, its allies, its industry networks are running a vicious social media astroturf campaign attempting to disparage anyone who pops up to speak about what is really happening, all the while saying not only that this stuff isn't true, but that the people who are saying it are intentionally misleading, that they're lying. But they're not lying.
They have accused people of raising these issues for ulterior motives, because they have a problem with porn or consensual conduct or they are some sort of religious zealot. The fact is that it's not about any of that. That's just a way to distract people from what the real problem is.
Of course, it was only when the New York Times exposed the problem after looking at it and found what everyone else finds when they look at it—and then Visa and Mastercard had been told about this problem but had also ignored it until the New York Times wrote its piece—that MindGeek, while still claiming that it takes all of this very seriously and always has, took down 10 million videos because it obviously had no idea whether those videos are consensual or not.
The astroturf campaign that has been run on social media has ended up doxing people. People have been hacked. We were representing a victim in Montreal who felt threatened, who felt for her safety, who had tires slashed and who then disappeared. I don't know where she is. We have investigators trying to find her. We're talking to law enforcement. I got a text message from somebody who claimed to be her roommate, who said she'd had a car accident and was in a coma. That wasn't true. I don't know what happened to her.
I have other examples like that. That's what's going on behind the scenes. Part of what we have been investigating for this year is who it has been. I'm not going to reveal that now, but we will soon. It's a very dangerous, reckless campaign that's being conducted to attempt to defend the indefensible.
What are the solutions? Real quick, one, we have to do our job and defend the victims who have been victimized and who continue to be victimized by people spreading lies about them and who, in certain instances, have been subjected to much worse conduct. We're going to do that.
What prevents it from continuing? MindGeek has taken down 10 million videos, but it has competitors that have not gotten any scrutiny. It is the flagship. It is the metaphor for the whole industry. It is a big problem. However, the problem is much bigger.
It seems to me there are two things. One, everyone agreed many years ago, before the Internet disrupted so much of our lives in good ways and bad, that, with respect to pornography, it was reasonable to have certain requirements for people who were going to produce, distribute or transfer content that required them to ensure that it was consensual. Back then, that system worked pretty well, because the industry was, compared to what it is on the Internet, somewhat finite and smaller.
It worked. There were disclosures. People had to make sure. People had to keep paperwork. Also, if you were going to distribute it, you had to make sure they had that paperwork. That made sense then, and that makes sense now. There is no way we are going to stop this or have any effective mechanisms to limit it unless we have some of those mechanisms. I don't think it's very hard, and I don't think it's unfair to require an industry that's making billions of dollars a year to have some basic compliance and moderating requirements.
There are other things I think we definitely need to do. Canada, the U.S. and most countries have the equivalent of NCMEC that child pornography is sent to, which then can make directions to take down videos and notify law enforcement. There are a few things that are obvious to me. The scope of this problem in the Internet age requires that those functions be dramatically developed and built up and that they become much more robust.
Two, I think there needs to be more transparency. [Technical difficulty—Editor] report [Technical difficulty—Editor] can look at with significantly more transparency, because obviously that will make a big difference. It will help prevent companies from denying problems simply because they know what's going on and we don't.
Most of all, this industry has to begin acting like a real industry, like a real business industry that actually cares about what it's peddling, as opposed to some chemical company from the seventies that didn't care but was making money and was poisoning people. There's a reason MindGeek is called “the Monsanto of pornography”. What everybody needs to do is to make that an impossible position to maintain in this industry.
Thank you.