I grew up in a small town in the mountains, and I didn't have Wi-Fi or really even running electricity up there. I never had an iPod or a phone or access to the Internet before I moved to the city. The school there was really small, too. There was only one school in the entire town, pre-K to grade 8. That was the school I grew up at. Then, when I moved to Bakersfield, there were 3,500 kids at one school and it was only two grades. I went from having five kids in my class all day to having 30 kids in a class, eight different classes in a day. It was all super new to me.
I'd never had a crush or a boyfriend or a first kiss or anything like that before, so I was picked on quite a bit for the first couple of weeks that I was attending school. They would make fun of me for not being up to date with everything. Being from the mountains, I didn't know the slang and I didn't know what was popular. And so, when a guy finally did take notice of me and was interested in me—or I thought he was interested in me—we started my first relationship. After a while of being in a relationship with him, his friends would come up to us at school during the lunch break and ask us a bunch of questions and try to pressure me into doing different things like kissing him—when I'd never had my first kiss before—and just saying all sorts of things.
One night—this is during the last semester of my grade 7 year—the boy I was dating at the time asked me to send him a video of myself. I didn't really understand what he meant at first. He had sent me a video from Pornhub of a girl undressing herself and just basically showing herself off to the camera. He asked me to do that and I told him I wasn't really comfortable, so he continued to ask me every night after we got back from school. I had gotten my first iPod at this point and I'd gotten a messenger app on it called Kik to talk to people at school.
He would message me on that app every night after school, asking me to send the video, and I always told him no, I wasn't comfortable doing that, I didn't even know what to do. And he's like, “It's perfectly fine, you know. Everybody does it. Everybody our age is doing that. If we're really in a relationship, if you truly loved me, then you would send me something like that.” I still, for a while, told him no. I wasn't really comfortable doing that. After a couple of weeks of it, he was like, “Fine then. You know what? This isn't even a real relationship. I don't know why I continue to bother you. If you're not even willing to send me something that I'm going to send you, then it will be over, whatever, unless you are going to send it and then I'll send you one, too.”
And so I took a quick little video, like a minute long, and I sent it to him, and for the first couple of days afterwards I didn't notice any difference. But then his friend group started coming up to us during lunch and making little comments about my body and how I was a freak and about how they wish their girlfriends would do stuff like I do. And so, at that point, I was getting upset, because I had a feeling that he had shown it to them, when he had told me that he would delete it right afterwards.
After that, I started noticing even more kids at school would look at me or make little comments to me. That was about a week and a half, two weeks, after I had first sent it. That was when I found out that it had been sent around to most of the school. After that, summer break happened. I had broken up with him because I did find out he sent it to his friends and his friends sent it to their friends, who then sent it to their friends. And so, it went around the entire school and all the neighbouring schools.
During the summer break, before grade 8, we moved, so I thought things would be better. At that point, I didn't know that other people had seen it, or that it had been posted online. When I started at the new school, after about two weeks of being there, somebody sent me a link through Kik. Somebody who made an anonymous account sent me a link through Kik. It was the video I had sent to my ex-boyfriend. It had been posted on Pornhub with the caption “13-year-old brunette shows off for the camera”.
After that, I started ditching school a lot. I started getting really depressed. I started getting into drug use. I begged my mom to transfer schools. I told her that this school was way ahead of what we had been learning up in the mountains, so I wasn't up to date. I asked her if I could just do home schooling instead, so I could get caught up. She was super busy, and she had five other kids to take care of on her own as a single parent. So obviously, she said no.
I just made it through grade 8. Before all of this, I was always a straight-A student. I was always on the honour roll or principal's list. I always got the achievement after every quarter, and at the end of the year. Toward the last quarter of grade 7, and all of grade 8, I barely passed my classes. My grades started rapidly slipping. It was mostly because I was no longer regularly attending school. I would ditch school a lot. Even on the days when I did go to school, I would hide in a bathroom stall for most of the day, or attempt to leave if I could.
After that, I messaged Pornhub to get the video taken down. I pretended to be my mother. I didn't want to tell my mom, because she was a single mother of six kids. She was raised Catholic. She had very strict views on stuff like this. I knew she would be angry. I knew it would cause problems for her. I didn't want to tell her.
I tried to deal with it on my own by typing in the “Report a problem” on the video. I flagged it. I said, “Hey, this is my daughter. She's only 14. This is child pornography. Please take this down.” They took a week or two to respond. Once they finally responded, it was like, “Yes, okay, we'll take it down”, and then proceeded to wait another two weeks before they finally did take it down.
Doing my research, I was told there was a system in place that when a video was labelled as child pornography on their site, it was flagged and tagged, and it could no longer be re-uploaded. But of course, that wasn't true, because a week after it was taken down, it was re-uploaded. All of the people my age—a couple of grades above me and even a couple of grades below me—had seen the video, even though when I transferred schools after grade 8, I transferred to a school all the way on the other side of town for high school. They had all seen the video as well. After that, I basically dropped out of public school.
Ever since, I've been.... The videos.... People find them and send them to me. They send them to me all the time, saying, “Oh my God, is this you?” People on the Internet, people I have never met in person, will find my accounts on social media and they will send it to me and say, “This is you, isn't it?” They then will try to ask me certain questions, or be really creepy toward me, or try to dox me or harass my family members. A lot of people in the grades above me, mostly guys, would try to harass me and blackmail me, saying that if I didn't do stuff with them, or if I didn't send more videos to them, they would send it to my family. They would send it to my grandma, to my mom, to all my sisters and my brother.
I just took myself off social media for a while. I stopped going to school. I got really depressed. I thought that once I stopped being in the public so much, once I stopped going to school, people would stop re-uploading it. But that didn't happen, because it had already been basically downloaded by people all across the world. It would always be uploaded, over and over and over again. No matter how many times I got it taken down, it would be right back up again.
That was the whole reason I ended up reaching out to Mike.
I'll just introduce myself briefly to the committee. My name is Michael Bowe. I'm a partner in Manhattan at the law firm of Brown Rudnick.
We have been investigating Pornhub and MindGeek, its parent, and its other sites for just about a year. Included in that investigation are hundreds of accounts that are similar to Serena's, of underage women who were children who had exploited material posted on Pornhub, of adult women who were raped and the rape was videotaped and put on Pornhub, of trafficked women who have had their videos put on Pornhub, and all sorts of other non-consensual content that has been put on Pornhub.
In the short time I have, I want to address four topics that hopefully will serve as somewhat of a road map for questions and follow-up: what is it that we're really here about; how did we get here; MindGeek's knowing decision to commercialize this type of conduct; and where do we go from here?
First, what are we here about? It's really a question of what we're not here about. In a second, I'll explain why I need to raise this right up front. This is about rape, not porn. It's about trafficking, not consensual adult performance or entertainment. This is not about policing consensual adult activity. It's not about religion. I think, even in these days, everybody can agree that no industry should be commercializing and monetizing in rape, child abuse and trafficked content. I think we all expect that any legitimate business or industry wouldn't do so and would do whatever it could to make sure that type of content doesn't pollute its product.
Why am I raising this? I'm raising this because, for the last year, when public scrutiny started to be focused on MindGeek, a Canadian company, about the fact that it knowingly commercialized and monetized this type of content, instead of acknowledging the problem and aggressively dealing with it, what it has aggressively done is conduct a gaslighting campaign in the media and social media to discredit victims and deflect from the issue and blame it on other things. I'll talk about that in a minute.
This is a real problem. It's real in the sense that it happens; it's not isolated and it's awful. It's significant; it is not one or two people here and there or certain things that slipped through the cracks. As I'll explain in a minute, this type of content is part of the business model, and not just for MindGeek, which is of particular importance to this committee because it's a Canadian company, but for its competitors and in the industry.
To drive home how real it is, let me give you just a few examples of other victims we've talked to and verified.
A girl was raped at 15, and a video was posted on Pornhub and distributed through a community. Pornhub refused to remove the video for three weeks, then said it had been removed when in fact it wasn't removed for another two months, with several hundred thousand additional views, downloads and distribution in that community.
A child younger than 10 was sold into trafficking and was the subject of child pornography for almost 10 years. Those videos were distributed on various MindGeek platforms where they could remain at least until later last year.
A 15-year-old was secretly filmed via computer hack and then extorted to do other videos. Those videos were posted on Pornhub with her personal information, distributed widely, including to her community and to her family, and subjected her to long-term abuse and stalking. When she raised the issue at Pornhub, it refused to search for the videos or take any other proactive steps to prevent their distribution. The trauma led her to consider suicide.
A woman was raped on videotape and it was distributed on Pornhub, including through her community.
A 17-year-old was secretly recorded by an underage boyfriend, and it was posted to Pornhub and distributed throughout her school community and to her family, subjecting her to harassment and extortion.
A woman was drugged and raped after meeting someone on a date. The rape was videotaped and posted on Pornhub. We believe it was sold on Pornhub by the person who posted it.
A 14-year-old was secretly recorded by her boyfriend, who posted the video to Pornhub and distributed it, again, through her school and community.
Child pornography posted on Pornhub of an individual had hundreds of thousands of views and an unknown number of downloads. When confronted, Pornhub failed to report it to the authorities. That's something I'll talk about in a second.
A 16-year-old was coerced into a sexual act that was videotaped and posted on Pornhub without her knowledge or consent.
A 16-year-old girl was trafficked by two American men who filmed the sexual acts as part of the trafficking. In fact, that was what she was offered for. Those acts were posted to Pornhub. This individual is aware of other women in that trafficking ring who were sold for the same purpose.
An underage girl was trafficked for years by a business colleague of her father's. Videos were monetized on Pornhub. She reported the incident, but the videos were not taken down for an extended period of time.
An underage girl attempted suicide multiple times and turned to drugs after videos were posted on Pornhub.
Those are just a few examples. We've found many, many examples. We've investigated hundreds. We've talked to several dozen victims whom we've been able to verify. We've talked to advocates, investigators, media people, industry people and whistle-blowers. These are not isolated incidents. It's a real problem.
How did we get here? Well, we got here like we've gotten to many places at this stage in our culture—because the Internet was a major disrupter in the pornography industry. Prior to Tube sites, the pornography industry had a policing mechanism. There were statutes. We have section 2257 in the States. It requires anyone who's going to produce pornographic material to have written consent that says they've verified the age and that the stuff is consensual. If you were going to distribute it, if you were going to sell it, if you were going to stream it on the Acme Hotel Company entertainment centre, if you were going to put it on a cable channel, then everything you were going to distribute had to have that disclosure on it saying that in fact those rules had been complied with. That system worked relatively well. It wasn't perfect, but it worked.
Enter the Tube site, where anyone could post anything at any time. Millions and millions of videos were posted in a given year. In our view, section 2257 applies to much of MindGeek's business model. It might not apply to all. It's pretty clear that MindGeek and the industry's view is that it doesn't apply at all. As a result, there was no requirement of the posters. There was no compliance on behalf of the Tube sites.
Then you add in how the business model for Tube sites works and search engine optimization. The goal, of course, is to end up number one in Google searches so that if someone types “porn” with a particular topic into Google, it will pull up your site first. All of these sites—MindGeek and its competitors—were basically in an arms race to be number one.
I don't have anywhere near enough time, nor probably enough understanding, to fully explain all the elements of search engine optimization, but I can tell you certain simple truths. Content is king. Search terms are king. Long search terms are king. Descriptions are king. The more content you have, the more titles you have, the more tags you have—all of that is gold [Technical difficulty—Editor] optimization.
So [Technical difficulty—Editor] not by the [Technical difficulty—Editor], including by this Canadian company, which essentially became the Monsanto of porn, that it would just simply not put any limits on content that was coming on to the site. We've talked to whistle-blowers and industry insiders. As soon as you start to try to somehow police and filter the content on your site, you start losing content. You start delaying upload times. You start losing the search engine optimization race.
The fact of the matter is that they knew and decided not to do anything about this.
How do we know that they knew? The evidence is overwhelming. First of all, before Tube sites, it was common knowledge in the industry that absent policing, non-consensual content—children, women being trafficked and rape videos, which are the metaphor of a snuff film—would find their way into commerce. That's why we had statutes, studies and congressional hearings on this. It was common knowledge. You couldn't be in this industry and not know that if you took those away and just simply distributed anything, you would end up with this content.
Then you have the fact that search engine optimization is at the core of their business. In fact, if you go to MindGeek's website, you would not know that it is the largest Internet pornography company in the world. You would think it is a tech company. That is how it describes itself. It describes itself as an expert in search engine optimization, which means knowing what's on its site, selling advertising to people who want access to those users, selling it smartly and profitably and selling the data back to those people from that product. Put simply, in terms of knowledge, a search engine optimization company like MindGeek that is running this business model on its sites knows as much about what's on that site as NASA knows about what's going on in the space capsule. That is to say, it knows everything that's going on. It does that on a daily basis. It optimizes that on a real-time basis.
At the centre of all this is an algorithm. If you go to the site and you're drawn to that site with a particular search, the algorithm then figures out what else to send you to. It needs to know exactly what it is that's on its site to know what it is it's sending people to. For people who would search for child pornography or for titles that we know are child pornography, they would pull up a search and MindGeek itself—its algorithm—would begin directing the user to more and more of that content. It knew what was on its site like NASA knows what's in its space capsule.
Moderators purportedly reviewed all the uploads. According to MindGeek's public statements and pronouncements, it reviews all the content that is uploaded to its site, which is an admission that it reviewed all the child pornography that's found on that site.
The people it externally calls moderators, it doesn't call them that internally. It calls them “formatters”. That's important because it shows you where the emphasis is. It's not really a moderator screening for content. It's a formatter making sure that content is in the right format to maximize search engine optimization. How so? Is the title right? Are the tags right? Is the video the right length?
Whatever you call it, they reviewed it. It's on their site. They knew it was there and they chose to let it be there.
Their treatments of complaints, comments and red flags.... You've heard Serena's story. If you've read accounts in the press—and certainly from people we've seen who were victims, good Samaritans, appalled users—they've essentially been stonewalling over the years when someone would raise a complaint. To say it was non-responsive does not accurately characterize it. It was hostile. It was discouraging. It was designed to make people go away.
Again, a search engine optimization company understands and is using all of this content to maximize the value of its content and monetization.
The comment sections of many of these videos, where people are explicitly saying that this is obviously rape, where you have a woman who is clearly passed out drunk—where the person videotaping is opening her eye and poking her in the eye—and being raped, where you have people saying that this person clearly can't even be 12 years old—this is all content that MindGeek is scanning and is aware of on its site, yet those videos remained for years, and they weren't the only videos.
The treatment of illegal content, when they were called out and when they were forced to do something.... You would think that the entire post would be deleted, that the user's account would be deleted, that they would look at the user's other accounts for similar content, that they would ban that content. But in fact, the only thing that would happen was that the video would be disabled. The link is still there; the page is still there; the search terms are still there; the tags are still there. They're there because now they can still use them in attempting to maximize their search engine optimization.
In fact, last week I typed in a title for a notorious example of a child rape that occurred, which was taken down last year around this time. Even though MindGeek had taken down 10 million of its videos and that video had been taken down in the spring under public scrutiny, lo and behold, Google took me right back to Pornhub, that exact search. This shows you how it works and why it was left up there. All of that was left up there. The user might not get the video, because it was disabled, but the algorithm would then steer them to other content like it—other content that...people had clicked on that and also watched something else.
Oftentimes, when people put some public scrutiny on things, or when NCMEC, the U.S. authority on this, would direct them to take it down, they would post something that would say, “Taken down at the direction of NCMEC”, which I think they're required to do. When they are forced to take it down, oftentimes instead they would say, “Taken down due to a copyright violation”, even though they knew that wasn't what it was. We also have examples of cases, when public scrutiny has been drawn to non-consensual content based on comments and tags, of them going in and not removing the video but removing the content and tags.
The other evidence of their knowledge and intent, to a trial lawyer like myself, is what they did over the course of the last year when all of this really finally got the public scrutiny it required. As someone who advises companies that sometimes end up in a jam because someone or something or the company did something they shouldn't have done, I would say we all know what the right formula is: You acknowledge the problem; you indicate that you are going to fix the problem; you hire whoever it is from the outside and give them whatever resources they need to do that; and then you go ahead and do it. That's what real companies do, what responsible companies do—certainly companies that are running businesses in industries that are as lucrative as this.
But that's not what happened. The reason I started out my presentation with something that you might have thought was obvious—by saying what we're here about and what we're not here about—is that for a year, in response to this, despite the fact that nobody knew what was on Pornhub's site better than Pornhub and MindGeek, MindGeek has run a gaslighting campaign that has denied this was a problem, denied its extent, discredited victims, discredited advocates, and essentially attempted to silence everyone and deflect. They say to this day.... Not just MindGeek, but its agents, its allies, its industry networks are running a vicious social media astroturf campaign attempting to disparage anyone who pops up to speak about what is really happening, all the while saying not only that this stuff isn't true, but that the people who are saying it are intentionally misleading, that they're lying. But they're not lying.
They have accused people of raising these issues for ulterior motives, because they have a problem with porn or consensual conduct or they are some sort of religious zealot. The fact is that it's not about any of that. That's just a way to distract people from what the real problem is.
Of course, it was only when the New York Times exposed the problem after looking at it and found what everyone else finds when they look at it—and then Visa and Mastercard had been told about this problem but had also ignored it until the New York Times wrote its piece—that MindGeek, while still claiming that it takes all of this very seriously and always has, took down 10 million videos because it obviously had no idea whether those videos are consensual or not.
The astroturf campaign that has been run on social media has ended up doxing people. People have been hacked. We were representing a victim in Montreal who felt threatened, who felt for her safety, who had tires slashed and who then disappeared. I don't know where she is. We have investigators trying to find her. We're talking to law enforcement. I got a text message from somebody who claimed to be her roommate, who said she'd had a car accident and was in a coma. That wasn't true. I don't know what happened to her.
I have other examples like that. That's what's going on behind the scenes. Part of what we have been investigating for this year is who it has been. I'm not going to reveal that now, but we will soon. It's a very dangerous, reckless campaign that's being conducted to attempt to defend the indefensible.
What are the solutions? Real quick, one, we have to do our job and defend the victims who have been victimized and who continue to be victimized by people spreading lies about them and who, in certain instances, have been subjected to much worse conduct. We're going to do that.
What prevents it from continuing? MindGeek has taken down 10 million videos, but it has competitors that have not gotten any scrutiny. It is the flagship. It is the metaphor for the whole industry. It is a big problem. However, the problem is much bigger.
It seems to me there are two things. One, everyone agreed many years ago, before the Internet disrupted so much of our lives in good ways and bad, that, with respect to pornography, it was reasonable to have certain requirements for people who were going to produce, distribute or transfer content that required them to ensure that it was consensual. Back then, that system worked pretty well, because the industry was, compared to what it is on the Internet, somewhat finite and smaller.
It worked. There were disclosures. People had to make sure. People had to keep paperwork. Also, if you were going to distribute it, you had to make sure they had that paperwork. That made sense then, and that makes sense now. There is no way we are going to stop this or have any effective mechanisms to limit it unless we have some of those mechanisms. I don't think it's very hard, and I don't think it's unfair to require an industry that's making billions of dollars a year to have some basic compliance and moderating requirements.
There are other things I think we definitely need to do. Canada, the U.S. and most countries have the equivalent of NCMEC that child pornography is sent to, which then can make directions to take down videos and notify law enforcement. There are a few things that are obvious to me. The scope of this problem in the Internet age requires that those functions be dramatically developed and built up and that they become much more robust.
Two, I think there needs to be more transparency. [Technical difficulty—Editor] report [Technical difficulty—Editor] can look at with significantly more transparency, because obviously that will make a big difference. It will help prevent companies from denying problems simply because they know what's going on and we don't.
Most of all, this industry has to begin acting like a real industry, like a real business industry that actually cares about what it's peddling, as opposed to some chemical company from the seventies that didn't care but was making money and was poisoning people. There's a reason MindGeek is called “the Monsanto of pornography”. What everybody needs to do is to make that an impossible position to maintain in this industry.