Committee
Consult the user guide
For assistance, please contact us
Consult the user guide
For assistance, please contact us
Add search criteria
Results: 1 - 60 of 88
View Shannon Stubbs Profile
CPC (AB)
View Shannon Stubbs Profile
2021-06-07 11:12
I think you have spoken about the concept of having a 24-hour takedown rule, so that once it has been notified that material is there, there would be a provision for that. I think that's a good idea. Of course, the trouble is that when child sexual abuse material or non-consensual images have been up for even 24 hours, they can have hundreds or thousands of viewers—millions in the case of Pornhub and MindGeek. We've heard from victims that explicit images of them were online for three years before they found out. In the case of Serena Fleites, hers was shared and downloaded all over her school before she knew. Then she got into a never-ending back and forth to try to get the platforms to be accountable and to take down the materials.
Can you explain or enlighten us about what prevention mechanisms might actually be in place?
View Steven Guilbeault Profile
Lib. (QC)
This is a very good question. My office and my department have spoken as well with victims and victims' organizations. What we want to do with this legislation is to really shift the challenge for victims of having to try to get these images taken down—if we're referring to images that we would find on Pornhub, for example. We're trying to shift the burden of doing this from the individual to the state. It would be up to the Government of Canada, through a regulator, to do that, as it is in other countries, such as Australia, with their e-safety commissioner.
That's the goal we're pursuing with the tabling of this legislation. You are correct; we are also working to ensure that not only are the images taken down but they are removed from websites or associate websites to prevent, for example, the download of such images. They're not going to be downloaded and uploaded and downloaded and uploaded, as we've seen in many cases.
View Steven Guilbeault Profile
Lib. (QC)
Companies should abide by Canadian laws. Whether they're online companies or physical companies, there should be no distinction. As I said earlier, the challenge we face now is that the tools we have to deal with these online harms just aren't adapted to the virtual world.
View Jacques Gourde Profile
CPC (QC)
Minister, would it have been possible to include a provision in Bill C‑10 to regulate platforms like Pornhub so as to finally protect our children, who are going through unspeakable things right now?
View Steven Guilbeault Profile
Lib. (QC)
Thank you for the question.
I find your question very cynical, as your party consistently opposes the passage of Bill C‑10, which is not about content moderation, but rather about web giants contributing to our cultural sector's artists and musicians.
View Jacques Gourde Profile
CPC (QC)
View Jacques Gourde Profile
CPC (QC)
Thank you, Mr. Chair.
We have had some very disturbing testimony about underage children being exploited by platforms, and we need to take action. You told us you would put in place a new provision, new legislation, which probably won't come into effect for a year, a year and a half. We need to move much, much faster than that. We live in a society where our children are not protected, currently, from web giants.
How are you going to speed up the process? Why couldn't C‑10 close the loophole for now?
View Steven Guilbeault Profile
Lib. (QC)
Once again, your party opposes the passage of Bill C‑10, which has nothing to do with content moderation, while the hate speech and online harm bill specifically addresses the issue of content moderation.
Yet you say you oppose content moderation. You and many of your colleagues say that the government wants to take away your freedom of expression. The exploitation of persons bill will ensure...
View Han Dong Profile
Lib. (ON)
Thank you very much, Chair.
I want to thank you, Minister Guilbeault, for coming to the committee today and talking about a very important topic.
First of all, I want to go back to your opening statement. You cited an increase of xenophobia and Islamophobia in behaviours or speeches online over the recent months. As a member of the Asian-Canadian community, I observe and witness first-hand some of these intolerable behaviours online.
I have to say that the pandemic is changing people's socialized behaviour. More and more, people are spending time on social media. Then we have some of these bad actors using various platforms, seeing them as tools of disguise, seeing them as a protection, and also utilizing bots and trolls and saying all kinds of things they otherwise wouldn't say in public.
You mentioned that children in the country are being victimized, and the platforms are not doing anything. That's precisely what we are talking about today.
We know that social media companies, including the one we are doing a study on, have been acting unilaterally and opaquely. Sometimes they introduce half measures after public pressure, but they haven't been serious about consulting with industry experts and listening to the recommendations of the audience and the groups of victims.
In your opinion, what can the giants do to respect Canadians' will and Canadian law in terms of protecting the general public? It's in their best interest as well, because that's their audience and their client base. A very few bad actors are contaminating the online environment.
Can you talk a little about that?
View Steven Guilbeault Profile
Lib. (QC)
There are many elements in what you said.
First, I think one of the purposes of the legislation is to ensure more transparency on the part of the platforms in terms of their guidelines and practices regarding content moderation, because right now it's very uneven. Some companies have better content moderation practices than others, and some have very little. You're right—they are not transparent.
Some may have rejoiced in the decision of this platform or that platform to ban this user or another user, but under which criteria? Why them and not someone else? This is clearly something we want to tackle. Frankly, there is an issue where we see the very business model of some of the platforms being about creating controversy and nourishing hate speech and intolerance, because it creates more traffic on their platform. Therefore, they can sell more publicity and make more money.
As part of the legislation that will be tabled, this is also something that we as a legislator will need to address.
View Arnold Viersen Profile
CPC (AB)
Thank you, Minister.
Have you watched any of the testimony that we heard from the victims before this committee?
View Arnold Viersen Profile
CPC (AB)
All right. Many of them talked about how non-consensual videos of them were put up and, overnight, had millions of views. How do you intend to combat that with a 24-hour takedown notice?
View Steven Guilbeault Profile
Lib. (QC)
Well, as stated in my mandate letter, once an illegal publication is flagged, companies will have 24 hours to take it down. Instead of the victims having to try to deal with these companies, it's going to be the Government of Canada that's going to work to ensure that they remove that. If they don't, then there will be consequences for these companies.
Charles DeBarber
View Charles DeBarber Profile
Charles DeBarber
2021-06-07 12:03
Hello. Good afternoon. My name is Charles DeBarber and I'm a senior privacy analyst with Phoenix Advocates and Consultants. My background is U.S. Army cyber-intelligence and cybersecurity.
I began my work with victims of non-consensual pornography, or NCP, in 2015, when I worked for the elite firm Fortalice. As the program manager for open source intelligence, I assisted victims of NCP through our reputation services. Since departing Fortalice in 2018, I have done freelance work on behalf of victims of revenge porn, extortion schemes and cyberstalking, and on purging content for victims of human trafficking. I've written bespoke information guides for clients to help protect their digital privacy and to reduce the chances of their being a target of successful doxing.
My background gives me deep insight into the sources of content on the Internet, and today I want to share with you guys some knowledge about the surface web, deep web and dark web. In addition, I'd like to share some research about the sources of adult NCP on these three layers.
As a disclaimer, I want to be clear that my data regarding NCP is limited in a few ways. First, my data is limited to the 90-plus cases that I've undertaken since 2019. You'll see these are sourced as “PAC Research 2016 to 2021”. I recognize there's a selection bias to that data due to it being from only our casework. Second, much of my information on NCP involving children is largely anecdotal, as I've never produced statistics on it. In addition, the bulk of my work has been with adult victims. Third, I am discussing the concepts of surface web, deep web and dark web and how they relate to the volumes and types of NCP often found on them. This is not to paint any of these layers as good or bad. The dark web has an especially heinous reputation, but remember that there are people who use the dark web to subvert censorship or express their free speech in countries where freedom of speech is very limited.
You'll see in the handout the beautiful iceberg graph that is commonly used to explain the three layers. You have surface web, deep web and dark web. We'll start with the surface web.
The surface web is basically the Internet content indexed by search engines themselves and things you can directly jump to from search engines. It's aggregated web content that can be found with web crawlers, also known as spider bots or spiders. Make note of that, because it is very important for one of the points I'll make later. The surface web is the minority of online content, around 4% to 5%.
What's the deep web? That's the majority of the web, more than 90% of it. It's Internet content that's not part of the surface web and is not indexed in search engines. It's mostly content that is not readily accessible through standard means, such as search engines. As I said, it's the majority of content on the Internet.
Then there's the dark web. It's part of the deep web, but what makes it different is that you have to use encryption software and special software to access it—things like Tor Browser or Freenet or Freegate. It's also used interchangeably with dark net. It can be called both.
NCP comes in many forms. Some of the key forms for adult victims include revenge porn, non-consensual surveillance, human trafficking and data or device breaches. We have the following statistics from our casework. The majority of adult NCP, 73.5% of our cases, was found on the surface web. We believe that the reason for this is that adult NCP pornography easily blends in with amateur pornography. The ease of use and popularity of video- and image-sharing sites on the surface web is the main cause of this.
On top of that, the deep web accounts for about 23.2%. These are often private forums for pirated content, BitTorrent sites, and VoIP and messaging apps like Discord communities. The more compartmented nature of the deep web leads to a lower volume of content that is also less viral.
The dark web accounts for little of our content. Content there, in our experience, includes things that we consider highly illegal, things you would find only on the dark web because they are highly illegal. This could be things like hidden bathroom cam footage, extremely violent content, child pornography and bestiality. NCP blends in with amateur pornography and is readily available on upper layers. There's no reason to go to the dark web for it. Only a minority of Internet users have enough expertise and knowledge of the dark web to use it anyway. The even more compartmentalized nature of the dark web just keeps people off it. This results in more extreme and illegal content being relegated to the dark web.
In our casework, only about 3.3% is dark web content.
There are a few observations I would like to share with the committee. I've removed over 100,000 pieces of NCP content in the last five years. My average client has between 400 to 1,200 pieces of content, and that could be the same picture, video or handful of pictures, but it's shared on many different sites. Viral content itself can be upwards of 6,000 pieces of content and above. Very rarely do I utilize the NCP removal processes created by search engines such as Google or Bing or social media like Facebook, Twitter or Reddit.
I normally use the copyright removal process here in the United States, known as the Digital Millennium Copyright Act. The NCP process often is more complicated and takes longer for victims who have to follow it for every piece of content. Imagine, if you have 400 pieces of content out there, that might be 400 different applications you have to put out. These companies, frankly, respect intellectual property more than victims, because the copyright process is so much easier.
The removal process is costly in both time and resources. I utilize automation, which is not cheap. For a client with more than 400 pieces of content, it would usually cost $2,000 for automated removal and $5,000 for bespoke removal services, and that just mitigates the problem. Victims using it manually require a certain level of understanding of information systems, search engines and web caching, and that is if the victim can find most of the content without using automated aggregators. My junior analysts, some of them with information systems and computer science backgrounds, take up to a month of hands-on work to learn how to effectively purge content. The average victim is expected to have this expertise if they cannot afford professional services. The tools for victims to effectively mitigate their digital footprint of content aren’t readily available.
Great strides have been made to get Silicon Valley to recognize the issue, and I don’t wish to demean those efforts or that recognition. Laws in my home country are now in 48 states and two territories to protect victims of NCP. However, picking up the pieces after NCP floods surface web sites is still an uphill battle. We’ve worked tirelessly so clients can google their name without NCP coming up. One of our clients lives in fear of her 10-year-old using the computer and googling her name. Others have lost job opportunities, housing opportunities and relationships. Many of our clients have contemplated or attempted suicide.
Finally, video upload sites that allow pornography, such as Pornhub or Xvideos, have exacerbated the problem. This is one of the big points I want to make. Content goes viral a lot faster with these sites, and these sites use what is called search engine optimization to flood Google with their content. Even if the content is deleted within 72 hours, it often takes days, frankly, for a victim to even find out that they're a victim. Smaller video upload sites then aggregate this material from search engines and repost it, making this a feedback loop that keeps feeding the search engines and makes it a viral issue.
The issue has become so significant that when a victim’s name is posted in a video title that they're aggregated in and it's then used in search engine keywords for porn sites that don't even have their content, it just becomes a random keyword—their name—and God forbid you have a unique name. Imagine googling your name, and hundreds of porn sites coming up because your name is a keyword empowered by SEO techniques.
We need to find a balance between verification and privacy. That's very easy for me to say, but sites having a reasonable policy for age verification is required. I compliment Pornhub in adopting a verified content policy in late 2020. I'm very angry [Technical difficulty—Editor] and I badly want them held accountable for that, but I want to make sure it's also not so cumbersome that sex workers who are free agents can't operate without reasonable privacy.
Search engines—and this is a key one, and I would recommend you put this forward, or at least encourage them to change their policies—shouldn't allow indexing from adult video image upload sites that do not come from verified accounts. This means that, with verified accounts, the spiders can be turned on so that they can feed into Google, Bing and so on. However, spiders should be turned off on any website where any Joe Schmo can come and upload content, whether it be videos or images. They should be turned off on that content until it is verified. That keeps it from hitting search engines in 72 hours.
Remember, with all NCP, you're really fighting time, and that keeps it from going viral a lot more quickly, quite frankly. It makes the clean-up process significantly better, and it can mitigate it. Furthermore, it would probably protect the intellectual property of other sex workers. As I said, Pornhub and other major tube sites have more or less put NCP into the express lane via SEO techniques.
Finally, the doxing of victims and sex workers is a very serious issue. Despite many of my clients being Jane Does, I can't get Google to delist web pages that post the real names of victims. I wish there was a policy that allowed the delisting of the real names of Jane Does, of sex workers, that exist on sites such as the defunct Porn Wikileaks, which were very dangerous for them and were made for doxing victims.
I'm very open to questions you may have and appreciate your welcoming me today. I'm honoured to be here.
Thank you.
View Marie-Hélène Gaudreau Profile
BQ (QC)
Sorry to interrupt you, but I'm running out of time and I have two more questions.
Why do the RCMP's responses make the process cumbersome?
You said that non‑consensual content can be removed. It's expensive and complicated, but it's possible.
As legislators, what do we need to fulfill our role?
Charles DeBarber
View Charles DeBarber Profile
Charles DeBarber
2021-06-07 12:48
The process of doing this is costly, and it's really just stacked against victims. On top of that, it's stacked against free agent sex workers who are trying to protect their intellectual property.
There's a great Vice article that talks about a lot of OnlyFans folks. They can't do the same services studios can, so it pushes them towards a more exploitive studio structure.
We need to make those things more available. One thing we need to change, once again, is SEO and search engines in unverified content, specifically for upload sites. What I mean by an upload site is any site like Imgur, Pornhub or Xvideos, where I can go in, make an account and post anything I want. Those are not moderated the way—
View Marie-Hélène Gaudreau Profile
BQ (QC)
As you said, once the damage is done, the process of removing the content is extremely difficult. There are delays and uploads involved.
What do you think of the right to be forgotten that several countries use?
Charles DeBarber
View Charles DeBarber Profile
Charles DeBarber
2021-06-07 12:49
I might be a little biased there, because I'm an intelligence analyst by trade.
You're asking somebody who goes and subversively finds information about privacy. Honestly, I'm for it. I like the EU's stance on it, to be honest. I'm very biased on that question.
What I would like to see, especially, is that this content doesn't get SEO unless it's verified, because that keeps it from going viral to the point where it costs thousands of dollars to go out there and find the thousands of websites it's on and try to get rid of it. If I can kill it in the crib or at least get it to where.... Your average victim, from my calculation, at least for revenge porn, doesn't know for seven to 90 days. If unverified accounts can post anything they want, then it becomes part of that feedback loop, and that's a big deal. It's as easy as making them turn web spiders off that web page. That's something Pornhub can do. It's something that they really should just be—
View Marie-Hélène Gaudreau Profile
BQ (QC)
At the end of the day, the entire international community must be aware of this new way of operating online. People, both young and old, must be informed. They must be warned. Certain measures must be implemented, including the process for accessing service providers and the web.
View Charlie Angus Profile
NDP (ON)
Thank you.
Mr. DeBarber, I'd like to go back to the issue of how these images are promoted and exploited and can be found in search engines. One of our survivors said that she has tried again and again and again to deal with police, to deal with anyone, to get her thumbnails and all that information. Even though the video has been taken down, it's still out there. It's still available.
Are there not simple tools we can apply so that when something is taken down, it's actually removed, so that we have the right of survivors not to be harassed by what's still there?
Charles DeBarber
View Charles DeBarber Profile
Charles DeBarber
2021-06-07 12:57
Yes and no. It all depends on where it's hosted. It also depends on where you're getting it through. One, there's live content on other websites and other platforms, but then there's the stuff that's right in Google cache. Those are two different animals in terms of getting them purged. You actually have to purge both. Caching is more or less backing the information up. When you click on Google Images, for example, you're usually seeing the cache. When you get rid of the live content, you have to get rid of the cache too—fun fact.
Now, with some companies, like Google, lawyer Carrie Goldberg helped Google write its policy to remove NCP back in 2016, I believe. I'm glad that the rest of the big tech giants, including social media like Reddit and Twitter, emulated that process. The copyright process is still easier, unfortunately. Once again, if that image is repeated 100 times, let's say, then often 100 different notices have to get sent out. You have to do it in both the search engine and on there, but here's the rub—you can get it un-cached on Google, and delisted, but that doesn't get rid of the live content.
Here's one short answer: Give my contact information, please, and I'll help your client pro bono.
View Chris Warkentin Profile
CPC (AB)
We'll move to a vote on the amendment.
Madam Clerk, I'm wondering if you'll run through the roll call for the purposes of the vote on the amendment. This is Mr. Dong's amendment. Then we'll vote on the main motion.
(Amendment agreed to: yeas 10; nays 0 [See Minutes of Proceedings])
(Motion as amended agreed to: yeas 10; nays 0 [See Minutes of Proceedings])
The Chair: Members, that's very helpful. I'm glad we can do that.
Of course, next week our meetings are scheduled to be the review of the report on pandemic spending. I think Mrs. Shanahan may have some suggestions for meetings in the week that follows.
Mrs. Shanahan.
View Anthony Housefather Profile
Lib. (QC)
I wasn't arguing that; I was actually arguing the contrary. I was saying that beyond illegal content, social media providers will frequently say that certain things cannot be posted that are racist but that are not illegal and not hate speech. Their actual rules go beyond just legality. Isn't that correct?
View Steven Guilbeault Profile
Lib. (QC)
Bill C-10 is not about content moderation. The CRTC, in its last 50 years of existence, has never done content moderation, and Bill C-10 doesn't give the CRTC the ability to do content moderation.
View Rachael Harder Profile
CPC (AB)
There are two sections in this bill that were of significance: proposed subsection 2(2.1), which protects individuals; and proposed section 4.1, which protects their content.
Proposed subsection 2(2.1), on individuals, was kept in, but the section that protects their content, what they post online, was taken out. Therefore, they no longer have that protection. Why?
View Steven Guilbeault Profile
Lib. (QC)
You might have heard, like I did a few minutes ago, Justice Deputy Minister Drouin answer that question very clearly, specifying that the powers given to the CRTC are very narrow and targeted and don't have to do with content moderation.
Melissa Lukings
View Melissa Lukings Profile
Melissa Lukings
2021-04-19 12:09
The current rules under PIPEDA assign a fine to companies that violate the provisions that are related to privacy. All businesses have to follow these rules and have a specific format for how they collect, use, handle, disclose, access and allow people to access their own information as held by a company and as used by a company.
When we do the digital charter implementation act, it wouldn't be far-fetched at all to increase the fine for online platforms without banning them entirely or making it impossible for them to operate. It's a harm reduction idea. It's a safer idea than forcing people onto the dark web, where we literally have our hands tied. We can't intervene or help at all. It's recognizing that with the digital charter implementation act, we have the opportunity to look to the future and say, all right, as much as we might like to say that it is not okay to do this to people, by banning things and by prohibiting them, we're forcing them underground. How did Prohibition work out?
We have this opportunity now to actually talk about it. What do we expect from social media? What do we expect from other platforms without putting the criminalization...? For me, this isn't about criminalization. It's about the rules for companies. Without criminalizing the actual people who are in the content, we can put the onus on the company to do user verification.
Think about the same types of things your bank might use. You have a PIN. Some online platforms will require you to submit your driver's licence. If someone who is a user uploads content that has not been made consensually, that can be flagged and be sent immediately to the moderator. I actually also own a website and run a website, so I know how this works. That can be flagged and sent to the website owner. They can then go and look up the user. You have their driver's licence. You can track them. It's perfect. It works really well—way better than the dark web.
By increasing the amount of controls and security that the company has to do, without regulating the actual people who are involved but putting the onus on the company, it reduces the criminalization of sex workers. It helps us to locate and assist people who are being exploited or who are having images uploaded non-consensually. It gives them more power, because when you flag the video, it immediately comes down.
We can do that. We have the technology to do all these things. We can do it automatically. Automation is a real thing.
View Colin Carrie Profile
CPC (ON)
View Colin Carrie Profile
2021-04-19 12:12
As one of my colleagues said, as technology improves, we're trying to keep up to it.
Ms. Melissa Lukings: Exactly.
Mr. Colin Carrie: It's almost impossible. I've done a lot of work on human trafficking. I think everybody is in agreement there. When you have a young person plead with this committee about a way in which we could work with regulators so that images could be taken down, anything you could send to the committee that would enlighten us would be greatly appreciated.
Talking about sex work is a whole other study.
View David Lametti Profile
Lib. (QC)
Thank you, Mr. Chair.
I'm accompanied today by François Daigle, the associate deputy minister of the Department of Justice. Thank you for the invitation to appear before you today.
I'd like to make some general comments on some of the issues raised during previous meetings of the committee's study.
I'd like to emphasize that the government is committed to keeping our children safe, including online, as Minister Blair just said. Canada's criminal legislation in this area are among the most comprehensive in the world.
The Criminal Code prohibits all forms of making, distributing, transmitting, making available, accessing, selling, advertising, exporting and possessing child pornography, which the Criminal Code broadly defines as material involving the depiction of sexual exploitation of persons under the age of 18 years.
The Criminal Code also prohibits luring—that is, communicating with a young person, using a computer, including online, for the purpose of facilitating the commission of a sexual offence against that young person. It prohibits agreeing to or making arrangements with another person to commit a sexual offence against a child, and it prohibits providing sexually explicit material to a young person for the purpose of facilitating the commission of a sexual offence against that young person.
Furthermore, the Criminal Code also prohibits voyeurism and the non-consensual distribution of intimate images, which are particularly germane to both the online world and the discussion we are having today.
Offences of a general application may also apply to criminal conduct that takes place online or that is facilitated by the use of the Internet. For example, criminal harassment and human trafficking offences may apply, depending upon the facts of the case.
Courts are also authorized to order the removal of child sexual exploitation material and other criminal content, such as intimate images, voyeuristic material or hate propaganda, where it is being made available to the public from a server in Canada.
In addition to the Criminal Code, as Minister of Justice, I'm responsible for the Act respecting the mandatory reporting of Internet child pornography by persons who provide and Internet service. This act doesn't have a short title, but law practitioners refer to it as the mandatory reporting act.
In English, it's the mandatory reporting act, or MRA.
Under the mandatory reporting act, Internet service providers in Canada have two main obligations. The first is to contact the Canadian Centre for Child Protection when they receive child pornography complaints from their subscribers. This centre is the non-governmental agency that operates Cybertip.ca, the national tipline for reporting the online sexual exploitation of children.
The second obligation of Internet service providers is to inform the provincial or territorial police when there are reasonable grounds to believe that its Internet services have been used to commit a child pornography offence.
While Canada's laws are comprehensive, it is my understanding that there has been some concern as to how they are being interpreted and implemented, especially in relation to the troubling media reports about MindGeek and its Pornhub site.
Since I am the Minister of Justice, it would not be appropriate for me to comment on ongoing or potential investigations or prosecutions, but I would also note that the responsibility for the administration of criminal justice, including the investigation and prosecution of such crimes, including the sexual exploitation offences, falls largely on my provincial colleagues and counterparts.
However, as the Prime Minister stated during question period on February 3:
...cracking down on illegal online content is something we are taking very, very seriously. Whether it is hate speech, terrorism, child exploitation or any other illegal acts....
In fact, the government takes these measures so seriously that the Prime Minister has given four ministers the mandate to address different aspects of online harms. Minister Blair and I are two of these ministers. As he has mentioned, the Minister of Canadian Heritage is one of the lead [Technical difficulty—Editor] as well.
While the Internet has provided many benefits to Canada and the world, it has also provided criminals with a medium that extends their reach—and thus, their victim base—and a medium that elevates the level of complexity of investigations. One complicating factor is that telecommunications networks and services transcend international borders, while the enforcement authority of police, such as the RCMP, is generally limited to their domestic jurisdiction.
Further, under international law, court orders are generally enforceable only within the jurisdiction of a state. With limited exceptions, their enforcement requires the consent of the other state in which they are sought to be enforced.
Canada is obviously not the only country facing these challenges, which is why we continue to work with our international partners to facilitate international co-operation in the investigation and prosecution of these crimes, notably to strengthen bilateral co-operation and negotiation of new international mutual legal assistance treaties in criminal matters in order to address these issues.
Although mutual legal assistance treaties are a universally accepted method of requesting and obtaining international assistance in criminal matters, even in emergency situations, they weren't designed for the Internet age, where digital evidence is a common component of most criminal investigations and where timeliness is essential to the collection of this evidence because of its volatility.
Canada is actively working with its international partners to address these issues. For example, we are currently participating in the negotiation of a second protocol to the Council of Europe Convention on Cybercrime to enhance international co-operation on cross-border access to data.
Thank you.
View Martin Champoux Profile
BQ (QC)
View Martin Champoux Profile
2021-03-29 11:28
I will interpret that response as a no. So I have to conclude that you don't have any francophone moderators in Quebec. It was a simple question that you could have answered with yes or no, but you are telling me that you do not want to disclose this information. That's all right.
Mr. Chan, you remember the sad events in Christchurch. I was asking you if you control the content that goes out on your platform, because we're discussing what information Facebook allows, and you have some control over what is broadcast on your platform. For 17 minutes, the Christchurch killer broadcast his actions live on the Facebook platform.
Do you think you could have stopped that broadcast at that time?
Kevin Chan
View Kevin Chan Profile
Kevin Chan
2021-03-29 11:29
We were able to detect it and remove it, ultimately, as you point out. Of course we regret the tragedy and we regret that we were not even faster. We have obviously learned a lot from that terrible incident, not just at Facebook. To be fair, we've worked across the sector to build systems and protocols—with governments as well—to ensure that the entire system actually works, not just on Facebook, but across companies, across platforms and with governments. We've built these protocols to move much faster should the regrettable and unfortunate thing happen again.
Kevin Chan
View Kevin Chan Profile
Kevin Chan
2021-03-29 12:13
There are two ways of enforcing our systems, to be honest. One is the automated system, as I think one of your colleagues mentioned, which uses artificial intelligence. Some of the technology was developed in Canada: machine learning to go and find all these things.
In fact, I have some statistics here. In terms of hate speech, in the last quarter of 2020, our automated systems found over 97% of hate speech directed at groups automatically, before any human had seen them or reported them. That's where we are. Now, 97% is not 100%, so we still have a ways to go, but we're getting better every day. That's our posture. That's the way we do it right now.
The other piece, though, is that because speech is important from a contextual standpoint, we have to be careful on some of the grey zones for speech that, in fact, it is an attack on the community and not something else, for example, spreading awareness about Asian racism. We need humans as well, so part of that 35,000-person team that I referred to consists of people who are going to be looking at the context and saying that this image was shared, this video was shared, or this text was shared, but is the context of this to attack Asians, or is this to raise awareness about discrimination and racism? That context matters in terms of whether or not we would enforce and take it down.
It is really a parallel process that meets when we need to get more context. We have automated systems that go and find things automatically. We're constantly improving, but we're at about 97% of proactive identification and we need humans to verify some of the more challenging ones, where the speech is grey and we have to be sure of the context. Then, in the most complicated cases, they get escalated to people like me and Rachel, where we will look at specific pieces of content emanating from Canada, consult with experts and think through whether or not we're going to be drawing the line in the right place.
Lianna McDonald
View Lianna McDonald Profile
Lianna McDonald
2021-02-22 11:11
Good morning, Chairperson and distinguished members of the committee. Thank you for giving us this opportunity to present.
I am Lianna McDonald, executive director of the Canadian Centre for Child Protection, a charity dedicated to the personal safety of children. Joining me today is Lloyd Richardson, our director of technology.
By way of background, our agency operates Cybertip.ca, which is Canada’s tip line for reporting the online sexual exploitation of children. The tip line has been operating for over 18 years and currently receives, on average, 3,000 or more public reports per month.
Our agency has witnessed the many ways in which technology has been weaponized against children and how the proliferation of child sexual abuse material, otherwise known as CSAM, and non-consensual material fosters ongoing harm to children and youth. Over the last decade, there has been an explosion of digital media platforms hosting user-generated pornographic content. This, coupled with a complete absence of meaningful regulation, has created the perfect storm whereby transparency and accountability are notably absent. Children have been forced to pay a terrible price for this.
We know that every image or video of CSAM that is publicly available is a source of revictimization for the child in that image or video. For this reason, in 2017 we created Project Arachnid. Processing tens of thousands of images per second, this powerful tool detects known CSAM for the purpose of quickly identifying and triggering the removal of this illegal and harmful content. Project Arachnid has provided our agency with an important lens into how the absence of a regulatory framework fails children. To date, Arachnid has processed more than 126 billion images and has issued over 6.7 million takedown notices to providers around the globe. We keep records of all these notices we send, how long it takes for a platform to remove CSAM once advised of its existence, and data on the uploading of the same or similar images on platforms.
At this point, we would like to share what we have seen on MindGeek’s platforms. Arachnid has detected and confirmed instances of what we believe to be CSAM on their platform at least 193 times in the past three years. These sightings include 66 images of prepubescent CSAM involving very young children; 74 images of indicative CSAM, meaning that the child in the image appears pubescent and roughly between the ages of 11 to 14; and 53 images of post-pubescent CSAM, meaning that sexual maturation of the child may be complete and we have confirmation that the child in the image is under the age of 18.
We do not believe the above numbers are representative of the scope and scale of this problem. These numbers are limited to obvious CSAM of very young children and of identified teenagers. There is likely CSAM involving many other teens that we would not know about, because many victims and survivors are trying to deal with the removal issue on their own. We know this.
MindGeek testified that moderators manually review all content that is uploaded to their services. This is very difficult to take seriously. We know that CSAM has been published on their website in the past. We have some examples to share.
The following image was detected by Arachnid. This image is a still frame taken from a CSAM video of an identified sexual abuse survivor. The child was pubescent, between the ages of 11 and 13, at the time of the recording. The image shows an adult male sexually assaulting the child by inserting his penis in her mouth. He is holding the child’s hair and head with one hand and his penis with the other hand. Only his midsection is visible in the image, whereas the child’s face is completely visible. A removal request was generated by Project Arachnid. It took at least four days for that image to come down.
The next example was detected also by Project Arachnid. It is a CSAM image of two unidentified sexual abuse victims. The children pictured in the image are approximately 6 to 8 years of age. The boy is lying on his back with his legs spread. The girl is lying on top of him with her face between his legs. Her own legs are straddling his head. The girl has the boy’s penis in her mouth. Her face is completely visible. The image came down the same day we sent the notice requesting this removal.
We have other examples, but my time is limited.
While the spotlight is currently focused on MindGeek, we want to make it clear that this type of online harm is occurring daily across many mainstream and not-so-mainstream companies operating websites, social media and messaging services. Any of them could have been put under this microscope as MindGeek has been by this committee. It is clear that whatever companies claim they are doing to keep CSAM off their servers, it is not enough.
Let's not lose sight of the core problem that led to this moment. We've allowed digital spaces where children and adults intersect to operate with no oversight. To add insult to injury, we have also allowed individual companies to decide the scale and scope of their moderation practices. This has left many victims and survivors at the mercy of these companies to decide if they take action or not.
Our two-decades-long social experiment with an unregulated Internet has shown that tech companies are failing to prioritize the protection of children online. Not only has CSAM been allowed to fester online, but children have also been harmed by the ease with which they can easily access graphic and violent pornographic content. Through our collective inaction we have facilitated the development of an online space that virtually has no rules, certainly no oversight and that consistently prioritizes profits over the welfare and the protection of children. We do not accept this standard in other forms of media, including television, radio and print. Equally, we should not accept it in the digital space.
This is a global issue. It needs a global coordinated response with strong clear laws that require tech companies to do this: implement tools to combat the relentless reuploading of illegal content; hire trained and effectively supervised staff to carry out moderation and content removal tasks at scale; keep detailed records of user reports and responses that can be audited; be accountable for moderation and removal decisions and the harm that flows to individuals when companies fail in this capacity; and finally, build in, by design, features that prioritize the best interests and rights of children.
In closing, Canada needs to assume a leadership role in cleaning up the nightmare that has resulted from an online world that is lacking any regulatory and legal oversight. It is clear that relying upon the voluntary actions of companies has failed society and children miserably. The time has come to impose some guardrails in this space and show the leadership that our children deserve.
I thank you for your time.
Daniel Bernhard
View Daniel Bernhard Profile
Daniel Bernhard
2021-02-22 11:19
Madam Chair, honourable members of the committee, thank you for inviting me to appear today.
My name is Daniel Bernhard, and I am the executive director of Friends of Canadian Broadcasting, an independent citizens' organization that promotes Canadian culture, values and sovereignty on air and online.
Last September, Friends released “Platform for Harm”, a comprehensive legal analysis showing that under long-standing Canadian common law, platforms like Pornhub and Facebook are already liable for the user-generated content they promote.
On February 5, Pornhub executives gave contemptuous and, frankly, contemptible, testimony to this committee, attempting to explain away all the illegal content that they promoted to millions of Canadians and millions more around the world.
Amoral as the Pornhub executives appear to be, it would be a mistake, in my opinion, to treat their behaviour as a strictly moral failing. As Mr. Angus said on that day the activity that you are studying is quite possibly criminal.
Pornhub does not dispute having disseminated vast amounts of child sexual abuse material, and Ms. McDonald just confirmed that fact. On February 5, the company's executives acknowledged that 80% of their content was unverified, some 10 million videos, and they acknowledged that they transmitted and recommended large amounts of illegal content to the public.
Of course, Pornhub's leaders tried to blame everybody but themselves. Their first defence is ignorance. They claim they can't remove illegal content from the platform because until a user flags it for them, they don't know it's there. In any case, they claim that responsibility lies with the person who uploaded the content and not with them. However, the law does not support this position. Yes, uploaders are liable, but so are platforms promoting illegal content if they know about it in advance and publish it anyway or if they are made aware of it post-publication and neglect to remove it.
This brings us to their second defence, incompetence. Given the high cost of human moderation, Pornhub employs software to find offending content, yet they hold themselves blameless when their software doesn't actually work. As Mark Zuckerberg has done so many times, Pornhub promised you that they'll do better. “Will do better” isn't a defence. It's a confession.
I wish Pornhub were an outlier, but it's not. In 2018, the U.S. National Center for Missing and Exploited Children received over 18 million referrals of child sexual abuse materials, according to the New York Times. Most of it was found on Facebook. There were more than 50,000 reports per day. That's just what they caught. The volume of user-uploaded, platform-promoted child sexual abuse material is now so vast that the FBI must prioritize cases involving infants and toddlers, and according to the New York Times, “are essentially not able to respond to reports of anybody older than that”.
These platforms also disseminate many illegal contents that are not of a sexual nature. These include incitement to violence, death threats, and the sale of drugs and illegal weapons, among others. The Alliance to Counter Crime Online group regularly discovers such content on Facebook, YouTube and Amazon. There is even an illegal market for human remains on Facebook.
The volume of content that these platforms handle does not excuse them from disseminating and recommending illegal material. If widespread distribution of illegal content is an unavoidable side effect of your business, then your business should not exist, period.
Can you imagine an airline being allowed to carry passengers when every other flight crashes? Imagine if they just said that flying is hard and kept going. Yet Pornhub and Facebook would have you believe just that: that operating illegally is fine because they can't operate otherwise. That's like saying, “Give me a break officer. Of course I couldn't drive straight. I had way too much to drink.”
The government promises new legislation to hold platforms liable in some way for the content that they promote and this is a welcome development. But do we really need a new law to tell us that broadcasting child sexual assault material is illegal? How would you react if CTV did? Exactly.
In closing, our research is clear. In Canada, platforms are already liable for circulating illegal user-generated content. Why hasn't the Pornhub case led to charges? Perhaps you can invite RCMP Commissioner Lucki to answer that question. Ministers Blair and Lametti could also weigh in. I'd be curious to hear what they have to say.
Don't get me wrong. The work that you are doing to draw attention to Pornhub's atrocious behaviour is vital, but you should also be asking why this case is being tried at committee and not in court.
Here's the question: Does Pornhub's CEO belong in Hansard or in handcuffs? This is a basic question of law and order and of Canada's sovereignty over its media industries. It is an urgent question. Canadian children, young women and girls cannot wait for a new law and neither should we.
Thank you very much. I welcome your questions.
John F. Clark
View John F. Clark Profile
John F. Clark
2021-02-22 11:25
Good morning, Madam Chair Shanahan and honourable members of the committee.
My name is John Clark. I am the president and CEO of the U.S.-based National Center for Missing and Exploited Children, sometimes known as NCMEC.
I am honoured to be here today to provide the committee with NCMEC's perspective on the growing problem of child sexual exploitation online, the role of combatting the dangers children can encounter on the Internet, and NCMEC's experience with the website Pornhub.
Before I begin with my testimony, I'd like to clarify for the committee that NCMEC and Pornhub are not partners. We do not have a partnership with Pornhub. Pornhub has registered to voluntarily report instances of child sexual abuse material on its website to NCMEC. This does not create a partnership between NCMEC and Pornhub, as Pornhub recently claimed during some of their testimony.
NCMEC was created in 1984 by child advocates as a private, non-profit organization to help find missing children, reduce child sexual exploitation and prevent child victimization. Today I will focus on NCMEC's mission to reduce online child sexual exploitation.
NCMEC's core program to combat online child sexual exploitation is the CyberTipline. The CyberTipline is a tool for members of the public and electronic service providers, or ESPs, to report child sexual abuse material to NCMEC.
Since we created the CyberTipline over 23 years ago, the number of reports we receive has exploded. In 2019 we received 16.9 million reports to the CyberTipline. Last year we received over 21 million reports of international and domestic online child sexual abuse. We have received a total of over 84 million reports since the CyberTipline began.
A United States federal law requires a U.S.-based ESP to report apparent child sexual abuse material to NCMEC's CyberTipline. This law does not apply to ESPs that are based in other countries. However, several non-U.S. ESPs, including Pornhub, have chosen to voluntarily register with NCMEC and report child sexual abuse material to the CyberTipline.
The number of reports of child sexual exploitation received by NCMEC is heartbreaking and daunting. So, too, are the many new trends NCMEC has seen in recent years. These trends include the following: a tremendous increase in sexual abuse videos reported to NCMEC, reports of increasingly graphic and violent sexual abuse images, and videos of infants and young children. These include on-demand sexual abuse in a pay-per- view format, and videos showing the rape of young children.
A broader range of online platforms are being used to access, store, trade and download child sexual abuse material, including chats, videos and messaging apps, video- and photo-sharing platforms, social media and dating sites, gaming platforms and email systems.
NCMEC is fortunate to work with certain technology companies that employ significant time and financial resources on measures to combat online child sexual abuse on their platforms. These measures include large teams of well-trained human content moderators; sophisticated technology tools to detect abusive content, report it to NCMEC and prevent it from even being posted; engagement in voluntary initiatives to combat online child sexual exploitation offered by NCMEC and other ESPs; failproof and readily accessible ways for users to report content; and immediate removal of content reported as being child sexual abuse.
NCMEC applauds the companies that adopt these measures. Some companies, however, do not adopt child protection measures at all. Others adopt half-measures as PR strategies to try to show commitment to child protection while minimizing disruption to their operations.
Too many companies operate business models that are inherently dangerous. Many of these sites also fail to adopt basic safeguards, or do so only after too many children have been exploited and abused on their sites.
In March 2020, MindGeek voluntarily registered to report child sexual abuse material, or CSAM, on several of its websites to NCMEC's CyberTipline. These websites include Pornhub, as well as RedTube, Tube8 and YouPorn. Between April 2020 and December 2020, Pornhub submitted over 13,000 reports related to CSAM through NCMEC's CyberTipline; however, Pornhub recently informed NCMEC that 9,000 of these reports were duplicative. NCMEC has not been able to verify Pornhub's claim.
After MindGeek's testimony before this committee earlier this month, MindGeek signed agreements with NCMEC to access our hash-sharing databases. These arrangements would allow MindGeek to access hashes of CSAM and sexually exploitive content that have been tagged and shared by NCMEC with other non-profits and ESPs to detect and remove content. Pornhub has not taken steps yet to access these databases or use these hashes.
Over the past year NCMEC has been contacted by several survivors asking for our help in removing sexually abusive content of themselves as children that was on Pornhub. Several of these survivors told us they had contacted Pornhub asking them to remove the content, but the content still remained up on the Pornhub website. In several of these instances NCMEC was able to contact Pornhub directly, which then resulted in the content being removed from the website.
We often focus on the tremendous number of CyberTipline reports that NCMEC receives and the huge volume of child sexual abuse material contained in these reports. However, our focus should more appropriately be on the child victims and the impact the continuous distribution of these images has on their lives. This is the true social tragedy of online child sexual exploitation.
NCMEC commends the committee for listening to the voices of the survivors in approaching these issues relating to Pornhub. By working closely with the survivors, NCMEC has learned the trauma suffered by these child victims is unique. The continued sharing and recirculation of a child's sexually abusive images and videos inflicts significant revictimization on the child. When any website, whether it's Pornhub or another site, allows a child's sexually abusive video to be uploaded, tagged with a graphic description of their abuse and downloaded and shared, it causes devastating harm to the child. It is essential for these websites to have effective means to review content before it's posted, to remove content when it's reported as child sexual exploitation, to give the benefit of doubt to the child or the parent or lawyer when they report content as child sexual exploitation, and to block the recirculation of abusive content once it has been removed.
Child survivors and the children who have yet to be identified and recovered from their abuse depend on us to hold technology companies accountable for the content on their platforms.
I want to thank you for the opportunity to appear before this committee. This is an increasingly important topic. I look forward to answering the committee's questions regarding NCMEC's work on these issues.
View Arnold Viersen Profile
CPC (AB)
Mr. Richardson, I'll ask you this seeing as you're the tech guy here.
The big trouble we've been studying here at this committee is around this, the age and the consent of folks who are depicted in these videos. We hear a lot about how long it took to take the video down and things like that, but certainly there would be methods of ensuring that these videos never show up in the first place.
I was wondering if you could comment on that. If you're bragging that you are the leading tech company in the world, surely there's technology to keep this stuff off the Internet to begin with.
Lloyd Richardson
View Lloyd Richardson Profile
Lloyd Richardson
2021-02-22 12:05
There is, but I would kind of invert that a little bit. It's not a technical issue.
Let's reverse in time to the 1980s before the popularized Internet when we had pornography and we didn't see child sexual abuse material showing up in Playboy magazine. It's not necessarily a technical issue. If you're in fact moderating everything that comes up on your platform, this should never happen. We don't see the CBC show up with child pornography on its services because there's moderation that happens. We have control over the content. That's not to say you can't leverage technology, as we do in Project Arachnid, to do proactive detection of known child sexual abuse material, but really, let's not look at the new and fancy, oh, I have an AI classifier that can automatically detect child pornography. That's great and all, but it's never going to detect everything, and it's not going to have the accuracy that you have of actual human moderators looking at material. It's in addition to something that's already there, so it's important not to belabour the technological side of things.
Daniel Bernhard
View Daniel Bernhard Profile
Daniel Bernhard
2021-02-22 12:07
It was just to say that I agree. Platforms want to operate at a certain scale, which requires them not to validate any of the content that comes up, yet that seems to result in illegal outcomes, so it's not really for us to say how they should deal with this, but simply that if it's there, they should face the consequences.
To Mr. Richardson's point, I have one final issue. It's not just CBC, CTV, etc., who make sure that their content is lawful. They also have to make sure the advertising that they run is lawful and that op-eds and other third party contributed content are lawful. Otherwise they are jointly liable. This is how the law works, and I see no reason why it shouldn't apply in the case of Pornhub, Facebook, Amazon or any other provider that is recommending and facilitating illegal behaviour through its service.
View Han Dong Profile
Lib. (ON)
Let's say going forward, if you find there's enough ground to start an investigation, would you be able to go back...?
MindGeek told us that they made some improvements on their approval process or screening process so everything is great now, but would you be able to, retroactively, take a look at their actions in the past?
Stephen White
View Stephen White Profile
Stephen White
2021-02-22 12:58
It would depend on the preservation of data retention policies that they have. How long they retain their data is often different across different companies or platforms, Internet providers. That is always a concern for us when we're doing investigations, trying to look backwards for a period in time, whether or not that data has been retained.
Witness-Témoin 1
View Witness-Témoin 1 Profile
Witness-Témoin 1
2021-02-19 14:47
When I was 24, I met someone I thought was a really nice guy. I married him, and as soon as he thought I was stuck, he stopped being nice pretty quickly. In April 2020, I moved away from our home to be safe, and obviously, we're not together anymore.
During our relationship, I had let him take some pictures. I was uncomfortable at first, because I had never been in any picture like that, but I trusted him and I wanted to keep him happy. It wasn't until August of 2020 that I discovered those private photos had been uploaded to porn sites, including Pornhub.
I was upset about the photos, but it was about to get worse. Finding the photos led me to a video. I did not know the video existed. I found out about it by watching it on Pornhub. In the title of the video, it says I'm sleeping. The tags include “sleeping” and “sleeping pills”. Whether I was asleep or drugged is impossible to know after the fact, but what is clear in the video is that I am not conscious and there is nothing to suggest consent. The video is clearly homemade and was uploaded by an anonymous email address. This is the content that the Pornhub moderators supposedly viewed and decided belonged on their porn site. My video had been uploaded in August of 2017, so by the time I found it, it had been active on Pornhub for over three years, and I had no idea.
I didn’t try to get the video down right away because I showed it to the police the next morning, and they told me to leave everything until they were done with it. However, sometime between August 16 and 19, the Pornhub video became no longer playable. It said “technical difficulties”. About that same time, I noticed that Pornhub was pulling their tags that directly indicated non-consensual content. For example, if you searched “sleeping pills” in early September, it didn't return any results. This was, of course, not the case in mid-August, so my best theory is that the video disappeared as they tried to clean up those kinds of tags.
In all that time, the video did not get flagged or removed. The viewers, rather than being turned away by sexual assault videos, were actively searching out that content. The tags made this possible, and they knew what they were watching before they clicked. It is a profound betrayal to know that thousands of men saw your assault and not only did nothing to flag it but actively sought it out and enjoyed it.
On Pornhub, there is a comment section, so the night I found my video, I also got to read a man describe in graphic physical detail just how much he enjoyed himself watching it. On another site, thousands of men watched my video and instead of flagging it, they awarded it top-rated for a certain body part. This video is not a one-off that slipped through a filter. Sexual assault is not an anomaly on the porn sites; it is a genre. This leaves little incentive for these sites to moderate such content.
To give an idea of the scope of the spread, as of early January 2021—after the December purge, and after the RCMP had removed a bunch for me—googling the name of my Pornhub video still returned over 1,900 results. One cause of the spread is, of course, users downloading it and reuploading it. There are definitely some of these floating around, but the most significant way my video was spread was through links. MindGeek did this by putting links to my Pornhub video on their other sites as a cheap way of adding content to those sites. Many of the other third party sites also use this method, so they too linked to my video on Pornhub. Of the 1,900 search results, Pornhub is the source for all of them.
The upside with linking is that when the video is removed from Pornhub, it's not playable on these other sites either. The downside is that Pornhub creates a thumbnail image file for all the videos uploaded to its site, and this image can be downloaded even if the video is only a link. There are still quite a few of these thumbnails on porn sites and in search engine caches. The thumbnail is still a picture of me naked. I don't want it on the Internet. Also, when Pornhub deleted my video, they didn't delete any of the data surrounding it like the title and the username. That is also a problem.
I contacted Pornhub in January to get them to remove the data and the thumbnails associated with their site. At first they pretended not to know what I was talking about. I sent them all the information again. They sent me a link to Google and told me to go do it myself. After a month and a half and eight emails, Pornhub has removed some of the data and thumbnails that were associated with their site and they indexed a few things on one search engine that's still not all gone. I think they're just ignoring me now.
I also asked them for help in removing the thumbnails and the content that spread from Pornhub to these other sites. They told me that they can't remove their content from the other sites it spreads to. However, they have an entire program where they proactively do exactly that for their exclusive model content. They advertise it. They monitor the Internet for where these videos spread, they take them down for them and they even pay them a bonus. All I'm asking is that they pretend to care as much about their non-consensual content as they do about their paid, exclusive content.
Nothing will ever be able to undo what has been done. At this point, I just want to be off the Internet.
Thanks to Pornhub, today is day 1,292 that I have been naked on these porn sites.
Laila Mickelwait
View Laila Mickelwait Profile
Laila Mickelwait
2021-02-19 15:48
Thank you.
For anyone who's viewing this online and also for the committee, I want to offer a content warning before I get started of graphic, very offensive language and descriptions of sexual violence. I don't do this to be sensational. I do it because I think it's important for the committee to have an accurate idea and understanding of the situation with the content on Pornhub without mincing words.
As I proceed, I want to ask the committee to keep in mind that CEO Feras Antoon said to this committee, “every single piece of content is viewed by our human moderators”—every single piece of content.
COO David Tassillo said, “There should be zero videos tabbed under either [child pornography or non-consensual acts] categories. Those categories are banned from being used on our site, as the keywords are.” They said, “child abuse material has no place on our platform. It makes us lose money.”
I believe it's important to elevate the voices of survivors, and I want to read some quotes and testimonies from survivors who have reached out to me over the past year.
Kate said, “I was 15 years old. My ex was 20. He was into homemade videos and stuff so he had videotaped us having sex. One day he said, 'Let me show you something'. He pulled up Pornhub on his phone and showed me that he had posted a video of us having sex. I tried to contact Pornhub and get them to take it down, but they never contacted me back or did anything about it. He also posted my 'sexy pics' on his account. Grown men and women were looking and watching me there. I'm disgusted.”
Beth said, “I was 16 and I was drunk once at a friend's party. I woke up. I was naked and pictures of me were on Pornhub, along with my name and my phone number. I had calls and texts to the point that I changed my number.”
Nicki said, “When I was 14 years old, I made the decision that changed my life. I was having a sexual FaceTime call. I showed him areas of my body that were private. I didn't know at the time but he was recording and he had uploaded it to Pornhub. The name of the video even had the words 'young teen' but that was not enough for Pornhub to analyze it and make sure it was consensual or legal. Years later my classmates found it on the website and told me about it. I was 16 when they found it. The first one had over one million views. We got the first one taken down, but the identical video was posted over and over again. I reported it to the police, and they opened an investigation. They told me they had contacted Pornhub to make sure it wouldn't be shown anymore, but the video was posted again. During these times of being posted multiple times, I was bullied by my entire school. Every boy and girl in my high school saw my body, and it changed my life.”
Sarah said, “I found out an explicit video of me was posted to Pornhub. I was underage. I did not send it to anyone to the best of my recollection, and it got hacked from my phone. I was horrified and I reported it and filed a complaint. Police took a statement. I'm waiting for the detective to contact me. Even if the video is taken down it could always come back. This could ruin my life and my future. I'm terrified and I'm traumatized.”
Anastasia said, “There's a video on their site that was taken of me without my knowledge while I was underage. It is still up on their site despite my reporting it numerous times, stating that I'm underage in the video and that it was taken and posted without my consent.”
Linda said, “I'm now 20 years old and I'm a sex and porn trafficking survivor. At the age of nine, my biological mother sold me in exchange for drugs and for money. This happened until I was rescued at the age of 17 and placed in a safe house. For eight years I was raped and beaten, and the video was taped by hundreds of men, women and even married couples. I never thought I would live to be 18 years old. I was hospitalized dozens of times and one time I was forced to drink ammonia until I passed out and was raped for hours after that, even though my mouth and my throat were burning. I was forced to have sex with other children, especially young girls. I still have nightmares and extreme PTSD from this, but it's not fair that my life is so hard now because I was forced into a life of pornography as a child. I've had to get police involved on multiple occasions to get these videos removed from RedTube, owned by MindGeek, and Pornhub, of me being raped as a minor. I don't understand why it's so difficult. Please stop allowing people to make money off the torture and the coercion of children. It's not fair.”
Keira said, “At the age of 15 I was coerced into being filmed during a sex act, and that video was uploaded without my consent to Pornhub. The uploader was also underage, and they had no way of confirming anyone's age or consent. I have been dealing with image issues, PTSD and sexual discomfort since the incident, into adulthood. This is my personal account, and I have heard similar stories from other women. I will never forgive Pornhub for allowing my abuse to be shared publicly and causing me to relive that pain years later.
Amanda said, “Leaked nude photos from when I was underage were put online, allowed to be uploaded by Pornhub and men were allowed to vote on which child was the most attractive. Pornhub told me that there was no point in making a fuss since people had already screenshot the photos, so deleting the video is pointless.”
Tiana said, “When I was 14 years old, someone recorded me performing oral sex without my knowledge or consent. The video was used as blackmail and was shared on Pornhub. Police contacted Pornhub, and it took them a while to delete it. It ruined my life, and people still bring it up to this day.”
Caroline said, “I spent two months begging Pornhub to take down a video of me being orally raped at the age of 15. I was crying, screaming. I had a bloody nose. It was up for a year and a half before I knew about it.”
Beth said, “I was 10 the first time I was raped. My uncle saw those porn stories and used me to play out his fantasies. Two years later I found the videos of me on Pornhub.”
I could go on and on. My time is short. I have many accounts of children who personally reached out to me, whom I've talked to, who have had their abuse immortalized on Pornhub.
All of the following is a small sample of evidence that has been documented on Pornhub in 2020, before the mass deletion of 10 million videos from unverified and unknown users.
Videos on Pornhub are titled “Young Teen Gets Pounded”; “Old Man with Young Teen”; “Young Girl Tricked”; “A Club Where you can Play with Little Girls, and It's So Fun”; “Cute Amateur Teen Drunk and Stoned”; “First BBC on Drugs”; “Stolen Teens' Secret Peeing Scenes”, with video cameras inside girls' toilets videotaping them without their knowledge; “Amateur Sex Tape Stolen from Teen Girl's Computer”; “Daddy Fucks Young Teen Boy Virgin, First Time”; “Tika Virgin from High School Jakarta Grade Two”; “Jovencitas violadas”, meaning “young rape”, from an unknown user; “Drunk Teen Fucked by Black Stranger”; “Innocent Teenage Girls are Used and Exploited”; “Crying Teen”; “Passed Out Teen”; “Very Young South American” with the tags “teenager” and “young”, and a comment says, “This girls looks 13”; “Chinese Northeast Middle School”; “Junior High School Student”; “Anal Crying Teen”; “I'm 14”, with a video of a young boy masturbating; “Gay 14”, a video of a young boy masturbating; and “Pinay Junior High Student”.
I could go on and on. Again, suggested and promoted searches by Pornhub that were found on their site as of 2020 are search terms that Pornhub actually serves up to its consumers: “abused teen”, “crying teen”, “punished teen”, “anal crying teen”, “teen destroyed”, “young Black teen”, “young, tiny teens”, “young girl”, “tiny, young girls”, “sleeping teen”, “middle school sex”, “Snapchat teen”, “middle student”, “stolen teen sex tape”, “stolen teen homemade” and “very young teens”.
As for comments on the site, there are hundreds of documented comments, if not thousands of documented comments, where users are flagging these child sexual abuse material videos to Pornhub, and they're ignored. They're on the site for months and even years. Examples are, “Isn't this technically child porn?”, “She looks 13. That's illegal”, “Wow, she looks like she's 12”, “I'm not legal but I have a winning video”, “She looks nine. Trade CP?”, and “She looks like she's 12, like she hasn't even hit puberty.”
Again, David Tassillo told this committee that, “Child abuse material has no place on our platform. It makes us lose money.” I would like to tell the committee that is not true, because child sexual abuse has made its way to Pornhub in a significant way. Every single video of a child that is found on Pornhub or of an abused adult is heavily monetized. It's monetized with ads of premium memberships, data collection. In some cases it's being directly sold for the profit of Pornhub: 35% to Pornhub and 65% to the person who uploaded the sex act through the model hub program.
I want to point out to the committee that any minor used in a commercial sex act is a victim of sex trafficking according to international law as well as domestic law. I think it's very important for us to realize that.
I also want to make it clear that Pornhub added insult to injury by adding an intentional download button to their system whereby every single video on Pornhub was made available to possess by consumers. It was transferred from MindGeek servers to individuals. One hundred and fifteen million users a day have the ability to commit the federal crime of downloading and possessing child sexual abuse material because Pornhub built that feature into the design of their website.
Feras Antoon said to this committee that “the spread of unlawful content online and...the non-consensual sharing of intimate images...goes against everything we stand for at MindGeek and Pornhub.” He said, “this type of material has no place on our platforms and is contrary to our values and our business model.” He said, “When David and I joined MindGeek in 2008, our goal was to create the most inclusive and safe adult community on the Internet” and that it was designed to value privacy. He said, “We knew this could be possible only if safety and security were our top priority.”
Anne wrote me and said, “Revenge porn is a major issue. I was a victim of it two years ago when I wouldn't take back my ex-fiancé. A couple of weeks later I received a call saying that my private photos I had sent him were uploaded to Pornhub. It was such a hassle to get them down.”
We have scores of testimonies of victims who have experienced the same thing.
Jessica says, “Most of my videos were done by my ex. I was too high to consent. I was blacked out. He put them on Pornhub without my permission.”
The following is a small sample of content on Pornhub as of 2020. On September 24, you could search the initials “GDP”, for “girls do porn”, which is a known sex trafficking operation which Pornhub is well aware is for trafficking victims, and you could turn up 338 results for these sex trafficking victims on the site. Other videos were titled “Fucked Sister Hard in the Ass While She Was Drunk and Sleeping”; “Drunk Girl Gets Handcuffed and Abused Next to the Party”; “Fucked Sleeping Schoolgirl After a Drunk Party”; and “Tinder Girl Passed Out At My House, So I Stuck It in Her Ass”.
Tiziana Cantone was a victim who committed suicide. Her video was on the site as of 2020. Other titles were “Anal Sex With a Drunk Girl”; “Drunk Asian Girl Humped By My Friend”; “Hidden Camera: Girls in the Toilet At Prom”; and “CCTV in Changing Room: Full Naked Hockey Team”. Suggested search terms to users on the site include “real hidden camera”; “hidden camera”; “voyeur”; “spycam shower”; “stop fucking me”; and “rape” in Chinese.
When pressed on the allowance of these kinds of non-consensual and illegal videos on his site, David Tassillo said to this committee, “We are a start-up still.” He said that about a site that is the 10th-largest-trafficked site in the world, and that makes hundreds of millions of dollars a year on this content.
In only a couple of minutes more, I want to finish. Feras Antoon told the committee that Pornhub was designed to celebrate freedom of expression. However, there are many instances of extreme racism on the site as of 2020, including “Black Slave Girl Brutalized”; “How to Treat Your Nigger”; “Real Drunk Stupid Chink Whore”; “Racist White Slut Sucks and Fucks Black Dick and Says Nigger”.
Lastly, I want to point out that VP Corey Urman has said in the media many times that they have a vast and extensive team of human moderators that is viewing each and every single video before it is uploaded to the site. I want to tell the committee that I have evidence that, actually, as of early 2020, Pornhub had under 10 moderators per eight-hour shift reviewing content on the site, in Cyprus. They had only 30 to 31 employees per day looking at content, and that's for all of MindGeek tube sites. These constitute the world's largest and most popular tube sites, with millions of videos uploaded per year.
Lastly, David Tassillo said, “We digitally fingerprint any content removed from our site so that it cannot be re-uploaded.” He said this to the committee, but we have emails of Pornhub telling victims that they do not guarantee that their child abuse will not be reuploaded to the site, and they callously tell victims, “Please educate yourself on the limits of our software.”
On behalf of two million people who have signed the petition from 192 countries to hold Pornhub accountable and over 300 organizations around the world that are calling for accountability by Pornhub, I want to thank this committee for taking this issue seriously and for conducting this investigation.
Megan Walker
View Megan Walker Profile
Megan Walker
2021-02-19 16:03
Good afternoon.
I hope everybody is well today. This is a heavy subject and you've had heavy testimony.
I really want to honour the lives of the victims who have come forward today, Ms. Galy and the two guests that you've had, as well as Ms. Fleites, who appeared last week. It takes an incredible amount of courage for women to come forward. We always say the most important voice is the voice of victims and we must always listen to survivors.
The London Abused Women's Centre last year had 143 women report to us that technology was used in their assault and another 64 reported that pornography was prevalent in their relationship and oftentimes they were forced to play out the scenes in pornography.
One of the women we served, who was involved with Pornhub, wrote, “It was soul destroying to find videos of me Pornhub. Discovering how readily available they were broke me. Being hit with the reality that anyone could see the darkest points of my life nearly killed me. I had to stop looking for more videos after I found four. I was suicidal and have deep-seeded shame about those videos even though I was a child. It causes a fear I can’t put words to.”
One of the common themes we hear from victims of pornography who are not able to have their pictures or images removed is that they feel incredible shame and are oftentimes suicidal. As far as the shame is concerned, we want to make sure they understand any shame and blame they may feel belongs to the abuser and to Pornhub and MindGeek. They do not have a responsibility to feel that. With respect to the suicidal ideations, I want to say that as bad as this is—and I can't even go to a place where I understand that—I believe that with help and counselling there is hope.
We know that Pornhub has facilitated and distributed the uploading of videos of minors being sexually exploited and assaulted. We also know that non-consenting adults and trafficked women have been raped and tortured for the world to see. Pornhub has actively participated in the downloading of these videos, which is leaving a lifelong imprint of trauma in the lives of millions of women.
It took The New York Times' article for Pornhub to remove millions of videos after an investigation showed a large number of them featured underage girls and non-consenting and trafficked women and girls. Pornhub is complicit in the trafficking of women and girls. This item alone, which it took The New York Times last year to expose Pornhub on, shows that Pornhub, even though the CEO and the COO acknowledged they were parents and grandparents, really doesn't care about the lives of women and girls.
Many of the videos were posted on Pornhub's website under the headings of “torture porn”, “teen porn” and “fetish porn”, and all of those headings continue to remain in place today.
MindGeek's CEO testified, as you heard from Laila, that “every single piece of content is viewed by our human moderators.” This is an absolute joke. It's ridiculous, and frankly, it's impossible, given how many millions of videos are uploaded. Further, only a team of forensic pediatricians can age girls, not men and women hired at perhaps minimum wage to look at videos all day long of rape scenes, to identify who is underage, who is consenting and who is not.
We know that the goal of MindGeek's CEO and COO is to make millions of dollars so that they can support their lavish lifestyles through their premium, the sale of ads and harvesting and selling their data. None of that would be possible without the exploitation of children and women. In fact, we know from our experience that men will pay more to see children exploited.
I have reviewed sections 162 and 163 of the Criminal Code of Canada. I am not a lawyer, but taken literally, it appears that almost every single section in the code could apply to MindGeek, Pornhub and all of their affiliated businesses as well as their CEO and COO.
In addition to the potential crimes, MindGeek appears to be and likely is in violation of international laws on trafficking and child sexual exploitation, and also has not complied with the mandatory reporting requirements in Canada. Pornhub has facilitated and profited from the exploitation of girls and women.
The London Abused Women's Centre offers a number of recommendations.
The first is that robust funding be made available to support all victims. They are suicidal. This is heartbreaking, and they need to make sure they have access to service.
Given the testimony from at least one survivor who stated that she was sexually exploited as a child, and given the testimony of MindGeek's CEO and COO acknowledging that children and non-consenting women have been exploited on Pornhub, we recommend that the committee immediately send witness statements to the police for a criminal investigation.
Also, given the testimony of MindGeek's CEO and COO acknowledging their failure to self-regulate, it is recommended that Parliament legislate the end of self-regulation by MindGeek, its affiliated and subsidiary companies and the pornography industry.
We recommend that a third party not associated with MindGeek, its affiliated and subsidiary companies or the pornography industry be retained to verify age and consent.
We recommend that Parliament legislate that all credit card companies be prohibited from providing services to MindGeek, its affiliated and subsidiary companies and the pornography industry until the third party recommended in the clause I just read is established.
Finally, I would say that it is recommended that the Canada Revenue Agency conduct a forensic audit and criminal investigation on the finances and ownership of MindGeek and its affiliated and subsidiary companies in order to determine if they are in compliance with the relevant Canadian and international tax, disclosure and other laws and regulations.
It is the role of government to regulate all industries in order to protect its citizens. When an industry is predatory, especially the porn industry, it is incumbent on the government to regulate the production and consumption related to that industry.
I thank you for giving me this time today.
View Shannon Stubbs Profile
CPC (AB)
View Shannon Stubbs Profile
2021-02-19 16:17
Thank you, Laila.
To your point about the gap between what they claim related to child sexual assault material and non-consensual material, do you have any comments in terms of what employees at MindGeek actually are experiencing? If you've ever spoken to any, might they have given you any understanding about how their so-called moderators—I think they actually call them “content formatters”—work?
Laila Mickelwait
View Laila Mickelwait Profile
Laila Mickelwait
2021-02-19 16:18
Yes. A number of whistle-blowers from MindGeek have reached out. I have been in contact with them. Attorneys have also been in contact with them. They've also been put in touch with law enforcement.
These whistle-blowers who've come forward have revealed things about the way Pornhub has acted with reckless disregard for human safety. They have acted in what I would think is gross criminal negligence. Just the idea that the world's largest and most popular porn site, with seven million uploads per year, 13 million videos available on the site at any given moment and 11 million comments posted to the site per year, many of which indicate that this is child rape, sex trafficking and non-consensual, that they would think it was okay to have 10 individuals per shift—including bathroom breaks, cigarette breaks and lunch breaks—reviewing these millions of videos and guessing, using an archaic Excel spreadsheet....
I have been given internal documents from MindGeek. I know MindGeek's executives refused to hand over such documents to the committee when they were presenting before you and you asked for them. I do have some of those internal documents using an archaic Excel spreadsheet, where in 2016 they had under 100 flagged red words that they were prohibiting on the site. You can compare that with what they've done now. It's absolutely reckless.
The suggestion that David Tassillo and Feras Antoon came before this committee and said they were leaders in child safety is incomprehensible.
View Jag Sahota Profile
CPC (AB)
Mr. Chair, I want to thank all the witnesses for coming here today, and for their time and evidence.
MindGeek executives continue to talk at length about their fingerprinting software for preventing reuploads of illegal content to the site. What is your knowledge about that software and its implementation?
Laila Mickelwait
View Laila Mickelwait Profile
Laila Mickelwait
2021-02-19 16:56
Like I mentioned before, MindGeek itself has emailed victims, telling them that they acknowledge that the software does not work. MindGeek itself came to this committee and said that to this committee, that they acknowledge that the fingerprinting software does not work in every case because you can make small edits to these videos, and then the hashing and the fingerprinting doesn't work anymore.
It's extremely problematic when a company is relying and touting this kind of software as a solution, and at the same time, fully understanding that it doesn't work and telling victims that it doesn't work. It speaks to the issue that we have to go to the front end. We have to go to the point of upload, and have these procedures and compliance in place to prevent these videos from getting on the site in the first place.
Francis Fortin
View Francis Fortin Profile
Francis Fortin
2021-02-19 16:57
From a legal standpoint, you'll recall that a fast approach is available. The approach is set out in section 163.1 of the Criminal Code, which deals with child pornography. I think that you could consider an equivalent provision for the non-consensual distribution of intimate images. You could build on these powers, which allow for the quick removal of illegal content.
That said, you need funding to create teams dedicated to this issue. I know that it's hard to say that one case is more serious than another. However, the police forces that are already dealing with child pornography cases would be prioritized. What happens when images of an older person are involved? I think that you should also look at creating police units or hybrid units, whose mandate would be to address this issue using the proper legal tools.
Regarding the digital footprint, certain technologies already make it possible to recognize what's happening in a video. We've moved beyond mathematical calculations and hashing to find the digital footprint. Software is now available that can help us do this quite well. In my opinion, everyone should check the illegal content database before allowing content to be distributed.
Megan Walker
View Megan Walker Profile
Megan Walker
2021-02-19 16:59
I just want to say really quickly that in the testimony of the MindGeek CEO and COO, I found that they really don't care one bit about interfering legally or about any software. They say in their testimony that for the last two years they've been building the tool called “SafeGuard” to help fight the distribution of non-consensual images. I don't believe that to be true. I would like to see evidence of that.
Also, I would say further that I really don't care if they have that software or not. The reality is that they should not be self-regulating, because they really don't care about anything except how much money they're going to bring in.
Feras Antoon
View Feras Antoon Profile
Feras Antoon
2021-02-05 13:02
Good afternoon. My name is Feras Antoon.
I'm the chief executive officer of Entreprise MindGeek Canada. With me are David Tassillo, chief operations officer, and Corey Urman, vice-president of product management, video-sharing platforms. We are grateful to the committee for the opportunity to speak with you today.
MindGeek is one of the largest, most well-known brands in the online adult entertainment space. Our flagship website, Pornhub, is among the top five most visited websites on the Internet. Over 12.5% of the adult Canadian population visit our website every day. As a leader in this industry, we share the committee's concern about the spread of unlawful content online and about the non-consensual sharing of intimate images. It goes against everything we stand for at MindGeek and Pornhub.
When David and I joined MindGeek in 2008, our goal was to create the most inclusive and safe adult community on the Internet. It was designed to celebrate freedom of expression, to value privacy and to empower adults from all walks of life. We knew this could be possible only if safety and security were our top priority. While we have remained steadfast in our commitment to protect our users and the public, we recognize that we could have done more in the past and we must do more in the future.
I want to be clear to every member of this honourable committee, and to the Canadian public, that even a single unlawful or non-consensual image on MindGeek's platforms is one too many, full stop. We are fathers and husbands. We have over 1,800 employees with families and loved ones. We are devastated by what the victims of these heinous acts have gone through. I want to emphasize that this type of material has no place on our platforms and is contrary to our values and our business model. We are sickened when anyone attempts to abuse our platforms to further their violence. Fortunately, the vast majority of attempts by criminals to use our platform for illicit material are stopped.
Before I speak about the steps we have taken to combat unlawful content on our platform, let me first tell you more about MindGeek and how we operate. MindGeek's flagship video-sharing platform is Pornhub. Created in 2007, Pornhub is a leading free, ad-supported, adult content hosting and streaming website, offering visitors the ability to view content uploaded by verified users, individual content creators and third party studios. Demand for MindGeek's content rivals that of some of the largest social media platforms. For example, in 2020, Pornhub averaged over 4 million unique user sessions per day in Canada alone. In 2020, over 30% of our Canadian visitors were women. Roughly 1.3 million Canadian women visit the site every day.
Running one of the world's most visited websites is a responsibility we do not take lightly. The spread of non-consensual and CSAM content is a massive challenge facing all social media platforms. The U.S.-based National Center for Missing and Exploited Children, also known as NCMEC, the industry standard for reporting CSAM, says it has received 16.9 million referrals from tech companies about possible child abuse, with well over 90% of those related to a single social media platform. MindGeek is a proud partner of NCMEC. We report every instance of CSAM when we are aware of it, so that this information can be disseminated to and investigated by authorities across the globe.
We share the objectives reflected in the 11 voluntary principles developed by governments, including Canada, to fight online sexual exploitation and abuse. We have been leading this fight by being more vigilant in our moderation than almost any other platform, both within and outside of the adult space.
Today, only professional studios and verified users and creators, whose personal identity and date of birth have been confirmed by MindGeek, may upload content. This means every piece of content on our websites can be traced back to its uploader, whose identity and location are known to us. We are the first and only major social media platform, adult or non-adult, to introduce this policy. We hope and expect that the entire social media industry will follow our lead.
We are also working to ensure that once content is removed, it can never make its way back to our platform or to any platform. The revictimization of individuals when their content is re-uploaded causes profound injury that we are working fiercely to prevent. We are attacking this problem in two ways. First, our people are trained to remove such material upon request. Second, we digitally fingerprint any content removed from our website so that it cannot be re-uploaded to our own platform.
For the last two years, we have been building a tool called “SafeGuard” to help fight the distribution of non-consensual intimate images. As I sit before you today, I am pleased to report that this month we will be implementing SafeGuard for all videos uploaded to Pornhub. We will offer SafeGuard for free to our non-adult peers, including Facebook, YouTube and Reddit. We are optimistic that all major social media platforms will implement SafeGuard and contribute to its fingerprint database. Such co-operation will be a major step to limit the spread of non-consensual material on the Internet.
Mr. Chair, thank you for the opportunity to discuss MindGeek's commitment to trust and safety, including our work to stamp out CSAM and non-consensual material on our platforms and on the Internet as a whole.
We look forward to answering the committee's questions.
Thank you.
David Tassillo
View David Tassillo Profile
David Tassillo
2021-02-05 13:13
Mrs. Stubbs, I would like to add to what Feras mentioned.
I'm not too sure where it says that in the terms of service, but I can guarantee you that every piece of content, before it's actually made available on the website, goes through several different filters, some of which my colleague made reference to.
Depending on whether it comes up as a photo or as a video, we go through different pieces of software that would compare it to known active cases of CSAM, so we'll actually do a hash check. We actually don't send the content itself over; they create a digital key per se that's compared to a known active database. After that, it's compared to the other piece of software that Feras mentioned, Vobile, which is a fingerprinting software by which anyone can have their content fingerprinted. Any time MindGeek would find the piece of infringing content, we'd add it to that database to prevent the re-upload.
Once it passes the software queue.... If anything fails at the software level, it automatically doesn't make it up to the site. Once that piece has gone through, we move over to the human moderation section. The human moderators will watch each one of the videos, and if they deem that the video passes, it will be—
Results: 1 - 60 of 88 | Page: 1 of 2

1
2
>
>|
Export As: XML CSV RSS

For more data options, please see Open Data