Committee
Consult the user guide
For assistance, please contact us
Consult the user guide
For assistance, please contact us
Add search criteria
Results: 1 - 15 of 52
View Steven Guilbeault Profile
Lib. (QC)
This is a very good question. My office and my department have spoken as well with victims and victims' organizations. What we want to do with this legislation is to really shift the challenge for victims of having to try to get these images taken down—if we're referring to images that we would find on Pornhub, for example. We're trying to shift the burden of doing this from the individual to the state. It would be up to the Government of Canada, through a regulator, to do that, as it is in other countries, such as Australia, with their e-safety commissioner.
That's the goal we're pursuing with the tabling of this legislation. You are correct; we are also working to ensure that not only are the images taken down but they are removed from websites or associate websites to prevent, for example, the download of such images. They're not going to be downloaded and uploaded and downloaded and uploaded, as we've seen in many cases.
View Steven Guilbeault Profile
Lib. (QC)
Companies should abide by Canadian laws. Whether they're online companies or physical companies, there should be no distinction. As I said earlier, the challenge we face now is that the tools we have to deal with these online harms just aren't adapted to the virtual world.
View Steven Guilbeault Profile
Lib. (QC)
Thank you for the question.
I find your question very cynical, as your party consistently opposes the passage of Bill C‑10, which is not about content moderation, but rather about web giants contributing to our cultural sector's artists and musicians.
View Steven Guilbeault Profile
Lib. (QC)
Once again, your party opposes the passage of Bill C‑10, which has nothing to do with content moderation, while the hate speech and online harm bill specifically addresses the issue of content moderation.
Yet you say you oppose content moderation. You and many of your colleagues say that the government wants to take away your freedom of expression. The exploitation of persons bill will ensure...
View Steven Guilbeault Profile
Lib. (QC)
There are many elements in what you said.
First, I think one of the purposes of the legislation is to ensure more transparency on the part of the platforms in terms of their guidelines and practices regarding content moderation, because right now it's very uneven. Some companies have better content moderation practices than others, and some have very little. You're right—they are not transparent.
Some may have rejoiced in the decision of this platform or that platform to ban this user or another user, but under which criteria? Why them and not someone else? This is clearly something we want to tackle. Frankly, there is an issue where we see the very business model of some of the platforms being about creating controversy and nourishing hate speech and intolerance, because it creates more traffic on their platform. Therefore, they can sell more publicity and make more money.
As part of the legislation that will be tabled, this is also something that we as a legislator will need to address.
View Steven Guilbeault Profile
Lib. (QC)
Well, as stated in my mandate letter, once an illegal publication is flagged, companies will have 24 hours to take it down. Instead of the victims having to try to deal with these companies, it's going to be the Government of Canada that's going to work to ensure that they remove that. If they don't, then there will be consequences for these companies.
Charles DeBarber
View Charles DeBarber Profile
Charles DeBarber
2021-06-07 12:03
Hello. Good afternoon. My name is Charles DeBarber and I'm a senior privacy analyst with Phoenix Advocates and Consultants. My background is U.S. Army cyber-intelligence and cybersecurity.
I began my work with victims of non-consensual pornography, or NCP, in 2015, when I worked for the elite firm Fortalice. As the program manager for open source intelligence, I assisted victims of NCP through our reputation services. Since departing Fortalice in 2018, I have done freelance work on behalf of victims of revenge porn, extortion schemes and cyberstalking, and on purging content for victims of human trafficking. I've written bespoke information guides for clients to help protect their digital privacy and to reduce the chances of their being a target of successful doxing.
My background gives me deep insight into the sources of content on the Internet, and today I want to share with you guys some knowledge about the surface web, deep web and dark web. In addition, I'd like to share some research about the sources of adult NCP on these three layers.
As a disclaimer, I want to be clear that my data regarding NCP is limited in a few ways. First, my data is limited to the 90-plus cases that I've undertaken since 2019. You'll see these are sourced as “PAC Research 2016 to 2021”. I recognize there's a selection bias to that data due to it being from only our casework. Second, much of my information on NCP involving children is largely anecdotal, as I've never produced statistics on it. In addition, the bulk of my work has been with adult victims. Third, I am discussing the concepts of surface web, deep web and dark web and how they relate to the volumes and types of NCP often found on them. This is not to paint any of these layers as good or bad. The dark web has an especially heinous reputation, but remember that there are people who use the dark web to subvert censorship or express their free speech in countries where freedom of speech is very limited.
You'll see in the handout the beautiful iceberg graph that is commonly used to explain the three layers. You have surface web, deep web and dark web. We'll start with the surface web.
The surface web is basically the Internet content indexed by search engines themselves and things you can directly jump to from search engines. It's aggregated web content that can be found with web crawlers, also known as spider bots or spiders. Make note of that, because it is very important for one of the points I'll make later. The surface web is the minority of online content, around 4% to 5%.
What's the deep web? That's the majority of the web, more than 90% of it. It's Internet content that's not part of the surface web and is not indexed in search engines. It's mostly content that is not readily accessible through standard means, such as search engines. As I said, it's the majority of content on the Internet.
Then there's the dark web. It's part of the deep web, but what makes it different is that you have to use encryption software and special software to access it—things like Tor Browser or Freenet or Freegate. It's also used interchangeably with dark net. It can be called both.
NCP comes in many forms. Some of the key forms for adult victims include revenge porn, non-consensual surveillance, human trafficking and data or device breaches. We have the following statistics from our casework. The majority of adult NCP, 73.5% of our cases, was found on the surface web. We believe that the reason for this is that adult NCP pornography easily blends in with amateur pornography. The ease of use and popularity of video- and image-sharing sites on the surface web is the main cause of this.
On top of that, the deep web accounts for about 23.2%. These are often private forums for pirated content, BitTorrent sites, and VoIP and messaging apps like Discord communities. The more compartmented nature of the deep web leads to a lower volume of content that is also less viral.
The dark web accounts for little of our content. Content there, in our experience, includes things that we consider highly illegal, things you would find only on the dark web because they are highly illegal. This could be things like hidden bathroom cam footage, extremely violent content, child pornography and bestiality. NCP blends in with amateur pornography and is readily available on upper layers. There's no reason to go to the dark web for it. Only a minority of Internet users have enough expertise and knowledge of the dark web to use it anyway. The even more compartmentalized nature of the dark web just keeps people off it. This results in more extreme and illegal content being relegated to the dark web.
In our casework, only about 3.3% is dark web content.
There are a few observations I would like to share with the committee. I've removed over 100,000 pieces of NCP content in the last five years. My average client has between 400 to 1,200 pieces of content, and that could be the same picture, video or handful of pictures, but it's shared on many different sites. Viral content itself can be upwards of 6,000 pieces of content and above. Very rarely do I utilize the NCP removal processes created by search engines such as Google or Bing or social media like Facebook, Twitter or Reddit.
I normally use the copyright removal process here in the United States, known as the Digital Millennium Copyright Act. The NCP process often is more complicated and takes longer for victims who have to follow it for every piece of content. Imagine, if you have 400 pieces of content out there, that might be 400 different applications you have to put out. These companies, frankly, respect intellectual property more than victims, because the copyright process is so much easier.
The removal process is costly in both time and resources. I utilize automation, which is not cheap. For a client with more than 400 pieces of content, it would usually cost $2,000 for automated removal and $5,000 for bespoke removal services, and that just mitigates the problem. Victims using it manually require a certain level of understanding of information systems, search engines and web caching, and that is if the victim can find most of the content without using automated aggregators. My junior analysts, some of them with information systems and computer science backgrounds, take up to a month of hands-on work to learn how to effectively purge content. The average victim is expected to have this expertise if they cannot afford professional services. The tools for victims to effectively mitigate their digital footprint of content aren’t readily available.
Great strides have been made to get Silicon Valley to recognize the issue, and I don’t wish to demean those efforts or that recognition. Laws in my home country are now in 48 states and two territories to protect victims of NCP. However, picking up the pieces after NCP floods surface web sites is still an uphill battle. We’ve worked tirelessly so clients can google their name without NCP coming up. One of our clients lives in fear of her 10-year-old using the computer and googling her name. Others have lost job opportunities, housing opportunities and relationships. Many of our clients have contemplated or attempted suicide.
Finally, video upload sites that allow pornography, such as Pornhub or Xvideos, have exacerbated the problem. This is one of the big points I want to make. Content goes viral a lot faster with these sites, and these sites use what is called search engine optimization to flood Google with their content. Even if the content is deleted within 72 hours, it often takes days, frankly, for a victim to even find out that they're a victim. Smaller video upload sites then aggregate this material from search engines and repost it, making this a feedback loop that keeps feeding the search engines and makes it a viral issue.
The issue has become so significant that when a victim’s name is posted in a video title that they're aggregated in and it's then used in search engine keywords for porn sites that don't even have their content, it just becomes a random keyword—their name—and God forbid you have a unique name. Imagine googling your name, and hundreds of porn sites coming up because your name is a keyword empowered by SEO techniques.
We need to find a balance between verification and privacy. That's very easy for me to say, but sites having a reasonable policy for age verification is required. I compliment Pornhub in adopting a verified content policy in late 2020. I'm very angry [Technical difficulty—Editor] and I badly want them held accountable for that, but I want to make sure it's also not so cumbersome that sex workers who are free agents can't operate without reasonable privacy.
Search engines—and this is a key one, and I would recommend you put this forward, or at least encourage them to change their policies—shouldn't allow indexing from adult video image upload sites that do not come from verified accounts. This means that, with verified accounts, the spiders can be turned on so that they can feed into Google, Bing and so on. However, spiders should be turned off on any website where any Joe Schmo can come and upload content, whether it be videos or images. They should be turned off on that content until it is verified. That keeps it from hitting search engines in 72 hours.
Remember, with all NCP, you're really fighting time, and that keeps it from going viral a lot more quickly, quite frankly. It makes the clean-up process significantly better, and it can mitigate it. Furthermore, it would probably protect the intellectual property of other sex workers. As I said, Pornhub and other major tube sites have more or less put NCP into the express lane via SEO techniques.
Finally, the doxing of victims and sex workers is a very serious issue. Despite many of my clients being Jane Does, I can't get Google to delist web pages that post the real names of victims. I wish there was a policy that allowed the delisting of the real names of Jane Does, of sex workers, that exist on sites such as the defunct Porn Wikileaks, which were very dangerous for them and were made for doxing victims.
I'm very open to questions you may have and appreciate your welcoming me today. I'm honoured to be here.
Thank you.
Charles DeBarber
View Charles DeBarber Profile
Charles DeBarber
2021-06-07 12:48
The process of doing this is costly, and it's really just stacked against victims. On top of that, it's stacked against free agent sex workers who are trying to protect their intellectual property.
There's a great Vice article that talks about a lot of OnlyFans folks. They can't do the same services studios can, so it pushes them towards a more exploitive studio structure.
We need to make those things more available. One thing we need to change, once again, is SEO and search engines in unverified content, specifically for upload sites. What I mean by an upload site is any site like Imgur, Pornhub or Xvideos, where I can go in, make an account and post anything I want. Those are not moderated the way—
Charles DeBarber
View Charles DeBarber Profile
Charles DeBarber
2021-06-07 12:49
I might be a little biased there, because I'm an intelligence analyst by trade.
You're asking somebody who goes and subversively finds information about privacy. Honestly, I'm for it. I like the EU's stance on it, to be honest. I'm very biased on that question.
What I would like to see, especially, is that this content doesn't get SEO unless it's verified, because that keeps it from going viral to the point where it costs thousands of dollars to go out there and find the thousands of websites it's on and try to get rid of it. If I can kill it in the crib or at least get it to where.... Your average victim, from my calculation, at least for revenge porn, doesn't know for seven to 90 days. If unverified accounts can post anything they want, then it becomes part of that feedback loop, and that's a big deal. It's as easy as making them turn web spiders off that web page. That's something Pornhub can do. It's something that they really should just be—
Charles DeBarber
View Charles DeBarber Profile
Charles DeBarber
2021-06-07 12:57
Yes and no. It all depends on where it's hosted. It also depends on where you're getting it through. One, there's live content on other websites and other platforms, but then there's the stuff that's right in Google cache. Those are two different animals in terms of getting them purged. You actually have to purge both. Caching is more or less backing the information up. When you click on Google Images, for example, you're usually seeing the cache. When you get rid of the live content, you have to get rid of the cache too—fun fact.
Now, with some companies, like Google, lawyer Carrie Goldberg helped Google write its policy to remove NCP back in 2016, I believe. I'm glad that the rest of the big tech giants, including social media like Reddit and Twitter, emulated that process. The copyright process is still easier, unfortunately. Once again, if that image is repeated 100 times, let's say, then often 100 different notices have to get sent out. You have to do it in both the search engine and on there, but here's the rub—you can get it un-cached on Google, and delisted, but that doesn't get rid of the live content.
Here's one short answer: Give my contact information, please, and I'll help your client pro bono.
View Steven Guilbeault Profile
Lib. (QC)
Bill C-10 is not about content moderation. The CRTC, in its last 50 years of existence, has never done content moderation, and Bill C-10 doesn't give the CRTC the ability to do content moderation.
Results: 1 - 15 of 52 | Page: 1 of 4

1
2
3
4
>
>|
Export As: XML CSV RSS

For more data options, please see Open Data