Committee
Consult the user guide
For assistance, please contact us
Consult the user guide
For assistance, please contact us
Add search criteria
Results: 1 - 15 of 240
Charles DeBarber
View Charles DeBarber Profile
Charles DeBarber
2021-06-07 12:03
Hello. Good afternoon. My name is Charles DeBarber and I'm a senior privacy analyst with Phoenix Advocates and Consultants. My background is U.S. Army cyber-intelligence and cybersecurity.
I began my work with victims of non-consensual pornography, or NCP, in 2015, when I worked for the elite firm Fortalice. As the program manager for open source intelligence, I assisted victims of NCP through our reputation services. Since departing Fortalice in 2018, I have done freelance work on behalf of victims of revenge porn, extortion schemes and cyberstalking, and on purging content for victims of human trafficking. I've written bespoke information guides for clients to help protect their digital privacy and to reduce the chances of their being a target of successful doxing.
My background gives me deep insight into the sources of content on the Internet, and today I want to share with you guys some knowledge about the surface web, deep web and dark web. In addition, I'd like to share some research about the sources of adult NCP on these three layers.
As a disclaimer, I want to be clear that my data regarding NCP is limited in a few ways. First, my data is limited to the 90-plus cases that I've undertaken since 2019. You'll see these are sourced as “PAC Research 2016 to 2021”. I recognize there's a selection bias to that data due to it being from only our casework. Second, much of my information on NCP involving children is largely anecdotal, as I've never produced statistics on it. In addition, the bulk of my work has been with adult victims. Third, I am discussing the concepts of surface web, deep web and dark web and how they relate to the volumes and types of NCP often found on them. This is not to paint any of these layers as good or bad. The dark web has an especially heinous reputation, but remember that there are people who use the dark web to subvert censorship or express their free speech in countries where freedom of speech is very limited.
You'll see in the handout the beautiful iceberg graph that is commonly used to explain the three layers. You have surface web, deep web and dark web. We'll start with the surface web.
The surface web is basically the Internet content indexed by search engines themselves and things you can directly jump to from search engines. It's aggregated web content that can be found with web crawlers, also known as spider bots or spiders. Make note of that, because it is very important for one of the points I'll make later. The surface web is the minority of online content, around 4% to 5%.
What's the deep web? That's the majority of the web, more than 90% of it. It's Internet content that's not part of the surface web and is not indexed in search engines. It's mostly content that is not readily accessible through standard means, such as search engines. As I said, it's the majority of content on the Internet.
Then there's the dark web. It's part of the deep web, but what makes it different is that you have to use encryption software and special software to access it—things like Tor Browser or Freenet or Freegate. It's also used interchangeably with dark net. It can be called both.
NCP comes in many forms. Some of the key forms for adult victims include revenge porn, non-consensual surveillance, human trafficking and data or device breaches. We have the following statistics from our casework. The majority of adult NCP, 73.5% of our cases, was found on the surface web. We believe that the reason for this is that adult NCP pornography easily blends in with amateur pornography. The ease of use and popularity of video- and image-sharing sites on the surface web is the main cause of this.
On top of that, the deep web accounts for about 23.2%. These are often private forums for pirated content, BitTorrent sites, and VoIP and messaging apps like Discord communities. The more compartmented nature of the deep web leads to a lower volume of content that is also less viral.
The dark web accounts for little of our content. Content there, in our experience, includes things that we consider highly illegal, things you would find only on the dark web because they are highly illegal. This could be things like hidden bathroom cam footage, extremely violent content, child pornography and bestiality. NCP blends in with amateur pornography and is readily available on upper layers. There's no reason to go to the dark web for it. Only a minority of Internet users have enough expertise and knowledge of the dark web to use it anyway. The even more compartmentalized nature of the dark web just keeps people off it. This results in more extreme and illegal content being relegated to the dark web.
In our casework, only about 3.3% is dark web content.
There are a few observations I would like to share with the committee. I've removed over 100,000 pieces of NCP content in the last five years. My average client has between 400 to 1,200 pieces of content, and that could be the same picture, video or handful of pictures, but it's shared on many different sites. Viral content itself can be upwards of 6,000 pieces of content and above. Very rarely do I utilize the NCP removal processes created by search engines such as Google or Bing or social media like Facebook, Twitter or Reddit.
I normally use the copyright removal process here in the United States, known as the Digital Millennium Copyright Act. The NCP process often is more complicated and takes longer for victims who have to follow it for every piece of content. Imagine, if you have 400 pieces of content out there, that might be 400 different applications you have to put out. These companies, frankly, respect intellectual property more than victims, because the copyright process is so much easier.
The removal process is costly in both time and resources. I utilize automation, which is not cheap. For a client with more than 400 pieces of content, it would usually cost $2,000 for automated removal and $5,000 for bespoke removal services, and that just mitigates the problem. Victims using it manually require a certain level of understanding of information systems, search engines and web caching, and that is if the victim can find most of the content without using automated aggregators. My junior analysts, some of them with information systems and computer science backgrounds, take up to a month of hands-on work to learn how to effectively purge content. The average victim is expected to have this expertise if they cannot afford professional services. The tools for victims to effectively mitigate their digital footprint of content aren’t readily available.
Great strides have been made to get Silicon Valley to recognize the issue, and I don’t wish to demean those efforts or that recognition. Laws in my home country are now in 48 states and two territories to protect victims of NCP. However, picking up the pieces after NCP floods surface web sites is still an uphill battle. We’ve worked tirelessly so clients can google their name without NCP coming up. One of our clients lives in fear of her 10-year-old using the computer and googling her name. Others have lost job opportunities, housing opportunities and relationships. Many of our clients have contemplated or attempted suicide.
Finally, video upload sites that allow pornography, such as Pornhub or Xvideos, have exacerbated the problem. This is one of the big points I want to make. Content goes viral a lot faster with these sites, and these sites use what is called search engine optimization to flood Google with their content. Even if the content is deleted within 72 hours, it often takes days, frankly, for a victim to even find out that they're a victim. Smaller video upload sites then aggregate this material from search engines and repost it, making this a feedback loop that keeps feeding the search engines and makes it a viral issue.
The issue has become so significant that when a victim’s name is posted in a video title that they're aggregated in and it's then used in search engine keywords for porn sites that don't even have their content, it just becomes a random keyword—their name—and God forbid you have a unique name. Imagine googling your name, and hundreds of porn sites coming up because your name is a keyword empowered by SEO techniques.
We need to find a balance between verification and privacy. That's very easy for me to say, but sites having a reasonable policy for age verification is required. I compliment Pornhub in adopting a verified content policy in late 2020. I'm very angry [Technical difficulty—Editor] and I badly want them held accountable for that, but I want to make sure it's also not so cumbersome that sex workers who are free agents can't operate without reasonable privacy.
Search engines—and this is a key one, and I would recommend you put this forward, or at least encourage them to change their policies—shouldn't allow indexing from adult video image upload sites that do not come from verified accounts. This means that, with verified accounts, the spiders can be turned on so that they can feed into Google, Bing and so on. However, spiders should be turned off on any website where any Joe Schmo can come and upload content, whether it be videos or images. They should be turned off on that content until it is verified. That keeps it from hitting search engines in 72 hours.
Remember, with all NCP, you're really fighting time, and that keeps it from going viral a lot more quickly, quite frankly. It makes the clean-up process significantly better, and it can mitigate it. Furthermore, it would probably protect the intellectual property of other sex workers. As I said, Pornhub and other major tube sites have more or less put NCP into the express lane via SEO techniques.
Finally, the doxing of victims and sex workers is a very serious issue. Despite many of my clients being Jane Does, I can't get Google to delist web pages that post the real names of victims. I wish there was a policy that allowed the delisting of the real names of Jane Does, of sex workers, that exist on sites such as the defunct Porn Wikileaks, which were very dangerous for them and were made for doxing victims.
I'm very open to questions you may have and appreciate your welcoming me today. I'm honoured to be here.
Thank you.
Melissa Lukings
View Melissa Lukings Profile
Melissa Lukings
2021-06-07 12:25
No worries.
The committee is dealing with a Canadian-controlled private corporation, a CCPC, which is a private commercial organization based in and operating with headquarters located in Canada. It is a Canadian company. We know this, and that's fine. Commercial organizations in Canada are bound by the Personal Information Protection and Electronic Documents Act. PIPEDA outlines the rules and remedies, including the fines and other penalties, for corporations that fail to abide by the provisions specified in the act.
Beyond the corporate level, we also have the Criminal Code of Canada, which outlines the criminal offences and punishments for committing such offences. We have these. We need to apply them. Everyone is bound by the Criminal Code of Canada.
Why, then, do we need additional regulations? Why do we need more oversight when we have not yet tried to simply apply the law we already have? We have these laws. We can use them, so let's use them. That's what they're for. What's the point in even having these statutes if you're not going to apply them when they're needed? What are we doing here?
We're here because a portion of those involved have decided to conflate the issue of corporate negligence with highly sexualized and emotive criminal activity—read again, child rape porn testimony. It elicits an emotional response—the sympathetic nervous system and all of that. It doesn't matter. This is about a corporation and user-generated content. It does not matter what is depicted in the content as much as it matters that the content, whatever it may be, should not have gotten past the corporation's screening system before being made live on the site. When the issue was brought to its attention, the corporation responded inadequately at first, so we need corporate law. We need to look at liability and feasibility standards.
Why has this become a forum for grandstanding religious ideologies? I'm sure you've all heard about Exodus Cry in the news, if you've been following it. Exodus Cry is a fundamental Christian organization founded on religious ideologies stemming from the United States. Why is it relevant to a question of corporate liability in Canada? It isn't. It doesn't make any sense.
Why are we arguing about exploitation? Why are we discussing mass censorship? Is that not a massive overreaction to a simple corporate negligence question? It seems glaringly obvious to me, so why are we not discussing reasonable options for encouraging corporations to better serve their users?
Also, I have some opinions about the genderedness of this. You can read about it in my notes.
When it comes down to it, you can't eliminate sex. We're humans, and there is always going to be a demand for sex. You can't eliminate sex work because the demand exists. You can't eliminate extramarital sex or porn or masturbation or demand for sexual services, but sexual assault is illegal, even when that person is your spouse. We need it to be that way. We want to protect people. If you're saying you can do certain things only within the context of marriage, you're setting yourself up for failure. It's true.
Yes, I said “masturbation” in a hearing. Oh my God.
You cannot eliminate base human desires, so you can't eliminate sex. That would be silly. It's okay to not like these things, and just because you don't like a thing or you feel that a thing is not for you, it doesn't mean it's inherently evil and should be eliminated. It doesn't work that way. It's not about and should not be about pornography or the actual content of online material here. This is about creating reasonable laws that work for Canada, Canadian corporations and everyone residing within Canada. We don't need new regulations; we don't need a new regulator, and we don't need online censorship. We need to use the tools we already have, which were designed for a reason. Why be redundant?
That is my diatribe.
Thank you for having me. I will take any questions you throw at me.
View Shannon Stubbs Profile
CPC (AB)
View Shannon Stubbs Profile
2021-06-07 12:34
Thank you.
I wonder if, from your work experience and your lived experience, you might want to expand on the importance of verification and consent. If platforms ever do that without your consent or your agreement, what are the commercial consequences, or the personal consequences in the case of adults who are choosing freely to engage in this work?
Melissa Lukings
View Melissa Lukings Profile
Melissa Lukings
2021-06-07 12:34
We're talking about what are the consequences if someone consensually uploads their own material?
View Shannon Stubbs Profile
CPC (AB)
View Shannon Stubbs Profile
2021-06-07 12:34
If an online platform were to host your material without an agreement with you or—
Melissa Lukings
View Melissa Lukings Profile
Melissa Lukings
2021-06-07 12:34
That's intellectual property. That's a copyright issue right there. As a photographer, when you take photos, you have a model release form. These are all contractual issues that would arise. If someone doesn't have your permission to use the material, then that is a digital copyright infringement. That's an artistic thing. It's exactly the same as if someone were to host any artistic content anywhere without the permission of the artist. It's very similar to that.
Again, we have the Copyright Act for that.
View David Lametti Profile
Lib. (QC)
Thank you very much, Mr. Chair. Good afternoon.
I wish to acknowledge that I'm speaking to you today from Ottawa on the traditional territory of the Algonquin people.
Thank you, Mr. Chair, for the invitation to appear before you to discuss the charter statement that was tabled for Bill C-10, as well as the explanatory document requested for the proposed amendments now before the committee.
As you can see, I'm appearing alongside Minister Guilbeault, who is the minister responsible for Bill C-10. I am accompanied by officials from my department.
I want to begin by discussing the duty I have under the law, as Minister of Justice, to prepare statements regarding the Canadian Charter of Rights and Freedoms for government bills introduced in the House of Commons.
I will discuss the purpose of charter statements and provide the context, including their history. I will explain what charter statements are meant to do and not do.
I will also gladly speak to the charter statement tabled in relation to Bill C-10, as well as the explanatory document provided to the committee concerning the potential effects of the proposed amendments on freedom of expression.
I should note at the outset that it is not my role as Minister of Justice and Attorney General to give legal advice to parliamentary committees. You have access to your own legal counsel and independent witnesses.
As you are aware, however, I do have obligations under the Department of Justice Act in terms of reviewing proposed government bills for inconsistency with the charter and preparing charter statements for government bills. This obligation was created by our government to be open and transparent with Canadians about the charter considerations of our legislation.
These two sets of obligations—examining bills and preparing charter statements—are both focused on the bill as tabled.
Section 4.2 of the Department of Justice Act requires the Minister of Justice to ensure that a charter statement is tabled in the House of Commons for every government bill. That obligation came into force in December 2019.
Examining bills for potential inconsistency with the charter, as set out in section 4.1, is one of my most important responsibilities. Rest assured that I also take very seriously the obligation to ensure charter statements are tabled in the House, as set out in section 4.2.
Now I will turn to the purpose of charter statements.
Charter statements are intended to inform parliamentary and public debate on a government bill. They foster transparency regarding the effects of a government bill on the fundamental values protected by the charter. They provide parliamentarians with additional information to further inform the important legislative debates they have on behalf of Canadians. Charter statements also provide Canadians with additional information to help them participate in these debates through their elected representatives.
The obligation to table charter statements is a testament to our government's commitment to respect and uphold the charter, as an integral part of the country's good governance.
We can never abdicate our responsibility as a government to ensure that our decisions—including those reflected in the reform of an act—respect our fundamental rights and freedoms. Section 4.2 of the Department of Justice Act strengthens the obligation this government and future governments have to respect this most basic of requirements.
I would like to take a few moments to explain the content of charter statements. In keeping with their purpose, charter statements are drafted at a high level. They set out in an accessible way the potential effects a bill may have on the rights and freedoms guaranteed by the charter. Charter statements also explain considerations that support the constitutionality of a bill.
In our discussion of the charter, it is also important to stress that, when Parliament legislates, it may have an effect on charter rights and freedoms. This may include limiting people's enjoyment or exercise when it is in the broader public interest to do so. This is entirely legitimate. The rights and freedoms guaranteed in the charter are not absolute, but rather subject to reasonable limits, as long as those limits can be demonstrably justified in a free and democratic society.
This means that, when identifying the potential effect of a bill that could limit a right or a freedom, it may also be necessary to consider whether the limit is reasonable and justified. A charter statement may therefore outline considerations relevant to the potential justifiability of a bill.
The fact that charter rights and freedoms can be limited, however, is not a licence to violate them. Rather, it is a reminder that any legislative limits to rights and freedoms must be carefully considered in the context of the shared values of Canada's unique, free and democratic society.
As parliamentarians, it is our responsibility to discuss and debate potential effects on charter guarantees. We exercise our judgment on behalf of Canadians as to whether proposed legislation strikes the right balance between rights and freedoms and the broader public interest. Charter statements are one more source of information to add to our deliberations.
I would also like to take a moment to explain what a charter statement is not.
A charter statement is not a legal opinion. It does not provide a comprehensive analysis of the constitutionality of a bill.
As I mentioned, a charter statement provides Parliament and the public with legal information relating to the possible effects of a bill on the rights guaranteed by the charter and to the considerations that support the consistency of the bill with the charter.
As we all know, bills often change when they are being considered by Parliament. A charter statement reflects the bill at the time it was introduced by the government in the House of Commons. Section 4.2 of the Department of Justice Act does not require that charter statements be updated as a bill progresses through Parliament.
Keeping that in mind, I will now turn to the proposed amendments to Bill C-10 in relation to social media, which are before the committee.
My fellow minister Mr. Guilbeault talked about the scope of the proposed amendments. He highlighted the key objectives underlying the amendments and discussed their intended effects on social media services and users.
In short, the proposed amendments are intended to empower the Canadian Radio-television and Telecommunications Commission to regulate a social media service in respect of programs uploaded by its unaffiliated users, strictly in relation to the following: payment of regulatory charges, such as to support the creation of Canadian programming; discoverability of Canadian creators; registration of the service; provision of information; and auditing of records.
In keeping with my obligations under the Department of Justice Act, I tabled a charter statement for Bill C-10 in the House of Commons on November 18, 2020. The charter statement for Bill C-10 identifies the rights and freedoms that may potentially be engaged by the bill, and relevant considerations that support the bill's consistency with the charter.
In considering the committee's recent discussions focusing on the impacts of the proposed amendments on social media, I understand there has been extensive debate on freedom of expression.
We have prepared and shared with you an explanatory document that examines the amendments, and discusses their potential effect on the right to freedom of expression in section 2(b) of the charter. I'm confident that these considerations support the charter consistency of the bill, and that they remain as outlined in the charter statement. It is our position that the bill, as tabled, and these proposed amendments are consistent with the charter.
As the charter statement indicates, the bill's regulatory requirements have the potential to engage freedom of expression in section 2(b) of the charter. The following considerations support the continued consistency of the proposed regulatory requirements of section 2(b).
By virtue of clause 1, which would remain in the bill, unaffiliated users of social media services would not be subject to broadcasting regulation in respect of the programs they post. What remains is an updating of the CRTC's regulatory powers, and providing it with new powers applicable to online service. The bill maintains the CRTC's role and flexibility at determining what, if any, regulatory requirements to impose on broadcasting undertakings.
Regarding the proposal to give the CRTC new limited powers to regulate an online undertaking that provides the social media service in respect of programs posted by unaffiliated users, the relevant charter considerations include the CRTC's discretionary role and flexibility.
The proposed narrowing of the CRTC's discretionary powers to regulate its social media service in respect of programs posted by unaffiliated users, to only discrete members that I have mentioned, is an additional consideration. The CRTC is subject to the charter, and must exercise any discretionary powers it has in a manner that is consistent with the charter.
The act states that it must be interpreted and applied in a manner consistent with freedom of expression. The CRTC's decisions on matters of law or jurisdiction are subject to review by the Federal Court of Appeal.
In my view, the relevant considerations that are set out in the charter statement remain valid. These considerations are not impacted by the proposed amendments.
Once again, thank you for the opportunity to address the committee today.
I am at your disposal to answer questions.
View Rachael Harder Profile
CPC (AB)
Thank you.
Minister, in the charter statement for BillC-10, clause 3, proposed section 4.1 is cited as grounds for the bill being in compliance with the charter. We know that section was removed. Experts in the industry now say that the removal of section 4.1 takes away the safeguards that were imperative to protect user-generated content.
Do you agree with that?
View David Lametti Profile
Lib. (QC)
As I said in my opening remarks, I'm not going to give legal advice. That is not part of my role as Minister of Justice. I don't give legal advice to committees.
That being said, the Department of Justice has provided a further explanatory document that examines the amendments, and discusses their potential effect on the right to freedom of expression, section 2(b) of the charter.
As I said in my opening remarks, I'm confident that the conclusion of that explanatory document is that the bill remains consistent, and the original charter statement has not changed as a result.
View Rachael Harder Profile
CPC (AB)
Dr. Geist makes things very clear when he says, “There is simply no debating that” by removing section 4.1, “the bill now applies to user-generated content, since all audiovisual content is treated as a program under the act.”
Do you agree with that? Is that a correct statement?
View David Lametti Profile
Lib. (QC)
I believe you're quoting Professor Michael Geist. I will defer to my colleague Minister Guilbeault to answer that question.
View Alain Rayes Profile
CPC (QC)
Thank you, Mr. Chair.
Mr. Minister, thank you for finally agreeing to come and meet with us. I'm very pleased.
Let me first ask you a very simple question. Does section 2(b) of the charter protect users' freedom of expression and the content they put online, yes or no?
View David Lametti Profile
Lib. (QC)
Thank you for the question, Mr. Rayes.
As I said at the outset, I am not here to give legal opinions or advice. That is not my role today. I never do that in public. It's true that, generally speaking, section 2(b) protects freedom of expression, but I'm not going to go into the details hypothetically. That is not my role today.
View Alain Rayes Profile
CPC (QC)
Mr. Lametti, with all due respect, you are the Minister of Justice. The Canadian Charter of Rights and Freedoms is a public document. My question is simple: I would like to know whether the Canadian Charter of Rights and Freedoms protects only individuals or whether it also protects the content they post online.
In your opening remarks, you said that there may be some limits to rights and freedoms, but you didn't want to elaborate on that, and you're perfectly entitled to refuse to do so.
I'm not asking you to give us a legal opinion or to prove any of this. I just want to know whether or not the Canadian Charter of Rights and Freedoms protects both individuals and the content they put online.
View Alain Rayes Profile
CPC (QC)
That's fine. Thank you, Mr. Lametti.
The chair has made it very clear that you are under no obligation to answer our questions if you do not wish to do so.
My understanding is that you don't want to tell us whether section 2(b) of the Charter protects both individuals and the content they post online. I don't know whether that is true or not, but that is my understanding.
The statement that you submitted on November 18 explicitly included in its analysis the proposed section 4.1 of the Broadcasting Act. That section was removed on a Friday afternoon about three weeks ago. That is at the root of the conflict we find ourselves in. However, you, as Minister of Justice, do not want to give us a legal opinion or at least tell us, based on your expertise, what you think.
You said earlier that lawyers or experts could be consulted once the bill is passed. Experts have already come to speak with us. Yesterday, Le Devoir published an open letter supported by five experts, including several former senior CRTC officials. I am sure you have read it. If not, your advisors or political staff must have read it. Those senior executives explicitly said that this would be challenged. We already know that. We have heard concerns from university professors, experts and policy analysts. I think it is legitimate for members of Parliament, who have to make recommendations, to consider those concerns.
Originally, the bill proposed to add section 4.1 to the act to protect the content that users post online. Now that this section has been removed, how can we be sure that users' content will be protected?
As a member of the House of Commons, how can I make a decision on this issue if you, as Minister of Justice, cannot help me?
Results: 1 - 15 of 240 | Page: 1 of 16

1
2
3
4
5
6
7
8
9
10
>
>|
Export As: XML CSV RSS

For more data options, please see Open Data