Committee
Consult the user guide
For assistance, please contact us
Consult the user guide
For assistance, please contact us
Add search criteria
Results: 1 - 30 of 139
View Steven Guilbeault Profile
Lib. (QC)
Thank you, Mr. Chair.
Mr. Chair, members of the committee, good morning.
I would first like to acknowledge that I am joining you from Montreal, on the traditional territory of the Mohawk and other Haudenosaunee peoples.
Thank you for inviting me to speak to you today. With me, as you said, are Joëlle Montminy, senior assistant deputy minister, cultural affairs, and Pierre-Marc Perreault, acting director, digital citizen initiative.
Like you and many other Canadians, I am concerned by the disturbing rise and spread of hateful, violent and exploitive content online and on social media.
As a legislator and father of four children, I find some of the content of these platforms to be profoundly inhuman.
I am also deeply troubled by the consequences and the echoes of that content in the real world.
The overall benefits of the digital economy and social media are without question. In fact, I published a book, shortly before I took up politics, wherein I talked about the benefits of the digital economy, of artificial intelligence in particular, but also about some unintended negative consequences.
In Canada, more than 9 out of 10 adults use at least one online platform, and since the beginning of the pandemic, online platforms have played an even more important role in our lives.
We use social media platforms like Facebook, Twitter, Instagram and YouTube to stay connected to our families, friends and colleagues. We use them to work, to conduct business, to reach new markets and audiences, to make our voices and opinions heard, and to engage in necessary and vital democratic debate. However, we have also seen how social media can have negative and very harmful impacts.
On a daily basis, there are Internet users who share damaging content, either to spread hate speech, the sexual exploitation of children, terrorist propaganda, or words meant to incite violence.
This content has led and contributed to violent outbursts such as the attack on the Islamic Cultural Centre in Quebec City in 2017, and similar attacks in Christchurch, New Zealand, in 2019.
Canadians and people all over the world have watched these events and others unfold on the news with shock and fear. We all understand the connections between these events and hateful, harmful online discourse. We worry about our own safety and security online. We worry about what our children and our loved ones will be exposed to.
According to a recent poll by the Canadian Race Relations Foundation, an overwhelming 93% of Canadians believe that online hate and racism are a problem, and at least 60% believe that the government has an obligation to prevent the spread of hateful and racist content online.
In addition, the poll revealed that racialized groups in Canada are more than three times more likely to experience racism online than non-racialized Canadians.
Since the beginning of the COVID‑19 pandemic, we have seen a rise in anti-Asian hate speech on the Internet and a steady increase in anti-Semitic rhetoric, further fuelled by recent events.
A June 2020 study by the Institute for Strategic Dialogue found that Canadians use more than 6,600 online services, pages and accounts hosted on various social media platforms to convey ideologies tinged with white supremacism, misogyny or extremism. This type of content wreaks havoc and destroys lives. It is intimidating and undermines constructive exchange. In doing so, it prevents us from having a true democratic debate and undermines free speech.
The facts speak for themselves. We must act, and we must act now. We believe that every person has the right to express themselves and participate in Internet exchanges to the fullest extent possible, without fear and without intimidation or concern for their safety. We believe that the Internet should be an inclusive place where we can safely express ourselves.
Our government is therefore committed to taking concrete steps to address harmful content online, particularly if the content advocates child sexual exploitation, terrorism, violence, hate speech, and non-consensual sharing of intimate images.
In fact, this is one of the priorities outlined in the mandate letter given to me by Prime Minister Justin Trudeau. So we have begun the process to develop legislation that will address the concerns of Canadians.
Over the past few months my office and I have engaged with over 140 stakeholders from both civil society organizations and the digital technology sector regarding this issue. This has included seven round-table discussions. We also spoke with indigenous groups, racialized Canadians, elected provincial officials, municipal officials and our international partners to assess our options and begin to develop a proposed approach.
In addition, given the global nature of the problem, I have hosted a virtual meeting with my counterparts from Australia, Finland, France and Germany—who were part of the multi-stakeholder working group on diversity of content online—to discuss the importance of a healthy digital ecosystem and how to work collectively.
I am also working closely with my colleagues the ministers of Justice, Public Safety, Women and Gender Equality,Diversity and Inclusion and Youthas well asInnovation, Science and Industry to find the best possible solution.
Our collaborative work aims to ensure that Canada's approach is focused on protecting Canadians and continued respect for their rights, including freedom of opinion and expression under the Charter of Rights and Freedoms. The goal is to develop a proposal that establishes an appropriate balance between protecting speech and preventing harm.
Let me be clear. Our objective is not to reduce freedom of expression but to increase it for all users, and to ensure that no voices are being suppressed because of harmful content.
We want to build a society where radicalization, hatred, and violence have no place, where everyone is free to express themselves, where exchanges are not divisive, but an opportunity to connect, understand, and help each other. We are continuing our work and hope to act as quickly and effectively as possible. I sincerely hope that I can count on the committee's support and move forward to build a more transparent, accountable and equitable digital world.
I thank you for your attention and will be happy to answer any questions you may have.
View Charlie Angus Profile
NDP (ON)
Rose Kalemba contacted our committee and asked us to fight for her. At age 14, she was kidnapped, brutally tortured and sexually assaulted, and her videos were posted on Pornhub, downloaded and promoted.
In your view—and I just have to be blunt here because we've talked about some really difficult stuff at our committee so I hope you don't find me being too blunt—would you believe that the posting of those videos represents criminal acts?
View Steven Guilbeault Profile
Lib. (QC)
As you are well aware, they are criminal acts according to the Canadian Criminal Code, yes.
View Charlie Angus Profile
NDP (ON)
Good, because it has sections 162, 163 and 164, and yet those laws are not being applied.
I need to know why we need a regulator to oversee something that's already under the Criminal Code. The promotion of these videos, according to law, is a criminal act, so why don't we just apply the law?
View Steven Guilbeault Profile
Lib. (QC)
As I said earlier, the challenge that we in Canada, and countries all around the world, are facing is that the tools that we have to deal with these issues in the physical world just aren't adapted to the virtual world. This is why Australia created a new regulatory body to deal with that, and it is why a number of countries either have created or are in the process of creating new regulations, new regulators, or both, to deal with this. It's because the tools we have just aren't adaptable.
View Charlie Angus Profile
NDP (ON)
Are you saying we simply don't need to use the Criminal Code? What surprises me is that internal documents from the RCMP's December 12 briefing note on Pornhub pointed out that your office is going to be taking the lead.
According to those documents, they are not going after Pornhub, so did cabinet tell the RCMP to stand down while you developed this regulator? Why is it that the RCMP are under the impression that you're the lead on this, and that the Canadian laws that exist are not going to be applied?
View Steven Guilbeault Profile
Lib. (QC)
I respectfully disagree with the premise of your question. As I stated earlier, the legislation will address five categories of online harms, which are already criminal according to Canadian law, and which are already criminal activities under the Canadian Criminal Code.
View Charlie Angus Profile
NDP (ON)
I get that. I guess my concern is that you haven't actually come up with legislation. You don't know when this regulator's going to appear, and the RCMP internal notes say your office is taking the lead.
We have survivors who suffered serious crimes and abuse. We have the Criminal Code. I'm wanting to know why your government is saying that it will be the regulator that handles that, as opposed to telling the RCMP and the justice minister to do their job.
View Steven Guilbeault Profile
Lib. (QC)
I think you're misunderstanding what we're trying to do.
There are many reasons we need to create a regulator. One—
View Charlie Angus Profile
NDP (ON)
I don't have a problem with the regulator. What I have a problem with is the fact that we actually have criminal laws in place, and it seems that the RCMP has decided that Pornhub doesn't have to actually follow the law—there's voluntary compliance; your Attorney General says he's not even sure if they're a Montreal company; you're telling us there's going to be some kind of regulator, but you don't have one....
I just have to be honest. Having the minister of culture and communications handle a file about horrific sexual assault videos to me is like asking the minister of transportation to look after human trafficking.
Why is it that the laws of the land are just not being applied? You can go and get a regulator, but why are the laws not being applied?
View Steven Guilbeault Profile
Lib. (QC)
Your analogy would be correct if I were the only one doing this. I'm not.
As I stated in my remarks initially, I am working with the Minister of Public Safety, with the Minister of Justice and with a number of other colleagues. This is a whole-of-government approach. It's not—
View Charlie Angus Profile
NDP (ON)
I know, and they say you're the lead on this. They defer to you.
View Charlie Angus Profile
NDP (ON)
We don't have a regulator. We don't have any action. Again, what do I tell the survivors who are being told, sorry, not much is going to happen but maybe a regulator, and maybe there will be a new CRTC for porn? How long are they going to have to wait before they actually see something?
View Steven Guilbeault Profile
Lib. (QC)
This was in my mandate letter when I was nominated as the Minister of Canadian Heritage. We started right away, despite the most important pandemic we've seen in the last 100 years, doing public consultations, doing the work. Some people may like—
View Steven Guilbeault Profile
Lib. (QC)
I have not personally, but the department and people on my team have, so yes, we have, but it's not something that can be solved overnight. It's a complex issue. As we're seeing all around the world, countries are struggling with this.
Charles DeBarber
View Charles DeBarber Profile
Charles DeBarber
2021-06-07 12:03
Hello. Good afternoon. My name is Charles DeBarber and I'm a senior privacy analyst with Phoenix Advocates and Consultants. My background is U.S. Army cyber-intelligence and cybersecurity.
I began my work with victims of non-consensual pornography, or NCP, in 2015, when I worked for the elite firm Fortalice. As the program manager for open source intelligence, I assisted victims of NCP through our reputation services. Since departing Fortalice in 2018, I have done freelance work on behalf of victims of revenge porn, extortion schemes and cyberstalking, and on purging content for victims of human trafficking. I've written bespoke information guides for clients to help protect their digital privacy and to reduce the chances of their being a target of successful doxing.
My background gives me deep insight into the sources of content on the Internet, and today I want to share with you guys some knowledge about the surface web, deep web and dark web. In addition, I'd like to share some research about the sources of adult NCP on these three layers.
As a disclaimer, I want to be clear that my data regarding NCP is limited in a few ways. First, my data is limited to the 90-plus cases that I've undertaken since 2019. You'll see these are sourced as “PAC Research 2016 to 2021”. I recognize there's a selection bias to that data due to it being from only our casework. Second, much of my information on NCP involving children is largely anecdotal, as I've never produced statistics on it. In addition, the bulk of my work has been with adult victims. Third, I am discussing the concepts of surface web, deep web and dark web and how they relate to the volumes and types of NCP often found on them. This is not to paint any of these layers as good or bad. The dark web has an especially heinous reputation, but remember that there are people who use the dark web to subvert censorship or express their free speech in countries where freedom of speech is very limited.
You'll see in the handout the beautiful iceberg graph that is commonly used to explain the three layers. You have surface web, deep web and dark web. We'll start with the surface web.
The surface web is basically the Internet content indexed by search engines themselves and things you can directly jump to from search engines. It's aggregated web content that can be found with web crawlers, also known as spider bots or spiders. Make note of that, because it is very important for one of the points I'll make later. The surface web is the minority of online content, around 4% to 5%.
What's the deep web? That's the majority of the web, more than 90% of it. It's Internet content that's not part of the surface web and is not indexed in search engines. It's mostly content that is not readily accessible through standard means, such as search engines. As I said, it's the majority of content on the Internet.
Then there's the dark web. It's part of the deep web, but what makes it different is that you have to use encryption software and special software to access it—things like Tor Browser or Freenet or Freegate. It's also used interchangeably with dark net. It can be called both.
NCP comes in many forms. Some of the key forms for adult victims include revenge porn, non-consensual surveillance, human trafficking and data or device breaches. We have the following statistics from our casework. The majority of adult NCP, 73.5% of our cases, was found on the surface web. We believe that the reason for this is that adult NCP pornography easily blends in with amateur pornography. The ease of use and popularity of video- and image-sharing sites on the surface web is the main cause of this.
On top of that, the deep web accounts for about 23.2%. These are often private forums for pirated content, BitTorrent sites, and VoIP and messaging apps like Discord communities. The more compartmented nature of the deep web leads to a lower volume of content that is also less viral.
The dark web accounts for little of our content. Content there, in our experience, includes things that we consider highly illegal, things you would find only on the dark web because they are highly illegal. This could be things like hidden bathroom cam footage, extremely violent content, child pornography and bestiality. NCP blends in with amateur pornography and is readily available on upper layers. There's no reason to go to the dark web for it. Only a minority of Internet users have enough expertise and knowledge of the dark web to use it anyway. The even more compartmentalized nature of the dark web just keeps people off it. This results in more extreme and illegal content being relegated to the dark web.
In our casework, only about 3.3% is dark web content.
There are a few observations I would like to share with the committee. I've removed over 100,000 pieces of NCP content in the last five years. My average client has between 400 to 1,200 pieces of content, and that could be the same picture, video or handful of pictures, but it's shared on many different sites. Viral content itself can be upwards of 6,000 pieces of content and above. Very rarely do I utilize the NCP removal processes created by search engines such as Google or Bing or social media like Facebook, Twitter or Reddit.
I normally use the copyright removal process here in the United States, known as the Digital Millennium Copyright Act. The NCP process often is more complicated and takes longer for victims who have to follow it for every piece of content. Imagine, if you have 400 pieces of content out there, that might be 400 different applications you have to put out. These companies, frankly, respect intellectual property more than victims, because the copyright process is so much easier.
The removal process is costly in both time and resources. I utilize automation, which is not cheap. For a client with more than 400 pieces of content, it would usually cost $2,000 for automated removal and $5,000 for bespoke removal services, and that just mitigates the problem. Victims using it manually require a certain level of understanding of information systems, search engines and web caching, and that is if the victim can find most of the content without using automated aggregators. My junior analysts, some of them with information systems and computer science backgrounds, take up to a month of hands-on work to learn how to effectively purge content. The average victim is expected to have this expertise if they cannot afford professional services. The tools for victims to effectively mitigate their digital footprint of content aren’t readily available.
Great strides have been made to get Silicon Valley to recognize the issue, and I don’t wish to demean those efforts or that recognition. Laws in my home country are now in 48 states and two territories to protect victims of NCP. However, picking up the pieces after NCP floods surface web sites is still an uphill battle. We’ve worked tirelessly so clients can google their name without NCP coming up. One of our clients lives in fear of her 10-year-old using the computer and googling her name. Others have lost job opportunities, housing opportunities and relationships. Many of our clients have contemplated or attempted suicide.
Finally, video upload sites that allow pornography, such as Pornhub or Xvideos, have exacerbated the problem. This is one of the big points I want to make. Content goes viral a lot faster with these sites, and these sites use what is called search engine optimization to flood Google with their content. Even if the content is deleted within 72 hours, it often takes days, frankly, for a victim to even find out that they're a victim. Smaller video upload sites then aggregate this material from search engines and repost it, making this a feedback loop that keeps feeding the search engines and makes it a viral issue.
The issue has become so significant that when a victim’s name is posted in a video title that they're aggregated in and it's then used in search engine keywords for porn sites that don't even have their content, it just becomes a random keyword—their name—and God forbid you have a unique name. Imagine googling your name, and hundreds of porn sites coming up because your name is a keyword empowered by SEO techniques.
We need to find a balance between verification and privacy. That's very easy for me to say, but sites having a reasonable policy for age verification is required. I compliment Pornhub in adopting a verified content policy in late 2020. I'm very angry [Technical difficulty—Editor] and I badly want them held accountable for that, but I want to make sure it's also not so cumbersome that sex workers who are free agents can't operate without reasonable privacy.
Search engines—and this is a key one, and I would recommend you put this forward, or at least encourage them to change their policies—shouldn't allow indexing from adult video image upload sites that do not come from verified accounts. This means that, with verified accounts, the spiders can be turned on so that they can feed into Google, Bing and so on. However, spiders should be turned off on any website where any Joe Schmo can come and upload content, whether it be videos or images. They should be turned off on that content until it is verified. That keeps it from hitting search engines in 72 hours.
Remember, with all NCP, you're really fighting time, and that keeps it from going viral a lot more quickly, quite frankly. It makes the clean-up process significantly better, and it can mitigate it. Furthermore, it would probably protect the intellectual property of other sex workers. As I said, Pornhub and other major tube sites have more or less put NCP into the express lane via SEO techniques.
Finally, the doxing of victims and sex workers is a very serious issue. Despite many of my clients being Jane Does, I can't get Google to delist web pages that post the real names of victims. I wish there was a policy that allowed the delisting of the real names of Jane Does, of sex workers, that exist on sites such as the defunct Porn Wikileaks, which were very dangerous for them and were made for doxing victims.
I'm very open to questions you may have and appreciate your welcoming me today. I'm honoured to be here.
Thank you.
View Shannon Stubbs Profile
CPC (AB)
View Shannon Stubbs Profile
2021-06-07 12:35
I think this is probably what's mind-boggling to many of us on this committee and probably many Canadians listening. A colleague said to me recently that, somehow, organizations like ag societies and school fundraisers and Legions are put through mountains of paperwork and administration to, say, play certain songs or use certain visual material. Then there are also online sites, say, that sell cannabis or alcohol, or host gambling, and in those two cases the country seems fairly effective at having a set of laws and bylaws and policies and regulations for these organizations [Technical difficulty—Editor] seem to manage to enforce and crack down on all of that being done illegally.
View Shannon Stubbs Profile
CPC (AB)
View Shannon Stubbs Profile
2021-06-07 12:36
I would just give you the opportunity to expand on any other specific recommendations in terms of both the enforcement and protections to combat the proliferation of child sexual abuse material and other illegal content, while also maintaining free expression, privacy and the right of individuals to have ownership and choice over their own images.
Melissa Lukings
View Melissa Lukings Profile
Melissa Lukings
2021-06-07 12:37
Thank you.
Privacy is very important, and it's also a safety issue in a lot of these situations. I can't provide any specific solutions. I'm not [Technical difficulty—Editor]. I definitely recommend asking Dr. Lashkari about that.
In terms of law, we need to remember the foundations of law, so what is the Privacy Act based on? What are the rights and freedoms that Canadians hold as important? Our rights to freedom of expression, freedom of association and all these things need to be considered when we're implementing new technology and new standards for technology.
As for specifics, that wouldn't be my area. I would be more like poking holes in why those things aren't private enough.
View Matthew Green Profile
NDP (ON)
We've heard lots of discussion around the prevalence of CERB fraud, and yet we hear Mr. Brouillard talk about 50,000 identities stored in the dark web. Have there been any early indications or cross-reference between information that was taken through these breaches and potential fraudulent applications for the CERB?
View Matthew Green Profile
NDP (ON)
Before that happens, Mr. Chair, can I just ask Mr. Jones if something like that would be in his purview before it's passed along?
Scott Jones
View Scott Jones Profile
Scott Jones
2021-05-31 16:15
We're really talking about two different things, Mr. Chair. I think there's the number of data breaches that have happened. The Privacy Commissioner of Canada, in our national cyber-threat assessment where we highlight this, said that 28 million Canadians last year had their information taken. That information has then been reused to target the Government of Canada. By reusing passwords, for example, somebody was able to log in.
We're not talking about information that was taken from the government. It was taken from other data breaches, but people reuse things. Our security questions are the same. What's your favourite colour? What school did you go to, etc.? That's the information these criminals have stolen, and because passwords are horrible and we all have too many of them, we tend to reuse them. A lot of Canadians reuse them, and so those were able be reused. That's what credential stuffing is. Really, we're talking about information from other data breaches then turned and used against the Government of Canada. But Marc, maybe—
View Matthew Green Profile
NDP (ON)
I do say this respectfully, because it's not often that we have a member from the Communications Security Establishment before us. This is why I'm trying to get the most out of this intervention, because I don't know when you may be back.
Is there a scenario—this is for my own edification—where the information that might have been obtained through the CRA's vulnerabilities could then have been used to re-access fraudulent CERB applications? Maybe I'm oversimplifying it or conflating it.
I'd love to hear from you, Mr. Jones.
Scott Jones
View Scott Jones Profile
Scott Jones
2021-05-31 16:16
I think that would be a pretty unlikely scenario, to be frank, because that wasn't what we saw happening here. We saw Canadians being impersonated in this activity where they were using their legitimate credentials, so essentially logging in as them. I think that's kind of my overall response to this, but Marc might be able to tell you more.
View Joël Lightbound Profile
Lib. (QC)
Have you seen an acceleration as a result of the pandemic? I think that this was noted in your report.
People are spending more time online, and there are more conspiracy theories, for example.
Has this affected radicalization and the rate?
Results: 1 - 30 of 139 | Page: 1 of 5

1
2
3
4
5
>
>|
Export As: XML CSV RSS

For more data options, please see Open Data