Committee
Consult the user guide
For assistance, please contact us
Consult the user guide
For assistance, please contact us
Add search criteria
Results: 61 - 75 of 294
Lloyd Richardson
View Lloyd Richardson Profile
Lloyd Richardson
2021-02-22 12:05
There is, but I would kind of invert that a little bit. It's not a technical issue.
Let's reverse in time to the 1980s before the popularized Internet when we had pornography and we didn't see child sexual abuse material showing up in Playboy magazine. It's not necessarily a technical issue. If you're in fact moderating everything that comes up on your platform, this should never happen. We don't see the CBC show up with child pornography on its services because there's moderation that happens. We have control over the content. That's not to say you can't leverage technology, as we do in Project Arachnid, to do proactive detection of known child sexual abuse material, but really, let's not look at the new and fancy, oh, I have an AI classifier that can automatically detect child pornography. That's great and all, but it's never going to detect everything, and it's not going to have the accuracy that you have of actual human moderators looking at material. It's in addition to something that's already there, so it's important not to belabour the technological side of things.
Daniel Bernhard
View Daniel Bernhard Profile
Daniel Bernhard
2021-02-22 12:07
It was just to say that I agree. Platforms want to operate at a certain scale, which requires them not to validate any of the content that comes up, yet that seems to result in illegal outcomes, so it's not really for us to say how they should deal with this, but simply that if it's there, they should face the consequences.
To Mr. Richardson's point, I have one final issue. It's not just CBC, CTV, etc., who make sure that their content is lawful. They also have to make sure the advertising that they run is lawful and that op-eds and other third party contributed content are lawful. Otherwise they are jointly liable. This is how the law works, and I see no reason why it shouldn't apply in the case of Pornhub, Facebook, Amazon or any other provider that is recommending and facilitating illegal behaviour through its service.
View Patricia Lattanzio Profile
Lib. (QC)
Thank you, Madam Chair, and thank you to our guests for being here this morning.
My first question is for you, Mr. Clark.
We heard Mr. Antoon from MindGeek say during his testimony how proud he was of being a partner with NCMEC. He clearly said in his testimony that he reports every instance of CSAM when they are aware of it, so that the information could be disseminated or investigated by authorities across the globe.
When we asked him to give us a report of how many instances were reported just in 2019, they couldn't answer. Would you be in a position to provide us information for 2019 and also, if you can, going all the way back to 2008?
John F. Clark
View John F. Clark Profile
John F. Clark
2021-02-22 12:26
Sure. As was noted in my testimony, they are not a partner. In fact, we sent a letter to them soon after their testimony when we became aware that they were saying they were a partner, telling them that it was not true and that they should cease and desist from saying so. That's important to note.
In terms of the 2019 numbers, I'm not aware of any reporting that was happening in that particular calendar year. In 2020, we did begin to receive some of the reports. Again, this is on a voluntary basis. We did note that many of those reports were duplicative.
We have encouraged them, as we do all the ESPs, to begin a stricter content moderation. They should begin to actually know beforehand what is being uploaded and whether that content passes any of the legal requirements that they would have to post it. If it does not, it should not go up, period. They should not have to go back and look for it or have a victim call in and ask for it to be taken down.
That's something we encourage all of them to do.
Stephen White
View Stephen White Profile
Stephen White
2021-02-22 12:40
Thank you very much.
Good afternoon, Madam Chair and honourable members of the committee. Thank you very much for the opportunity to speak with you today on this pressing matter. My colleagues from the RCMP have been introduced.
I'd like to highlight that Chief Superintendent Marie-Claude Arsenault is with us. She oversees sensitive and specialized investigative services, which also includes the National Child Exploitation Crime Centre. Also with us is Mr. Paul Boudreau, executive director of technical operations for the RCMP. It's also a pleasure to have our colleague from the Department of Justice with us as well.
I'd like to describe for a couple of minutes a broader context of online child sexual exploitation and highlight the RCMP's steadfast efforts towards combatting this crime and bringing offenders to justice.
Online child sexual exploitation is one of the most egregious forms of gender-based violence and human rights violations in Canada. Not only are children, particularly girls, victimized through sexual abuse, but often they are revictimized through their lives, as photos, videos and/or stories of their abuse are shared repeatedly on the Internet amongst offenders.
In 2004 the Government of Canada announced the national strategy for the protection of children from sexual exploitation on the Internet, which brings together the RCMP, Public Safety Canada, the Department of Justice and the Canadian Centre for Child Protection, CCCP, to provide a comprehensive, coordinated approach to enhancing the protection of children from online child sexual exploitation. The Canadian Centre for Child Protection is a non-governmental organization that operates Cybertip.ca, Canada's tip line to report suspected online sexual exploitation of children.
The Criminal Code provides a comprehensive range of offences relating to online child sexual exploitation. Canadian police services, including the RCMP, are responsible for investigating these offences when there is a possible link to Canada. The Criminal Code also authorizes courts to order the removal of specific material, for example, a voyeuristic recording, an intimate image and child pornography that are stored on and made available through a computer system in Canada.
The RCMP's National Child Exploitation Crime Centre is the national law enforcement arm of the national strategy and functions as a central point of contact for investigations related to online sexual exploitation of children in Canada and international investigations involving Canadian victims, offenders or Canadian companies hosting child sexual exploitation material.
The centre investigates online child sexual exploitation and provides a number of critical services to law enforcement agencies, including immediately responding to a child at risk; coordinating investigative files with police of jurisdiction across Canada and internationally; identifying and rescuing victims; conducting specialized investigations; gathering, analyzing and generating intelligence in support of operations; engaging in operational research; and developing and implementing technical solutions.
The centre has seen first-hand the dramatic increase in reports of online child sexual exploitation in recent years. In 2019 the centre received 102,927 requests for assistance, an increase of 68% since 2018 and an overall increase of 1,106% since 2014. The majority of the referrals the centre receives come from the National Center for Missing and Exploited Children in the United States. Every report is assessed and actioned where possible.
In addition to the high number of reports, cases of online child sexual exploitation have become more complex. Advances in technology such as encryption, the dark Web and tools to ensure anonymity have made it much easier for offenders to conduct their criminal activities away from law enforcement agencies. Investigations related to online platforms also raise a host of other Internet-related issues, including the failure of platforms to retain data, the amount and speed at which content can be posted and distributed, and the ability of users to download hosted content.
When content is successfully removed from one platform, it can easily be uploaded to the same platform or to other websites, perpetuating victimization and leading to a proliferation of content depicting sexually exploited children on multiple platforms. It is well known that offenders protect this type of content on personal devices or through cloud computing services.
Like many cybercrimes, online child sexual exploitation is often multi-jurisdictional or multinational, affecting victims across jurisdictions and creating additional complexities for law enforcement. No single government or organization can address this crime alone. The RCMP works diligently with its partners at the municipal, provincial and federal levels in Canada and internationally, as well as with non-governmental organizations, to strengthen efforts to rescue victims and bring offenders to justice. In fact, the RCMP is the current chair of the Virtual Global Taskforce, an international police alliance dedicated to the protection of children from online sexual exploitation and other transnational child sex offences. The Virtual Global Taskforce consists of law enforcement, NGOs and industry partners working collaboratively to find effective response strategies. Chief Superintendent Arsenault, who is with us today, is the current chair of this very important group.
The RCMP also seeks to work closely with the private sector as offenders regularly utilize platforms operated by Internet and/or communications service providers to carry out a range of Criminal Code offences relating to online child sexual exploitation.
The RCMP regularly engages private sector partners to discuss existing legislation, which includes an act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service, referred to as the mandatory reporting act, which came into force in 2011. The mandatory reporting act requires that Internet service providers report to the Canadian Centre for Child Protection any tips they receive regarding websites where child pornography may be publicly available. Under the mandatory reporting act, Internet service providers are also required to notify police and safeguard evidence if they believe that a child pornography offence has been committed using their Internet service. Since the mandatory reporting act came into force in 2011, the RCMP has seen a continual increase in reporting from industry partners.
Many online platforms post jurisdictional challenges, as I outlined earlier. An online platform that is registered in Canada may maintain its servers abroad, which could limit the effect of a Canadian warrant. Further, when a global company registered abroad has a Canadian presence, it is likely to host its content abroad, making jurisdiction difficult to determine.
When an online platform permits its users, and/or itself, to personally download material to and upload material from their own computers, it becomes impossible to determine where this material may be stored or to prevent it from reappearing and being further disseminated.
New companies, platforms and applications will continue to emerge, and the services they provide to Canadians will continue to evolve. It is important that the Government of Canada, legislative authorities and law enforcement agencies keep pace and adapt accordingly to combat these crimes.
The illegal online content that many communications service—
View Arnold Viersen Profile
CPC (AB)
Thank you, Madam Chair.
I want to thank the witnesses for being here.
Section 163.1 of the Criminal Code makes it an offence to make available or distribute CSAM. MindGeek executives told us that none of this exists on their site. Have you found this to be true, Mr. White?
Stephen White
View Stephen White Profile
Stephen White
2021-02-22 12:49
Actually, we have had referrals from our partner in the United States that I mentioned, NCMEC, with regard to disclosures by that company to us regarding that content.
View Arnold Viersen Profile
CPC (AB)
Okay.
We've heard multiple times at this committee that there are documented cases of videos of exploitation of minors being put up on MindGeek. The big question for us is to understand why the owners have not faced charges. Is there a problem with the laws as they currently stand?
Stephen White
View Stephen White Profile
Stephen White
2021-02-22 12:50
Obviously, there are a number of elements to when and what types of charges will be laid. When we're talking about these corporations, which are service providers and hosting platforms, other individuals have the ability to automatically load the content onto the platforms. There are jurisdiction issues, but every case is different.
I would ask my colleague, Chief Superintendent Arsenault, if she can add to that.
Marie-Claude Arsenault
View Marie-Claude Arsenault Profile
Marie-Claude Arsenault
2021-02-22 12:51
I would add that every situation is assessed, and in looking at the evidence we have, that would determine if we have enough to proceed with charges. As was mentioned, with MindGeek or Pornhub, we've received, since June 2020, about [Technical difficulty—Editor] reports, which were—
Marie-Claude Arsenault
View Marie-Claude Arsenault Profile
Marie-Claude Arsenault
2021-02-22 12:52
We've received 120 reports, which have been triaged and prioritized. Some were referred to other law enforcement in Canada and others were deemed to be not online sexual exploitation, for various reasons. That's it.
View Arnold Viersen Profile
CPC (AB)
The presence of the download button on the MindGeek-Pornhub site seems like a clear violation of the distribution part of the CSAM laws. Is that part of your case? Has that been flagged? How come we haven't seen any charges?
Marie-Claude Arsenault
View Marie-Claude Arsenault Profile
Marie-Claude Arsenault
2021-02-22 12:53
Well, again, we have to look at all the evidence we have, and my understanding is that this function has been taken out. Without the evidence, there's limited action that we can take. We are assessing all the reports that we're getting now and determining if charges are likely to happen.
View Arnold Viersen Profile
CPC (AB)
Okay.
Up until recently, you have not had any reports of CSAM on MindGeek or Pornhub sites. Am I hearing you correctly?
Results: 61 - 75 of 294 | Page: 5 of 20

|<
<
1
2
3
4
5
6
7
8
9
10
>
>|
Export As: XML CSV RSS

For more data options, please see Open Data