Committee
Consult the user guide
For assistance, please contact us
Consult the user guide
For assistance, please contact us
Add search criteria
Results: 1 - 100 of 294
View Ginette Petitpas Taylor Profile
Lib. (NB)
I call this meeting to order.
Good afternoon, everyone. It is 2:44 Atlantic Time.
Welcome to the third meeting of the Subcommittee on Private Members' Business. Pursuant to Standing Order 91.1(1), we are meeting to consider the items placed in the order of precedence of May 31, 2021, to determine whether they should be considered non-votable.
Since I believe we are all online, I don't have to read the instructions that would apply if anyone were in the room.
During this meeting, should you wish to get my attention, please signal me with your hand gesture or, at an appropriate time, call out my name.
Madame Normandin, before the meeting officially started, I asked all the members if there were any items they had any issues with, and I am going to ask the same question to you. Is there any item you would like to discuss?
View Shannon Stubbs Profile
CPC (AB)
View Shannon Stubbs Profile
2021-06-07 11:08
Thank you, Chair.
Minister, thanks for being here.
Just to start, do you think Bill C-10 is adequate to combat child sexual abuse material and rape and non-consensual material online?
View Steven Guilbeault Profile
Lib. (QC)
I was invited to talk about our upcoming legislation regarding online harms, which I'm happy to do. If this committee would like to invite me to talk about Bill C-10, I would be happy to appear at another time to do that.
View Shannon Stubbs Profile
CPC (AB)
View Shannon Stubbs Profile
2021-06-07 11:08
I'll take that as a “no” for Bill C-10.
Witnesses said previously that Canada's Criminal Code “child pornography” definition is among the world's broadest. It bans images, audio and written forms. Platforms are already liable for circulating illegal user-generated content. There are circumstances in which a company becomes liable for something that somebody else said or did if the company knew about it in advance and published it anyway, or if the company was notified about it after the fact and failed to take action. These situations are very well documented with MindGeek and Pornhub. It seems the real and disturbing issue is a lack of application of the law and its enforcement.
In January, you said that within a few weeks you were going to create a regulator to stop child sexual abuse material and sharing of non-consensual images online. I'm just wondering why there hasn't been any serious progress on that. I have a couple of questions about that for you from survivors. What's the delay?
View Steven Guilbeault Profile
Lib. (QC)
I respectfully disagree with the premise of the question. What we see here in Canada, and frankly, all around the world, is that the tools we have to deal with these harms in the physical world just aren't adapted to deal with them in the virtual world.
Let me give you an example. In 2019, the RCMP saw a 1,106% increase from 2014 of reports regarding child sexual exploitation online. This exploitation disproportionately impacts girls. In 2019, the RCMP found that girls made up 62% of identified Canadian victims depicted in online child sexual exploitation material.
I did say I was hoping to introduce this legislation in January. Unfortunately, the systemic obstruction by the Conservative Party regarding Bill C-10 has prevented me from doing so. However, I am still hoping to table this bill as soon as possible.
View Jacques Gourde Profile
CPC (QC)
Well, then, let's talk about something else, Minister. We're not talking about culture, we're talking about protecting our children.
When will your next bill be introduced?
View Steven Guilbeault Profile
Lib. (QC)
As quickly as possible. I can already tell you that your party will oppose that bill as well. Your party...
View Jacques Gourde Profile
CPC (QC)
That is speculation, Minister.
We want to protect our children. Table your bill as soon as possible, before an election is called. If there is an election this fall, absolutely nothing will happen for the next two years.
There are children in Canada who are thinking about suicide. They are not being protected right now, Minister. Why is this coming back into your court? It should have been the responsibility of the Department of Justice. You may not be in the best position to help our children right now.
View Steven Guilbeault Profile
Lib. (QC)
I want to start by saying that the Internet and the sexual exploitation of children on the Internet existed before 2015. Your party was in power for 10 years. On the one hand, you did nothing about this issue, despite the existence of this phenomenon.
On the other hand, the sooner your party stops its systematic obstruction of Bill C‑10, the sooner...
View Jacques Gourde Profile
CPC (QC)
Your arguments are being made from an electoral perspective, Minister. You don't want to help children. Right now they need help and we want to help them. You are not helping us.
You are already in an election campaign. You are making election-minded comments and it's really sad. I'm really disappointed in your attitude, because we are all elected to improve the lives of Canadians. Please stop your electioneering and tell us how you are going to help our children.
View Steven Guilbeault Profile
Lib. (QC)
We want to do several things. As stated in my mandate letter, the bill will make it possible to remove all illegal content within 24 hours, thereby forcing companies to do so. Companies currently aren't doing this. The bill will also help implement an effective and user‑friendly content moderation system. Platforms will be subject to greater transparency obligations with respect to reporting online harms, such as child sexual exploitation, to law enforcement.
View Jacques Gourde Profile
CPC (QC)
Rest assured, Minister Guilbeault, that we'll be there to help you. Don't speculate. This bill hasn't been tabled.
Thank you, Mr. Chair.
View Charlie Angus Profile
NDP (ON)
Thank you so much to the witnesses. It's wonderful to have Madam Lukings back.
This committee does not have a mandate to look into sex work. We are the privacy committee. There's the women's committee, the justice committee. There are many, many important issues. We've heard many important issues here.
Our focus started out from that article that Madam Shanahan called “sensationalist”. It was a New York Times article with Serena Fleites.
She came to our committee, and she stated that she tried time and time again, as a 13-year-old, to take it down. Pornhub's executives told us they had no record and they weren't sure of when she contacted them.
Mr. DeBarber, in your experience, is that a credible answer, that Pornhub wouldn't have known about this video or known about efforts to have it taken down? Is it the dark net inside corporate headquarters?
Charles DeBarber
View Charles DeBarber Profile
Charles DeBarber
2021-06-07 12:53
My honest answer is that I believe your victim, first off.
To share something just as seedy that happened, there is right now a criminal conviction for human sex trafficking surrounding the defunct site, GirlsDoPorn. It's infamous. There are a lot of great articles about it. I had clients who were even raped during the entire time. It's a horrifying situation.
They were a content partner for Pornhub. As early as 2016, at least from my records, they were already seeing statements from more than one Jane Doe about the process and what went down there, and they kept them as a partner, literally almost to the day of the civil judgment in 2019, where 40 Jane Does stood up.
I completely believe them.
Charles DeBarber
View Charles DeBarber Profile
Charles DeBarber
2021-06-07 12:53
Well, I'll put it this way. That is a lot of data to track, in fairness, if I'm looking at it from the cybersecurity point of view. I can't tell you if they had the data or not; it all depends on how much they archive.
To be plain, I fully believe your victim. This is a company that I strongly believe has some heavy liability out there and should face consequences for it.
View Charlie Angus Profile
NDP (ON)
Thank you, Mr. Chair.
I'm speaking to my motion today to invite Mr. Steven Guilbeault, the Minister of Heritage, to come to the ethics and privacy committee to testify on the plans that are being led through the heritage department to deal with the allegations of non-consensual sexual assault videos that exist on PornHub.
At the April 12 ethics meeting, we were informed by security minister Bill Blair that the government of Mr. Trudeau will “introduce legislation to create a new regulator that will ensure online platforms remove harmful content, including depictions of child sexual exploitation and intimate images that are shared without consent” and that “Public Safety Canada and other departments are working on this proposed legislation with Canadian Heritage, which leads this effort.”
We have had no indication of what this new regulator is and I think we need clarity.
I would just step back a minute and say that this all stems from the December 2020 reports that came out of the United States on horrific abuse of children and sexual assault victims on PornHub, a company that is based in Canada. We began our study at that time to see if our laws were insufficient or if there was a problem. We asked the RCMP to come. The RCMP have made it clear that they are not moving forward with allegations against PornHub. They've talked about their being a partner. They've talked about voluntary compliance.
I received the RCMP's internal briefing documents in response to the December 2020 article, and in that document, it talks about what next steps have to be done and it mentions the leadership of the heritage department. My office asked the RCMP to send us the blacked-out information to explain why the RCMP is deferring to Mr. Guilbeault's office. My staff was told that this would breach cabinet confidence.
What that tells me is that after the December 2020 article came out in The New York Times on PornHub, this issue was discussed at the cabinet of Prime Minister Justin Trudeau and a decision was made then to have Mr. Steven Guilbeault and the heritage department handle this file, rather than transferring it over to police, to the Attorney General or to public security.
I think this is really important. We cannot finish our PornHub study without knowing what exactly the government's plan is, because we have Bill C-10 right now that Mr. Guilbeault is in charge of, and I think the government shocked everybody when they decided to put user-generated content under Bill C-10. I've talked to many arts organizations that were shocked that Bill C-10 includes user-generated content. It is nothing that the artists' community wanted. They want Facebook and Google to pay their share. Where is this user-generated content coming from? Is this to address the allegations the survivors brought to us on PornHub?
If that is the case, Mr. Guilbeault needs to explain that, because I don't think you could disrespect survivors in any more of an egregious fashion than to suggest that sexual assault videos or videos of the torture of children that were brought forward to our committee are somehow considered user-generated content in Canada. What does that say to survivors? What does that say to the women of the global south who I have been meeting with, who are speaking from Nigeria, Colombia, Spain and France, talking about the sexual assault videos from their countries that are being posted on a Canadian site?
Are the Liberals telling us that they consider sexual assault and criminal acts mere content that can be handled by a regulator? Are they going to hand it off to the CRTC under Bill C-10, or are they going to create a new pornography regulator? I would like to know what that pornography regulator would be, because, again, I had excellent meetings following the debacle of our meetings with the sex workers, and Ms. Lukings provided really interesting analysis of how what we want to do is to make sure we hold corporations accountable for what's online, but we don't want to push stuff to the dark net.
If the Liberals have this idea that Mr. Guilbeault could set up some kind of regulator to tell us—I don't know—Canadian content in porn, good porn, bad porn.... Do we need a regulator or do we simply need the Liberal government to apply the laws?
We can look at the laws we have in Canada. In section 162 of the Criminal Code, it is a crime to film the private acts of individuals or people without their consent. It is a crime to circulate, to sell, to advertise or to make available the recording. We have a law. In section 163, sexual videos of crime, cruelty and violence are classified as criminal in behaviour. We heard from the survivors of non-consensual sexual assault videos that their videos were videos of crime, cruelty and violence. Section 164 gives the authorities, which would be the RCMP, the power to issue warrants to seize the recordings of voyeuristic videos of crimes as well as child pornography.
We have mandatory reporting laws. We have learned that Pornhub has not followed through on them. Pornhub has not respected the laws we have in this country.
The Attorney General doesn't seem to even think it applies, because he's not sure if this Montreal-based company is a Canadian company. If the Attorney General, who lives in Montreal, isn't sure that Pornhub is a Canadian company, even though their address is on Décarie Boulevard and everybody in Montreal who goes to work passes their office in the morning, then how are we expected to believe that the CRTC or some kind of regulator will handle this?
I think Mr. Guilbeault needs to come and explain this to us. What is the government's plan for dealing with the issues of sexual violence on Pornhub that have come to our committee? Are we going to ignore Canadian law or are we going to establish the CRTC to do this? Is this going to be Bill C-10 or...? Mr. Blair suggested that they're going to create a new regulator.
I think Mr. Guilbeault needs to come and inform us so that we can actually finish a report on what Parliament needs to do to address these disturbing allegations of brutality and non-consensual sexual assault of women, not just from Canada but from around the world. We need to be able to respond to those survivors and to the Canadian people that we've done our job. We cannot do that job without Mr. Guilbeault coming and explaining why he is the lead person appointed by the Trudeau government to address these very serious allegations.
I'd like to bring that motion forward for a vote.
View Chris Warkentin Profile
CPC (AB)
We'll move to a vote on the amendment.
Madam Clerk, I'm wondering if you'll run through the roll call for the purposes of the vote on the amendment. This is Mr. Dong's amendment. Then we'll vote on the main motion.
(Amendment agreed to: yeas 10; nays 0 [See Minutes of Proceedings])
(Motion as amended agreed to: yeas 10; nays 0 [See Minutes of Proceedings])
The Chair: Members, that's very helpful. I'm glad we can do that.
Of course, next week our meetings are scheduled to be the review of the report on pandemic spending. I think Mrs. Shanahan may have some suggestions for meetings in the week that follows.
Mrs. Shanahan.
Jennifer Clamen
View Jennifer Clamen Profile
Jennifer Clamen
2021-04-19 11:17
Sure.
Hopefully you can hear me say that the irony of you not being able to hear me while I'm talking about sex workers not being heard is not lost. Hopefully that will give all of you a little smile for the day. Maybe you can't hear that.
Madame Gaudreau, you cannot hear me say it still? You can? You thought my joke was funny, fantastic.
I'll just continue on some of the principles that underscored our recommendations.
The last few are around how singling out sex workers and activities related to sex work for additional prohibitive or additional repression is virtually always harmful for people working in the sex industry. Some of the briefs that you received, particularly the one from West Coast LEAF and other partners, really outlined the evidence around that.
Any legislation or policy or repressive measure that you're thinking of right now should really maximize the autonomy of sex workers to be able to work as safely as possible in keeping with sex workers' human rights to safe working conditions, liberty, privacy, non-discrimination and dignity.
I'll finish up now, but there's been a lot of discussion around youth in this committee as well. There's been a lot of conflation of issues with youth, and exploitation, and sex industry, human trafficking; these words are being bandied about very carelessly. In our recommendations for law reform, we also took the time with the hundreds and hundreds of sex workers to talk about recommendations for youth. I also wanted to share those with you too, because our recommendations really stem from recognizing the agency of people, and that includes people under 18, and by agency meaning the capacity to think, and the capacity to make decisions in a given set of conditions that anybody is living on.
The entire focus on child exploitation and human trafficking in this committee has been completely overblown. That is not to suggest these things don't exist in real space and time, but they have been overblown with respect to the conversations in this committee. The framing of all content online as youth exploitation really undermines sex workers' ability to keep safe and makes it harder to address violence in the industry. When we see everything as violence in the industry, it's hard to understand when sex workers are actually experiencing violence.
The alliance's groups recommend the following principles with respect to understanding youth and any regulations that involves youth: a harm reduction approach that requires authorities to use the least intrusive approach towards youth with the emphasis on preserving community; and a recognition that repression, apprehension, detention and rehabilitation are often experienced as antagonistic and traumatic and often push youth away from supports rather than towards supports.
The alliance's member groups also recommend a reliance on existing laws rather than the creation of new laws, additional regulations and law enforcement measures that move people away from supports rather than towards supports.
I'll conclude by saying that last week we heard Bill Blair say what all sex workers were fearing as a result of this committee. We were assured it had nothing to do with committee, but we heard Bill Blair say that he was thinking of creating a new regulatory body that would be created for online content. We can't stress enough that more regulation is not the answer and that it will just actually harm sex workers and harm the industry in general with respect to sex workers' rights.
There's also the ongoing parallel work of Bill S-203 submitted to the Senate by Julie Miville-Dechêne.
On top of this there's the continual refusal of Parliament to decriminalize sex work, despite the evidence that regulation and criminalization harms sex workers.
Targeting Internet sex work during a pandemic is such an aggressive and violent move on your part and on the part of everybody who's considering regulation right now. The Internet has been a safe haven for so many workers who are unable to face the conditions of COVID like so many sex workers. Some sex workers, but not all, have moved online and have been able to support themselves this way, so it is important, now more than ever, to protect these spaces and to ensure that sex workers can continue to work without violence and exploitation.
If you want to know how to protect people on platforms like Pornhub, create a committee, sit down with people who actually post their content on Pornhub, sit down with sex workers and talk to us.
Thank you.
View Arnold Viersen Profile
CPC (AB)
All right.
Over 200,000 people had watched the video of her being assaulted while she lay drugged and unconscious. On that day in August, mortified, dizzied by her discovery of the betrayal, Legarde prepared to tie a noose.
“I was standing in my garage, under a beam, holding onto a rope”, she recalled, but finally, she changed her mind. She said, “I said to myself, 'If this is your situation, he'll do it to someone else tomorrow.'” Legarde resolved her own story and fought back, so now it doesn't have to happen to other girls.
We've heard several stories like this from people who have come to this committee.
Nicholas Kristof points out that this isn't about pornography. This is about rape and sexual abuse. He's also heard from a Canadian student who said, “I have no problem with consensual adults making porn.” Her concern is that many people in the pornographic videos weren't consenting adults, like her. Kristof writes that after she turned 14, a man enticed her to engage in sexual play over Skype. He secretly recorded her. A clip, along with her full name, ended up on XVideos, the world's most-visited pornography site. Google searches helped direct people to this illegal footage of child sexual abuse. This Canadian student shared with Kristof how she begged XVideos to remove the clip. Instead, XVideos hosted two more copies so that hundreds of thousands of people could leer at her at the most mortifying moment in her life.
I also want to highlight another study that came out at the beginning of the month and that may be important to this committee's work. The study, published in The British Journal of Criminology, looked at the ways in which mainstream pornography positions sexual assault violence as normative sexual script. By analyzing videos and titles found in the landing sites of these three most popular pornography sites in the United Kingdom—XVideos, Pornhub and XHamster—the study drew the largest research sample on online pornographic content to date, over 130,000. It is unique in its focus on the content immediately advertised to the new user. The academics found that one in eight titles shown to a first-time user on the main page of the porn sites depicted sexual violence or non-consensual content.
Mr. Chair, we have heard from people from across the spectrum about how they have been targeted and exploited by companies such as Pornhub, and that is what this study is all about.
Kate was 15 years old. Her ex was 20. He was into making homemade videos and stuff, and he videotaped her. One day he said, “Let me show you something.” She tried to get the content taken off Pornhub. It took her years to get rid of that content.
Rosa was 16. She was drunk at a friend's party. She woke up and there were naked pictures of her on Pornhub, with her name and her phone number. She had endless calls and texts. She had to change her number.
Nicole was 14. She made a decision that changed her life. She was having a sexual FaceTime with someone she didn't know. “I didn't know anything about him, his name or his age or anything”, she said, “but I showed him areas of my body that were private. I didn't know it at the time but he was recording it and uploaded it to Pornhub. The name of the video was even 'Young Teen', but that wasn't enough for Pornhub to analyze it and take it down. No, years later, classmates of mine found out about me and the pornography that was shot of me as a child. I've had the police involved on multiple occasions and cannot get the videos taken down.”
This is a video of Rosella, who was raped when she was 14, yet the video is still up on Pornhub.
Kyra, at the age of 15, was coerced into making a film of a sexual act. The video had been uploaded, without her consent, to Pornhub. The uploader was also underage. No one confirmed anyone's age or consent. “I've been dealing with image issues, PTSD, sexual discomfort since the incident and into adulthood. This is my personal account, and I have heard similar stories from other women. I will never forgive Pornhub for allowing my abuse to be shared publicly. It caused me to relive my pain, year over year over year.”
View Bill Blair Profile
Lib. (ON)
Thank you very much, Mr. Chair, and good morning to members of this committee.
I'd like to begin by thanking all of you for the invitation to join you this morning for this very important and very timely study on a very significant issue.
As I think as everyone recognizes, the sexual abuse and exploitation of a child—any child—in any context, in any platform and in any place is intolerable and unacceptable. It is the most heinous of crimes and deserves society's strongest condemnation and our effective response.
Recording the sexual abuse of a child can have significant lifelong impacts on both the victims and the survivors of this crime. Sadly, as some of these victims grow older, many come to realize that their images continue to be circulated on the Internet, and they are revictimized over and over again as this material is shared.
I'd like to take this opportunity to recognize the remarkable courage and resilience of survivors in coming forward and speaking out. I've had an opportunity to meet with [Technical difficulty—Editor], and I think I share this committee's appall at reports that abhorrent material of this kind has been found on platforms. It is unacceptable that the victims have encountered difficulties in getting companies to remove this illegal content.
Their stories and experiences remind all of us of the important work that we must do and are doing to protect children and youth. The Government of Canada plays a leading role in these efforts to combat online child sexual exploitation and, Mr. Chair, we are taking action to increase awareness and to reduce the stigma of reporting. This is important, because we know that the number of reported cases is just the tip of the iceberg when it comes to the true scale of this most heinous of crimes.
Internet companies must also do more to protect children, and we are taking steps to hold them to account for their role in this. We are also taking action to bring more perpetrators to justice by supporting efforts to detect, investigate and prosecute these cases. I have asked the RCMP commissioner to continue to work with her provincial and territorial counterparts to address this crime and to ensure that prosecutions are done when deemed appropriate by evidence and by law enforcement.
Canada's national strategy on this issue is led by Public Safety Canada, which works in partnership with the RCMP, the Department of Justice and the Canadian Centre for Child Protection or, as it's very often known, C3P. We are backing this national strategy with ongoing annual funding of more than $18 million. That includes support for Cybertip.ca, a national tip line operated by C3P. It also includes $5.8 million in ongoing funding announced in 2018 to increase the investigative capacity of the RCMP's National Child Exploitation Crime Centre.
On top of this, in budget 2019, we invested $22.2 million over three years in additional funding to better protect children from this horrendous crime. Of that amount, $15 million is specifically aimed at enhancing the capacity of Internet child exploitation units in municipal and provincial police services right across Canada. These specialized units are dedicated to investigating cases of online child sexual exploitation. Investments in budget 2019 are also helping to increase public awareness of this crime, reduce the stigma associated with reporting and work with the digital industry to find new ways to combat sexual exploitation of children online.
At the same time, it's important to acknowledge the complexities and jurisdictional challenges involved in what is often a borderless crime. Perpetrators and victims can be located anywhere in the world, and images of child sexual abuse and exploitation can be shared on platforms that may be headquartered in one country but legally registered in another, with servers in yet a third and different country.
This affects the authority and challenges the ability of Canadian law enforcement agencies to investigate, and the application of Canadian laws, but I am confident that law enforcement continues to do everything possible to investigate these horrendous crimes and prosecute those responsible. International co-operation is key in this regard. I want to assure you that the RCMP and the Department of Justice work very closely with international partners on investigations and prosecutions.
We also work closely with our international allies and partners to find solutions to better protect children and youth. Last year, for example, Canada and its Five Eyes partners launched the “Voluntary Principles to Counter Online Child Sexual Exploitation and Abuse”. These principles are a guide for industry on how to counter this scourge on their platforms.
We recognize also that there is much more work to do, and that's why we will introduce legislation to create a new regulator that will ensure online platforms remove harmful content, including depictions of child sexual exploitation and intimate images that are shared without consent. Public Safety Canada and other departments are working on this proposed legislation with Canadian Heritage, which leads this effort.
We will continue to do everything we can to protect Canadian children and support Canadian survivors of this terrible crime, and we will continue to work with domestic and international partners to investigate cases in which evidence exists and bring the perpetrators to justice.
Thank you very much, Mr. Chair.
View David Lametti Profile
Lib. (QC)
Thank you, Mr. Chair.
I'm accompanied today by François Daigle, the associate deputy minister of the Department of Justice. Thank you for the invitation to appear before you today.
I'd like to make some general comments on some of the issues raised during previous meetings of the committee's study.
I'd like to emphasize that the government is committed to keeping our children safe, including online, as Minister Blair just said. Canada's criminal legislation in this area are among the most comprehensive in the world.
The Criminal Code prohibits all forms of making, distributing, transmitting, making available, accessing, selling, advertising, exporting and possessing child pornography, which the Criminal Code broadly defines as material involving the depiction of sexual exploitation of persons under the age of 18 years.
The Criminal Code also prohibits luring—that is, communicating with a young person, using a computer, including online, for the purpose of facilitating the commission of a sexual offence against that young person. It prohibits agreeing to or making arrangements with another person to commit a sexual offence against a child, and it prohibits providing sexually explicit material to a young person for the purpose of facilitating the commission of a sexual offence against that young person.
Furthermore, the Criminal Code also prohibits voyeurism and the non-consensual distribution of intimate images, which are particularly germane to both the online world and the discussion we are having today.
Offences of a general application may also apply to criminal conduct that takes place online or that is facilitated by the use of the Internet. For example, criminal harassment and human trafficking offences may apply, depending upon the facts of the case.
Courts are also authorized to order the removal of child sexual exploitation material and other criminal content, such as intimate images, voyeuristic material or hate propaganda, where it is being made available to the public from a server in Canada.
In addition to the Criminal Code, as Minister of Justice, I'm responsible for the Act respecting the mandatory reporting of Internet child pornography by persons who provide and Internet service. This act doesn't have a short title, but law practitioners refer to it as the mandatory reporting act.
In English, it's the mandatory reporting act, or MRA.
Under the mandatory reporting act, Internet service providers in Canada have two main obligations. The first is to contact the Canadian Centre for Child Protection when they receive child pornography complaints from their subscribers. This centre is the non-governmental agency that operates Cybertip.ca, the national tipline for reporting the online sexual exploitation of children.
The second obligation of Internet service providers is to inform the provincial or territorial police when there are reasonable grounds to believe that its Internet services have been used to commit a child pornography offence.
While Canada's laws are comprehensive, it is my understanding that there has been some concern as to how they are being interpreted and implemented, especially in relation to the troubling media reports about MindGeek and its Pornhub site.
Since I am the Minister of Justice, it would not be appropriate for me to comment on ongoing or potential investigations or prosecutions, but I would also note that the responsibility for the administration of criminal justice, including the investigation and prosecution of such crimes, including the sexual exploitation offences, falls largely on my provincial colleagues and counterparts.
However, as the Prime Minister stated during question period on February 3:
...cracking down on illegal online content is something we are taking very, very seriously. Whether it is hate speech, terrorism, child exploitation or any other illegal acts....
In fact, the government takes these measures so seriously that the Prime Minister has given four ministers the mandate to address different aspects of online harms. Minister Blair and I are two of these ministers. As he has mentioned, the Minister of Canadian Heritage is one of the lead [Technical difficulty—Editor] as well.
While the Internet has provided many benefits to Canada and the world, it has also provided criminals with a medium that extends their reach—and thus, their victim base—and a medium that elevates the level of complexity of investigations. One complicating factor is that telecommunications networks and services transcend international borders, while the enforcement authority of police, such as the RCMP, is generally limited to their domestic jurisdiction.
Further, under international law, court orders are generally enforceable only within the jurisdiction of a state. With limited exceptions, their enforcement requires the consent of the other state in which they are sought to be enforced.
Canada is obviously not the only country facing these challenges, which is why we continue to work with our international partners to facilitate international co-operation in the investigation and prosecution of these crimes, notably to strengthen bilateral co-operation and negotiation of new international mutual legal assistance treaties in criminal matters in order to address these issues.
Although mutual legal assistance treaties are a universally accepted method of requesting and obtaining international assistance in criminal matters, even in emergency situations, they weren't designed for the Internet age, where digital evidence is a common component of most criminal investigations and where timeliness is essential to the collection of this evidence because of its volatility.
Canada is actively working with its international partners to address these issues. For example, we are currently participating in the negotiation of a second protocol to the Council of Europe Convention on Cybercrime to enhance international co-operation on cross-border access to data.
Thank you.
View Patricia Lattanzio Profile
Lib. (QC)
Thank you, Mr. Chairman.
Thank you, everyone, for being present this morning, both the ministers and Ms. Lucki. Thank you for partaking and helping this committee move along and undertake this very important study.
My first question will be for Mr. Blair.
Mr. Blair, just last month, Public Safety Canada launched a national awareness campaign targeting children, parents and caregivers in order to raise awareness of online child sexual exploitation and abuse and, more specifically, raising awareness of this heinous crime, how to report it and how to reduce the stigma associated with the reporting. Why are the awareness and stigma reduction practices so important?
View Bill Blair Profile
Lib. (ON)
Thank you very much, Madam Lattanzio.
It's a very important question. We understand that public education for children and their parents and an awareness of the issue of child sexual exploitation on the Internet is absolutely critical in giving families and young people the tools they need. We are also working hard to remove the stigma, because we know that many people have been deeply traumatized by this most heinous of crimes and we want to empower people to be able to come forward and take actions to protect themselves.
At the same time, we also recognize the importance of strong support for criminal investigations. I want to acknowledge that the RCMP runs the National Child Exploitation Crime Centre, but we work very closely with the Canadian Centre for Child Protection, which undertakes, on our behalf and with our funding, support for victim identification and victim support strategies to provide assistance to survivors and tailored resources for victims and their families.
We know that this victimization in this most terrible way by this online sexual exploitation of children can have lifelong consequences. It's critically important that we raise public awareness of the issue so that.... We know that during the pandemic a lot of kids are spending a lot more time online, and we want to make sure they can do it safely. That can be done through public education and working with their families. At the same time, we also recognize that predators are out there, and we need to make sure that we have the tools and the resources necessary to apprehend, deter and prosecute those individuals.
View Shannon Stubbs Profile
CPC (AB)
View Shannon Stubbs Profile
2021-04-12 12:15
Thanks, Chair.
I have some questions for you, Commissioner Lucki. I have been looking at the website for the various child sexual exploitation units in the RCMP. I would also just note the recent reporting by the director of Cybertip, who says that in 2020 his [Technical difficulty—Editor] spike over April, May and June [Technical difficulty—Editor] youth who had been sexually exploited and reports of people trying to sexually abuse children.
I wonder if you could confirm that cases of child sexual exploitation online have increased during the past year. In that context, could you also shed some light on exactly what the support and resources were that the public safety minister says he offered when he reached out to the RCMP after members of Parliament and victims spoke out on this travesty last year?
Brenda Lucki
View Brenda Lucki Profile
Brenda Lucki
2021-04-12 12:16
Thank you so much for that question.
COVID-19 especially has had a heightened risk to children, as offenders have taken advantage of the fact that children are spending more time online and are often unsupervised. Since the onset of the pandemic, the centre has seen increased online activity related to online child sexual exploitation. From March to May 2020, the centre has recorded an approximately 36% increase in reports of suspected online child exploitation, attributed in part to the increase in viral media and a tangible increase in self-exploitation cases.
We also anticipate more reporting of child exploitation offences, both online and off-line, when the pandemic-related restrictions are slowly lifted and the children have access to trusted adults once again—their teachers, caregivers and community support services. It was largely limited at the onset of the pandemic, likely preventing children from reporting abuse to trusted adults outside of their homes, which is such a crucial part.
In terms of your question with regard to Minister Blair reaching out to the RCMP, whenever a huge...for example, when this arose about the increase in child exploitation, we're always having a conversation about the things we can do to prevent them. Obviously, we're looking at legislation and we're looking at the mandatory reporting act. We spoke about resources. We spoke about technology. We've talked about things within the acts and how that could improve law enforcement and how we could better reach out to law enforcement.
View Han Dong Profile
Lib. (ON)
Thank you very much, Chair.
Commissioner Lucki, thank you very much for coming to the committee and answering some questions.
First things first, when the story broke, it raised a lot of tensions in the public. I've been getting a lot of questions from my constituents in Don Valley North who are asking what they should do if they ever encounter a situation like that or any evidence of a child being exploited. I'm hoping that you very quickly can tell us a bit more about the National Child Exploitation Crime Centre.
Does it provide 24-hour service? What kind of service is there? What can any member of the public do if they sense that there's a crime taking place?
Brenda Lucki
View Brenda Lucki Profile
Brenda Lucki
2021-04-12 12:23
Obviously, when people are privy to that information, they need to go to their police of jurisdiction first and foremost. We work with the National Child Exploitation Crime Centre and have tried to connect that centre with industry because having [Technical difficulty—Editor] increases the education from industry. They are mandated to address the online child sexual exploitation and are available 24-7. They work to try to get voluntary compliance from industry, but they also provide that service.
As soon as they find things that come to their attention, they quickly bring that to law enforcement and then we help law enforcement. If it's not within our RCMP jurisdiction, we make sure that it is brought to the police of jurisdiction and assist in any way we can.
View Han Dong Profile
Lib. (ON)
You mentioned that industry has to voluntarily report this. However, as you heard earlier, there have been a lot of questions about whether the law is sufficient in making those requirements on industry and their responsibility of protecting the kids. Going forward, my suggestion is that, as we complete this study, if you have any suggestions on how to strengthen the legislation in that aspect, we'd be happy to hear about it.
My other question is that it's been reported that since mid last year, the RCMP has received in excess of hundreds of reports related to Pornhub through the National Center for Missing and Exploited Children. Obviously I'm not looking for specifics of an investigation if there's one, but what is the process when the RCMP receives these reports?
Brenda Lucki
View Brenda Lucki Profile
Brenda Lucki
2021-04-12 12:25
Obviously the [Technical difficulty—Editor] work through the RCMP National Child Exploitation Crime Centre. When they receive those reports, as we also do internationally through the national crime centre in the United States, we take those and ensure that they are referred to a Canadian law enforcement agency. Often we'll put together an investigative package if we can and bring that to our agencies. Once they get that, then they can initiate an investigation.
View Han Dong Profile
Lib. (ON)
We heard from a lot of members. There are obviously questions about jurisdictional responsibilities. Is this a Canadian company subject to Canadian law?
In your opinion, is there anything we can do to make it clearer so that these investigations can take place? Obviously, we've heard there's a lot of evidence out there, but an investigation hasn't been launched.
Brenda Lucki
View Brenda Lucki Profile
Brenda Lucki
2021-04-12 12:27
It's such a complicated issue because, as you know, the application of domestic criminal laws and territorial limits [Technical difficulty—Editor] jurisdiction has been a challenge given the global nature of the Internet, which is not bound by traditional borders. International conflict law is such a complex matter. It's very difficult for the RCMP to monitor and ascertain compliance with the mandatory reporting act, particularly in the cases where the companies have a complex international structure and the data is stored in multiple jurisdictions. Those services flow through the Internet and transcend international borders.
However, that's where having strong partnerships internationally, including with the Virtual Global Taskforce, allows us to exchange intelligence and data. That is where we can maximize and get rid of some of those borders, so to speak, because we have to make sure it falls within our protocol.
View Arnold Viersen Profile
CPC (AB)
Okay. If you're sitting in an office building in Canada and you come across child sexual exploitation, do you not have a duty to report it, regardless of where it originated?
Brenda Lucki
View Brenda Lucki Profile
Brenda Lucki
2021-04-12 12:37
Yes. Any citizen, not just employees but any person in Canada, any person anywhere, should be reporting it. Absolutely.
View Bill Blair Profile
Lib. (ON)
Thank you very much, Mr. Chair. I'll accept your remarks with respect to charm, but I'm afraid, with respect to looks, it's contrary to the evidence before us.
I'd like to thank the committee for the invitation today, and I'm pleased to present the 2021 supplementary estimates (C) and the 2021-22 main estimates for the public safety portfolio.
I'm very ably joined today by a number of my colleagues. Respectfully, in the interest of time, I will not introduce them, but I'd like to take the opportunity to acknowledge that, during these incredibly difficult and challenging times over the past year, they've all stepped up to the plate. They've been working diligently to keep our borders, communities and correctional institutions safe as well as to protect our national security.
Today, Mr. Chair, I believe these estimates reflect that work.
I'll go through the supplementary estimates (C) for 2021 in order to present these items chronologically. The approval of these estimates will result in funding approvals of $11.1 billion for the public safety portfolio, and that represents an increase of 3.3% over total authorities provided to date. I will briefly share some of the highlights here as they relate to how we manage our critical services during the pandemic.
The first is $135.8 million for the Correctional Service of Canada for critical operating requirements related to COVID-19.
The second is $35 million for Public Safety Canada, to support the urgent relief efforts of the Canadian Red Cross during the pandemic. Mr. Chair, as you know, the many volunteers and staff of the Canadian Red Cross have been there to support Canadians from the outset of this pandemic, including at long-term care homes right across the country.
I would ask this committee to join me in thanking them for all their service and for providing help where it was needed most. I’ll also note that this funding is in addition to the $35 million of vote 5 funding to Public Safety from Health Canada to support rapid response capacity testing being deployed to fill gaps in surge and targeted activities, including remote and isolated communities.
Included in these supplementary estimates is funding to enhance the integrity of our borders and asylum system while also modernizing the agency’s security screening system. This funding will ensure that security screening results are made available at the earliest opportunity under a reformed system.
I'd like to take this opportunity to highlight that CBSA employees have done a remarkable job in keeping our borders safe in response to COVID-19. I'd like to take the opportunity as well to thank them for their continued hard work in keeping Canadians safe.
We're also working through these supplementary estimates to increase funding to end violence against indigenous women and girls and to provide essential mental health services.
For the RCMP, we are investing significant funds through both the supplementary and main estimates to support improvements to the federal policing investigative capacity by bolstering its capability with additional policing professionals, investigators and scientists. This will be used to deal with federal policing initiatives, which include responding to money laundering, cybercrime such as child sexual exploitation, and national security such as responding to terrorism and foreign-influenced hostile activities.
Mr. Chair, if I may, I'll turn to the 2021-22 main estimates. The public safety portfolio, as a whole, is requesting a total of approximately $10 billion for this fiscal year. As I’ve previously noted, the portfolio funding has remained stable over the last few years. I will endeavour to break down the numbers by organization.
Public Safety Canada is seeking a total of $1.1 billion in the main estimates. This represents an increase of $329.9 million, or 45.5%, over the previous year. The bulk of this increase is due to the grants and contributions regarding the disaster financial assistance arrangements program, or DFAA. It’s an increase in funding based upon forecasts from provinces and territories for expected disbursements under the DFAA for this fiscal year. This represents a critical part of my portfolio as minister of Public Safety and Emergency Preparedness.
In these main estimates, increases also include $15 million for incremental funding to take action against gun and gang violence. As this committee knows, I introduced Bill C-21 in the House not very long ago, a bill designed to protect Canadians from firearm violence and to fulfill our promise of strengthening gun control.
Mr. Chair, I know that this committee will have the chance to review that legislation at some future date, and I look forward to discussing it with them at that time.
I want to focus on a number of ongoing issues and our responses to them, starting with Correctional Service of Canada, which is seeking $2.8 billion this fiscal year, which represents an increase of $239.8 million or 9.4% over the previous year. This net increase is primarily due to a net increase in operating funding, which includes an increase for transforming federal corrections as a result of the passage of the former Bill C-83, which introduced the new structured intervention unit model.
That bill represents a major change in the way our correctional institutions operate, and recent reports have been clear that more work must be done. Funding is just one part of the solution. With the creation of data teams, efforts to replicate best practices nationally and enhanced support from independent, external decision-makers, I am confident we will deliver on this transformational promise.
I want to again acknowledge the troubling findings that were made in the Bastarache report, which I know this committee has examined and reviewed with concern. We are seeking funds to establish the independent centre for harassment resolution. This will be responsible for implementing the full resolution process, including conflict management, investigations and decision-making.
Mr. Chair, we know more work needs to be done. I'd like to conclude by noting the importance of our oversight agencies. You will see in the main estimates that we are seeking to increase funding for the Office of the Correctional Investigator, the CRCC and the ERC, the latter by close to 100%.
With that, Mr. Chair, I thank you and the members of the committee for your patience as I delivered my opening remarks. I'm happy to answer questions that members may have about these estimates and the collective work of our portfolio.
Alex Kamarotos
View Alex Kamarotos Profile
Alex Kamarotos
2021-03-11 15:44
Good afternoon.
Let me first of all thank you warmly for the invitation to Defence for Children International. I'll start with a few words about the organization. I think we are the only non-Canadians here.
Defence for Children International is a leading child rights-focused and membership-based grassroots movement and is currently composed of 35 national sections across five continents. It was created in 1979, the International Year of the Child, in Geneva, Switzerland.
The UN High Commissioner for Human Rights, Michelle Bachelet Jeria, reported the following at the current session of the UN Human Rights Council here in Geneva:
Much of the negative impact of the COVID-19 pandemic has been exacerbated by a failure to address previously existing structural causes of inequality, social exclusion and deprivation, and the inability of many countries, rich and poor alike, to meet the basic needs of a sizeable proportion of their populations.
This is equally applicable to children and the rights of the child, in particular during this pandemic. DCI has had the chance to count on some very relevant experience from such other health emergencies as the 2015 Ebola emergency in west Africa, where DCI-Sierra Leone and DCI-Liberia were particularly involved. In February 2020, the international secretariat and the entire movement mobilized in front of this pandemic. We very quickly gave alerts regarding the risk of violations exacerbated by the pandemic or even created by mitigation measures taken by states.
In my intervention, in complementarity with your earlier hearings, I want to touch upon two issues related to children. The first one concerns the impact of the pandemic on violence against children, including gender-based violence. The second is the impact on access to justice, in particular for children deprived of liberty. That touches upon the issue we just heard.
UNICEF reports that violence prevention and response services have been disrupted in 104 countries during the COVID pandemic. I believe we still only see the top of the iceberg regarding the impact of the COVID pandemic on violence against children, but it seems to be already well documented that COVID-19 and some of the mitigation measures taken by the governments have increased the exposure of children to different forms of violence, exacerbating such human rights violations as stigmatization, discrimination and xenophobia; child labour and unpaid work; child pregnancy; and harmful acts that include child marriage and female genital mutilation, as well as online abuse, bullying and exploitation. As the UN Special Representative of the Secretary-General on Violence Against Children emphasized in her report to the UN Human Rights Council earlier this week, “What began as a health crisis risks evolving into a broader child-rights crisis.”
I also want to share our experience and results in the area of justice for children, in particular children deprived of liberty. DCI has been part of the origin—we are currently the co-chair together with Human Rights Watch—of a wide civil society coalition on children deprived of liberty. The NGO Panel for the Global Study on Children Deprived of Liberty is composed of 170 civil society organizations worldwide. The UN High Commissioner for Human Rights, Michelle Bachelet Jeria, has urged authorities since the beginning of the pandemic to look at releasing detainees and in particular low-risk child offenders. UNICEF data indicate that at least 31 countries have released children from detention because of concerns about the spread of COVID-19. This is certainly insufficient, and even lower than the number of adult detainees released.
Honourable members, I cannot finish this very short and certainly incomplete presentation without speaking about the impact of COVID-19 measures on the mental health of children and the importance of ensuring the meaningful participation of children on mitigation measures that concern them. Last year DCI organized child- and youth-led online debates on the impacts of COVID-19. We had very, very concrete results.
We also participated, together with a great number of other civil society organizations, in #CovidUnder19, an initiative to meaningfully involve children in responses to the pandemic, with participation from more than 26,000 children from 137 countries.
I want to quote from two of the children who participated in the initiative. The first one comes from a Bolivian girl: “I think the government should understand that children are not dumb and easily manipulated. Children should feel that trust and not feel like they have to remain silent. This would increase their confidence and [motivate them] to report injustice.”
Last but not least, a 16-year-old Canadian girl said, “Even though there is a pandemic going on, there are people out there who experience abuse daily. The awareness, even in Canada, on how to access the resources is not explained in the best way. Finding that information should be basic knowledge for any human being.”
I thank you.
Lianna McDonald
View Lianna McDonald Profile
Lianna McDonald
2021-02-22 11:11
Good morning, Chairperson and distinguished members of the committee. Thank you for giving us this opportunity to present.
I am Lianna McDonald, executive director of the Canadian Centre for Child Protection, a charity dedicated to the personal safety of children. Joining me today is Lloyd Richardson, our director of technology.
By way of background, our agency operates Cybertip.ca, which is Canada’s tip line for reporting the online sexual exploitation of children. The tip line has been operating for over 18 years and currently receives, on average, 3,000 or more public reports per month.
Our agency has witnessed the many ways in which technology has been weaponized against children and how the proliferation of child sexual abuse material, otherwise known as CSAM, and non-consensual material fosters ongoing harm to children and youth. Over the last decade, there has been an explosion of digital media platforms hosting user-generated pornographic content. This, coupled with a complete absence of meaningful regulation, has created the perfect storm whereby transparency and accountability are notably absent. Children have been forced to pay a terrible price for this.
We know that every image or video of CSAM that is publicly available is a source of revictimization for the child in that image or video. For this reason, in 2017 we created Project Arachnid. Processing tens of thousands of images per second, this powerful tool detects known CSAM for the purpose of quickly identifying and triggering the removal of this illegal and harmful content. Project Arachnid has provided our agency with an important lens into how the absence of a regulatory framework fails children. To date, Arachnid has processed more than 126 billion images and has issued over 6.7 million takedown notices to providers around the globe. We keep records of all these notices we send, how long it takes for a platform to remove CSAM once advised of its existence, and data on the uploading of the same or similar images on platforms.
At this point, we would like to share what we have seen on MindGeek’s platforms. Arachnid has detected and confirmed instances of what we believe to be CSAM on their platform at least 193 times in the past three years. These sightings include 66 images of prepubescent CSAM involving very young children; 74 images of indicative CSAM, meaning that the child in the image appears pubescent and roughly between the ages of 11 to 14; and 53 images of post-pubescent CSAM, meaning that sexual maturation of the child may be complete and we have confirmation that the child in the image is under the age of 18.
We do not believe the above numbers are representative of the scope and scale of this problem. These numbers are limited to obvious CSAM of very young children and of identified teenagers. There is likely CSAM involving many other teens that we would not know about, because many victims and survivors are trying to deal with the removal issue on their own. We know this.
MindGeek testified that moderators manually review all content that is uploaded to their services. This is very difficult to take seriously. We know that CSAM has been published on their website in the past. We have some examples to share.
The following image was detected by Arachnid. This image is a still frame taken from a CSAM video of an identified sexual abuse survivor. The child was pubescent, between the ages of 11 and 13, at the time of the recording. The image shows an adult male sexually assaulting the child by inserting his penis in her mouth. He is holding the child’s hair and head with one hand and his penis with the other hand. Only his midsection is visible in the image, whereas the child’s face is completely visible. A removal request was generated by Project Arachnid. It took at least four days for that image to come down.
The next example was detected also by Project Arachnid. It is a CSAM image of two unidentified sexual abuse victims. The children pictured in the image are approximately 6 to 8 years of age. The boy is lying on his back with his legs spread. The girl is lying on top of him with her face between his legs. Her own legs are straddling his head. The girl has the boy’s penis in her mouth. Her face is completely visible. The image came down the same day we sent the notice requesting this removal.
We have other examples, but my time is limited.
While the spotlight is currently focused on MindGeek, we want to make it clear that this type of online harm is occurring daily across many mainstream and not-so-mainstream companies operating websites, social media and messaging services. Any of them could have been put under this microscope as MindGeek has been by this committee. It is clear that whatever companies claim they are doing to keep CSAM off their servers, it is not enough.
Let's not lose sight of the core problem that led to this moment. We've allowed digital spaces where children and adults intersect to operate with no oversight. To add insult to injury, we have also allowed individual companies to decide the scale and scope of their moderation practices. This has left many victims and survivors at the mercy of these companies to decide if they take action or not.
Our two-decades-long social experiment with an unregulated Internet has shown that tech companies are failing to prioritize the protection of children online. Not only has CSAM been allowed to fester online, but children have also been harmed by the ease with which they can easily access graphic and violent pornographic content. Through our collective inaction we have facilitated the development of an online space that virtually has no rules, certainly no oversight and that consistently prioritizes profits over the welfare and the protection of children. We do not accept this standard in other forms of media, including television, radio and print. Equally, we should not accept it in the digital space.
This is a global issue. It needs a global coordinated response with strong clear laws that require tech companies to do this: implement tools to combat the relentless reuploading of illegal content; hire trained and effectively supervised staff to carry out moderation and content removal tasks at scale; keep detailed records of user reports and responses that can be audited; be accountable for moderation and removal decisions and the harm that flows to individuals when companies fail in this capacity; and finally, build in, by design, features that prioritize the best interests and rights of children.
In closing, Canada needs to assume a leadership role in cleaning up the nightmare that has resulted from an online world that is lacking any regulatory and legal oversight. It is clear that relying upon the voluntary actions of companies has failed society and children miserably. The time has come to impose some guardrails in this space and show the leadership that our children deserve.
I thank you for your time.
Daniel Bernhard
View Daniel Bernhard Profile
Daniel Bernhard
2021-02-22 11:19
Madam Chair, honourable members of the committee, thank you for inviting me to appear today.
My name is Daniel Bernhard, and I am the executive director of Friends of Canadian Broadcasting, an independent citizens' organization that promotes Canadian culture, values and sovereignty on air and online.
Last September, Friends released “Platform for Harm”, a comprehensive legal analysis showing that under long-standing Canadian common law, platforms like Pornhub and Facebook are already liable for the user-generated content they promote.
On February 5, Pornhub executives gave contemptuous and, frankly, contemptible, testimony to this committee, attempting to explain away all the illegal content that they promoted to millions of Canadians and millions more around the world.
Amoral as the Pornhub executives appear to be, it would be a mistake, in my opinion, to treat their behaviour as a strictly moral failing. As Mr. Angus said on that day the activity that you are studying is quite possibly criminal.
Pornhub does not dispute having disseminated vast amounts of child sexual abuse material, and Ms. McDonald just confirmed that fact. On February 5, the company's executives acknowledged that 80% of their content was unverified, some 10 million videos, and they acknowledged that they transmitted and recommended large amounts of illegal content to the public.
Of course, Pornhub's leaders tried to blame everybody but themselves. Their first defence is ignorance. They claim they can't remove illegal content from the platform because until a user flags it for them, they don't know it's there. In any case, they claim that responsibility lies with the person who uploaded the content and not with them. However, the law does not support this position. Yes, uploaders are liable, but so are platforms promoting illegal content if they know about it in advance and publish it anyway or if they are made aware of it post-publication and neglect to remove it.
This brings us to their second defence, incompetence. Given the high cost of human moderation, Pornhub employs software to find offending content, yet they hold themselves blameless when their software doesn't actually work. As Mark Zuckerberg has done so many times, Pornhub promised you that they'll do better. “Will do better” isn't a defence. It's a confession.
I wish Pornhub were an outlier, but it's not. In 2018, the U.S. National Center for Missing and Exploited Children received over 18 million referrals of child sexual abuse materials, according to the New York Times. Most of it was found on Facebook. There were more than 50,000 reports per day. That's just what they caught. The volume of user-uploaded, platform-promoted child sexual abuse material is now so vast that the FBI must prioritize cases involving infants and toddlers, and according to the New York Times, “are essentially not able to respond to reports of anybody older than that”.
These platforms also disseminate many illegal contents that are not of a sexual nature. These include incitement to violence, death threats, and the sale of drugs and illegal weapons, among others. The Alliance to Counter Crime Online group regularly discovers such content on Facebook, YouTube and Amazon. There is even an illegal market for human remains on Facebook.
The volume of content that these platforms handle does not excuse them from disseminating and recommending illegal material. If widespread distribution of illegal content is an unavoidable side effect of your business, then your business should not exist, period.
Can you imagine an airline being allowed to carry passengers when every other flight crashes? Imagine if they just said that flying is hard and kept going. Yet Pornhub and Facebook would have you believe just that: that operating illegally is fine because they can't operate otherwise. That's like saying, “Give me a break officer. Of course I couldn't drive straight. I had way too much to drink.”
The government promises new legislation to hold platforms liable in some way for the content that they promote and this is a welcome development. But do we really need a new law to tell us that broadcasting child sexual assault material is illegal? How would you react if CTV did? Exactly.
In closing, our research is clear. In Canada, platforms are already liable for circulating illegal user-generated content. Why hasn't the Pornhub case led to charges? Perhaps you can invite RCMP Commissioner Lucki to answer that question. Ministers Blair and Lametti could also weigh in. I'd be curious to hear what they have to say.
Don't get me wrong. The work that you are doing to draw attention to Pornhub's atrocious behaviour is vital, but you should also be asking why this case is being tried at committee and not in court.
Here's the question: Does Pornhub's CEO belong in Hansard or in handcuffs? This is a basic question of law and order and of Canada's sovereignty over its media industries. It is an urgent question. Canadian children, young women and girls cannot wait for a new law and neither should we.
Thank you very much. I welcome your questions.
John F. Clark
View John F. Clark Profile
John F. Clark
2021-02-22 11:25
Good morning, Madam Chair Shanahan and honourable members of the committee.
My name is John Clark. I am the president and CEO of the U.S.-based National Center for Missing and Exploited Children, sometimes known as NCMEC.
I am honoured to be here today to provide the committee with NCMEC's perspective on the growing problem of child sexual exploitation online, the role of combatting the dangers children can encounter on the Internet, and NCMEC's experience with the website Pornhub.
Before I begin with my testimony, I'd like to clarify for the committee that NCMEC and Pornhub are not partners. We do not have a partnership with Pornhub. Pornhub has registered to voluntarily report instances of child sexual abuse material on its website to NCMEC. This does not create a partnership between NCMEC and Pornhub, as Pornhub recently claimed during some of their testimony.
NCMEC was created in 1984 by child advocates as a private, non-profit organization to help find missing children, reduce child sexual exploitation and prevent child victimization. Today I will focus on NCMEC's mission to reduce online child sexual exploitation.
NCMEC's core program to combat online child sexual exploitation is the CyberTipline. The CyberTipline is a tool for members of the public and electronic service providers, or ESPs, to report child sexual abuse material to NCMEC.
Since we created the CyberTipline over 23 years ago, the number of reports we receive has exploded. In 2019 we received 16.9 million reports to the CyberTipline. Last year we received over 21 million reports of international and domestic online child sexual abuse. We have received a total of over 84 million reports since the CyberTipline began.
A United States federal law requires a U.S.-based ESP to report apparent child sexual abuse material to NCMEC's CyberTipline. This law does not apply to ESPs that are based in other countries. However, several non-U.S. ESPs, including Pornhub, have chosen to voluntarily register with NCMEC and report child sexual abuse material to the CyberTipline.
The number of reports of child sexual exploitation received by NCMEC is heartbreaking and daunting. So, too, are the many new trends NCMEC has seen in recent years. These trends include the following: a tremendous increase in sexual abuse videos reported to NCMEC, reports of increasingly graphic and violent sexual abuse images, and videos of infants and young children. These include on-demand sexual abuse in a pay-per- view format, and videos showing the rape of young children.
A broader range of online platforms are being used to access, store, trade and download child sexual abuse material, including chats, videos and messaging apps, video- and photo-sharing platforms, social media and dating sites, gaming platforms and email systems.
NCMEC is fortunate to work with certain technology companies that employ significant time and financial resources on measures to combat online child sexual abuse on their platforms. These measures include large teams of well-trained human content moderators; sophisticated technology tools to detect abusive content, report it to NCMEC and prevent it from even being posted; engagement in voluntary initiatives to combat online child sexual exploitation offered by NCMEC and other ESPs; failproof and readily accessible ways for users to report content; and immediate removal of content reported as being child sexual abuse.
NCMEC applauds the companies that adopt these measures. Some companies, however, do not adopt child protection measures at all. Others adopt half-measures as PR strategies to try to show commitment to child protection while minimizing disruption to their operations.
Too many companies operate business models that are inherently dangerous. Many of these sites also fail to adopt basic safeguards, or do so only after too many children have been exploited and abused on their sites.
In March 2020, MindGeek voluntarily registered to report child sexual abuse material, or CSAM, on several of its websites to NCMEC's CyberTipline. These websites include Pornhub, as well as RedTube, Tube8 and YouPorn. Between April 2020 and December 2020, Pornhub submitted over 13,000 reports related to CSAM through NCMEC's CyberTipline; however, Pornhub recently informed NCMEC that 9,000 of these reports were duplicative. NCMEC has not been able to verify Pornhub's claim.
After MindGeek's testimony before this committee earlier this month, MindGeek signed agreements with NCMEC to access our hash-sharing databases. These arrangements would allow MindGeek to access hashes of CSAM and sexually exploitive content that have been tagged and shared by NCMEC with other non-profits and ESPs to detect and remove content. Pornhub has not taken steps yet to access these databases or use these hashes.
Over the past year NCMEC has been contacted by several survivors asking for our help in removing sexually abusive content of themselves as children that was on Pornhub. Several of these survivors told us they had contacted Pornhub asking them to remove the content, but the content still remained up on the Pornhub website. In several of these instances NCMEC was able to contact Pornhub directly, which then resulted in the content being removed from the website.
We often focus on the tremendous number of CyberTipline reports that NCMEC receives and the huge volume of child sexual abuse material contained in these reports. However, our focus should more appropriately be on the child victims and the impact the continuous distribution of these images has on their lives. This is the true social tragedy of online child sexual exploitation.
NCMEC commends the committee for listening to the voices of the survivors in approaching these issues relating to Pornhub. By working closely with the survivors, NCMEC has learned the trauma suffered by these child victims is unique. The continued sharing and recirculation of a child's sexually abusive images and videos inflicts significant revictimization on the child. When any website, whether it's Pornhub or another site, allows a child's sexually abusive video to be uploaded, tagged with a graphic description of their abuse and downloaded and shared, it causes devastating harm to the child. It is essential for these websites to have effective means to review content before it's posted, to remove content when it's reported as child sexual exploitation, to give the benefit of doubt to the child or the parent or lawyer when they report content as child sexual exploitation, and to block the recirculation of abusive content once it has been removed.
Child survivors and the children who have yet to be identified and recovered from their abuse depend on us to hold technology companies accountable for the content on their platforms.
I want to thank you for the opportunity to appear before this committee. This is an increasingly important topic. I look forward to answering the committee's questions regarding NCMEC's work on these issues.
View Shannon Stubbs Profile
CPC (AB)
View Shannon Stubbs Profile
2021-02-22 11:34
Thank you, Madam Chair.
Once again, as every day on this committee, I am shocked and sick to my stomach and haunted by the amount of time this has all gone on. I thank you all for your work and your efforts and your expertise. I can't even imagine the level and the years of frustration you must have experienced. Thanks for being here today.
I hope that at the end of all of this there's actually content to combat this scourge, rather than what happens sometimes, where reports are written and then nothing occurs.
Again, I hardly even know where to start.
Ms. McDonald, you mentioned in your 2020 report about the various platforms, including Pornhub, that now I think many people's minds are being blown, that they include child sexual assault material. Could you tell us whether or not there are features that actually allow the reporting specifically of child sexual abuse material on those platforms you reviewed?
Lianna McDonald
View Lianna McDonald Profile
Lianna McDonald
2021-02-22 11:36
Thank you very much. I'm going to turn this over to Lloyd Richardson, our director of technology, in one second.
The point I want to make before he gives those concrete examples is that we took that review on when we were examining the now signed-on voluntary principles to address this, which the Five Eyes countries are signatories to. Our agency wanted to find out how easy it was for a user or a victim to report CSAM on very well-known platforms. We were absolutely shocked at how difficult it was often even to find the term CSAM.
We noticed a number of tactics that were used to actually discourage, if you can imagine, the reporting of CSAM. We can only surmise it's because many of those companies didn't necessarily want the numbers, didn't want to show how much of this was on their platforms, because of the volume of it coming in.
Before Lloyd speaks to the examples, I also want to note the number of survivors who, as our colleague and friend John Clark mentioned, are coming into organizations such as ours right now. We have a tsunami of these victims who either want to get their illegal material down or are having a difficult time reporting. The review was intended to shed light on a number of platforms and the inability of people to effectively and easily report.
Lloyd.
Lloyd Richardson
View Lloyd Richardson Profile
Lloyd Richardson
2021-02-22 11:37
It's important to note when looking at these different platforms that this is only one dynamic of the way these companies operate in the space, the ability of people to report, “Hey, this is my material. Please remove it.” We need to know that many people aren't even necessarily doing that.
When we look at reports that we send in, typically industry will use the term “trusted flagger program” and what have you. Essentially, that just means they pay more attention to child protection charities when they send a notice in. When a member of the public does it, it generally has a much lower priority. This is typical across most tech companies, including MindGeek.
Another piece that's a bit of an issue is that to actually remove something is not a one-click option. The idea that these companies allow for the upload of this material—or historically have—and that you can upload it with no sort of contact information and away you go.... The process you need to go through to actually get something removed is quite heavy. In some cases you need to provide identification. If you have your material up there, would you really want to provide your email address or contact information to a company such as MindGeek?
Certainly, some of these things have changed. MindGeek fared well compared with some of the big tech companies, but that certainly doesn't mean it's doing very well in this space.
Lianna McDonald
View Lianna McDonald Profile
Lianna McDonald
2021-02-22 11:39
There's one last point I want to make. It's really important also, when we look at all the reports, and even the information and the data that organizations such as ours or NCMEC have, that we only come across what we know.
What we know—and we all understand this—is that for a young adolescent girl who has had a sexual image of her land on such sites as this, the fear and humiliation in coming forward to organizations and even approaching organizations for help are incredibly difficult. What we know for sure is that our numbers for this type of victimization vastly underestimate it.
View Charlie Angus Profile
NDP (ON)
Thank you, Madam Chair, and thank you to each of our witnesses. If I cut you off, it's not that I'm being rude, but I only have six minutes and we have so much to get to.
In terms of legal obligations in Canada, in 2011 Parliament adopted an act regarding the mandatory reporting of Internet child pornography. There are two provisions in it. One is that if an online provider finds issues of child pornography, they have a legal obligation to report to police. As well, they have a legal obligation to report to the Canadian Centre for Child Protection. Madam McDonald, that is you.
We asked Pornhub about their compliance with Canadian law on this matter. Have you found that they are complying with Canadian law in reporting these multiple incidents that we've had to deal with?
Lloyd Richardson
View Lloyd Richardson Profile
Lloyd Richardson
2021-02-22 11:57
There are two pieces there.
The one side, concerning law enforcement, we couldn't really comment on. On the law enforcement angle, the legislation states that basically, if an entity has possession of that material on their system, there are preservation requirements placed upon them and they need to report to Canadian law enforcement. That could be any police officer in Canada.
The second piece is where an entity does not necessarily have possession of the CSAM. It could be an Internet service provider of some sort that has become aware of child sexual abuse material on a different service. They can report that to a designated reporting entity, which is us, the Canadian Centre for Child Protection.
Speaking on this side of things, very recently MindGeek has reached out to us to attempt to report through that means. I can't necessarily speak—
View Charlie Angus Profile
NDP (ON)
Okay.
I find it interesting, because when I read the law it says “on a service”. They provide a service. If they're made aware of it “on a service”, my understanding is that they're obligated to report in Canada.
When we asked them this, they said that they reported to NCMEC, Mr. Clark, which might be great, but to me, it's still avoiding the issue of Canadian law.
How long have they been reporting to NCMEC on allegations that are brought forward?
John F. Clark
View John F. Clark Profile
John F. Clark
2021-02-22 11:58
Their reporting to us has just been in recent times, within the last few weeks, probably.
John F. Clark
View John F. Clark Profile
John F. Clark
2021-02-22 11:58
Yes, it's been the last few weeks.
As was noted, I think, in my testimony, while they provided, I believe, around 13,000 reports, a lot of those were duplicative. We've also noted in some instances a very strong reluctance on their part to take down material that is called in from people who have been victimized. However, when we call them, they take it down.
View Charlie Angus Profile
NDP (ON)
Thank you.
I'll now go to Mr. Bernhard on the issue of Canadian law.
We have spent a lot of time at our parliamentary committee on issues of compliance by the tech giants, and Pornhub-MindGeek is a tech giant. It seems to me in my reading of Canadian law that we have very strong laws to protect against non-consensual exploitation of images. We have strong child pornography laws.
I believe there's only been one online investigation, and it wasn't against any company nearly as big as Pornhub. Safe harbour provisions have protected the tech giants, because they don't know what's on their servers. However, Pornhub officials told us that they viewed every single image. If they viewed every single image, that means they would endorse that image as being okay.
Would you think there would be an issue of responsibility there, and as well that the tags that identify “knocked-out teen” or “raped teen” would be the promotion of acts that we would consider illegal? They seem to be not sure whether “teen” is legal or illegal; however, I think under Canadian law.... Do you believe they would be protected under safe harbour provisions?
Daniel Bernhard
View Daniel Bernhard Profile
Daniel Bernhard
2021-02-22 12:02
My understanding is that they would not. My understanding is that even in the United States, there is an exception to section 230 for child sexual abuse material, so even under the safe harbour laws of the United States, they would not be protected. Ultimately, Mr. Angus, I would hope it would be a judge who would decide this, and the lack of charges here is particularly concerning to me. It appears to be in clear violation of the law, but that's not something that I am able to pronounce upon, and so, the police, law enforcement, prosecutors and judges really need to get involved here, because it does appear to be a very clear violation of the law.
View Arnold Viersen Profile
CPC (AB)
Thank you, Madam Chair.
Ms. McDonald, your organization has done fantastic work. I had the opportunity to visit your facility a number of years back and I got to see Project Arachnid in action even though there's not much to see when you're at the facility. It's a bunch of lights blinking on computers.
Nonetheless, recently the Canadian Centre for Child Protection took steps to distance itself from an international group that has received donations from MindGeek. Can you explain why you took this step, and would you consider MindGeek a partner in any way?
Lianna McDonald
View Lianna McDonald Profile
Lianna McDonald
2021-02-22 12:03
Yes, what ended up happening, we didn't realize at the time, when the organization.... The international association you're talking about is INHOPE. The organization had taken a donation from a company called MindGeek. At the time, we did not understand the business structure and what all that meant. As soon as our organization became aware that this company owned a number of adult pornography sites, we immediately made the decision to take away our sort of membership there. It was a very important decision for our organization because we deal with survivors and victims, and many of them are teens. Many of them had come in to us also talking about their victimization on these types of sites.
Also, I just have to raise—because it's not subject to the conversation that we're having right now—that we have a huge issue with the lack of age verification. We have Canadians coming in telling us that their 13-year-old or 12-year-old son was able to go straight into a really graphic web page called Pornhub. From our organization's standpoint, we could not continue with that, so we did make that difficult decision. We worked with hotlines around the world. We do in other capacities and we'll continue to do so because it's in the best interest of children. That was why we made the decision.
View Arnold Viersen Profile
CPC (AB)
Mr. Richardson, I'll ask you this seeing as you're the tech guy here.
The big trouble we've been studying here at this committee is around this, the age and the consent of folks who are depicted in these videos. We hear a lot about how long it took to take the video down and things like that, but certainly there would be methods of ensuring that these videos never show up in the first place.
I was wondering if you could comment on that. If you're bragging that you are the leading tech company in the world, surely there's technology to keep this stuff off the Internet to begin with.
Lloyd Richardson
View Lloyd Richardson Profile
Lloyd Richardson
2021-02-22 12:05
There is, but I would kind of invert that a little bit. It's not a technical issue.
Let's reverse in time to the 1980s before the popularized Internet when we had pornography and we didn't see child sexual abuse material showing up in Playboy magazine. It's not necessarily a technical issue. If you're in fact moderating everything that comes up on your platform, this should never happen. We don't see the CBC show up with child pornography on its services because there's moderation that happens. We have control over the content. That's not to say you can't leverage technology, as we do in Project Arachnid, to do proactive detection of known child sexual abuse material, but really, let's not look at the new and fancy, oh, I have an AI classifier that can automatically detect child pornography. That's great and all, but it's never going to detect everything, and it's not going to have the accuracy that you have of actual human moderators looking at material. It's in addition to something that's already there, so it's important not to belabour the technological side of things.
Daniel Bernhard
View Daniel Bernhard Profile
Daniel Bernhard
2021-02-22 12:07
It was just to say that I agree. Platforms want to operate at a certain scale, which requires them not to validate any of the content that comes up, yet that seems to result in illegal outcomes, so it's not really for us to say how they should deal with this, but simply that if it's there, they should face the consequences.
To Mr. Richardson's point, I have one final issue. It's not just CBC, CTV, etc., who make sure that their content is lawful. They also have to make sure the advertising that they run is lawful and that op-eds and other third party contributed content are lawful. Otherwise they are jointly liable. This is how the law works, and I see no reason why it shouldn't apply in the case of Pornhub, Facebook, Amazon or any other provider that is recommending and facilitating illegal behaviour through its service.
View Patricia Lattanzio Profile
Lib. (QC)
Thank you, Madam Chair, and thank you to our guests for being here this morning.
My first question is for you, Mr. Clark.
We heard Mr. Antoon from MindGeek say during his testimony how proud he was of being a partner with NCMEC. He clearly said in his testimony that he reports every instance of CSAM when they are aware of it, so that the information could be disseminated or investigated by authorities across the globe.
When we asked him to give us a report of how many instances were reported just in 2019, they couldn't answer. Would you be in a position to provide us information for 2019 and also, if you can, going all the way back to 2008?
John F. Clark
View John F. Clark Profile
John F. Clark
2021-02-22 12:26
Sure. As was noted in my testimony, they are not a partner. In fact, we sent a letter to them soon after their testimony when we became aware that they were saying they were a partner, telling them that it was not true and that they should cease and desist from saying so. That's important to note.
In terms of the 2019 numbers, I'm not aware of any reporting that was happening in that particular calendar year. In 2020, we did begin to receive some of the reports. Again, this is on a voluntary basis. We did note that many of those reports were duplicative.
We have encouraged them, as we do all the ESPs, to begin a stricter content moderation. They should begin to actually know beforehand what is being uploaded and whether that content passes any of the legal requirements that they would have to post it. If it does not, it should not go up, period. They should not have to go back and look for it or have a victim call in and ask for it to be taken down.
That's something we encourage all of them to do.
Stephen White
View Stephen White Profile
Stephen White
2021-02-22 12:40
Thank you very much.
Good afternoon, Madam Chair and honourable members of the committee. Thank you very much for the opportunity to speak with you today on this pressing matter. My colleagues from the RCMP have been introduced.
I'd like to highlight that Chief Superintendent Marie-Claude Arsenault is with us. She oversees sensitive and specialized investigative services, which also includes the National Child Exploitation Crime Centre. Also with us is Mr. Paul Boudreau, executive director of technical operations for the RCMP. It's also a pleasure to have our colleague from the Department of Justice with us as well.
I'd like to describe for a couple of minutes a broader context of online child sexual exploitation and highlight the RCMP's steadfast efforts towards combatting this crime and bringing offenders to justice.
Online child sexual exploitation is one of the most egregious forms of gender-based violence and human rights violations in Canada. Not only are children, particularly girls, victimized through sexual abuse, but often they are revictimized through their lives, as photos, videos and/or stories of their abuse are shared repeatedly on the Internet amongst offenders.
In 2004 the Government of Canada announced the national strategy for the protection of children from sexual exploitation on the Internet, which brings together the RCMP, Public Safety Canada, the Department of Justice and the Canadian Centre for Child Protection, CCCP, to provide a comprehensive, coordinated approach to enhancing the protection of children from online child sexual exploitation. The Canadian Centre for Child Protection is a non-governmental organization that operates Cybertip.ca, Canada's tip line to report suspected online sexual exploitation of children.
The Criminal Code provides a comprehensive range of offences relating to online child sexual exploitation. Canadian police services, including the RCMP, are responsible for investigating these offences when there is a possible link to Canada. The Criminal Code also authorizes courts to order the removal of specific material, for example, a voyeuristic recording, an intimate image and child pornography that are stored on and made available through a computer system in Canada.
The RCMP's National Child Exploitation Crime Centre is the national law enforcement arm of the national strategy and functions as a central point of contact for investigations related to online sexual exploitation of children in Canada and international investigations involving Canadian victims, offenders or Canadian companies hosting child sexual exploitation material.
The centre investigates online child sexual exploitation and provides a number of critical services to law enforcement agencies, including immediately responding to a child at risk; coordinating investigative files with police of jurisdiction across Canada and internationally; identifying and rescuing victims; conducting specialized investigations; gathering, analyzing and generating intelligence in support of operations; engaging in operational research; and developing and implementing technical solutions.
The centre has seen first-hand the dramatic increase in reports of online child sexual exploitation in recent years. In 2019 the centre received 102,927 requests for assistance, an increase of 68% since 2018 and an overall increase of 1,106% since 2014. The majority of the referrals the centre receives come from the National Center for Missing and Exploited Children in the United States. Every report is assessed and actioned where possible.
In addition to the high number of reports, cases of online child sexual exploitation have become more complex. Advances in technology such as encryption, the dark Web and tools to ensure anonymity have made it much easier for offenders to conduct their criminal activities away from law enforcement agencies. Investigations related to online platforms also raise a host of other Internet-related issues, including the failure of platforms to retain data, the amount and speed at which content can be posted and distributed, and the ability of users to download hosted content.
When content is successfully removed from one platform, it can easily be uploaded to the same platform or to other websites, perpetuating victimization and leading to a proliferation of content depicting sexually exploited children on multiple platforms. It is well known that offenders protect this type of content on personal devices or through cloud computing services.
Like many cybercrimes, online child sexual exploitation is often multi-jurisdictional or multinational, affecting victims across jurisdictions and creating additional complexities for law enforcement. No single government or organization can address this crime alone. The RCMP works diligently with its partners at the municipal, provincial and federal levels in Canada and internationally, as well as with non-governmental organizations, to strengthen efforts to rescue victims and bring offenders to justice. In fact, the RCMP is the current chair of the Virtual Global Taskforce, an international police alliance dedicated to the protection of children from online sexual exploitation and other transnational child sex offences. The Virtual Global Taskforce consists of law enforcement, NGOs and industry partners working collaboratively to find effective response strategies. Chief Superintendent Arsenault, who is with us today, is the current chair of this very important group.
The RCMP also seeks to work closely with the private sector as offenders regularly utilize platforms operated by Internet and/or communications service providers to carry out a range of Criminal Code offences relating to online child sexual exploitation.
The RCMP regularly engages private sector partners to discuss existing legislation, which includes an act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service, referred to as the mandatory reporting act, which came into force in 2011. The mandatory reporting act requires that Internet service providers report to the Canadian Centre for Child Protection any tips they receive regarding websites where child pornography may be publicly available. Under the mandatory reporting act, Internet service providers are also required to notify police and safeguard evidence if they believe that a child pornography offence has been committed using their Internet service. Since the mandatory reporting act came into force in 2011, the RCMP has seen a continual increase in reporting from industry partners.
Many online platforms post jurisdictional challenges, as I outlined earlier. An online platform that is registered in Canada may maintain its servers abroad, which could limit the effect of a Canadian warrant. Further, when a global company registered abroad has a Canadian presence, it is likely to host its content abroad, making jurisdiction difficult to determine.
When an online platform permits its users, and/or itself, to personally download material to and upload material from their own computers, it becomes impossible to determine where this material may be stored or to prevent it from reappearing and being further disseminated.
New companies, platforms and applications will continue to emerge, and the services they provide to Canadians will continue to evolve. It is important that the Government of Canada, legislative authorities and law enforcement agencies keep pace and adapt accordingly to combat these crimes.
The illegal online content that many communications service—
View Arnold Viersen Profile
CPC (AB)
Thank you, Madam Chair.
I want to thank the witnesses for being here.
Section 163.1 of the Criminal Code makes it an offence to make available or distribute CSAM. MindGeek executives told us that none of this exists on their site. Have you found this to be true, Mr. White?
Stephen White
View Stephen White Profile
Stephen White
2021-02-22 12:49
Actually, we have had referrals from our partner in the United States that I mentioned, NCMEC, with regard to disclosures by that company to us regarding that content.
View Arnold Viersen Profile
CPC (AB)
Okay.
We've heard multiple times at this committee that there are documented cases of videos of exploitation of minors being put up on MindGeek. The big question for us is to understand why the owners have not faced charges. Is there a problem with the laws as they currently stand?
Stephen White
View Stephen White Profile
Stephen White
2021-02-22 12:50
Obviously, there are a number of elements to when and what types of charges will be laid. When we're talking about these corporations, which are service providers and hosting platforms, other individuals have the ability to automatically load the content onto the platforms. There are jurisdiction issues, but every case is different.
I would ask my colleague, Chief Superintendent Arsenault, if she can add to that.
Marie-Claude Arsenault
View Marie-Claude Arsenault Profile
Marie-Claude Arsenault
2021-02-22 12:51
I would add that every situation is assessed, and in looking at the evidence we have, that would determine if we have enough to proceed with charges. As was mentioned, with MindGeek or Pornhub, we've received, since June 2020, about [Technical difficulty—Editor] reports, which were—
Marie-Claude Arsenault
View Marie-Claude Arsenault Profile
Marie-Claude Arsenault
2021-02-22 12:52
We've received 120 reports, which have been triaged and prioritized. Some were referred to other law enforcement in Canada and others were deemed to be not online sexual exploitation, for various reasons. That's it.
View Arnold Viersen Profile
CPC (AB)
The presence of the download button on the MindGeek-Pornhub site seems like a clear violation of the distribution part of the CSAM laws. Is that part of your case? Has that been flagged? How come we haven't seen any charges?
Marie-Claude Arsenault
View Marie-Claude Arsenault Profile
Marie-Claude Arsenault
2021-02-22 12:53
Well, again, we have to look at all the evidence we have, and my understanding is that this function has been taken out. Without the evidence, there's limited action that we can take. We are assessing all the reports that we're getting now and determining if charges are likely to happen.
View Arnold Viersen Profile
CPC (AB)
Okay.
Up until recently, you have not had any reports of CSAM on MindGeek or Pornhub sites. Am I hearing you correctly?
View Han Dong Profile
Lib. (ON)
It's not working?
The Chair: Okay, you're back on. Great.
Mr. Han Dong: I hope I get an extra minute because of technical difficulties.
I want to thank all the witnesses for coming forward.
Mr. White, first things first, I remember in 2019 that the government announced an expansion of the national strategy for protecting children from sexual exploitation on the Internet. I think that the total was over $22 million. Specifically, $15.25 million was to enhance the capacity of the Internet child exploitation unit, which is the ICE unit.
Can you tell us whether that capacity of RCMP's ICE unit across the country has been enhanced as a result of this additional investment?
Stephen White
View Stephen White Profile
Stephen White
2021-02-22 12:56
Thank you for that question.
We did receive additional funding. We have implemented new resources, both in the national crime centre here in Ottawa which I mentioned, the National Child Exploitation Crime Centre, as well as in ICE units across the country. A lot of those units across the country are integrated units. They're made up of RCMP and other local police services as well.
View Han Dong Profile
Lib. (ON)
Along the same line as my Conservative colleague's question, we heard testimony from witnesses saying that subsidiary companies of MindGeek sometimes shared content, or moved some non-consensual or child pornography content to other platforms to add content to them.
In itself, isn't that a violation of the Criminal Code? Have you done any investigation on their intentionally adding content to subsidiary companies to make money? Have you done any investigations on that?
Stephen White
View Stephen White Profile
Stephen White
2021-02-22 12:58
No, we're not aware.... We haven't been informed of their adding their own content, child sexual exploitation content, if that's what you're referring to.
View Han Dong Profile
Lib. (ON)
This is a very interesting point you brought up. I didn't get a chance to ask this in the previous session.
Why is Pornhub reporting to NCMEC as opposed to reporting to the authority directly? Isn't it very strange that they report to a not-for-profit organization and not to the police?
Stephen White
View Stephen White Profile
Stephen White
2021-02-22 13:00
In the United States, one of the reporting entities is NCMEC. That's why a lot of the disclosures we get, not just from Pornhub but other Internet service providers and hosting platforms.... If they have a presence in the United States, they are able to disclose to the American entity as well.
View Marie-Hélène Gaudreau Profile
BQ (QC)
Thank you, Madam Chair.
Mr. White, my colleague was talking about an increase in investments of more than $22 million, which allowed you to increase your staff to work on the problem.
Even if money is spent today, even if people can no longer download the content that has been removed, we would like to know what happened in the previous months. So I have three questions for you.
Does the amount of money collected always correspond to the number of complaints you receive, or do you now monitor preventively?
How is it that it was through the media that we learned that MindGeek, which owns Pornhub, had broken the law?
How do you intervene preventively, and how do you handle complaints?
Stephen White
View Stephen White Profile
Stephen White
2021-02-22 13:03
Thank you very much for the question.
We can always do more in terms of prevention. This has always been our priority. The more we do, the better.
Ms. Arsenault talked a little bit about the volume of complaints we receive and the number of investigators it would take to deal with all of them. I'll ask her to elaborate on that.
View Marie-Hélène Gaudreau Profile
BQ (QC)
Forgive me for interrupting, but I don't have much time.
Will the investment of more than $22 million allow you to be more vigilant in terms of prevention? We can see that laws are very different from one country to another and that these companies have found a business model that will allow them to continue to operate.
Other than this investment and the means to keep a watchful eye, what would you need to intervene upstream of problems that could lead to thousands of complaints?
Marie-Claude Arsenault
View Marie-Claude Arsenault Profile
Marie-Claude Arsenault
2021-02-22 13:06
We always need more resources, but the resources we invest in this area are more for the proactive aspect. We need to collect data and prioritize cases to be able to stop abuse. Of course, building partnerships is really important to do these investigations.
View Charlie Angus Profile
NDP (ON)
Okay.
In 2011, the Canadian Parliament passed a law that if an Internet content hosting service provider came across issues of child abuse online, they had a legal obligation to report to the police. That was in 2011.
Stephen White
View Stephen White Profile
Stephen White
2021-02-22 13:11
I'll ask Madam Arsenault to confirm that, but it's my understanding we only began receiving complaints in 2020.
View Cathay Wagantall Profile
CPC (SK)
Thank you very much, Chair. It's a real privilege to be part of this conversation today on behalf of those who are being victimized at unbelievable levels. It's frightening, and we have to do something.
I noticed here a comment that we've investigated only around 120 reports, 25 of which qualified to go to our police force. That means that 90, it was indicated, did not meet the Criminal Code definition. I have to ask why. What should we be doing to improve this Criminal Code definition so that these circumstances aren't taking place? I can't imagine that these other cases shouldn't qualify to be investigated.
Stephen White
View Stephen White Profile
Stephen White
2021-02-22 13:18
Sure.
Every one of these cases that come into the National Child Exploitation Crime Centre is fully analyzed. We have some very good and passionate people in that unit doing this work, and they do a good, thorough analysis—
View Cathay Wagantall Profile
CPC (SK)
Excuse me, Mr. White. I'm not questioning their analysis. They're doing their analysis and deeming 90 of them as not meeting Criminal Code definitions.
Mr. Wong, what's the problem here? What do we have to do to increase that and improve that area?
Marie-Claude Arsenault
View Marie-Claude Arsenault Profile
Marie-Claude Arsenault
2021-02-22 13:18
Maybe, before you answer, I could just clarify that it did not meet the definition of child pornography in the Criminal Code. The vast majority were cases like age-difficult media, meaning we cannot definitely ascertain whether the individual is under the age of 18 years old.
Normand Wong
View Normand Wong Profile
Normand Wong
2021-02-22 13:19
I will just add to what Marie-Claude said.
The definition of child pornography in the Criminal Code is among the broadest in the world. We protect children under 18. The problem, as Marie-Claude mentioned, is the age-difficult media. When there are secondary sexual characteristics, unless you're dealing with an identifiable person, it's very difficult for anyone to tell whether that person is above 18 or below 18, so a lot of that material is not captured. That's probably what Marie-Claude is talking about.
View Charlie Angus Profile
NDP (ON)
Thank you for that, Madam Chair.
I want to go back to this issue of the fact that Parliament signed a law into place in 2011 on mandatory reporting for service providers. We understand that last year, in 2020, the RCMP received their first report. That's almost 10 years of no reports.
If, in that time, case X tried to come forward, case Y came forward and case Z came forward with issues of non-consensual or child abuse on that platform and nothing was done, the fact that they're reporting now to NCMEC, is that okay for the RCMP? Do you just say, “Well, that was then, this is now, and they're now complying with NCMEC” or do they have legal obligations that they failed to fulfill under the laws of Canada?
Stephen White
View Stephen White Profile
Stephen White
2021-02-22 13:32
When I referred earlier to the 120 reports that we received from NCMEC, that was directly related to Pornhub, to my knowledge. We have been receiving reports over the years since the mandatory reporting act—
Results: 1 - 100 of 294 | Page: 1 of 3

1
2
3
>
>|
Export As: XML CSV RSS

For more data options, please see Open Data