Skip to main content
;

ETHI Committee Meeting

Notices of Meeting include information about the subject matter to be examined by the committee and date, time and place of the meeting, as well as a list of any witnesses scheduled to appear. The Evidence is the edited and revised transcript of what is said before a committee. The Minutes of Proceedings are the official record of the business conducted by the committee at a sitting.

For an advanced search, use Publication Search tool.

If you have any questions or comments regarding the accessibility of this publication, please contact us at accessible@parl.gc.ca.

Previous day publication Next day publication
Skip to Document Navigation Skip to Document Content






House of Commons Emblem

Standing Committee on Access to Information, Privacy and Ethics


NUMBER 021 
l
2nd SESSION 
l
43rd PARLIAMENT 

EVIDENCE

Monday, February 22, 2021

[Recorded by Electronic Apparatus]

(1105)

[English]

     I call this meeting to order. I think members are aware by now that I am chairing the meeting today due to the unavoidable absence of our regular chair, Mr. Warkentin. We certainly wish his family well during a difficult time.
    This is meeting number 21 of the House of Commons Standing Committee on Access to Information, Privacy and Ethics. We are resuming our study on the protection of privacy and reputation on online video platforms such as Pornhub. I would like to remind you that today's meeting is webcast and will be made available via the House of Commons website.
    Today's meeting is taking place in a hybrid format pursuant to the House order of January 25, 2021. Therefore, members may attend in person in the room or remotely using the Zoom application.
    I believe, Madam Clerk, that the witnesses have been briefed on the usual procedures for the hybrid format. I need only remind all present that members and witnesses may speak in the official language of their choice. Please use the raise hand feature should you wish to speak or alert the chair, and this is a reminder that all comments by members and witnesses should be addressed through the chair.
    I have a point of order.
    Go ahead, Mr. Angus.
    Madam Shanahan, I want to welcome you to your first running as chair, and I know it's going to be very successful. Certainly we all express our concerns for Mr. Warkentin and his family.
    I am sorry to interrupt. I just want a point of clarification, because I know there will be committee business. In case I have to step out, I'm asking for a minute of clarification.
     At the meeting on January 29, we passed a motion that we were going to call Mr. Victor Li, Madam Marquez and Guy Spencer Elms and that we would be issuing summons. I know we have issued the legal summons on Mr. Victor Li and Madam Marquez, but I didn't hear the status of Mr. Guy Spencer Elms, who is the key director of many of the financial operations of the Kielburger operations in Kenya. Given the really disturbing allegations that have come out, both by CBC's Fifth Estate and Bloomberg, I think his testimony will help clear the air for a lot of people, particularly around the allegations of children being beaten in the schools in Kenya, which I think we all find pretty shocking and surprising.
    Could the chair tell us if Mr. Spencer Elms has agreed to come before our committee? Would that be a yes or a no?
    I believe, Mr. Angus, that the clerk has been working in that regard and can provide us with an update. The clerk can tell us what she can now, but we will be addressing this in committee business in camera at 1:30.
    Thank you, Madam Chair.
    I have been unable to reach Mr. Spencer Elms at this point.
    Mr. Spencer Elms, who runs a major law firm in Kenya, has not responded. You've not been able to contact him at all.
    No, sir.
    Okay. Thank you for that.
    Thank you, Mr. Angus.
    I recognize Mr. Fergus.
    Thank you, Madam Chair.
    Perhaps it was just me, because I wasn't here during the sound check. I see that Mr. Angus is participating in person in the committee room along with our clerk.
    Madam Chair, could I ask the clerk which other members might be appearing in person today?
(1110)
    Go ahead, Madam Clerk.
    Mr. Viersen is here as well.
    Welcome, Mr. Viersen.
    Thank you. I appreciate clarification on that. With the hybrid format, it's not always easy to see who is in person in the room and who is on screen.
    I would like to proceed now with welcoming our witnesses for today for this very important study. As the witnesses know, they have time for presentations.
    From the Canadian Centre for Child Protection, we will hear from Lianna McDonald, executive director, and Lloyd Richardson, director, information technology. We also have with us, from the Friends of Canadian Broadcasting, Daniel Bernhard, executive director. From the National Center for Missing and Exploited Children, we have Mr. Clark, president and chief executive officer.
    I believe that each of you has a presentation.
    Ms. McDonald, the floor is yours.
    Good morning, Chairperson and distinguished members of the committee. Thank you for giving us this opportunity to present.
     I am Lianna McDonald, executive director of the Canadian Centre for Child Protection, a charity dedicated to the personal safety of children. Joining me today is Lloyd Richardson, our director of technology.
    By way of background, our agency operates Cybertip.ca, which is Canada’s tip line for reporting the online sexual exploitation of children. The tip line has been operating for over 18 years and currently receives, on average, 3,000 or more public reports per month.
     Our agency has witnessed the many ways in which technology has been weaponized against children and how the proliferation of child sexual abuse material, otherwise known as CSAM, and non-consensual material fosters ongoing harm to children and youth. Over the last decade, there has been an explosion of digital media platforms hosting user-generated pornographic content. This, coupled with a complete absence of meaningful regulation, has created the perfect storm whereby transparency and accountability are notably absent. Children have been forced to pay a terrible price for this.
    We know that every image or video of CSAM that is publicly available is a source of revictimization for the child in that image or video. For this reason, in 2017 we created Project Arachnid. Processing tens of thousands of images per second, this powerful tool detects known CSAM for the purpose of quickly identifying and triggering the removal of this illegal and harmful content. Project Arachnid has provided our agency with an important lens into how the absence of a regulatory framework fails children. To date, Arachnid has processed more than 126 billion images and has issued over 6.7 million takedown notices to providers around the globe. We keep records of all these notices we send, how long it takes for a platform to remove CSAM once advised of its existence, and data on the uploading of the same or similar images on platforms.
    At this point, we would like to share what we have seen on MindGeek’s platforms. Arachnid has detected and confirmed instances of what we believe to be CSAM on their platform at least 193 times in the past three years. These sightings include 66 images of prepubescent CSAM involving very young children; 74 images of indicative CSAM, meaning that the child in the image appears pubescent and roughly between the ages of 11 to 14; and 53 images of post-pubescent CSAM, meaning that sexual maturation of the child may be complete and we have confirmation that the child in the image is under the age of 18.
    We do not believe the above numbers are representative of the scope and scale of this problem. These numbers are limited to obvious CSAM of very young children and of identified teenagers. There is likely CSAM involving many other teens that we would not know about, because many victims and survivors are trying to deal with the removal issue on their own. We know this.
     MindGeek testified that moderators manually review all content that is uploaded to their services. This is very difficult to take seriously. We know that CSAM has been published on their website in the past. We have some examples to share.
     The following image was detected by Arachnid. This image is a still frame taken from a CSAM video of an identified sexual abuse survivor. The child was pubescent, between the ages of 11 and 13, at the time of the recording. The image shows an adult male sexually assaulting the child by inserting his penis in her mouth. He is holding the child’s hair and head with one hand and his penis with the other hand. Only his midsection is visible in the image, whereas the child’s face is completely visible. A removal request was generated by Project Arachnid. It took at least four days for that image to come down.
    The next example was detected also by Project Arachnid. It is a CSAM image of two unidentified sexual abuse victims. The children pictured in the image are approximately 6 to 8 years of age. The boy is lying on his back with his legs spread. The girl is lying on top of him with her face between his legs. Her own legs are straddling his head. The girl has the boy’s penis in her mouth. Her face is completely visible. The image came down the same day we sent the notice requesting this removal.
    We have other examples, but my time is limited.
(1115)
     While the spotlight is currently focused on MindGeek, we want to make it clear that this type of online harm is occurring daily across many mainstream and not-so-mainstream companies operating websites, social media and messaging services. Any of them could have been put under this microscope as MindGeek has been by this committee. It is clear that whatever companies claim they are doing to keep CSAM off their servers, it is not enough.
    Let's not lose sight of the core problem that led to this moment. We've allowed digital spaces where children and adults intersect to operate with no oversight. To add insult to injury, we have also allowed individual companies to decide the scale and scope of their moderation practices. This has left many victims and survivors at the mercy of these companies to decide if they take action or not.
    Our two-decades-long social experiment with an unregulated Internet has shown that tech companies are failing to prioritize the protection of children online. Not only has CSAM been allowed to fester online, but children have also been harmed by the ease with which they can easily access graphic and violent pornographic content. Through our collective inaction we have facilitated the development of an online space that virtually has no rules, certainly no oversight and that consistently prioritizes profits over the welfare and the protection of children. We do not accept this standard in other forms of media, including television, radio and print. Equally, we should not accept it in the digital space.
    This is a global issue. It needs a global coordinated response with strong clear laws that require tech companies to do this: implement tools to combat the relentless reuploading of illegal content; hire trained and effectively supervised staff to carry out moderation and content removal tasks at scale; keep detailed records of user reports and responses that can be audited; be accountable for moderation and removal decisions and the harm that flows to individuals when companies fail in this capacity; and finally, build in, by design, features that prioritize the best interests and rights of children.
    In closing, Canada needs to assume a leadership role in cleaning up the nightmare that has resulted from an online world that is lacking any regulatory and legal oversight. It is clear that relying upon the voluntary actions of companies has failed society and children miserably. The time has come to impose some guardrails in this space and show the leadership that our children deserve.
    I thank you for your time.
     Thank you very much, Ms. McDonald.
    Mr. Bernhard, you may make your presentation.

[Translation]

    Madam Chair, honourable members of the committee, thank you for inviting me to appear today.
    My name is Daniel Bernhard, and I am the executive director of Friends of Canadian Broadcasting, an independent citizens' organization that promotes Canadian culture, values and sovereignty on air and online.
(1120)

[English]

    Last September, Friends released “Platform for Harm”, a comprehensive legal analysis showing that under long-standing Canadian common law, platforms like Pornhub and Facebook are already liable for the user-generated content they promote.
    On February 5, Pornhub executives gave contemptuous and, frankly, contemptible, testimony to this committee, attempting to explain away all the illegal content that they promoted to millions of Canadians and millions more around the world.
    Amoral as the Pornhub executives appear to be, it would be a mistake, in my opinion, to treat their behaviour as a strictly moral failing. As Mr. Angus said on that day the activity that you are studying is quite possibly criminal.
    Pornhub does not dispute having disseminated vast amounts of child sexual abuse material, and Ms. McDonald just confirmed that fact. On February 5, the company's executives acknowledged that 80% of their content was unverified, some 10 million videos, and they acknowledged that they transmitted and recommended large amounts of illegal content to the public.
    Of course, Pornhub's leaders tried to blame everybody but themselves. Their first defence is ignorance. They claim they can't remove illegal content from the platform because until a user flags it for them, they don't know it's there. In any case, they claim that responsibility lies with the person who uploaded the content and not with them. However, the law does not support this position. Yes, uploaders are liable, but so are platforms promoting illegal content if they know about it in advance and publish it anyway or if they are made aware of it post-publication and neglect to remove it.
    This brings us to their second defence, incompetence. Given the high cost of human moderation, Pornhub employs software to find offending content, yet they hold themselves blameless when their software doesn't actually work. As Mark Zuckerberg has done so many times, Pornhub promised you that they'll do better. “Will do better” isn't a defence. It's a confession.
    I wish Pornhub were an outlier, but it's not. In 2018, the U.S. National Center for Missing and Exploited Children received over 18 million referrals of child sexual abuse materials, according to the New York Times. Most of it was found on Facebook. There were more than 50,000 reports per day. That's just what they caught. The volume of user-uploaded, platform-promoted child sexual abuse material is now so vast that the FBI must prioritize cases involving infants and toddlers, and according to the New York Times, “are essentially not able to respond to reports of anybody older than that”.

[Translation]

    These platforms also disseminate many illegal contents that are not of a sexual nature. These include incitement to violence, death threats, and the sale of drugs and illegal weapons, among others. The Alliance to Counter Crime Online group regularly discovers such content on Facebook, YouTube and Amazon. There is even an illegal market for human remains on Facebook.
    The volume of content that these platforms handle does not excuse them from disseminating and recommending illegal material. If widespread distribution of illegal content is an unavoidable side effect of your business, then your business should not exist, period.

[English]

    Can you imagine an airline being allowed to carry passengers when every other flight crashes? Imagine if they just said that flying is hard and kept going. Yet Pornhub and Facebook would have you believe just that: that operating illegally is fine because they can't operate otherwise. That's like saying, “Give me a break officer. Of course I couldn't drive straight. I had way too much to drink.”
    The government promises new legislation to hold platforms liable in some way for the content that they promote and this is a welcome development. But do we really need a new law to tell us that broadcasting child sexual assault material is illegal? How would you react if CTV did? Exactly.
    In closing, our research is clear. In Canada, platforms are already liable for circulating illegal user-generated content. Why hasn't the Pornhub case led to charges? Perhaps you can invite RCMP Commissioner Lucki to answer that question. Ministers Blair and Lametti could also weigh in. I'd be curious to hear what they have to say.
    Don't get me wrong. The work that you are doing to draw attention to Pornhub's atrocious behaviour is vital, but you should also be asking why this case is being tried at committee and not in court.
    Here's the question: Does Pornhub's CEO belong in Hansard or in handcuffs? This is a basic question of law and order and of Canada's sovereignty over its media industries. It is an urgent question. Canadian children, young women and girls cannot wait for a new law and neither should we.
    Thank you very much. I welcome your questions.
(1125)
    Thank you very much, Mr. Bernhard.
    Mr. Clark, you may begin your presentation.
    Good morning, Madam Chair Shanahan and honourable members of the committee.
    My name is John Clark. I am the president and CEO of the U.S.-based National Center for Missing and Exploited Children, sometimes known as NCMEC.
    I am honoured to be here today to provide the committee with NCMEC's perspective on the growing problem of child sexual exploitation online, the role of combatting the dangers children can encounter on the Internet, and NCMEC's experience with the website Pornhub.
    Before I begin with my testimony, I'd like to clarify for the committee that NCMEC and Pornhub are not partners. We do not have a partnership with Pornhub. Pornhub has registered to voluntarily report instances of child sexual abuse material on its website to NCMEC. This does not create a partnership between NCMEC and Pornhub, as Pornhub recently claimed during some of their testimony.
    NCMEC was created in 1984 by child advocates as a private, non-profit organization to help find missing children, reduce child sexual exploitation and prevent child victimization. Today I will focus on NCMEC's mission to reduce online child sexual exploitation.
    NCMEC's core program to combat online child sexual exploitation is the CyberTipline. The CyberTipline is a tool for members of the public and electronic service providers, or ESPs, to report child sexual abuse material to NCMEC.
    Since we created the CyberTipline over 23 years ago, the number of reports we receive has exploded. In 2019 we received 16.9 million reports to the CyberTipline. Last year we received over 21 million reports of international and domestic online child sexual abuse. We have received a total of over 84 million reports since the CyberTipline began.
    A United States federal law requires a U.S.-based ESP to report apparent child sexual abuse material to NCMEC's CyberTipline. This law does not apply to ESPs that are based in other countries. However, several non-U.S. ESPs, including Pornhub, have chosen to voluntarily register with NCMEC and report child sexual abuse material to the CyberTipline.
    The number of reports of child sexual exploitation received by NCMEC is heartbreaking and daunting. So, too, are the many new trends NCMEC has seen in recent years. These trends include the following: a tremendous increase in sexual abuse videos reported to NCMEC, reports of increasingly graphic and violent sexual abuse images, and videos of infants and young children. These include on-demand sexual abuse in a pay-per- view format, and videos showing the rape of young children.
    A broader range of online platforms are being used to access, store, trade and download child sexual abuse material, including chats, videos and messaging apps, video- and photo-sharing platforms, social media and dating sites, gaming platforms and email systems.
    NCMEC is fortunate to work with certain technology companies that employ significant time and financial resources on measures to combat online child sexual abuse on their platforms. These measures include large teams of well-trained human content moderators; sophisticated technology tools to detect abusive content, report it to NCMEC and prevent it from even being posted; engagement in voluntary initiatives to combat online child sexual exploitation offered by NCMEC and other ESPs; failproof and readily accessible ways for users to report content; and immediate removal of content reported as being child sexual abuse.
    NCMEC applauds the companies that adopt these measures. Some companies, however, do not adopt child protection measures at all. Others adopt half-measures as PR strategies to try to show commitment to child protection while minimizing disruption to their operations.
    Too many companies operate business models that are inherently dangerous. Many of these sites also fail to adopt basic safeguards, or do so only after too many children have been exploited and abused on their sites.
(1130)
     In March 2020, MindGeek voluntarily registered to report child sexual abuse material, or CSAM, on several of its websites to NCMEC's CyberTipline. These websites include Pornhub, as well as RedTube, Tube8 and YouPorn. Between April 2020 and December 2020, Pornhub submitted over 13,000 reports related to CSAM through NCMEC's CyberTipline; however, Pornhub recently informed NCMEC that 9,000 of these reports were duplicative. NCMEC has not been able to verify Pornhub's claim.
    After MindGeek's testimony before this committee earlier this month, MindGeek signed agreements with NCMEC to access our hash-sharing databases. These arrangements would allow MindGeek to access hashes of CSAM and sexually exploitive content that have been tagged and shared by NCMEC with other non-profits and ESPs to detect and remove content. Pornhub has not taken steps yet to access these databases or use these hashes.
    Over the past year NCMEC has been contacted by several survivors asking for our help in removing sexually abusive content of themselves as children that was on Pornhub. Several of these survivors told us they had contacted Pornhub asking them to remove the content, but the content still remained up on the Pornhub website. In several of these instances NCMEC was able to contact Pornhub directly, which then resulted in the content being removed from the website.
    We often focus on the tremendous number of CyberTipline reports that NCMEC receives and the huge volume of child sexual abuse material contained in these reports. However, our focus should more appropriately be on the child victims and the impact the continuous distribution of these images has on their lives. This is the true social tragedy of online child sexual exploitation.
    NCMEC commends the committee for listening to the voices of the survivors in approaching these issues relating to Pornhub. By working closely with the survivors, NCMEC has learned the trauma suffered by these child victims is unique. The continued sharing and recirculation of a child's sexually abusive images and videos inflicts significant revictimization on the child. When any website, whether it's Pornhub or another site, allows a child's sexually abusive video to be uploaded, tagged with a graphic description of their abuse and downloaded and shared, it causes devastating harm to the child. It is essential for these websites to have effective means to review content before it's posted, to remove content when it's reported as child sexual exploitation, to give the benefit of doubt to the child or the parent or lawyer when they report content as child sexual exploitation, and to block the recirculation of abusive content once it has been removed.
    Child survivors and the children who have yet to be identified and recovered from their abuse depend on us to hold technology companies accountable for the content on their platforms.
    I want to thank you for the opportunity to appear before this committee. This is an increasingly important topic. I look forward to answering the committee's questions regarding NCMEC's work on these issues.
    Thank you very much, Mr. Clark.
    We will now turn to our first round of questions.
    Ms. Stubbs, you have six minutes.
    Thank you, Madam Chair.
    Once again, as every day on this committee, I am shocked and sick to my stomach and haunted by the amount of time this has all gone on. I thank you all for your work and your efforts and your expertise. I can't even imagine the level and the years of frustration you must have experienced. Thanks for being here today.
    I hope that at the end of all of this there's actually content to combat this scourge, rather than what happens sometimes, where reports are written and then nothing occurs.
    Again, I hardly even know where to start.
    Ms. McDonald, you mentioned in your 2020 report about the various platforms, including Pornhub, that now I think many people's minds are being blown, that they include child sexual assault material. Could you tell us whether or not there are features that actually allow the reporting specifically of child sexual abuse material on those platforms you reviewed?
(1135)
     Thank you very much. I'm going to turn this over to Lloyd Richardson, our director of technology, in one second.
    The point I want to make before he gives those concrete examples is that we took that review on when we were examining the now signed-on voluntary principles to address this, which the Five Eyes countries are signatories to. Our agency wanted to find out how easy it was for a user or a victim to report CSAM on very well-known platforms. We were absolutely shocked at how difficult it was often even to find the term CSAM.
    We noticed a number of tactics that were used to actually discourage, if you can imagine, the reporting of CSAM. We can only surmise it's because many of those companies didn't necessarily want the numbers, didn't want to show how much of this was on their platforms, because of the volume of it coming in.
    Before Lloyd speaks to the examples, I also want to note the number of survivors who, as our colleague and friend John Clark mentioned, are coming into organizations such as ours right now. We have a tsunami of these victims who either want to get their illegal material down or are having a difficult time reporting. The review was intended to shed light on a number of platforms and the inability of people to effectively and easily report.
    Lloyd.
     It's important to note when looking at these different platforms that this is only one dynamic of the way these companies operate in the space, the ability of people to report, “Hey, this is my material. Please remove it.” We need to know that many people aren't even necessarily doing that.
    When we look at reports that we send in, typically industry will use the term “trusted flagger program” and what have you. Essentially, that just means they pay more attention to child protection charities when they send a notice in. When a member of the public does it, it generally has a much lower priority. This is typical across most tech companies, including MindGeek.
    Another piece that's a bit of an issue is that to actually remove something is not a one-click option. The idea that these companies allow for the upload of this material—or historically have—and that you can upload it with no sort of contact information and away you go.... The process you need to go through to actually get something removed is quite heavy. In some cases you need to provide identification. If you have your material up there, would you really want to provide your email address or contact information to a company such as MindGeek?
    Certainly, some of these things have changed. MindGeek fared well compared with some of the big tech companies, but that certainly doesn't mean it's doing very well in this space.
    There's one last point I want to make. It's really important also, when we look at all the reports, and even the information and the data that organizations such as ours or NCMEC have, that we only come across what we know.
    What we know—and we all understand this—is that for a young adolescent girl who has had a sexual image of her land on such sites as this, the fear and humiliation in coming forward to organizations and even approaching organizations for help are incredibly difficult. What we know for sure is that our numbers for this type of victimization vastly underestimate it.
    This is very disturbing.
     I think I only have a minute left, so if we run out of time, I hope we'll get at this for all of our witnesses.
    I'm hoping you can help us understand better one matter. What I understand from testimony is that when these websites have to take down child sexual abuse material, they'll put up a notice that says, “Removed because of copyright”, instead of something such as “Taken down because of a report to NCMEC”. Could you comment on that?
    Then frankly, to Mr. Bernhard's point, which I was “out loud on mute” supporting, what boggles my mind is that at least under Canadian law—and I'm glad that these things are being reported to NCMEC—it seems to me very clear that they have a responsibility to be reporting child sexual abuse material to the police.
    I wonder whether any or all of you have comments on those two points.
(1140)
    Keep to very short answers, please.
    Oh, shoot. They can get at it later, too, further in the hour. I think these are the crucial questions for our committee.
     I'll jump in really quickly.
    At the National Center, we work with a lot of technology companies. Of course, we have encouraged their reporting, but of the thousands of Internet service providers, we only have about 170 that are actively reporting. Of those 170, there are only about 20, maybe less than 20, that are actually significantly reporting.
    Of course, we'd like to get that part of the whole ecosystem working well first and then obviously report it to the police, because as has been noted, many, many of these instances are criminal activity. Make no mistake about it. It's criminal activity. Not to mention—
    Thank you very much, Mr. Clark. You can no doubt continue in a subsequent question.
    Mr. Sorbara, please, for six minutes.
    Thank you, Chair, and thank you to everybody for their testimony this morning. It was very enlightening. Thank you for all the work you do in this very important area in helping kids in very bad situations.
    First, I just want to go over two numbers that were reported by the Canadian Centre for Child Protection. I can go back to the blues, but I just want to get this out there again. You used the number in “billions” of images. Is that correct?
    Yes. That's images scanned. If we're talking about needles in a haystack, that's the whole haystack in terms of images that we've detected. We've sent 6.7 million notices to providers on images. The 126 billion is not all child sexual abuse material. That's just the swath of material that we've scanned.
    Okay. Thank you.
    Mr. Bernhard, I skimmed the report, “Platform for harm: Internet intermediary liability in Canadian law”. I also saw that you had produced an opinion piece in the Toronto Star on Thursday, December 10. Thank you for all the work you're doing in holding to account providers of these images when they know they should not be up there, if I can just put it in very plain language.
    I wish to ask a question, and I believe this is under your domain. In a September 2020 report, Friends concludes that existing Canadian laws should be sufficient to hold platforms such as Pornhub accountable for illegal content that appears on their platform, despite the fact that the content is user generated and is not created or uploaded by Pornhub.
    First, can you explain your position in more detail? Second, do you think MindGeek's algorithms provide the company sufficient knowledge of non-consensual content to give it “knowing involvement” in their publication and dissemination?
    Thank you, Mr. Sorbara. You've touched on a key point, which is the difference between the law in Canada and the law in the United States.
    In the United States, there has been a lot of talk about the Communications Decency Act and section 230 of that act, which holds platforms not to be liable for user-generated content that they are distributing.
    Can you hear me still?
    Yes.
    Sorry. No. Now you're frozen.
    Chair, can I move on, then?
    To the National Center for Missing and Exploited Children, welcome. Mr. Clark, thank you for availing yourself.
    I'm very curious. What does a partnership with NCMEC entail?
     That's a good question.
    We work with some of the Internet service providers to, first and foremost, make sure they have good content moderation, that there are actual human beings looking for and seeking to immediately take down items, illegal content, CSAM when it's first discovered. We work, and try to work, closely with those companies who are willing and like-minded with us in trying to take down things that are apparently criminal activity and getting that off the sites, period. That's one of the things we look for when we're talking about—in air quotes—“partnership”, making sure that we continue with more of a collaboration model, working with the Internet service providers.
(1145)
    We'll get Mr. Bernhard back on in my last minute or two of time, but staying with you, Mr. Clark, you mention Internet service providers. How many adult platform providers would you have a partnership with currently?
    We don't technically, as I said, have a partnership with Pornhub, although they have begun to voluntarily report, but I believe they are the only one. The others that we are working closely with are just large tech companies generally.
    That is a very important distinction to make when you have a partnership with an Internet service provider which allows entities to put content onto their platform versus having a partnership with what would be called an adult platform. Understood?
    Understood.
    Okay.
    Mr. Bernhard, welcome back. Perhaps you could continue with your answer.
    I'm sorry. The Internet providers are conspiring against me here.
    We're talking about a difference between American law, which holds companies that deal in user-generated content indemnified from that content, and Canadian law, which does not.
    In Canada, as our report documents, a company becomes liable for something that somebody else said or did under two circumstances: first, if they know about it in advance and publish it anyway, and second, if they are notified about it after the fact and fail to take action.
    That's the first thing. In the case of Pornhub, both appear to be true. They are notified and take a long time to remove content. Also, to address your point about algorithms and recommendation of content, we believe they have a pretty sophisticated understanding of what this content is. If a relatively small not-for-profit organization in Manitoba is able to deploy technology that can find this material in large numbers, surely a company the size of MindGeek can do the same thing.
    There is a difference between hosting content and actively recommending it to people. In that sense, the platforms are arguably more liable and more responsible for the offending content than the users themselves.
    Chair, please interrupt me when I'm beyond my time.
    Daniel, I would agree with you, because the algorithms, as we all know with regard especially to social media platforms, are very powerful. I think I have 4,000 friends on Facebook, but I only see content from 25 of them on a daily basis, when I check. We know that the AI technology that is being used for what's being recommended that people should see and be viewing is very powerful. I'm sure that this adult platform we're talking about utilizes that technology.
    Yes, I think you're entirely right.
    I'm glad you mentioned Facebook, because to some extent I think the sexual nature, the shocking nature of this illegal activity can allow us to focus too much. The general question is: Are these platforms responsible for illegal activity that they promote, period. The child sexual abuse material is part of it. It's terrible. I have a 10-week-old daughter. This means a lot to me.
    We know, however, that there is other illegal activity, including incitements to violence, the sale of drugs and arms, and so—
     I'm sorry, but I will have to stop you there, Mr. Bernhard.
    Thank you.

[Translation]

    Ms. Gaudreau, you now have the floor for six minutes.
    Thank you, Madam Chair, and congratulations.
    Allow me to express my sincere thanks to our esteemed witnesses. You are again providing us with information that will allow us not only to have what it takes to act, but also to determine that it is urgent to act.
    Some of the discussions I had with my kids this weekend made me wonder about the impact that this viral world of adult entertainment consumption could have on our youth. I feel that we need to take a more global view. You tell us, for example, that the legislation in a given country is different from ours and that, depending on the business model in place, people manage to slip through the cracks in every imaginable way. Why not take the bull by the horns and decide to do something upstream quickly, that is, even before allocating space to distributors? We could force them to obtain a licence, or we could force them to demonstrate that everything is done in a legal and preventive way.
    Beyond that, another aspect must be taken into account. When we talk about legislation, we're also talking about intervention structures with our country's police forces. When it comes to international cases, however, there is a complete loss of control. I would like you to tell us how we could act. Right now, we have a lot of data and evidence and reports that clearly show us that there is an urgent need for action.
     Could you tell us in a few words how we could approach this effectively, when the issues are international?
     If, tomorrow morning, we decided to legislate urgently, the fact remains that, in other countries, the legislation would be different.
    I'll use the time I have left to give each witness a few minutes to speak. We could start with Mr. Bernhard.
(1150)
    Thank you for the question, Ms. Gaudreau.

[English]

     I think the answer is that the incidents are international, but they are also very local. Pornhub, MindGeek, is arguably Canada's largest Internet media company. Mr. Antoon lives in Canada, has property in Canada and is fully subject to the jurisdiction of the RCMP. While the crimes may be perpetuated elsewhere, they are also happening in Canada.
    In cases in which the company is resident outside of Canada, there is a lever that the government can pull and that is the money. All of the money that comes from advertisers to those platforms can also be stopped. Just as it's illegal to buy drugs with a credit card issued by a Canadian bank or to engage in some gambling activity, in the extreme case in which a foreign company does not comply, we could also cut the money.
    In this case, we're talking about hundreds of millions of dollars. If that's not an incentive, I don't know what is.
    May I just add something, please?
    From our organization's perspective, we've been screaming from the rooftops that we are long overdue for regulation.
    I think it's incumbent upon us all to ask ourselves how we got here. We as an organization also look at the unregulated nature of the adult pornography issue. There definitely needs to be more than a conversation about how we are going to take the keys back from industry.
    Internet freedoms don't mean freedom from accountability and responsibility for what users post on your services. We have been definitely looking at yielding the power, to start, with the Five Eyes, looking at ways countries can become unified in looking at global standards for what we need to do here.
    On behalf of children, and they are citizens as well, we can certainly say that we've never needed government more than we do now to step in and intervene.
    I would just say from the U.S.-based look at this that the marketplace for CSAM is a global one, and it requires a global effort. As you might naturally believe and know, there are differences in laws around the globe that affect privacy issues and what Internet service providers can do on their platforms.
    As has been reported over and over again, however, whenever it involves children or criminal activity, governments must step in and take a very strong stance at regulating what is going on when you're seeing child abuse, child rape, victimization over and over again. We always urge countries near and far to really take a strong look at the legal measures that can be enacted to fight this.
(1155)

[Translation]

    Mr. Bernhard, did you want to add any comments?
    Thank you, Ms. Gaudreau.

[English]

    I just wish to comment on something Ms. McDonald said, to disagree in hopes of agreeing with her. It is that I don't believe that the Internet is in fact unregulated. I cannot commit fraud legally on the Internet. I cannot steal legally on the Internet. I cannot sell you heroin legally on the Internet. Likewise, I cannot traffic child pornography legally on the Internet, especially in Canada where the law is clear that publishers are jointly responsible under the conditions I have outlined.
    The issue, then, is not that it is unregulated; the issue is that the law is not being applied. That is even more concerning because we have the rules. The question of enforcement, then, is very important. That's what I would like to pose.
     If I could just add a—

[Translation]

    Thank you very much.
    I must give the floor to the next speaker.

[English]

    Mr. Angus, you have six minutes.
    Thank you, Madam Chair, and thank you to each of our witnesses. If I cut you off, it's not that I'm being rude, but I only have six minutes and we have so much to get to.
    In terms of legal obligations in Canada, in 2011 Parliament adopted an act regarding the mandatory reporting of Internet child pornography. There are two provisions in it. One is that if an online provider finds issues of child pornography, they have a legal obligation to report to police. As well, they have a legal obligation to report to the Canadian Centre for Child Protection. Madam McDonald, that is you.
    We asked Pornhub about their compliance with Canadian law on this matter. Have you found that they are complying with Canadian law in reporting these multiple incidents that we've had to deal with?
    Lloyd?
     There are two pieces there.
    The one side, concerning law enforcement, we couldn't really comment on. On the law enforcement angle, the legislation states that basically, if an entity has possession of that material on their system, there are preservation requirements placed upon them and they need to report to Canadian law enforcement. That could be any police officer in Canada.
    The second piece is where an entity does not necessarily have possession of the CSAM. It could be an Internet service provider of some sort that has become aware of child sexual abuse material on a different service. They can report that to a designated reporting entity, which is us, the Canadian Centre for Child Protection.
    Speaking on this side of things, very recently MindGeek has reached out to us to attempt to report through that means. I can't necessarily speak—
    I'm sorry. When you say “very recently”, what do you mean?
    I mean in the last few months.
    Could it be since the New York Times articles came out?
    It could be pretty close to that time.
    Okay.
    I find it interesting, because when I read the law it says “on a service”. They provide a service. If they're made aware of it “on a service”, my understanding is that they're obligated to report in Canada.
    When we asked them this, they said that they reported to NCMEC, Mr. Clark, which might be great, but to me, it's still avoiding the issue of Canadian law.
    How long have they been reporting to NCMEC on allegations that are brought forward?
    Their reporting to us has just been in recent times, within the last few weeks, probably.
    The last few weeks?
     Yes, it's been the last few weeks.
    As was noted, I think, in my testimony, while they provided, I believe, around 13,000 reports, a lot of those were duplicative. We've also noted in some instances a very strong reluctance on their part to take down material that is called in from people who have been victimized. However, when we call them, they take it down.
    Okay.
    There is still a lot of work to be done there.
    My concern is, under Canadian law, and I imagine it is the same in the United States, the preservation requirements when an issue has been raised and flagged, the obligation to report.
    When Serena Fleites, the young woman who I think has really blown the doors off this whole case, spoke to us, that was a groundbreaking moment in changing the discussion. When we asked Pornhub about the efforts she took to get her images down, they said they had no record of her. I found that quite shocking.
    Under American law, because Ms. Fleites is an American citizen, would there be preservation requirements such that Pornhub-MindGeek would be accountable for the images they had of her abuse, so that they would at least have a record of it?
(1200)
    It would seem likely that they should have a record of it.
     Since NCMEC is a non-investigatory agency, we don't take further steps to investigate that relationship about what they are saying they did or didn't do when somebody reported to them about CSAM material or asked them to take something down. We do engage and provide reports of apparent CSAM material directly to law enforcement, as a clearing house. That occurs at the national centre at NCMEC on a daily, regular basis.
    Thank you.
    I'll now go to Mr. Bernhard on the issue of Canadian law.
    We have spent a lot of time at our parliamentary committee on issues of compliance by the tech giants, and Pornhub-MindGeek is a tech giant. It seems to me in my reading of Canadian law that we have very strong laws to protect against non-consensual exploitation of images. We have strong child pornography laws.
    I believe there's only been one online investigation, and it wasn't against any company nearly as big as Pornhub. Safe harbour provisions have protected the tech giants, because they don't know what's on their servers. However, Pornhub officials told us that they viewed every single image. If they viewed every single image, that means they would endorse that image as being okay.
     Would you think there would be an issue of responsibility there, and as well that the tags that identify “knocked-out teen” or “raped teen” would be the promotion of acts that we would consider illegal? They seem to be not sure whether “teen” is legal or illegal; however, I think under Canadian law.... Do you believe they would be protected under safe harbour provisions?
     My understanding is that they would not. My understanding is that even in the United States, there is an exception to section 230 for child sexual abuse material, so even under the safe harbour laws of the United States, they would not be protected. Ultimately, Mr. Angus, I would hope it would be a judge who would decide this, and the lack of charges here is particularly concerning to me. It appears to be in clear violation of the law, but that's not something that I am able to pronounce upon, and so, the police, law enforcement, prosecutors and judges really need to get involved here, because it does appear to be a very clear violation of the law.
    Thank you.
    Thank you very much.
    Thank you, Mr. Angus.
    We now turn to Mr. Viersen for five minutes.
    Thank you, Madam Chair.
    Ms. McDonald, your organization has done fantastic work. I had the opportunity to visit your facility a number of years back and I got to see Project Arachnid in action even though there's not much to see when you're at the facility. It's a bunch of lights blinking on computers.
    Nonetheless, recently the Canadian Centre for Child Protection took steps to distance itself from an international group that has received donations from MindGeek. Can you explain why you took this step, and would you consider MindGeek a partner in any way?
    Yes, what ended up happening, we didn't realize at the time, when the organization.... The international association you're talking about is INHOPE. The organization had taken a donation from a company called MindGeek. At the time, we did not understand the business structure and what all that meant. As soon as our organization became aware that this company owned a number of adult pornography sites, we immediately made the decision to take away our sort of membership there. It was a very important decision for our organization because we deal with survivors and victims, and many of them are teens. Many of them had come in to us also talking about their victimization on these types of sites.
    Also, I just have to raise—because it's not subject to the conversation that we're having right now—that we have a huge issue with the lack of age verification. We have Canadians coming in telling us that their 13-year-old or 12-year-old son was able to go straight into a really graphic web page called Pornhub. From our organization's standpoint, we could not continue with that, so we did make that difficult decision. We worked with hotlines around the world. We do in other capacities and we'll continue to do so because it's in the best interest of children. That was why we made the decision.
(1205)
    Mr. Richardson, I'll ask you this seeing as you're the tech guy here.
    The big trouble we've been studying here at this committee is around this, the age and the consent of folks who are depicted in these videos. We hear a lot about how long it took to take the video down and things like that, but certainly there would be methods of ensuring that these videos never show up in the first place.
    I was wondering if you could comment on that. If you're bragging that you are the leading tech company in the world, surely there's technology to keep this stuff off the Internet to begin with.
    There is, but I would kind of invert that a little bit. It's not a technical issue.
     Let's reverse in time to the 1980s before the popularized Internet when we had pornography and we didn't see child sexual abuse material showing up in Playboy magazine. It's not necessarily a technical issue. If you're in fact moderating everything that comes up on your platform, this should never happen. We don't see the CBC show up with child pornography on its services because there's moderation that happens. We have control over the content. That's not to say you can't leverage technology, as we do in Project Arachnid, to do proactive detection of known child sexual abuse material, but really, let's not look at the new and fancy, oh, I have an AI classifier that can automatically detect child pornography. That's great and all, but it's never going to detect everything, and it's not going to have the accuracy that you have of actual human moderators looking at material. It's in addition to something that's already there, so it's important not to belabour the technological side of things.
    All right.
    Mr. Bernhard, you looked like you wanted to jump in on that one as well.
    It was just to say that I agree. Platforms want to operate at a certain scale, which requires them not to validate any of the content that comes up, yet that seems to result in illegal outcomes, so it's not really for us to say how they should deal with this, but simply that if it's there, they should face the consequences.
    To Mr. Richardson's point, I have one final issue. It's not just CBC, CTV, etc., who make sure that their content is lawful. They also have to make sure the advertising that they run is lawful and that op-eds and other third party contributed content are lawful. Otherwise they are jointly liable. This is how the law works, and I see no reason why it shouldn't apply in the case of Pornhub, Facebook, Amazon or any other provider that is recommending and facilitating illegal behaviour through its service.
     Mr. Clark, when the executives of Pornhub were here—
    Mr. Viersen, you have just a few seconds.
    I'll make it a yes or no answer.
    They said that in 2019 they had never reported anything to NCMEC. Can you confirm that as well?
    I missed part of your question. You said that was Pornhub?
    Yes.
    That is correct; they did not.
    Thank you very much.
    Next up is Mr. Fergus for five minutes.

[Translation]

    Thank you very much, Madam Chair.
    I thank the witnesses for appearing before us to talk about this topic. I find this testimony terribly hard to hear. Earlier this spring, I attended an international conference involving three countries, Germany, Canada and the United States. We were Zoom-bombed.
    We were shown material similar to what you describe, where we saw children, very young people. I have to tell you that it traumatized me at the time. Listening to you and listening to the testimony of the victims over the last few weeks, I have to say I find all these things abominable.
    Mr. Bernhard, in your opinion, which country has the best balance in terms of legislation? Which country has laws that are tough enough to fight the distribution of this content, and laws that have enough teeth to sue companies to remove this content from their sites?
(1210)

[English]

    I think Canada already has a pretty strong legal regime coming from hundreds of years of common law. The original case we talked about in our report is over 100 years old and comes from Scotland and involves a defamatory notice on a message board.

[Translation]

    Absolutely, but I am asking you which country has the best balance. You did say that we don't follow up with our police forces with regard to this type of content, correct?

[English]

    Yes, I understand.
    I think the closest comparison I would give is Germany, which has a strong law, in this case talking about illegal hate speech, but again, illegal content. Platforms that are caught facilitating the transmission of this content can face fines of up to 50 million euros per infraction. That is an example of getting serious. The United Kingdom is also talking about personal liability for online harms, not just for companies but for their individual executives.
    Our Nanos polling from the fall shows that Canadians also overwhelmingly support such a move in Canada.

[Translation]

    Ms. McDonald and Mr. Richardson, do you have any comments to add?

[English]

    I'll start, and then I'm sure Lloyd will jump in.
    It's very difficult to answer that question. The reason it is difficult is that we see so many problems right through the continuum of where action would be taken. Whether it's on the enforcement side, wherever it is, we are quite overwhelmed.
    I think part of the challenge is the way in which the Internet was structured. People having the cloak of anonymity and the vast ability to transmit material in these capacities, issues such as end-to-end encryption and other things that are protecting the privacy rights of adults at the expense of the safety and well-being of children, are the common themes.

[Translation]

    I see.

[English]

    Lloyd, do you have a country that you'd like to mention?

[Translation]

    Can you give a short answer, Mr. Richardson? I only have 45 seconds left, and I'd like Mr. Clark to answer that as well.

[English]

     I would quickly say that it is a very dynamic issue. It's not necessarily one thing. There are examples of countries in the world that prohibit most children and adult content altogether. That would be one—
     Could I stop you there, Mr. Richardson.
    Are we having trouble with interpretation?

[Translation]

     Until Mr. Richardson solves his problem, I could turn to Mr. Clark.
    Mr. Clark, you have the floor.

[English]

    We send our CyberTipline reports to well over 100 countries around the globe. The ones that seem to work best are the ones that have a strong law enforcement involvement, judicial process, laws in place—
    Which ones would they be? Very quickly, so we can get this on the record, sir.
    We're talking primarily about the Five Eyes as good examples of strong laws—not perfect, but very good laws.
    Is that also on the enforcement side?
    On the enforcement side, I would say that's true as well. We send our CyberTipline reports to law enforcement organizations in those countries.
    Thank you.
    Thank you very much, Mr. Fergus.

[Translation]

    Ms. Gaudreau, you have the floor for two and a half minutes.
    Thank you, Madam Chair.
    We've just learned—and we expected this—that fraudsters have gotten their hands on the personal data of three out of four Canadians.
    I think it is urgent, obviously, to include control measures. Earlier, we were talking about the link with police forces. We were wondering how we can work more closely with them.
    I realize that the work to be done in the field is colossal. It saddens me to see that not only are there millions of individuals whose privacy is completely ruined, but also that fraud exists in Quebec and in Canada.
    Mr. Bernhard, am I wrong to say that there is a structure missing that would prevent some people from slipping through the cracks?
     Do you agree that tightening up control regulations through the Ethics Commissioner, for example, is urgent and necessary, so that our regulation can one day align with a model like the one in Germany?
(1215)

[English]

    I think what you're pointing out is that Canada is so far behind on these issues of enforcing the law when it comes to anything that happens digitally.
    You're right to connect the fraud, the child sexual abuse material, the privacy violations, all of this other illegal activity. It's the same question: Are we going to enforce the law when it happens digitally or not? I really hope that you will invite Commissioner Lucki.
    I'll give you one last point here: the Christchurch shooting that happened in March 2019. I spoke with a fellow in Vancouver, Chris Trottier, who opened his phone and saw this recommended to him. He didn't ask to see it; it was pushed to him.
    Is that a criminal offence? We need more case law, and to do that, we need more trials. I really wonder why the RCMP does not consider those types of actions to constitute “promotion”. It seems to constitute promotion to me.

[Translation]

    Mr. Richardson, we've lost control when it comes to privacy issues.
     As an expert, do you agree that this problem needs to be addressed and that more protections need to be built into legislation?
    Ms. Gaudreau, I'm sorry, but your time is up. Perhaps Mr. Richardson can answer the question later.
     Mr. Angus, you have the floor for two and a half minutes.

[English]

    Thank you, Madam Chair.
    Stepping back from this a bit—because for many of us this has been a pretty shocking study and many of us, I think, are feeling in our guts that something is fundamentally wrong—I want to just articulate that pornography is legal in Canada. Citizens have the right to watch weird things. People have the right to promote and show their consensual bedroom antics, if that's what they like to do. Whether people like it or not, that is their right.
    The question is whether or not Pornhub-MindGeek has abused or failed to live up to their legal obligations. That, to me, is the fundamental question on non-consensual images, on issues of rape, on issues particularly of child abuse.
    When I read Canada's legislation, going back to the 2011 mandatory reporting legislation, a provider is obligated to reach out to the police if there are issues raised, but also within Canada, the Canadian Centre for Child Protection.
    I want to go back to the Canadian Centre for Child Protection for a second.
    You said that they began reporting somewhere around December, somewhere around the time that the New York Times article blew the doors off everything. Is that correct?
     Yes, I can follow up with the exact date for you. They reached out to us directly to see how they could report it to us. I could get that.
    Would that mean that for basically 10 years of Canadian law, while we had very strong laws on the books, they were not reporting to you?
    That's correct.
    That's correct.
    Mr. Clark, they have different legal obligations in the United States, but they've reached out to you to be a partner. They're calling themselves a partner. They have voluntarily come forward on issues for which I think any good corporate citizen would say, “We don't want this on our site.” They told us they didn't want bad things on their site.
    When did they reach out to you? Would you say it's been a matter of months?
    I believe it was in the latter part of 2020. Earlier I said it was more recent, but I think it was in the latter part of 2020.
    Yes, it was in the latter part of 2020, when the New York Times broke the story, possibly.
    That was most likely when.
    It was most likely then.
    Yes, and—
    We've had a number of survivors reach out to us from the United States, and for 10 years before that, they were not reaching out to you to let you become aware or to let you work with them on helping these survivors, these victims.
    That would be correct.
    Thank you very much.

[Translation]

    Thank you very much.
    Mr. Gourde, you have the floor for five minutes.
(1220)
    Thank you, Madam Chair.
    Witnesses speak about concerted action at the international level. We all understand that, in cases where virtual activities are involved, there are no borders.
    My concern is that while we work to put in place strong regulations in Canada, companies like Pornhub could move their headquarters to countries where they would be safe from lawsuits or any legal action. It's shocking how quickly these companies can move, and it leaves us perplexed as to what action we should take.
    Could the witnesses give us an idea of countries we could work with internationally? Which countries would be most sensitive to the problem?
    Could we talk about crimes against our children? Could these criminals be accused of crimes against humanity?

[English]

    We are indeed talking about criminal activity, and to your question about which countries we should work with, again, it's very, very difficult. There are variances in all different countries we work with, but make no mistake that we do believe in strong enforcement, strong prosecution, a strong judicial system. The legality surrounding this.... I mean, the information superhighway was never meant to be unpoliced. If a crime happens on the street and is punishable under law, it should be the same if a crime is happening on the Internet. It should be punishable to the full extent of the law.
     I would add in here, to echo John's comments, that it is very challenging. I would say, though, that Australia has done some very impressive work in this space. Also, as was mentioned by Daniel, the U.K. government has really taken a leadership role with its “Online Harms White Paper” and looking towards a different type of schema to look at this. I do want to make just one point that really has not been discussed here at all, and that goes back to the issue of accountability and oversight.
    Again, we are still relying on systems under which it's up to the companies to come forward and to report, so we don't know the scale of the problem. We don't know. There's no oversight to know if they're in fact reporting what they ought to be reporting, and it puts the users or survivors and victims in an unfair situation when they're dependent on these companies to do the right thing. While we looked at what is available to us, we also have to raise the important question about accountability and what oversight is tied to what these companies are or are not doing.

[Translation]

    These platforms and companies are owned by unscrupulous individuals. Should we instead sue the owners of these platforms?
    In my opinion, it would hurt them a lot more and would solve the problem faster.
    My question is for all the witnesses.

[English]

     You need more government certification to sell a toaster oven than you do to run Facebook or Pornhub. This idea of permissionless entry can be very difficult. Ultimately, I think, the question is can they operate legally. If they can't, we should come down on them.
     You asked about international measures. As I said earlier, the companies may be based elsewhere, but there's a lot of money made here. Just as we've done with other transnational criminal incidents where the criminals themselves are not available, the money is, and if we target the money, I think we can make good progress.
    That is something Canada can do without waiting for international participation. As much as it would be great to work with other countries around the world, this could be one of those rare instances where we might want to lead globally, and I seriously encourage you to do it. We can start by calling the cops.
(1225)

[Translation]

    Thank you, Mr. Bernhard.
    Mr. Gourde, there are only a few seconds left, so we'll move on to Ms. Lattanzio.
     Ms. Lattanzio, you have the floor for five minutes.

[English]

    Thank you, Madam Chair, and thank you to our guests for being here this morning.
    My first question is for you, Mr. Clark.
    We heard Mr. Antoon from MindGeek say during his testimony how proud he was of being a partner with NCMEC. He clearly said in his testimony that he reports every instance of CSAM when they are aware of it, so that the information could be disseminated or investigated by authorities across the globe.
     When we asked him to give us a report of how many instances were reported just in 2019, they couldn't answer. Would you be in a position to provide us information for 2019 and also, if you can, going all the way back to 2008?
    Sure. As was noted in my testimony, they are not a partner. In fact, we sent a letter to them soon after their testimony when we became aware that they were saying they were a partner, telling them that it was not true and that they should cease and desist from saying so. That's important to note.
    In terms of the 2019 numbers, I'm not aware of any reporting that was happening in that particular calendar year. In 2020, we did begin to receive some of the reports. Again, this is on a voluntary basis. We did note that many of those reports were duplicative.
     We have encouraged them, as we do all the ESPs, to begin a stricter content moderation. They should begin to actually know beforehand what is being uploaded and whether that content passes any of the legal requirements that they would have to post it. If it does not, it should not go up, period. They should not have to go back and look for it or have a victim call in and ask for it to be taken down.
     That's something we encourage all of them to do.
    Okay.
    My next question is for you, Mr. Bernhard.
     In what you were stating this morning, I hear very clearly from you that we don't need to have more laws. You want the enforcement part to be acted on ASAP.
     On the laws that we have presently, do you see or foresee any loopholes in the existing laws? Can we make sure that when these companies or individuals are indeed tried they don't get away?
    Thank you for your question.
    I think the biggest problem—we mention this in our report—happens when an offence is not criminal but civil and the current system leaves it to the individual victim to take up their case with a platform. This happens often in cases of libel and defamation. It's just impossible to expect that one person will have the emotional and financial resources to see that through.
    The one area where we do think there could be improvement is for the government to empower someone, like the Centre for Child Protection, to use government resources to help escalate cases so that individual plaintiffs and individual complainants can have the force of government behind them to make sure that their complaints are seen through. That's one area of enforcement where we think there could definitely be an improvement.
     As for the loopholes, we need to find out, and until a judge points at one, we won't know, so let's get on with it. That's our lesson here, our message. If there are loopholes, we'll identify them and Parliament can act to fix them.
    You're waiting for cases to come before the court so that we can establish some sort of jurisprudence on the issue. Is that what you're saying?
    Yes.
    Thank you.
    The CRTC explains that it does not regulate Internet content, because consumers can already control access to unsuitable material on the Internet using the filtering software, and any potential illegal content on the Internet can be addressed with civil action, existing hate crime legislation and the courts.
    How do you respond to the position taken by the CRTC?
(1230)
    The CRTC has exempted itself from the obligation to regulate online broadcasting. That's not actually a problem with the Broadcasting Act. The CRTC has created an order to let itself off the hook with this. We can litigate that on another occasion. I think the major implication for that is with Netflix and companies like that.
    That does not excuse these platforms from civil or criminal law. As I said earlier, these platforms will hide behind their scale. They will say, “Look how many videos there are. How could we possibly find all the illegal content inside?” My answer, and I hope your answer, should be, “I don't care. If you can't fly the plane safely, then you can't sell tickets to the public. If you cannot operate this service legally, then you can't operate it.” It's really that simple.
    The CRTC is disappointing in this case. Fortunately there are other avenues.
    Thank you very much.
    That's your time, Ms. Lattanzio.
    I will now thank all the witnesses for appearing before us and suspend the meeting for three minutes when we will come back on again.
    Thank you very much.
(1230)

(1235)
    Colleagues, we are resuming our meeting. I would propose that after this panel we go in camera to discuss some developments with our upcoming witnesses. The clerk will send out new codes for the in camera portion. We will suspend at about 1:30 to go in camera, if members agree.
    We're all in agreement. Thank you very much.
    I would like to introduce our witnesses. From the Royal Canadian Mounted Police, we have Mr. Stephen White, deputy commissioner, specialized policing services; Marie-Claude Arsenault, chief superintendent; and Paul Boudreau, executive director, technical operations, specialized policing services. I believe we also have Normand Wong from the Department of Justice.
    Will both the RCMP and Department of Justice be presenting?
     Is there a presentation from the RCMP, Madam Clerk?
    Will the Department of Justice be making one as well?
     No, the Department of Justice will not be doing an opening statement.
    Thank you very much.
    We will now begin with the presentation from the RCMP.
     Mr. White.
(1240)
     Thank you very much.
    Good afternoon, Madam Chair and honourable members of the committee. Thank you very much for the opportunity to speak with you today on this pressing matter. My colleagues from the RCMP have been introduced.
     I'd like to highlight that Chief Superintendent Marie-Claude Arsenault is with us. She oversees sensitive and specialized investigative services, which also includes the National Child Exploitation Crime Centre. Also with us is Mr. Paul Boudreau, executive director of technical operations for the RCMP. It's also a pleasure to have our colleague from the Department of Justice with us as well.
    I'd like to describe for a couple of minutes a broader context of online child sexual exploitation and highlight the RCMP's steadfast efforts towards combatting this crime and bringing offenders to justice.
    Online child sexual exploitation is one of the most egregious forms of gender-based violence and human rights violations in Canada. Not only are children, particularly girls, victimized through sexual abuse, but often they are revictimized through their lives, as photos, videos and/or stories of their abuse are shared repeatedly on the Internet amongst offenders.
    In 2004 the Government of Canada announced the national strategy for the protection of children from sexual exploitation on the Internet, which brings together the RCMP, Public Safety Canada, the Department of Justice and the Canadian Centre for Child Protection, CCCP, to provide a comprehensive, coordinated approach to enhancing the protection of children from online child sexual exploitation. The Canadian Centre for Child Protection is a non-governmental organization that operates Cybertip.ca, Canada's tip line to report suspected online sexual exploitation of children.
    The Criminal Code provides a comprehensive range of offences relating to online child sexual exploitation. Canadian police services, including the RCMP, are responsible for investigating these offences when there is a possible link to Canada. The Criminal Code also authorizes courts to order the removal of specific material, for example, a voyeuristic recording, an intimate image and child pornography that are stored on and made available through a computer system in Canada.
    The RCMP's National Child Exploitation Crime Centre is the national law enforcement arm of the national strategy and functions as a central point of contact for investigations related to online sexual exploitation of children in Canada and international investigations involving Canadian victims, offenders or Canadian companies hosting child sexual exploitation material.
    The centre investigates online child sexual exploitation and provides a number of critical services to law enforcement agencies, including immediately responding to a child at risk; coordinating investigative files with police of jurisdiction across Canada and internationally; identifying and rescuing victims; conducting specialized investigations; gathering, analyzing and generating intelligence in support of operations; engaging in operational research; and developing and implementing technical solutions.
    The centre has seen first-hand the dramatic increase in reports of online child sexual exploitation in recent years. In 2019 the centre received 102,927 requests for assistance, an increase of 68% since 2018 and an overall increase of 1,106% since 2014. The majority of the referrals the centre receives come from the National Center for Missing and Exploited Children in the United States. Every report is assessed and actioned where possible.

[Translation]

    In addition to the high number of reports, cases of online child sexual exploitation have become more complex. Advances in technology such as encryption, the dark Web and tools to ensure anonymity have made it much easier for offenders to conduct their criminal activities away from law enforcement agencies. Investigations related to online platforms also raise a host of other Internet-related issues, including the failure of platforms to retain data, the amount and speed at which content can be posted and distributed, and the ability of users to download hosted content.
    When content is successfully removed from one platform, it can easily be uploaded to the same platform or to other websites, perpetuating victimization and leading to a proliferation of content depicting sexually exploited children on multiple platforms. It is well known that offenders protect this type of content on personal devices or through cloud computing services.
(1245)

[English]

     Like many cybercrimes, online child sexual exploitation is often multi-jurisdictional or multinational, affecting victims across jurisdictions and creating additional complexities for law enforcement. No single government or organization can address this crime alone. The RCMP works diligently with its partners at the municipal, provincial and federal levels in Canada and internationally, as well as with non-governmental organizations, to strengthen efforts to rescue victims and bring offenders to justice. In fact, the RCMP is the current chair of the Virtual Global Taskforce, an international police alliance dedicated to the protection of children from online sexual exploitation and other transnational child sex offences. The Virtual Global Taskforce consists of law enforcement, NGOs and industry partners working collaboratively to find effective response strategies. Chief Superintendent Arsenault, who is with us today, is the current chair of this very important group.
    The RCMP also seeks to work closely with the private sector as offenders regularly utilize platforms operated by Internet and/or communications service providers to carry out a range of Criminal Code offences relating to online child sexual exploitation.
    The RCMP regularly engages private sector partners to discuss existing legislation, which includes an act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service, referred to as the mandatory reporting act, which came into force in 2011. The mandatory reporting act requires that Internet service providers report to the Canadian Centre for Child Protection any tips they receive regarding websites where child pornography may be publicly available. Under the mandatory reporting act, Internet service providers are also required to notify police and safeguard evidence if they believe that a child pornography offence has been committed using their Internet service. Since the mandatory reporting act came into force in 2011, the RCMP has seen a continual increase in reporting from industry partners.
    Many online platforms post jurisdictional challenges, as I outlined earlier. An online platform that is registered in Canada may maintain its servers abroad, which could limit the effect of a Canadian warrant. Further, when a global company registered abroad has a Canadian presence, it is likely to host its content abroad, making jurisdiction difficult to determine.
    When an online platform permits its users, and/or itself, to personally download material to and upload material from their own computers, it becomes impossible to determine where this material may be stored or to prevent it from reappearing and being further disseminated.

[Translation]

    New companies, platforms and applications will continue to emerge, and the services they provide to Canadians will continue to evolve. It is important that the Government of Canada, legislative authorities and law enforcement agencies keep pace and adapt accordingly to combat these crimes.

[English]

    The illegal online content that many communications service—
    I'm sorry, Mr. White. Are you wrapping up? I think we need to move to questions.
     These will be my last couple of sentences.
    The illegal online content that many communications service providers operating in Canada have to deal with goes beyond online child sexual exploitation. As the committee is aware, the parameters are wide and include rape, various forms of violence and non-consensual distribution of intimate images, such as revenge porn. Sadly, many victims who have been victimized through sites such as these may not involve law enforcement due to the fear of repercussions from their families, friends and employers or because they feel a sense of shame. These offences have damaging and long-term impacts on our citizens and on our societies. We must strive to do better.
    Thank you for inviting us here today. We look forward to answering your questions.
    Thank you very much, Mr. White.
    I now go to Mr. Viersen for six minutes.
    Thank you, Madam Chair.
    I want to thank the witnesses for being here.
    Section 163.1 of the Criminal Code makes it an offence to make available or distribute CSAM. MindGeek executives told us that none of this exists on their site. Have you found this to be true, Mr. White?
    Actually, we have had referrals from our partner in the United States that I mentioned, NCMEC, with regard to disclosures by that company to us regarding that content.
(1250)
    Okay.
    We've heard multiple times at this committee that there are documented cases of videos of exploitation of minors being put up on MindGeek. The big question for us is to understand why the owners have not faced charges. Is there a problem with the laws as they currently stand?
     Obviously, there are a number of elements to when and what types of charges will be laid. When we're talking about these corporations, which are service providers and hosting platforms, other individuals have the ability to automatically load the content onto the platforms. There are jurisdiction issues, but every case is different.
     I would ask my colleague, Chief Superintendent Arsenault, if she can add to that.
    Ms. Arsenault.
     I would add that every situation is assessed, and in looking at the evidence we have, that would determine if we have enough to proceed with charges. As was mentioned, with MindGeek or Pornhub, we've received, since June 2020, about [Technical difficulty—Editor] reports, which were—
    Could you repeat that number again?

[Translation]

    Ms. Arsenault, there was an interruption in the audio.

[English]

    One-two-zero: Did you hear that?
    Yes. Very good.
    Please continue.
    We've received 120 reports, which have been triaged and prioritized. Some were referred to other law enforcement in Canada and others were deemed to be not online sexual exploitation, for various reasons. That's it.
    The presence of the download button on the MindGeek-Pornhub site seems like a clear violation of the distribution part of the CSAM laws. Is that part of your case? Has that been flagged? How come we haven't seen any charges?
    Well, again, we have to look at all the evidence we have, and my understanding is that this function has been taken out. Without the evidence, there's limited action that we can take. We are assessing all the reports that we're getting now and determining if charges are likely to happen.
    Okay.
    Up until recently, you have not had any reports of CSAM on MindGeek or Pornhub sites. Am I hearing you correctly?
    June 2020.
    Okay.
    The government has indicated that it will table legislation that will require companies like MindGeek to remove illegal content from websites within 24 hours. Do you think that's enough? We know that with some of the cases we've seen here, some of these videos have millions of views within 24 hours. Is 24 hours a short enough period of time?
    Who is the question directed to?
    You, please, Ms. Arsenault.
    Well, any action that will improve on the reporting, and any tools that will assist law enforcement in getting the information as fast as we can as we're trying to identify and rescue the victim in a timely manner....
    In other areas of sexual assault, like bail hearings and things like that, there's a reverse onus that is placed on the defendant. Would there be any area in this area of the law where a reverse onus would be helpful? I'm just thinking of taking a screenshot and saying, “This looks bad, so prove that it's not the case.” Is that an opportunity in the law?
(1255)
     There's always a possibility for it. In terms of this type of content, obviously we'd have to look very closely at that with our Justice officials to see if that type of provision would apply.
    I'm not sure if Normand, our colleague from Justice, can add anything to that.
    I'm sorry. He won't have time to reply, but perhaps in the next round, further along.
    We'll go to Mr. Dong for six minutes.
    Mr. Dong, we're not hearing you. This is unfortunate.
    It's not working?
    The Chair: Okay, you're back on. Great.
    Mr. Han Dong: I hope I get an extra minute because of technical difficulties.
    I want to thank all the witnesses for coming forward.
    Mr. White, first things first, I remember in 2019 that the government announced an expansion of the national strategy for protecting children from sexual exploitation on the Internet. I think that the total was over $22 million. Specifically, $15.25 million was to enhance the capacity of the Internet child exploitation unit, which is the ICE unit.
    Can you tell us whether that capacity of RCMP's ICE unit across the country has been enhanced as a result of this additional investment?
    Thank you for that question.
    We did receive additional funding. We have implemented new resources, both in the national crime centre here in Ottawa which I mentioned, the National Child Exploitation Crime Centre, as well as in ICE units across the country. A lot of those units across the country are integrated units. They're made up of RCMP and other local police services as well.
    Along the same line as my Conservative colleague's question, we heard testimony from witnesses saying that subsidiary companies of MindGeek sometimes shared content, or moved some non-consensual or child pornography content to other platforms to add content to them.
    In itself, isn't that a violation of the Criminal Code? Have you done any investigation on their intentionally adding content to subsidiary companies to make money? Have you done any investigations on that?
    No, we're not aware.... We haven't been informed of their adding their own content, child sexual exploitation content, if that's what you're referring to.
    Let's say going forward, if you find there's enough ground to start an investigation, would you be able to go back...?
    MindGeek told us that they made some improvements on their approval process or screening process so everything is great now, but would you be able to, retroactively, take a look at their actions in the past?
    It would depend on the preservation of data retention policies that they have. How long they retain their data is often different across different companies or platforms, Internet providers. That is always a concern for us when we're doing investigations, trying to look backwards for a period in time, whether or not that data has been retained.
    In your view, or based on your knowledge of Pornhub, do you think it has met its obligation to report?
    I can't say, but under the mandatory reporting act, there is a very strict requirement there.
    My understanding is that we have received reports from NCMEC in the United States from Pornhub. Now, I am not in a position to say that what we received is all that there is to receive, but I can confirm that I've been informed we have received some disclosures.
    This is a very interesting point you brought up. I didn't get a chance to ask this in the previous session.
    Why is Pornhub reporting to NCMEC as opposed to reporting to the authority directly? Isn't it very strange that they report to a not-for-profit organization and not to the police?
(1300)
    In the United States, one of the reporting entities is NCMEC. That's why a lot of the disclosures we get, not just from Pornhub but other Internet service providers and hosting platforms.... If they have a presence in the United States, they are able to disclose to the American entity as well.
     We know its headquarters are in Montreal, but we also heard through testimony that it has subsidiaries all over the world. That's basically the business model. Would that itself pose any challenge to your ability or your authority to look into their operations?
    Yes, it does, without a doubt. It's always a challenge for us when we're talking about large international entities such as this one, and where they're housed could be different from where they're incorporated. Where they're incorporated and housed could be different from where they maintain servers with their data. They may also have data that exists in cloud storage, which is evolving as well. All those add, without a doubt, certain levels of complexity for investigating.
     I just want to confirm. You have not started any investigation against MindGeek or Pornhub. Is that correct?
    No, we have not—not to my knowledge.
    Can you confirm this and let the committee know afterwards?
    We'll look at that, yes.
    Thank you.
    To your previous point, is there anything that the committee can recommend to the government to make it flexible and easier for you to look at the wider scope of your practice around the world?
    You have 20 seconds.
    If you could provide some recommendations afterwards to the committee, that would be appreciated.
    Those would be recommendations with regard to what? I'm sorry. I missed the question.
    To enable you to look at the wider operation of MindGeek, since it's a transnational company with various financial tentacles around the world, is there anything the committee can do to help you to effectively investigate, if there is an investigation?
    Thank you very much.

[Translation]

    Ms. Gaudreau, you have the floor for six minutes.
    Thank you, Madam Chair.
     Mr. White, my colleague was talking about an increase in investments of more than $22 million, which allowed you to increase your staff to work on the problem.
    Even if money is spent today, even if people can no longer download the content that has been removed, we would like to know what happened in the previous months. So I have three questions for you.
    Does the amount of money collected always correspond to the number of complaints you receive, or do you now monitor preventively?
     How is it that it was through the media that we learned that MindGeek, which owns Pornhub, had broken the law?
     How do you intervene preventively, and how do you handle complaints?
    Thank you very much for the question.
    We can always do more in terms of prevention. This has always been our priority. The more we do, the better.
    Ms. Arsenault talked a little bit about the volume of complaints we receive and the number of investigators it would take to deal with all of them. I'll ask her to elaborate on that.
    Ms. Arsenault, if it wasn't for what we've been going through in the last few years, there wouldn't have been so many complaints, because there would have been no possibility of downloading or uploading.
    What's your opinion about this?
    We often react because of the volume of complaints. We also have a proactive model, which allows us to identify victims and conduct more specialized or complex investigations. This allows us to get these people out of the abusive situations they are in.
(1305)
    Forgive me for interrupting, but I don't have much time.
    Will the investment of more than $22 million allow you to be more vigilant in terms of prevention? We can see that laws are very different from one country to another and that these companies have found a business model that will allow them to continue to operate.
    Other than this investment and the means to keep a watchful eye, what would you need to intervene upstream of problems that could lead to thousands of complaints?
    We always need more resources, but the resources we invest in this area are more for the proactive aspect. We need to collect data and prioritize cases to be able to stop abuse. Of course, building partnerships is really important to do these investigations.
    You said earlier that you were facing challenges related to jurisdictions. Since you have to deal with the various provinces, things will vary from place to place in Canada.
    Could you tell us more about the partnerships used to enforce the law as it stands?
    My question is for Ms. Arsenault or Mr. White.
    We work closely with our partners in the provinces and municipalities. There are teams in every province and in almost every major city, for example in Toronto, Ontario, where the provincial police have resources to conduct these types of investigations.
    You said you received 120 reports. Do they involve child pornography or are there also cases involving adults who have been unable to have content removed on these platforms?
     I would ask Ms. Arsenault to give you these details.
    These are 120 reports regarding Pornhub that we received from NCMEC, the National Centre for Missing and Exploited Children. Of these, 25 reports that involved child sexual exploitation were forwarded to police agencies in various locations across the country. Of the remaining reports, 93 were determined to fall outside of the Criminal Code definition of the alleged offence. The remaining reports are currently being evaluated.
    Other than financial resources, what changes in legislation would you like to see adopted on an urgent basis so that we have the tools to fight or stop this scourge?

[English]

     There are probably quite a number of things that we could be looking at.
     I've already touched on a number of them with regard to making sure we get access to the data and to any information that we would be able to have access to up front and more urgently, as well as more basic subscriber information, and also ensuring that all the entities that are required under the mandatory reporting act to report are reporting. We obviously need some level of compliance around that.
    Encryption is becoming a challenge for law enforcement in terms of regular encryption. In encryption, people are able to become a lot more anonymous and hide their identities on the Internet.
     There are actually quite a number of elements that we are looking at and having discussions on.
    Thank you very much.
    We'll now go to Mr. Angus for six minutes.
    Thank you so much.
    Mr. White, you talk about your private sector partners that you work with. Is Pornhub-MindGeek one of your partners?
(1310)
    To my knowledge, we are not a partner with them. When I refer to partners, I'm referring to—
    Okay. Who are your private sector partners if the biggest porn company in Canada isn't one of them? Who are your private sector partners that you work with?
    We are working with some of the bigger companies, Google, Facebook. When I mentioned the Virtual Global Taskforce earlier—
    But you're not working with Pornhub-MindGeek?
    Not to my knowledge, but I will ask Madam Arsenault—
    Have you watched any of this testimony of what we're studying here? We're studying Pornhub-MindGeek. They aren't one of your partners.
    You get your information from NCMEC. Is that correct?
    NCMEC, yes.
    Okay.
    In 2011, the Canadian Parliament passed a law that if an Internet content hosting service provider came across issues of child abuse online, they had a legal obligation to report to the police. That was in 2011.
     That is correct.
    How many cases have been reported to you by Pornhub-MindGeek since 2011?
    I'll ask Madam Arsenault to confirm that, but it's my understanding we only began receiving complaints in 2020.
    From NCMEC....
    We received 120 reports through NCMEC.
    Here's the thing. You come here and you talk about how you could get more money to do a better job. We've had a law on the books for 10 years, and yet since 2020 you're getting reports from an American agency, when the law says that if there are allegations, it has to go to the police. You have no record of any of that? It has to go to the Canadian child centre. Thank God the Americans are sending you some information.
    I ask that because we have had multiple witnesses come forward, and we're not just talking about child pornography, about encryption in the dark net. We're talking about witnesses who say that modern sexual assault is being tied to what is happening online, that it is being promoted, and that it is being promoted in a Canadian company. Just to clarify, are you saying that you have absolutely no cases outstanding against Pornhub-Mindgeek for all the cases that have been brought forward?
    That is my understanding.
    Okay.
    Then you said you have a problem with jurisdiction. How could you have a problem with jurisdiction when they are based in Montreal?
    Part of the company may be based in Montreal, but it may be incorporated elsewhere and their servers could be located elsewhere as well.
    I find that interesting, because when I read the law, it doesn't mention their servers. It mentions their service. Pornhub-Mindgeek is an Internet service. As an Internet service, it promotes content, adult content, sexual content. The law doesn't say that if Pornhub-Mindgeek has their servers in Cyprus, you can't touch them. They are providing a service; they have a legal obligation.
    Have you had any legal opinion from your people about the difference between the service they provided and your inability to check their servers, or have you just not tried to check their servers because you haven't followed through with any cases?
    No. The cases we've followed through on are the ones that were referred to us. They have a presence in the United States. They report it through that entity in the United States and pass it on to us.
    Okay, thank you.
    The 120 reports that were mentioned are and have been looked at.
    Okay. Again, I'm looking at Canadian law, and it doesn't say that a Canadian entity should be referred to the United States so they could then refer to the RCMP. They said they have an obligation to report to the police, which is in Canada.
    I want to read an email I received from one of the survivors. Again, we're not talking about one or two cases here. We've come across many. This survivor wrote to me on Friday afternoon. She said she was glad that we would be talking to the RCMP on Monday. She said, “I hope they can answer why they don't do anything. I emailed them and asked them to investigate Pornhub's part in my video, because I think it was illegal. They didn't even answer.”
    Here's the point. She said she was scared to try again. She was worried if she pushed the issue, they'd just get mad and stop working on these cases. The issue of survivors having to ask you to do the job you're supposed to do, and you tell us that you haven't initiated any cases. It sounds as if you're not even going to get mad at this poor survivor; you just haven't done it.
    Can you explain to us, after all the testimony we've heard, why you are still talking about dealing with the dark net, needing more resources, working with the United States, and you are not addressing the issues—the credible issues—of sexual abuse and non-consensual acts that have happened on this service? What are you giving us here?
(1315)
    The reports that are referred to us are assessed, regardless of which entity refers them, and if it is deemed to be content that needs to be investigated further, packages are put together and sent out to police services across the country—
    Okay, but the survivor who wrote to me said she asked them to investigate Pornhub's involvement in her video because she thought it was illegal and the RCMP didn't even answer and she was scared to try again.
    Mr. Angus, I see that Mr. Wong has his hand up. If the committee agrees, we'll give Mr. Wong 15 seconds.
    Certainly.
    Thank you, Madam Chair.
    Just quickly, because this issue has come up a number of times in relation to jurisdiction, it is a complicating factor in all these cases. As Commissioner White said, there is the issue of incorporation in Luxembourg, headquarters in Montreal, its servers located around the world, but mainly in the United States.
    The operation of the mandatory reporting act—and Mr. Angus is correct in relation to the definition of Internet service. Pornhub meets that definition, but the obligation under section 3 of the act is when they find they have reasonable grounds to believe they've found child pornography on their servers.... The operation of the mandatory reporting act—
     I'll have to stop you there, Mr. Wong. If you have further information to give us, please provide it to the committee in writing.
    I will now go to Mrs. Wagantall for, I believe, five minutes.
     Thank you very much, Chair. It's a real privilege to be part of this conversation today on behalf of those who are being victimized at unbelievable levels. It's frightening, and we have to do something.
    I noticed here a comment that we've investigated only around 120 reports, 25 of which qualified to go to our police force. That means that 90, it was indicated, did not meet the Criminal Code definition. I have to ask why. What should we be doing to improve this Criminal Code definition so that these circumstances aren't taking place? I can't imagine that these other cases shouldn't qualify to be investigated.
    I'll start, and then I'll turn it over to Madam Arsenault.
    Every one of these cases—
    I would like to ask Mr. Wong as well, please.
    Sure.
    Every one of these cases that come into the National Child Exploitation Crime Centre is fully analyzed. We have some very good and passionate people in that unit doing this work, and they do a good, thorough analysis—
    Excuse me, Mr. White. I'm not questioning their analysis. They're doing their analysis and deeming 90 of them as not meeting Criminal Code definitions.
    Mr. Wong, what's the problem here? What do we have to do to increase that and improve that area?
    Maybe, before you answer, I could just clarify that it did not meet the definition of child pornography in the Criminal Code. The vast majority were cases like age-difficult media, meaning we cannot definitely ascertain whether the individual is under the age of 18 years old.
    Mr. Wong, how do we improve this?
    I will just add to what Marie-Claude said.
    The definition of child pornography in the Criminal Code is among the broadest in the world. We protect children under 18. The problem, as Marie-Claude mentioned, is the age-difficult media. When there are secondary sexual characteristics, unless you're dealing with an identifiable person, it's very difficult for anyone to tell whether that person is above 18 or below 18, so a lot of that material is not captured. That's probably what Marie-Claude is talking about.
    That's very disconcerting.
    There has been a lot of talk about the issue around jurisdiction. Madam Arsenault, you're chairing the Virtual Global Taskforce. Is that correct?
(1320)
    Thank you for your work.
    However, I can't help but think there has to be a way with MindGeek to work together internationally—because this is an international issue—to provide that capability to have jurisdiction absolutely wherever you need it for something like this.
    Why is that not something that has been being worked on internationally, or is it? What needs to be done in terms of Canada's responsibility in enabling us to get over that hurdle? Clearly, it's a method of avoidance.
    Within the Virtual Global Taskforce, pretty well all the countries, or many of them, have similar challenges with jurisdictional issues. We are working on identifying all these law enforcement challenges. As part of the VGT, we have industry and NGOs that are our partners. We also work with other NGOs that have some influence internationally to advocate for some of the challenges.
    Perhaps Mr. Wong could speak from the international side on legislative groups that are also looking at these issues.
    Thank you.
    I'd like to expand a bit, however, your comment that you're working on it internationally. I know of NGOs that have reported scenarios like this to me and said that as the police force, it's very difficult to function in this environment because you don't have the jurisdictional support you need. It's supposedly been worked on for a very long time, yet we have a situation here where we've had only 120 reports since 2020 and that type of thing.
    What has been accomplished, or what is being done that it's taking so long to get any kind of co-operation internationally to deal with this horrific situation?
     The co-operation is there amongst all international partners when it comes to the exchange of information, the exchange of intelligence, sharing our best practices, and so on. On the legislative side, our group of law enforcement does not have control in terms of changing the laws—
    Right.
    Mr. Wong—
    I'm sorry, we have to stop you there.
    If Mr. Wong could provide an answer in writing, I would appreciate that a great deal.
    Yes. That's a very good idea, Mrs. Wagantall.
    I now move to Madam Lattanzio for five minutes.
    Thank you, Madam Chair.
    Thank you for being with us today to provide us with essential information that will help us along, I hope, in completing this study. My questions will be for Mr. Wong.
    There are specifically three areas that I'd like to hear you on. I understand that there are limitations with regard to the application of the law right now, and changes or amendments to it might be necessary so that we can address this issue. The first one is really the definition of child pornography, because I think that's what is posing a problem.
    Number two is the question of jurisdiction. I heard you say that it implies the question of having the material on the server. What about the question of making the material available and distributed in different countries? Would the fact that the material would appear in a specific country make the material or the distribution and availability of that material in that country a question of jurisdiction, so therefore if it appears there then it would have legal jurisdiction to be able to try?
    Number three is the question of onus of responsibility. We heard that there could be a defence in terms of ignorance, i.e., they didn't know until someone flagged it, or the responsibility has shifted over to the person who uploads. Wouldn't legislative changes in terms of shifting the onus of responsibility onto those who make this kind of material readily available, other than the victims themselves, be...? We heard the testimony of Ms. Fleites. It was heartbreaking to hear that she tried and tried and tried, with proof of identity and with a licence, and yet again the material was taken down very temporarily, only to reappear again and again and again—a repeated assault.
    I'd like to hear from you on these three points. Thank you.
(1325)
     There's a lot to unpack there.
     The definition of child pornography in the Criminal Code is among the world's broadest. It's not only images that we protect against or criminalize the distribution of, but it is also audio pornography and two forms of written pornography.
    I am not sure it's the problem of the law. The problem often is the application of the law, and how that works when the rubber hits the road. We heard Inspector White talk about the circumstances in which these things come up, and Marie-Claude was talking about how much evidence and proof there is in being able to follow up on an investigation.
    In relation to the jurisdiction—and that's the more difficult part—Marie-Claude was talking about what we're doing internationally. Canada is involved in the negotiation of a second additional protocol to the Budapest Convention, and that's the only international convention that covers cybercrime.
    In that convention there are specific provisions or articles on child pornography. There is the ability of the international community to deal with this, but that second additional protocol has to do with transborder access to data, because it's almost a universal problem among all countries trying to combat crime in this sphere.
    Other work is ongoing at the UN right now with the negotiation of a new cybercrime treaty and also the Five Eyes, Ms. McDonald, in the previous panel, mentioned the voluntary principles that they're working on. Our largest partner, the United States, also enacted the U.S. CLOUD Act. That is another method of addressing the issue of transborder access to data. Canada is involved in all those aspects.
    In terms of shifting the onus, there is a difficulty, and Mr. Angus was highlighting some of the issues about the person who wrote in about the difficulty of getting the material taken down off these sites. There is a lag in that. The problem with some of this material, like the revenge porn, is that someone has to be affected. It's very difficult to police a lot of the companies, because without a complaint, there's no way of distinguishing that revenge porn from something that is otherwise completely legal. There's always going to be a bit of a lag time. I think it was mentioned that Minister Guilbeault and Canadian Heritage are looking at the 24 hour takedown in terms of the online harm.
    I'll have to stop you there, Mr. Wong.

[Translation]

    We will conclude this round of questions with interventions from Ms. Gaudreau and Mr. Angus.
    We will then continue in camera.

[English]

    We're going to Ms. Gaudreau for two and half minutes.

[Translation]

    Mr. White, you talked about the handling of complaints recorded in the reports.
    Since I'm not familiar with this area, can you tell me what you need to have in hand for the complaint to be investigated? Can you give me a brief explanation of how the process works?
    I would ask Ms. Arsenault to review the steps that follow the receipt of a complaint.
(1330)
    Ms. Arsenault, you have the floor.
    At the triage stage, if we determine that it is a case that meets the definition of child pornography, in that a child is involved, we can treat it as a priority. Another determining factor is the complexity of the investigations. In many cases, the complaint also involves serious acts of violence.
    In terms of the data we've been talking about, there are companies that don't keep information for long. So we have to decide on our priorities to get the information we need to support the evidence. There are a number of things we need to consider to help us prioritize cases for investigation.
    In terms of prioritizing—
    I unfortunately have to interrupt you, Ms. Gaudreau.

[English]

     Mr. Angus, you have two and a half minutes. Make it short and snappy, please.
    Thank you for that, Madam Chair.
    I want to go back to this issue of the fact that Parliament signed a law into place in 2011 on mandatory reporting for service providers. We understand that last year, in 2020, the RCMP received their first report. That's almost 10 years of no reports.
    If, in that time, case X tried to come forward, case Y came forward and case Z came forward with issues of non-consensual or child abuse on that platform and nothing was done, the fact that they're reporting now to NCMEC, is that okay for the RCMP? Do you just say, “Well, that was then, this is now, and they're now complying with NCMEC” or do they have legal obligations that they failed to fulfill under the laws of Canada?
    When I referred earlier to the 120 reports that we received from NCMEC, that was directly related to Pornhub, to my knowledge. We have been receiving reports over the years since the mandatory reporting act
    I'm sorry. You've been receiving reports from Pornhub-MindGeek?
    [Inaudible—Editor] from Pornhub-MindGeek.
    Because I thought I asked earlier and you hadn't.
    We have. The ones we received from Pornhub-MindGeek are the ones that have been transmitted through the NCMEC, but with regard to—
    Since 2020.
    Yes, but with regard to reports since the mandatory reporting act has come into play from other entities—
    I'm not interested in other entities. We're studying Pornhub-MindGeek.
    So you haven't had any. Is that like a 10-year lag? You guys are just saying, “Okay, well, now they're complying and giving it to the Americans, and the Americans are giving it to us.” Is that okay?
    To my knowledge, we would have to confirm if we have received over that period of time any others.
    Thank you.
    Are you done, Mr. Angus?
    Thank you very much.
    Colleagues, we will terminate the public part of the meeting. We'll take a 10-minute break. You have been sent the codes for the in camera portion of our committee business meeting. We have 10 minutes, until 1:43 p.m., to come back in and recommence.
    Thank you very much to all the witnesses.
    [Proceedings continue in camera]
Publication Explorer
Publication Explorer
ParlVU