Notices of Meeting include information about the subject matter to be examined by the committee and date, time and place of the meeting, as well as a list of any witnesses scheduled to appear. The Evidence is the edited and revised transcript of what is said before a committee. The Minutes of Proceedings are the official record of the business conducted by the committee at a sitting.
This is the 19th meeting of the House of Commons Standing Committee on Access to Information, Privacy and Ethics.
We are resuming our study today on the protection of privacy and reputation on online video platforms such as Pornhub.
I'd like to remind you that our meeting today is televised.
Today we have three witnesses at our committee. From MindGeek, we have Feras Antoon, chief executive officer; David Tassillo, chief operating officer; and Corey Urman, vice-president, video-sharing platform.
I'd like to remind the witnesses today that any witness before a parliamentary committee has a duty to tell the whole truth, and any failure to do so might result in a finding of contempt of Parliament. I'll remind all members of that.
Gentlemen, we'll turn to you for your opening statements. I don't know what you've arranged, in terms of who will go first, but we'll turn it over to you. Unmute yourself when you're prepared to speak. Then, we'll have some questions for you once we've heard your opening statements.
I'm the chief executive officer of Entreprise MindGeek Canada. With me are David Tassillo, chief operations officer, and Corey Urman, vice-president of product management, video-sharing platforms. We are grateful to the committee for the opportunity to speak with you today.
MindGeek is one of the largest, most well-known brands in the online adult entertainment space. Our flagship website, Pornhub, is among the top five most visited websites on the Internet. Over 12.5% of the adult Canadian population visit our website every day. As a leader in this industry, we share the committee's concern about the spread of unlawful content online and about the non-consensual sharing of intimate images. It goes against everything we stand for at MindGeek and Pornhub.
When David and I joined MindGeek in 2008, our goal was to create the most inclusive and safe adult community on the Internet. It was designed to celebrate freedom of expression, to value privacy and to empower adults from all walks of life. We knew this could be possible only if safety and security were our top priority. While we have remained steadfast in our commitment to protect our users and the public, we recognize that we could have done more in the past and we must do more in the future.
I want to be clear to every member of this honourable committee, and to the Canadian public, that even a single unlawful or non-consensual image on MindGeek's platforms is one too many, full stop. We are fathers and husbands. We have over 1,800 employees with families and loved ones. We are devastated by what the victims of these heinous acts have gone through. I want to emphasize that this type of material has no place on our platforms and is contrary to our values and our business model. We are sickened when anyone attempts to abuse our platforms to further their violence. Fortunately, the vast majority of attempts by criminals to use our platform for illicit material are stopped.
Before I speak about the steps we have taken to combat unlawful content on our platform, let me first tell you more about MindGeek and how we operate. MindGeek's flagship video-sharing platform is Pornhub. Created in 2007, Pornhub is a leading free, ad-supported, adult content hosting and streaming website, offering visitors the ability to view content uploaded by verified users, individual content creators and third party studios. Demand for MindGeek's content rivals that of some of the largest social media platforms. For example, in 2020, Pornhub averaged over 4 million unique user sessions per day in Canada alone. In 2020, over 30% of our Canadian visitors were women. Roughly 1.3 million Canadian women visit the site every day.
Running one of the world's most visited websites is a responsibility we do not take lightly. The spread of non-consensual and CSAM content is a massive challenge facing all social media platforms. The U.S.-based National Center for Missing and Exploited Children, also known as NCMEC, the industry standard for reporting CSAM, says it has received 16.9 million referrals from tech companies about possible child abuse, with well over 90% of those related to a single social media platform. MindGeek is a proud partner of NCMEC. We report every instance of CSAM when we are aware of it, so that this information can be disseminated to and investigated by authorities across the globe.
We share the objectives reflected in the 11 voluntary principles developed by governments, including Canada, to fight online sexual exploitation and abuse. We have been leading this fight by being more vigilant in our moderation than almost any other platform, both within and outside of the adult space.
Today, only professional studios and verified users and creators, whose personal identity and date of birth have been confirmed by MindGeek, may upload content. This means every piece of content on our websites can be traced back to its uploader, whose identity and location are known to us. We are the first and only major social media platform, adult or non-adult, to introduce this policy. We hope and expect that the entire social media industry will follow our lead.
We are also working to ensure that once content is removed, it can never make its way back to our platform or to any platform. The revictimization of individuals when their content is re-uploaded causes profound injury that we are working fiercely to prevent. We are attacking this problem in two ways. First, our people are trained to remove such material upon request. Second, we digitally fingerprint any content removed from our website so that it cannot be re-uploaded to our own platform.
For the last two years, we have been building a tool called “SafeGuard” to help fight the distribution of non-consensual intimate images. As I sit before you today, I am pleased to report that this month we will be implementing SafeGuard for all videos uploaded to Pornhub. We will offer SafeGuard for free to our non-adult peers, including Facebook, YouTube and Reddit. We are optimistic that all major social media platforms will implement SafeGuard and contribute to its fingerprint database. Such co-operation will be a major step to limit the spread of non-consensual material on the Internet.
Mr. Chair, thank you for the opportunity to discuss MindGeek's commitment to trust and safety, including our work to stamp out CSAM and non-consensual material on our platforms and on the Internet as a whole.
We look forward to answering the committee's questions.
This is a very important question, and I thank you very much for actually starting with it, because that's the core of this meeting. Sexual material, child abuse material has no place on our platform. It makes us lose money. I will walk you through two steps to exactly explain this point.
When you see this kind of material on our website, it completely ruins the brand that we have been trying to build for over a decade. The Pornhub brand, which is known worldwide, has the trust of its users. When the four million Canadians who come daily to Pornhub see this disgusting kind of material, they lose trust and faith in us—
I would agree. I think that's exactly why it's concerning that there is public knowledge of at least 100 such videos. Even just on Monday, this committee heard from a witness that she tried to get removed explicit videos of her when she was 13 years old that were on Pornhub without her consent.
We are an ad-supported platform. That's how we make our revenues. That's how Pornhub makes its revenues.
Now, MindGeek has other products that are membership-based. We have products where you buy a membership, like Netflix, and that has content that has section 2257 IDs and the consent of all the actors—like Netflix, basically.
The ad-free model is a video-sharing platform. Our rules are very similar to adult and non-adult.... Facebook, YouTube and TikTok have very similar rules to ours. They also have pornographic material. They also report, like us. It is a big issue in the video-sharing platform community today, not only adult.... We recognize that.
MindGeek is headquartered in Luxembourg. MindGeek Europe comprises four offices: Luxembourg, the U.K., Cyprus and Romania. We have 800 people in Europe. MindGeek Europe owns all the IP, trademarks and copyrights of all our products and platforms. Pornhub, for example, is owned by MindGeek Europe.
The Canadian subsidiary has 1,000 employees based in Montreal. The Canadian entity is a service entity that supplies services to all the European entities, for example Pornhub. The services provided on the platform are from Montreal. Those services include management, customer care and engineering. The Montreal office, which has 1,000 employees, has around 400 engineers.
It's because every single piece of content is viewed by our human moderators. Number two, it goes through software that we have licensed from YouTube, like CSAI Match, and from Microsoft, like PhotoDNA for pictures. It goes through a software called Vobile.
Mrs. Stubbs, I would like to add to what Feras mentioned.
I'm not too sure where it says that in the terms of service, but I can guarantee you that every piece of content, before it's actually made available on the website, goes through several different filters, some of which my colleague made reference to.
Depending on whether it comes up as a photo or as a video, we go through different pieces of software that would compare it to known active cases of CSAM, so we'll actually do a hash check. We actually don't send the content itself over; they create a digital key per se that's compared to a known active database. After that, it's compared to the other piece of software that Feras mentioned, Vobile, which is a fingerprinting software by which anyone can have their content fingerprinted. Any time MindGeek would find the piece of infringing content, we'd add it to that database to prevent the re-upload.
Once it passes the software queue.... If anything fails at the software level, it automatically doesn't make it up to the site. Once that piece has gone through, we move over to the human moderation section. The human moderators will watch each one of the videos, and if they deem that the video passes, it will be—
In the case [Technical difficulty—Editor] an individual performer, they probably wouldn't need sound to establish that. We always instruct all of our agents to err on the caution side. Basically, if you have any doubt at all, just don't let it up, versus just letting it up.
Even one video, as Feras mentioned, could create irreparable harm to us. The way we view it is that every piece of content that makes it up to the site that shouldn't be there.... For every viewer who stumbles upon that content, we believe the vast majority of individuals want nothing to do with this content, 99.9% and I don't know how many more nines. But after that—
I respectfully disagree with that. I think it's been a constant evolution. Some of the parts that have been more publicly made available to the changes we've made were more publicly spoken about, but this has been a constant evolution in our company since the onset, since 2008. We had human moderation available on our sites when it was a word that didn't even exist, when Facebook and any of the other main platforms in the world never used it.
These were all things that we started. We weren't public about it, but these are things we did since the beginning. They've been core to the way we wanted the company to run.
We heard devastating testimony on Monday from a young woman who was victimized on your platform. Is it fair to say that Pornhub and MindGeek failed to take all the actions they could have taken a number of years ago to prevent that instance from happening?
The first time ever that we heard the name of Ms. Fleites was a couple of months before The New York Times article was released. The writer, Mr. Nicholas Kristof, reached out to our PR team around September. So—
No. For now, we only know her first and last name. We started an investigation, but we do not have enough information to see if she ever contacted us or not. I'm not saying that she's not telling the truth—not at all, and please do not misunderstand me. I'm just saying that with the first name and last name, it is impossible to know if she's contacted us—
Don't you think that's worse, that you have no idea if she contacted you? She said she did.
You, as a company that is making millions of dollars...and here is a woman who has been victimized on your platform, and you don't even know, sitting here today, that she contacted you many years ago, when she was 13, to have that content taken down. Don't you think that's even worse?
What do you say to young women like Serena who have been victimized through your site? You have an opportunity today. The public is watching. What do you say to these individuals who have been victimized on your site?
I am a father. I have a daughter. I have a wife. I have a mother. I'm heartbroken when I hear these stories. The things that they have suffered are unimaginable. We are aligned with everyone who wants to come up with new regulations. This is a heartbreaking story. Of course I feel sad. This is not what the company—
Let's talk about how many instances there were. Let's just pick a year. How many times in 2020 did individuals reach out to MindGeek and say, “I want content taken down because I did not consent to that content being put up”?
So you come here today.... Yes, it's a worthwhile apology, although when I ask you what you would say to victims, I would have expected an apology there too. But at a minimum, coming prepared today, you would have thought that you would receive that question.
So you don't know, sitting here today, how many times people reached out to you in 2017, 2018, 2019. You obviously don't have a record of Serena. Maybe you weren't keeping records five years ago. You don't have any clue, sitting here today, how many times people reached out in 2018, 2019—
To go back to the other comments that you made previously about our not knowing, it's not that we don't know who contacted us. From the limited information we were given—a first name and a last name were the only things that were given to us—we reached out to—
We did try to contact them multiple times, asking for additional information, pre- and post-interview. We didn't receive anything. With the limited information we did have, a first and a last name, we were unable to locate, in any of our forms or any of our emails, that we had received contact.
Once again, we are not insinuating that what they're saying is untrue. We are willing to look into it. We've also reached out to counsel now to get more information: what email it was sent to, whether there was a video—
You've been part of the company since 2008. Is it fair to say, though, that if the changes you made in December 2020 had been put into place years ago, this instance wouldn't have happened, and GirlsDoPorn wouldn't have happened, and women victimized on your site, young women principally, wouldn't have been victimized on your site?
You made changes in December 2020, and I appreciate those changes, but had you put those changes in place earlier, we wouldn't have seen the victims we see today. Do you think that's fair?
I respectfully disagree with this notion. We've always been improving our procedures. Yes, our system is not perfect, like any other video-sharing platform, adult or non-adult. Just recently, two weeks ago, a publicly traded video-sharing platform got sued by somebody—
My last question is, why the changes now, in December 2020? You said one instance is too many. You have many women reaching out. It can't be just Serena. You have your 2020 transparency report coming. Why these changes only when MasterCard and Visa say they are going to stop participating in an arrangement with you?
We started contacting a very reputable law firm in New York at the beginning of 2020, so a year ago. We signed a contract with them in April 2020. They came up with recommendations to improve our procedures. This has been in the works since almost a year ago. Yes, it was implemented in December, but it's been in the works way prior to these incidents.
My first question relates to the 2018-2019 numbers. I would also like you to provide us with a copy of your terms of service before the revised version of December 8. Can you provide us with it shortly? We would like to see those terms and conditions.
You talked about privacy. Let's talk about the past. I understand that you are concerned about the content. Furthermore, there's the issue of uploading. Unless I'm mistaken, when an individual makes a request to remove content and does so within what I think is an unreasonable timeframe, that is, one week or two, that content may once again be found because another user—
Excuse me, Madame Gaudreau. I'm not sure if other people are experiencing the same thing, but there is overriding translation, as well as the floor channel, that seems to be going on the same channel. I'm not sure if there is a technical problem there. I'm wondering if our technical experts could just verify that.
Were other members experiencing the same difficulty hearing?
Mr. Tassillo, here's what I would like to understand. Even if you look after the content on the platform and you remove what was on the site, a user could have already downloaded it, become a member of the platform and repost the content.
You'll probably say that's what used to happen, but what about today?
Thank you for the question. I'll be replying in English, if that's okay. It's more familiar for me.
I'll take a couple of steps back. There are a couple of different ways to have content removed from the site. Some ways will have the content removed immediately. Some ways, it gets sent back to a review team. It depends on the path the end-user would take.
If you are an end-user to the site, just a visitor to the site, and you might be watching a piece of content and for whatever reason you feel the content shouldn't be there, you can actually flag the content. Once the content is flagged, it is sent to our team for a re-review process. This is obviously a re-review, as before the content made it out to the site in and of itself, it went through the process that I briefly described before.
If you so choose, you can also use a content removal form.
You were saying that you had very specific measures. Your technology cannot detect content that had to be removed either by your moderators who saw that it was child pornography, or as a result of a request from someone claiming that the content was non-compliant and had been made without their knowledge.
What technology do you use to ensure that content will not be reposted, content that should not even have appeared and been removed because it had been flagged?
Specifically for the re-uploading issue.... I'll box it in; I'm trying to be respectful of your time.
For the re-uploading process, when a piece of content is taken down by either one of those paths, we now automatically create a digital fingerprint of it, so when someone attempts to re-upload the content, it will get blocked at the upload phase.
Obviously, no software in the world is perfect, and that was something we outsourced—
The software was always available. In early 2020, I believe, we made the process automatic so that it was automatically added to the database, but the end-user always had the ability to use it. We decided it's one thing we shouldn't ask the end-user to use, so we added it automatically.
Even at that, we saw that the software wasn't performing to our standards, and that's why, about two years ago, we started creating SafeGuard, which we believe is much more effective in being able to—
I have a question for you, and I won't be able to sleep well tonight if I don't ask it.
You said that you are parents. So am I. We can talk about business, profitability and accessibility. We can also talk about the fact that this is entertainment for consenting adults, so there's no connection to minors. What we are seeing today is that you are making changes.
I would like to hear you talk about your conscience as parents. It's not like this is financial fraud, but these events have an impact on entire lives. I'd like to hear what you have to say on that. I'm not talking to businessmen, I'm talking to individuals who wanted to provide entertainment for consenting adults. It's a thin line and there were rules that you had to follow; clearly, you have failed to do so over the past few years.
How do you experience this personally?
I'd like to hear from each of you for the minute and a half that I have left.
At a personal level, I am obviously horrified by even one instance of it going wrong. It's why we put so much effort into constantly upgrading our systems. If you took a snapshot of the evolution of our policies at any even given time, compared to 12 months later, it would be like, wow, we've come so far.
That's why we've worked so hard over the last year to sign up all these different organizations and trusted flaggers. We're the first adult company to break into that barrier to create partnerships with NCMEC and Internet Watch Foundation. It's something we pride ourselves on.
Mr. Antoon, I want to follow up on my colleague, Mr. Erskine-Smith, to clarify how you deal with criminal content. You say you have no record of Serena Fleites, who was a 13-year-old girl trying to get content taken down from your site. She says she tried to pretend she was her mother because she didn't want her family to know, and that your organization insisted she provide photographs and evidence of who she was.
This is the thing. Either you're suggesting Ms. Fleites lied...and I would say that would be a really dangerous thing to do because, in 17 years of hearing testimony, I found her to be a very strong witness. I would assume it would be fairly straightforward in your records...because we're talking about criminal activity.
I'll ask you this. Under your legal obligations in Canada, a company that becomes aware of content that is hosted on its servers must report that to the police. Do you have a record of reporting anything from that time to the police about a 13-year-old girl who said her images were being used on your site? You would have a police record, wouldn't you?
I can take a step back and explain it. There's an insinuation that the information isn't available. The information.... We might have records of this. We have never said that she's lying. We just don't know. What Feras was trying to explain is that with the limited information, a first name and a last name, she might have—
No, because I don't have much time here. It's a question of whether or not you reported it to the police, because those are your legal obligations.
Let's talk about 14-year-old Rose Kalemba, who was kidnapped and tortured and raped. Her video was on your site, and she begged you to take it down. Do you have a record of Rose Kalemba begging you over a six-month period, trying to get the rape and torture video of her taken down off your site?
If she sent any email it will be.... Well, we do have an email retention time, but if there is a record.... We have basically knowledge of the video, when it was live and when it was disabled, so we could go backwards and reconstitute the time from when the video was up to when the video was taken down and we do keep all the take-down notices, so we could reconstruct that. That's what I was trying to reference the—
Okay. My issue here is that we're not interested in adult pornography, in what adults do in a room. That's not the purview of the committee. The committee's job is whether or not big tech is respecting its legal obligations. Ms. Fleites has a statement that she begged you to take it down. There should be a record that this was reported to the police, because you're legally obligated.
Now, in the case of Rose Kalemba and her torture, the videos were listed as “teen crying and getting slapped around”, “teen getting destroyed” and “passed out teen”. Your moderators viewed this. You told us that every video was viewed, so you viewed this. Wouldn't you think that someone in your organization would have said that a video of the torture of a 14-year-old girl would be in contravention of subsection 163.1(3) of the child pornography law, which means that it's a 14-year crime. Why did it take six months to get it down, and do you have a record of having dealt with this woman? Why was it kept up there when you know this is a criminal activity?
If there was a time when the content was up there, it would be taken down as soon as we were made aware that it was against our terms of service. For this specific incident, I'd have to go back and verify the exact timeline. I don't have it.
My question is this. Ms. Fleites was 13 years old. She testified to our committee the same as you, which would be like being under oath, that she begged you over a six-month period and that you made her send photos and proof before you took the video down, and it was reloaded time and time again. Ms. Kalemba said she finally had to pretend she was a lawyer after the videos of her rape had been seen over 400,000 times.
If in those two cases that was the situation, how can you come here and tell us that “immediately, as soon as someone raises this, we take it down”? I'm going to ask you this question. Will you give to our committee the reports of how many incidents you actually report to the RCMP, which is your legal obligation in response to these crimes committed against children? If you're following the law, that should be a pretty straightforward thing to give us.
In any instance where we've felt we were doing wrongdoing, we 100% would. As an individual I feel horrible for her, but as a corporation right now we are uncertain. This is what I was trying to get to in a previous question.
At this time I do not know. We don't know if the content was there. We don't know the timeline. We don't have access to the video because to this day we have not been given enough information to identify the video, the claims that were being made because she said that she sent emails. I am not saying that any of that is untrue, so please, I don't want to be misrepresented on that. However, at this point did she—
We purposely block all of those from our searches. We have a running list of banned words that we use, which basically stops those terms from being searched. Even if the term is not part of our banned list—and I know this was an object of one of Nicholas Kristof's articles, the “14yo” and “14 yo”—we had actually blocked “14yo” from actually returning any results.
In one instance when it was “14 yo”—I can't remember which one it was—it did return results, but just because you're asking for the content, it doesn't actually mean that the content is there. It's a textual-based search system, so it will see the word “14” and it will see the word “yo” and it might return something based on 14, like “Dave's favourite videos, volume 14,” and the “29 yo” because it's textual based.
The system is not designed to detect intent, but as soon as we do see these variations, what we do is add them to the database so they can't be searched for. On top of that—
At this point they were manually done; they weren't automatically done, so they were done ad hoc. As we were made aware of them, we would send them off.
What we did in 2020 was to streamline the system to make sure that everything would go through, and it was all consolidated and everyone had a path. We built it into our content management system and then it went to—
Under this law, starting in 2011, you would have to report this to the Canadian Centre for Child Protection, not NCMEC and the RCMP. Failure to do this would result in jail times. Are you aware of that?
I, too, as a woman, as a mother of two young ladies, like they say in good old French, was estomaquée to hear the testimony from the young woman to our committee this week.
I reached the following conclusion in my mind: It's unfair that the onus is on children to have to go to great lengths to be able to report to you what is going on. Even when they do, you—and I listened carefully—said you have a review team that goes through these tapes, and then you say that you have to revise them—re-review again, Mr. Tassillo—again.
This is morally wrong. I'm not even looking at it in terms of legislation. This is morally wrong. Why is it that the onus is placed on children? It should be up to you to be able, as soon as you get a video, to decide and make a decision to not even post it. How do you go from just posting whatever video you get and then turning the onus on children to plead with you to be able to take it down? I'd like to hear you on that.
Maybe I can step back and explain the level of responsibilities enough to make it clear to the whole committee.
Level one responsibility is that when somebody uploads a video to our website, he or she agrees to comply with our terms of service, which clearly state that he must have the consent, and that he has to be—
Ms. Lattanzio, I'm going to address you more as an individual than, as you guys are seeing us right now, as a business.
What we do is we take this extremely seriously. I know it just sounds like words, because this is the first time the committee and everyone else has heard us. Since the beginning, since the inception of the company, we've always worked on different ways to make this better. I agree with you, as a parent, and I agree with everyone on the panel. I remember—
I would never put the onus on the children. We are putting the onus on us. That's why we invest millions and millions of dollars every year to avoid this getting on to the site. If that video gets on the site, the way we view it is that for everyone who would view that video, I have now lost for the rest of my life as a customer. We try to take—
Mr. Tassillo, how much have you paid in legal settlements with regard to both children and adults for distributing their images to the public since 2008? Can you give us an amount of money that you've paid out in legal settlements for this kind of mishap?
We've created all of these different processes. We've integrated all of these different softwares. We have the human moderation team. I believe sometimes we're getting caught in some of the details. I understand the frustration you all have, but this is a problem that's bigger than just our site. This is a problem with how on the Internet people are misusing platforms. We're trying to create a safe environment for people to consume adult content, and we understand there are people out there who are trying to misuse these platforms.
Our standards are very clear. We will not allow anything to do with CSAM. We want nothing to do with anything non-consensual, and then we have a bunch of sub-standards that go in very granularly into a bunch of different niches. I don't have them all off the top of my head, but we have very specific standards that our human moderators follow.
I actually believe that if Pornhub was to shut off its lights tomorrow—which I understand how people looking from the outside think it would be a much better solution—I think it would be a horrible situation for people around the world and the 10 million Canadians a day who come to our site. We are the only site that puts in place these different processes and software, puts in place the human moderation, and creates partnerships with over 40 trusted flagging organizations, including NCMEC and IWF, so that the content can be removed. Our being there or not will not change the demand for the product, but we are the safest place on the Internet right now to consume adult content. If they weren't coming to our site, then they would go somewhere else that has absolutely no regulations, and I believe, both as Canadians and internationally, that we would be in a worse position. As a parent, that would be my position.
Gentlemen, I would like to continue to hear what you have to say because, at the moment, the situation is being downplayed on the grounds that the platform is being managed properly and that it is also the fault of fraudsters, of users who are not conducting themselves properly. I don't want to hear about your advantages as managers. I can understand that your model is very lucrative. I want to hear about your conscience, about how you fall asleep at night thinking about all those parents and victims. I have not checked, but you are saying that, as we speak, it has all been apparently fixed. That remains to be confirmed.
How do you feel in this context where the consumption is a global phenomenon, as you say?
How do you live with the fact that lives are affected?
First, we are very sorry if this has caused any impact on victims.
We are very proud that we built a product that gets 170 million people visiting a day, four million Canadians, 30% of them women. Don't you believe if those four million Canadians who come to our site every day saw something so heinous and criminal, they would be calling the police? Wouldn't the police lines and those of the RCMP be ringing non-stop? We created a very good product that I and our 1,800 employees who have families and children are proud of. It is not perfect. Yes, there is a tiny, minute concern—
Mr. Antoon, the only thing I wanted to know is that you are fine. You have a clear conscience that you are doing everything according to the current rules, in 2021. As lawmakers, our job is to make sure that any loophole or loss of control on the Internet is unacceptable.
So you are telling me that your conscience is clear.
I obviously feel awful, and we feel terrible, about any kind of illegal content that makes it to the site, especially when it involves children. That's something we would never want to happen and we've really worked very hard to prevent it as much as possible. As my colleagues have mentioned, we've taken a lot of steps that no adult company has taken. A lot of those steps, even the most recent ones, very few social media or video-sharing platforms in the world have done to try to avoid any of these kinds of situations from happening in the future, because it is awful and—
I'm very interested in the corporate structure of MindGeek because my understanding is that it's a Canadian company. You have 1,800 employees, 1,000 of whom I think are in the Montreal area. Who owns MindGeek?
At the beginning of 2010, MindGeek was called Manwin. It was owned by a German gentleman. His name was Fabian Thylmann, who resided in Europe.
In 2013, he sold the company to a group of people, and I'll walk you through them. When he sold it in 2013, we became MindGeek. David and I are Canadians, residents of Montreal. We are minority shareholders of the company. The majority shareholder, owning over 50% of the company, is a European national residing outside Canada.
The structure of the company has been European for 10—
What I found concerning when I was reading your terms of reference is that if people have complaints, for example, Serena Fleites, Rose Kalemba or anyone else, they have to go to the courts in Cyprus. You are here in Canada. I'm concerned that you would think you could avoid Canadian law here, especially if we're talking about child abuse or non-consensual acts.
Don't you think by telling someone in your terms of reference that if they don't like it, such a 13-year-old girl from California, to take their case case up in Cyprus, you're putting up more barriers, putting the onus on the victims, the survivors? That to me doesn't look like a corporation that's trying to do some good fun stuff with adult entertainment and make sure they're going to protect the survivors. Do you think the Cyprus thing would stand up in a court of law?
On your specially trained experts, I think that's a really important question for us. Could you get us the training manual? Mr. Bowe said they were formatters, not moderators. I think it's really important for us to get a sense of how many you have and what training they have so that they can actually identify the horrific videos we've referenced, and whether or not these videos are consensual. Could you get us those training manuals?
There are a couple of things in there. I'll try to address them quickly to be sensitive to your time.
The content formatters are in a completely separate team. Those are not the individuals who actually do the screening of the content. That's a separate team that's actually located in Montreal. They work with different content providers to work on enhancing the videos and stuff like that. They have nothing to do with the compliance team.
No one at the company is actually allowed to work on the content until it passes through compliance. That was just a misunderstanding.
As for the manual, that is an internal document that I believe is best kept internal. It is a constantly evolving document. We'd like to stand behind what people see on the site, as those are our real words and those are our real actions. It's constantly evolving at every level.
They said they wanted to keep the documents internal, but can we have a discussion about our rights as parliamentarians, being that this is a Canadian company and a parliamentary investigation, and whether we can obtain those documents?
As the shadow minister for women and gender equality, I'm disgusted by what I have heard from both of you, Mr. Antoon and Mr. Tassillo, but I'm going to ask my question.
Please try to answer this. Social media platforms all utilize tags and hashtags to categorize content to help make content easier to find. MindGeek websites are no different. Why then is MindGeek blatantly ignoring tags that are used for rape, underage children, with the tag “CP”? For example, one of the videos on your site entitled “Short video of my school ho, young dick, had my bush a trim” used the tags “middle schooler”, “young boy” and “boy”. These are the things you talked about today, how you have a process in place, and yet you're allowing this illegal content to be searchable on your site.
You, as the site operators, have responsibility to protect these vulnerable individuals from exploitation on your platform. You spoke a lot about the process, but it's failing. Tell me about it. How did this video make it to your website?
Thank you for your question. I understand your concerns.
For that specific video, I'd have to actually go back and take a look at it and view it. You bring up a point about tagging systems and how they work across the Internet regardless of the site. In that same negative database of terms that I made reference to, which is in the works, there are over a thousand terms now, and then another couple of thousand sit on top of that, which are [Inaudible--Editor] words that could have either positive or negative intentions, depending on the way they're used. None of those words are available to be used as tags or as categories, and are not allowed within the titles of videos.
A second point you brought up, which I understand could be confusing, or which could “give the wrong message” is probably a better way of saying it, is when you see certain terms such as “teen”. That term has created a lot of controversy on the site. When you're using the English language in its normal way, “teen” is used as for someone 13 to 19 years old. That's the demographic that's put into your head.
In the adult world, when people say “teen”, they're actually referencing those who are 18 to 25 years old, or 18 to 27, something in that range. It's similar to how, when people are having a sports conversation, they will use the word “GOAT”, versus the traditional use of the word of “goat” when you're referencing the animal. To be more specific, we want to make sure that we don't allow these things to be misused, even when we have “teen” as a category, because it is an allowed category in the same way it's allowed on any of the other platforms outside of the Internet. Whether it be on television or Bell ExpressVu and all of these things, there is a category that's called “teen” because it's a well-known category within the space.
We actually label it “Teen (18+)” just to further drive home that if people are looking for this, we don't want it on any part of our site. If someone even does a search on our site looking for “14” or “15”, obviously no results are found.
We're actually launching a project that's coming out I believe this week. My apologies, it's already in place right now with The Lucy Faithful Foundation out of the U.K., whereby we're not only not returning results, but are putting deterrent messages similar to what Google and others have instituted over the last couple of years. I don't have a timeline of when they did, but that was a great thing we learned from our counterparts who are also combatting the same issues that we are.
In the brief you submitted to the committee, you said that your business is similar to that of mainstream social media. You also highlighted that your subsidiary, Pornhub, is one of the world's most popular websites. You claim that a MindGeek employee visually inspects each and every piece of content before it is uploaded. You say MindGeek employs 1,800 people. According to your own report, 2.8 hours' worth of content is uploaded every minute to your site, which means over 160 hours are uploaded in one hour. Over the course of each of your 1,800 employees' standard 7.5-hour shifts, 1,260 hours of content are uploaded.
How is it possible, even if every employee was dedicated to content moderation, that they would be able to review 1,260 hours' worth of content?
I agree that in terms of pure linear math it seems an impossible task to do, or impossible to do efficiently. The way we do it, irrespective of the amount of content, is that the content will not go live unless a human moderator views it. I want to assure the panel of that.
However, the content comes in to different buckets. It comes in from content partners. These are studios that are usually in the U.S. They are producing content professionally, and they include 2257 documentation. I'm not sure if you're familiar with the law, but that's a law that stipulates that content produced in the U.S. has signed documentation, release forms from all of the individuals performing in it, and all of the appropriate IDs. When stuff comes up through that channel, it can be viewed a lot quicker because we know that the appropriate documentation is available from the producer who is uploading the content. When we have stuff that's uploaded through the model program, a lot of times it's solo stuff so it can be flipped through a lot faster.
The compliance team is instructed, essentially, to spend as much time as needed to verify that a piece of content is okay. They are always instructed to err on the side of caution, and we tell them, “If you're at all worried, it doesn't really matter. Just don't ask questions and don't put it up.”
First of all, I want to thank all of the witnesses for appearing today and giving us some answers—under oath, if I may remind everyone of that. It's an important issue that we're talking about today.
On Monday we heard from a very brave young woman. As a parent, I can't imagine the amount of courage that is driving her to come here in public to shine a light on the operation of your company. Today we're here to find out why. What made her work up the courage to take on a giant like your company?
I would like to start by asking the three gentlemen in front of us today. You're all quite successful. How much do you make, each of you? How much do you make, Mr. Anton, Mr. Tassillo and Mr. Urman?
That's okay. I'll ask the chair if we can talk about this later and if we can add this to the list of documents requested.
I also agree with my colleague, MP Erskine-Smith, who commented that he highly doubts that you don't know the number of complaints received in the early years. I respectfully ask the chair to add that to the documents requested as well, since we would like to see those numbers.
I don't have the number off the top of my head, but I would like to assure the committee that we have audited financials, consolidated for the entire group worldwide, done by a third party, and we have been doing this for 10 years.
If you could submit that information to the committee, that would be great.
Can you tell us how many videos and images relate to or are tagged as child pornography or have non-consensual acts in them? How many videos? I think that's relevant to what my colleague, MP Erskine-Smith, was asking. Do you have any numbers?
That's odd. In the articles we read, they said that if anyone goes on Google and types in those tags of relevant terms, they're provided a link back to your platform. People will click on that, and they'll see those videos. Is that true?
It's a bit of a technical explanation. That's why I'm going back to the site.
If someone actually does make a search on the site for “Dave”, as an example, and they keep searching on the site for “Dave” over and over, irrespective of whether the content about Dave is on the site, because of the integration done with Google analytics and dynamic searches, Google then indexes that people are attempting to search for that, irrespective of whether the content shows up. Because we're an authority in the adult space, if you follow almost any keyword with the word “adult” or “porn”, basically Google will index.
Now, when you're searching for that word on Google, it might dynamically fill in your search query as if Pornhub has replied. It'll send you to Pornhub, but it doesn't mean that content is actually there. Now, if it's a banned word—
I am the father of four daughters and grandfather of three granddaughters. I am disappointed with the witnesses before us today. They seem to trivialize the situation and want to defend their business at all costs, and they are doing a very good job.
Here, all parties are unanimous. Faced with the magnitude of the problem, all parliamentarians in Canada are affected right now. I'm not sure whether the witnesses are aware that their site can cause collateral damage to young teenagers who are caught in a maze with no way out; they don't see the light at the end of the tunnel. This causes major problems for those kids. It leads to depression, runaways, and in some cases, suicide.
We may never be able to connect the triggers. Your site is probably a trigger for major societal problems. We, as lawmakers, won't be able to keep our eyes closed on the collateral damage you cause for money, just for money. You have set up a site that provides mediocre safeguards, and I'm sure that you have spent more money on legal counsel than on protecting teenagers.
If you still have some ethics and honesty, I would ask you to provide the committee with your budgets for site security, the number of people working on security to protect people who make complaints, and your budgets for legal counsel.
Those working for your company are robots. They are robots who post and repost the content. They sometimes prevent certain content from being posted, but when that content makes money, the robots put it back into the system or accept it. This is inconceivable, it's just to make money. You're not protecting Canadians, our teens are getting into something they cannot get out of, and their lives are being affected. If you still have any ethics, set up a program to help them. When a teenager calls you to say that a video has been posted without her knowledge, that she doesn't consent and asks you to remove it, remove it.
What are you going to do to get rid of those videos?
Mr. Gourde, I genuinely as an individual, and as a parent and just as a person, understand your frustration. I genuinely do.
I'm going to try to address each piece of the question that you had.
We do have all the systems in place. Well, you will never have it all. It's always going to be an evolution. Right now, an end-user, if they do see something on the site—I want to reiterate—they can fill out the form and the content will be disabled. There is actually no human intervention. You could go right now to the site, fill out a content removal form, and the content will be removed immediately. I can't stop it; Feras can't stop it; nobody can stop it. It will happen on its own.
We are not making any attempt to make anything difficult for any end-user to take anything down. We understand the responsibility we have. We take it very seriously. We will continue to, and we will continue to add new features.
That's one of the reasons why we made this large step we did in December to change it to deter people further from misusing our platform. We made it so that if you're going to upload anything to the site, I need to know who that person is. We are now making it obligatory, for anyone who uploads to the site, that we have to have the government-issued ID of the individual uploading to the site, so that if someone does misuse the site and does use our platform to commit a crime, we are able to help law enforcement get to the bottom of it, irrespective of where they are in the world. We keep this information now. And even prior to this, we always worked with all law enforcement.
I know we keep going back to the testimony of Monday. We will continue to look into this investigation as more information is made available to us. We just cannot track it down right now. We're not saying it's not true. We just can't track it down right now.
As for the amount of money that we put into fighting these issues, the number is large. I think last year—I'm saying this as an estimate; I'm not 100% sure—it was roughly $10 million Canadian, and it continues to grow every year. We will continue to invest money into it. We're always looking for the best place to put the money.
We're working with a new provider that we found in the last 3-4 months that is able to work on even the comment engine to see if people are putting in negative comments and use that as a lead to potentially trigger that there's something wrong with this piece of content. There have been instances in our past when even our human moderators—because we do go back and check the comments manually; we don't have an automated system—actually caught it on the evidence of a comment, someone saying something like, “This person looks young” or “That doesn't make sense.” We would review it and take it down if we felt that was the case.
It's easy for many of us to personalize today's committee meeting. Obviously many of us are parents. Mr. Gourde is a parent of four daughters, and I have two beautiful young daughters at home. It's very easy to personalize this and understand why we're here and why it's so important we're here. And that goes without saying.
And you folks, Mr. Antoon and Mr. Tassillo, your company has a responsibility. You don't generate the content, but your site—and if it wasn't your site, it would be someone else's site, because someone else would step into that business—has a responsibility of either greenlighting the content or redlighting the content.
I believe Mr. Tassillo mentioned that any content now that is generated must have a human touch to have it then placed on the site. Is that correct?
In your terms of service, which I read through, it says very clearly that it does not allow users to “post any Content that depicts any person under 18 years of age (or older in any other location in which 18 is not the minimum age of majority) whether real or simulated”.
Obviously, there are situations that have arisen and we know child sex exploitation is an issue across the world, whether it's in human trafficking or other shape or form. It's not just an issue with Pornhub or MindGeek or whatever entity of the complex structure you guys have in place, but it's an issue across the world.
But you folks have a very special responsibility to either stop as best as you can having this content being posted or, if it does get through whatever filters you have—and in my personal view the filters were not as robust as they should have been, if I can use that language, were not as strong as they should have been—you then have a duty to remove that material. This is because when that material was posted—and you were talking about the U.S. material that's generated—you should have known, you should be able to verify before the content is posted who these individuals are, and if you can't verify it without a driver's licence or without any sort of ID, you shouldn't post it.
In a purely linear world, yes, that is a fair statement. However, this is not an issue...and I commend you on your understanding of the way the Internet works, seeing that it is a global issue and not just a single isolated issue to MindGeek. If we were to put those practices into place today, it wouldn't stop the problem; it would just have it move to other places. Everyone has to work together to get to a solution with this.
You've thrown out some numbers. You said that 14 million Canadians will visit the site today, and you have every single one of those IP addresses. You know that every platform, whichever one you want to name out there, knows its users better than we know ourselves. You have all that information out there. It's your duty and your responsibility as citizens, as business folks and as individuals who have a moral compass—because we all have a moral compass—to ensure your business practices do the right thing. That's where I'm coming in.
I understand your business. We get it. We get what's out there on the Internet in this digital world. We're here digitally. We get that, but you guys have failed, in my view, in terms of your filters and your ability to respond to concerns that have been put to you by minors, because in no way should that content be on the Internet. We don't want it on the Internet, and it shouldn't be on your site. You have 1,800 employees who work for you—1,000 in Montreal and 800 in Europe. In no way should that content get onto your site.
In terms of your structure, why is your headquarters in Luxembourg?
I don't know off the top of my head, because we restructure once in a while. We use third party companies that give us advice. The structure is not complex, to be honest. It's just that we have many products, and the structure is built on the advice of third party professionals.
I'm going to be honest with this; I'm really not the accountant behind it. What I can confirm right now is that this is a legacy structure from the previous owners, who were European, and when the new ownership came into play in 2013, it was just carried on and the individuals—Feras and I as minority shareholders and the majority shareholder being a non-Canadian resident—continued the structure moving forward. We bought the majority of the entities underneath.
I think my colleague Mr. Sorbara's question was with regard to the financial flow, and my earlier request.... I would like that to be added to the list of requests, because this is very important. We are talking about a company that's making hundreds of millions of dollars on the back of—
Gentlemen, I have a number of questions for you. I have taken a breath and I am fully aware, as a legislator, that we have a huge responsibility when it comes to protecting personal information.
Bill C-11, which seeks to protect digital privacy, will be before the House shortly. While we have good intentions, fraud is on the rise. In the Internet age, there is clearly a loss of control. This is our job. I'm going to stop lecturing and I'm going to ask you some questions.
Here's what I understand. An individual who wants to do business can use your system and share content using Viewshare mode. The more views there are, the more revenue they will get. So they can upload content to the site, but that content can be removed quickly, at least that's the way it used to work. In the meantime, some individuals may have already downloaded the content, and may be able to edit it under a false name. For some reason, your high-tech Safeguard system cannot identify those individuals. If that content has been viewed a number of times, it could end up on the platform, even if it has been removed, for any reason that does not comply with the conditions.
Am I to understand that uploading and downloading is a big problem with your model?
I'll try to answer as quickly as possible, because there were a couple of questions.
For the first part, you made reference to our Viewshare, which basically would allow people who had put content on the site to actually partake in it. A person whose identification we do not have or who is not part of the model program actually cannot partake in that. It's not like a random person can upload something to the site and get paid for it. The system doesn't work like that. You'd actually have to be part of the model program or part of the content partner program, in which case we'd either have signed contracts, the 2257, or have the identification of the individuals who actually uploaded it.
I wanted to ensure that at least that was understood. You can't just put it up and get a cheque randomly for anything you put up on the site. The system doesn't work that way. It would just be too open to fraud.
To your other concern, about the upload, removal and then re-upload, as we said before, we were dealing with a third party vendor for many years, and we are still dealing with that third party vendor. What we saw is that over a long period of time, with many variables in the video exchange—which could happen when you re-encode it or basically reprocess the video—it could be harder. When we created SafeGuard, we had that in mind.
We actually do a frame-by-frame analysis. Basically, in one second you have 30 frames. We actually analyze the frames, so if we have to reconstitute the image.... Now, obviously there are algorithms and stuff like that, to enhance it. We saw it as an issue, and that's why we started developing this two years ago. We're finally ready to get it out. We're using it on photos already. It will be made available for videos within this month. Then we're going to make it available to any other website on the Internet that wants to use it. We not only saw an issue with that, but there also wasn't a centralized—
Thank you so much, gentlemen. I'm very pleased to hear you talk about how one of these horrific child abuse videos is too many and how you share our concern, but in October 2020, Pornhub was asked by a respected Canadian journalist about the allegations of child pornography on your site, and Pornhub's position was that they were “conspiracy theories”. That was repeated again, I believe, in December.
It was when MasterCard and Visa threatened to pull out their support and you had to flush 80% of your content that we started to see these changes.
To say that this was conspiracy theory, I think, is a real disrespect for the families who have gone through this, because your link searches before the changes included “13-year-old”; “12-year-old”; multiple variations of “middle school”—and in Canada, middle school is grades 7 to 9—“assault”; “drugs”; “exploited black teen”; “drugged teen”; “runaway teen”; “homeless teen”; “abused teen”; “teen destroyed”; “teen manipulated”; “stolen teen sex tape”; and “crying teen”. Each of these videos would have been viewed by your team of experts and given the flag to go ahead.
I want to go back to the question I asked earlier on subsection 163.1(3) of the Criminal Code, which says that it is an offence to transmit, make available, distribute, advertise or sell child pornography in any of these forms, and it is a 14-year sentence. At any point when you were promoting these links of 12-year-olds and runaway teens, was there a conversation that you were actually breaking Canadian law?
We're just going off the textual nature of what you've given examples of. If there was anything with “12-year-old” in the title, that would be immediately prohibited out of the gate, irrespective of what was actually depicted in the content.
You told us that people view this and document it. There should be a system in place, because what is being promoted—or at least was, up until December—was criminal activity that you would be culpable for. Are you saying that there was—
I'm not saying that they don't exist, and I'd actually like to assure the committee that of all the videos that have been suspended in December, none of them have been deleted. Everything has been preserved. We still have all of that content available.
I hardly can even understand what is going on here. There are just a couple of issues that I guess I want to clarify.
You keep comparing yourself to other social media platforms and tech companies. I think the key difference is that every single one of those platforms explicitly banned the content that you profit from. There's that issue.
I also find it shocking that you would come to this public committee after it has been public that—and we as members of Parliament know without a shadow of a doubt—content of child sexual abuse material, non-consensual material and human trafficking material has been present on at least one of your at least 48 subsidiaries. How it could be that you've come to this committee and not actually know your terms of reference and not be able to answer those questions is just mind-boggling to me.
I guess I have a few more questions about your moderation and your content. You've said that you have MindGeek moderators, which we actually understand are called “content formatters”, which turns one's stomach, doesn't it? You've said that those content formatters view and approve every single video and approve each piece of content. Do you agree that MindGeek content formatters viewed and approved every example of child abuse and non-consensual content?
That is correct. They allow adult content on all of those platforms. I'm not trying to at all paint a bad picture of those companies. I think they're both good companies, but I believe they face a lot of the challenges that we do.
In regard to the content formatters versus compliance, I believe I addressed this, but I'll revisit it. The content formatters team is a completely separate team that has nothing to do with the compliance agents. Content formatters are not allowed to interact with the content until it is actually screened by the compliance agents. Once stuff is allowed to go live, then the formatters are able to interact with the content.
They are two separate teams completely. It was a misunderstanding. I believe it was cited in an article somewhere. I read one of the articles; I just can't remember which one it was.
Can you explain why, and how, did MindGeek verify, in the Modelhub program, multiple confirmed underage victims, including the 15-year-old from Florida who was abused in 58 videos, and why a trafficker from Tuscaloosa, Alabama, who was verified under the name BigTankDog, was abusing a 16-year-old, in your verified program?
The IDs that were provided were, from what we could assess, legally issued identifications. At the time, we understood that they were 18 years old. I don't want to be on record saying the wrong thing, because those are very specific cases, but every time you're part of the model program and you have submitted videos that are to be monetized, you automatically have to give identification.
The primary performer is the one who provides the identification. If we believe the secondary performer looks underage, we would request the secondary performer's ID. The primary performer, or the account holder, provides their identification. We have the identification of these individuals. I believe—I'm saying “believe”, because I'm not 100% sure—that in at least one of those cases, we're working with authorities to provide the information requested.
The owner of the account also attests that everyone else performing or appearing in the piece of content is of age and has provided the necessary consent to allow for those videos to be uploaded. You can't just put something up.
It is possible that people committing crimes are able to circumvent our systems. It's similar to security on a home or security at a bank. You put multiple levels of security and deterrents in place, but if a criminal still wants to commit a crime, they will try to circumvent the systems. We are constantly adding new systems and better systems and trying to further it to not only deter but to stop.
I mentioned this previously. The content on our site—
No. This has been a constant evolution since at least 2008, I want to say, when we started doing moderation. We actually were doing moderation when everyone else, like Facebook and the others, were not, because they believed it could cause them to lose their DMCA “safe harbour”. Now that time has evolved....
We were still willing to do it, because it was so important to us as a business that we wouldn't allow CSAM on the site. We did it even though we thought we could lose our DMCA safe harbour.
I find that very interesting that you consider your financial information to be private and confidential, in view of the business that you're in.
Mr. Antoon, it has been reported that you are building a sprawling mansion in the north end of Montreal, that you and your wife own two other large properties in Quebec, including a compound in the Laurentians, and that you drive a Lamborghini Urus. Are these assets yours, or are they owned by MindGeek or one of the related companies?
I don't understand how what I own is relevant to this committee. Number two, I don't think it's illegal to own a house or a property. I don't own anything in the Laurentians. If you look at the media reports—
For the record, Chair, I know that the request has been made already, but if this committee can work toward obtaining the financials of MindGeek and its related entities, as well as tax information, I think that would be very useful.
I'd like to give the rest of my time to Mr. Erskine-Smith.
If I were a CEO of a company and I were very concerned about any single instance of harmful content on my platform, I might have taken this a little bit more seriously, because in that article it was reported that employees flag content so egregious occasionally that they recommended contacting the police. But two former formatters said that they were discouraged by managers from doing so, and one was told not to bother since uploaders are typically anonymous and unlikely to be identifiable.
That's a perfect example of journalists not understanding whom they're interviewing. Formatters do not report to the police. Formatters do not review the content for, as David just explained many times—
—and if those comments that were made.... I don't know who the employee is. I don't know who made the comment. If I had additional information, I would definitely look into it. If the committee has access to information, I'd ask you to please hand it over.
Mr. Tassillo, you can maybe help me because you seem to provide more detailed answers. Mastercard said in a statement that they found evidence of “unlawful content”—their words, not mine. What was that content?
The content in question was actually something they deemed could potentially be depicting something that would be non-consensual. When we reviewed it, the content should not have made it on the site. It actually made it up through an edge case by making it through not only three of our softwares, but two other softwares that we actually had that were running against the databases. Since then, we've plugged it.
We understand, and it goes back to what we said: Even one example is too many, and that's why we constantly work to identify these edge cases and plug them, so they can never happen again.
In that same Globe article, they identify a user who did upload questionable content. When it was flagged to Pornhub by The Globe and Mail, the content was taken down.
When a verified account holder uploads problematic and harmful content, what happens to that account, and how does Pornhub and MindGeek confirm that the individuals—not the account holder—in that video are providing consent?
It goes back to the process of the upload that we were discussing before. When the compliance officers are running through the actual piece of material, they look for any sign of duress or anything that would insinuate that there is no consent. It would obviously be impossible to have “I consent to this video” before every single video when the content's being uploaded on the site, so we look for any signs that consent would not be available. If that were ever depicted, that's the type of content that wouldn't make it on the site. In the case of—
We do. I was making reference to when you're actually watching the content in and of itself. At the time of upload it's explicitly asked that the person doing the upload has the consent of everyone in the video, and that everyone in the video is of legal age, which is 18. There might be places in the world where it's not 18, but we use a governing law of 18 irrespective of where the content's uploaded from.
Every time there are these kinds of reports, we reach out to the journalists and ask for more information to help us identify and remove the content. That journalist could easily have flagged that content with just one click of a button. They did not flag it. They did not submit the information to us. Today, if I don't have this information, I cannot say if it's correct or not.
Once again I go back to what I told the committee before. This content was suspended. It hasn't been deleted. If there is anything, law enforcement is still able to ask for any piece of information on any one of those pieces of content.
It goes back to what we've been doing since the beginning as a company. We've always made every attempt to be the front-runners, the leaders in how social media platforms should work, and this was just one more step in our evolution. We're making it so that for everyone who uploads to our site, we basically have ID for them, so we know who they are and we can add further deterrents to their doing illegal activity. We hope people who are providing adult entertainment and any form of entertainment to people in the world will follow our lead.
It's just a matter of safety. Some of our employees at the company have used aliases or pseudonyms from time to time because of safety. We've seen a lot of threats and doxing on 4chan and other message boards. David and Feras, who have been using their real names, have actually seen quite a lot of attacks and threats against them and their families.
This was something that Feras and I blessed. We felt it was a prudent thing to do, because there are a lot of social media platforms on the Internet that don't have the responsibilities that we do, who don't actually check for negative comments and stuff like that. There have been many articles, and you can google them—if you google my name, you'll find these articles—in which they threaten not only me and my family.... And there are a bunch of other heinous accusations, so we felt it was the best way we could to protect our employees from groups who don't believe that adult entertainment belongs in the world. We're very proud of our product and we're not trying to hide behind anything. That's why Feras and I use our personal names. But we would never oblige an employees who is proud to work at the company to have to endure any of those things.
I don't see the direct correlation between the two, but, as I mentioned many times, if we're given the additional information, we will definitely look into whether those facts are true. It is by no way, shape or form the way we want our site to operate, or any of our sites to operate. As I mentioned to the committee, we have multiple ways to have them removed. I have never heard of an instance. And once again, I'm not saying it didn't happen. It could have been one employee who did the wrong thing or acted in the wrong way. I understand that even though it's an employee, it's our responsibility. It's our company. But we would never allow someone to do that.
Could you explain to me the term “middle schooler”? I know what it means. However, there are reports that there are still videos up on your site that say “middle schooler”, that this is a middle schooler being depicted.
My colleague just asked you about a Globe and Mail article where formatters raised an issue of egregious abuse and were told to park it—that they weren't going anywhere. You don't remember the article. You didn't know what it was about. You said it was journalists making things up. And then I think your colleague asked our committee to maybe give you some more information. The issue we're talking about here is criminal behaviour, the Criminal Code. Your obligation is to protect people and there's your complete disregard for an accusation raised in one of Canada's most respected newspapers. You didn't even bother to read it.
When you come in here and tell us how much you care about the victims, it strikes me, Mr. Antoon, that you show a staggering level of recklessness that's just been made apparent here. We've asked you about Ms. Fleites. She blew your business model apart. You've had three months to investigate this and you showed up at our committee and said, sorry, we haven't found anything. You made her give you her pictures over to you. You made her give her ID. You dragged this child out and destroyed her life. And then you showed up at our committee, after you had to flush 80% of your content down the toilet because it was either nonconsensual or possibly criminal behaviour, and you shrugged. And you're asked about your own staff, whose job it is to protect people, and you didn't bother to know what the allegations were. I think, sir, you are extremely negligent and we're talking about possible criminal acts here.
Just to follow up on that point, there was a Vice investigation in relation to your state-of-the-art video fingerprinting that suggested that a video that was gently modified could be easily uploaded. Was that journalist making things up, or did you take action to rectify the situation?
It's a simple question: Will you provide it? The answer is no.
This is my last question. On a going-forward basis, you've actually taken a number of steps as of December, and I know it began prior to that, to upgrade your platform to protect individuals. You have kids, as you've said repeatedly today.
We will continue to invest in what we believe is the best way to invest our money, to essentially make the platform as safe as humanly possible, to make a safe place for Canadians and everywhere around the world to consume adult entertainment in a safe environment. If part of that would involve—
We've actually donated money to several organizations and will continue to do a mixed bucket of things between investing in software and investing in personnel to make the adult platform the safest it can be. I truly believe in my heart of hearts that we are the safest adult platform in the world right now, and if we weren't there, it would be a much worse place for people trying to consume this content.
Colleagues, I don't want to run out of time and be unable to discuss the documents that have been requested. A significant number were requested and I think it's important that we go through that list now. Of course, committee members do know that our ability to request documents is absolute and that we can abide by the protections that we believe are warranted. We do have the right to request documents. I'm just going to run through a number of different documents that have been requested.
Mr. Erskine-Smith, in his last round of questioning, asked for the independent review to be supplied to our committee. We ask that it be submitted to the committee.
Mr. Sorbara asked for the corporate structures of your business and the subsidiaries. If you could provide those documents to our committee, they would be very helpful.
Mr. Gourde asked for the budget that your company provides for protections, including the budgets for legal services and the budgets for staff who are dedicated to the area of protection of individuals who are being depicted on your platform. We would ask for those documents.
Mr. Dong asked for information with regard to the gross profits and audited financial statements of your company. We would ask for those.
Mr. Angus asked for the training manuals that are used by your monitors, so we would ask for that document as well.
Ms. Lattanzio asked for the amount in Canadian dollars in legal settlements that you have paid out to the victims, so we would ask for that information.
There were two additional documents requested, but I missed writing them down. Mr. Angus and Madame Gaudreau had asked for separate documents.
I will turn now to Mr. Angus to clarify what those initial documents he asked for are.
Thank you. I'm still trying to come to terms with everything I heard. For some of those documents, I think I'm going to have to come back to you.
While we're on this, I would like to know, with this study or with another study, if we could give a clear message to witnesses who have information that could help us that they can write to us offering that testimony. If they opt to provide testimony, can we write back and say they can present it to us in documented form and that it will be privileged and protect them from any legal threat from these witnesses, or in another study? Therefore, if we're receiving these requests from people who want to provide information, we will contact them and say, yes, we will ask for your documentation and decide whether or not it's relevant. We will look at it, of course, and then make a decision. I think we should send that clear message that if people have information, they can contact our committee, and if it's germane to our study, we can use it.
Yes, I can confirm, Mr. Angus, that parliamentary protection will be provided to any documents that are supplied to our committee.
I guess as the chairman I will now indicate to anybody listening that if they would like to supply this committee with information with regard to the studies that we are currently undertaking, they have the right to supply us with that documentation, and that documentation will be treated under the same provisions as if they were to come as a witness to our committee. They will be provided with full parliamentary protection for any documentation that they supply to our committee for the studies that we're currently undertaking.
My request was to receive the conditions of use as they were before the last changes. Earlier, when I talked about our bill on the protection of privacy, which was much talked about during this meeting, we were not able to shed light on the process and the specific rules. This information would assist the committee in looking into the protection of privacy. You may be putting things in place that might help us pass legislation to protect the public.
Can you explain the process and tell us specifically how the information is being used?
Yes. I want to ensure that we're prescriptive and thorough in the requests that we make. I think what we definitely have to have in the documents supplied will be a comprehensive breakdown of all the revenue streams, the equity in the company and explicitly who owns it, what the value is, and if there is any debt leveraging, who owns that debt.
I think it would be beneficial to know how much revenue has been generated in the last five years and if there's enough money to make the case that they could be holding themselves to a higher standard, or ought to be. We'll also want to see any off-balance sheet entities for which fees and costs associated with revenue are deducted as costs for tax purposes, and, I think, probably tax payments, at least in Canada but ideally in all locations where there are subsidiaries of MindGeek.
I also requested during my questioning the SafeGuard software that Mr. Tassillo spoke to. I asked him to give us more details in writing, as well as the standards that were put in place prior to December 2020 and after 2020.
I see that my colleague Nathan is not with us, but I remember distinctly that he asked for the transparency documents and reports during his first question period, as well as MindGeek's billings both here in Canada and internationally. I imagine that would include Luxembourg and Cyprus.
I'm wondering, Mr. Chair, about going forward. Are we planning to have more meetings on this in the near future? I'd be happy to meet during the break week if we need to. I know that we have the Kielburgers showing up, but—
As the old-guy conscience on the committee, I want to put it on record that we are looking at a specific issue, which is whether privacy rights were violated. It is not the role of our committee to investigate an adult entertainment industry. If they are complying with the law, we have nothing to say. Our focus is about whether or not failures to protect on issues of consent and non-consent took place.
It is important that we state that we will examine the documents we are requesting in camera. If we believe they are relevant, we will add them to our testimony. To reassure our witnesses about financial information and so on, this is not about naming and shaming. If the information is relevant, and if four different political parties that don't often agree on anything can all agree that it is relevant, then it will be made part of the study. If it's not relevant, the privacy information that is given to us is returned.
It is very important to say that this information will be examined in camera, and then it will be decided whether or not it belongs in the larger study.
Mr. Chair, would that be how we normally address these issues?
Before we sign off today, there's one thing we could ask for that is very important, because it is a very complex structure. It is the information regarding MG Billing. I'm trying to decipher everything with regard to that. If MG Billing could that be added to the document list, it would be appreciated.
Gentlemen, thanks so much for joining us. We appreciate your testimony. We will obviously be interested in receiving these documents that have been requested, and we'll be happy to work with you or your legal counsel as you prepare these documents. Please submit them to the clerk. We will work with our law clerk to ensure that we undertake a review in the appropriate way for any documentation that does come to our committee.
Colleagues, in terms of the upcoming meetings, I ask that you submit names of suggested witnesses to the clerk over the next couple of days. We will be working through the next number of days to finalize the meetings for the next stage that we'll be undertaking in the same study. I'll be in contact with you individually and I hope we can come to a consensus on those witnesses.
If there is nothing more, I will adjourn the meeting.