Committee
Consult the user guide
For assistance, please contact us
Consult the user guide
For assistance, please contact us
Add search criteria
Results: 61 - 75 of 88
David Tassillo
View David Tassillo Profile
David Tassillo
2021-02-05 13:15
In the case [Technical difficulty—Editor] an individual performer, they probably wouldn't need sound to establish that. We always instruct all of our agents to err on the caution side. Basically, if you have any doubt at all, just don't let it up, versus just letting it up.
Even one video, as Feras mentioned, could create irreparable harm to us. The way we view it is that every piece of content that makes it up to the site that shouldn't be there.... For every viewer who stumbles upon that content, we believe the vast majority of individuals want nothing to do with this content, 99.9% and I don't know how many more nines. But after that—
View Shannon Stubbs Profile
CPC (AB)
View Shannon Stubbs Profile
2021-02-05 13:15
Right, but the main question is then, why did MindGeek wait until December 2020, after global condemnation, after threats from payment processors, to take these actions?
David Tassillo
View David Tassillo Profile
David Tassillo
2021-02-05 13:16
I respectfully disagree with that. I think it's been a constant evolution. Some of the parts that have been more publicly made available to the changes we've made were more publicly spoken about, but this has been a constant evolution in our company since the onset, since 2008. We had human moderation available on our sites when it was a word that didn't even exist, when Facebook and any of the other main platforms in the world never used it.
These were all things that we started. We weren't public about it, but these are things we did since the beginning. They've been core to the way we wanted the company to run.
David Tassillo
View David Tassillo Profile
David Tassillo
2021-02-05 13:47
We've created all of these different processes. We've integrated all of these different softwares. We have the human moderation team. I believe sometimes we're getting caught in some of the details. I understand the frustration you all have, but this is a problem that's bigger than just our site. This is a problem with how on the Internet people are misusing platforms. We're trying to create a safe environment for people to consume adult content, and we understand there are people out there who are trying to misuse these platforms.
View Charlie Angus Profile
NDP (ON)
I just have a second or two left.
On your specially trained experts, I think that's a really important question for us. Could you get us the training manual? Mr. Bowe said they were formatters, not moderators. I think it's really important for us to get a sense of how many you have and what training they have so that they can actually identify the horrific videos we've referenced, and whether or not these videos are consensual. Could you get us those training manuals?
David Tassillo
View David Tassillo Profile
David Tassillo
2021-02-05 13:55
There are a couple of things in there. I'll try to address them quickly to be sensitive to your time.
The content formatters are in a completely separate team. Those are not the individuals who actually do the screening of the content. That's a separate team that's actually located in Montreal. They work with different content providers to work on enhancing the videos and stuff like that. They have nothing to do with the compliance team.
No one at the company is actually allowed to work on the content until it passes through compliance. That was just a misunderstanding.
As for the manual, that is an internal document that I believe is best kept internal. It is a constantly evolving document. We'd like to stand behind what people see on the site, as those are our real words and those are our real actions. It's constantly evolving at every level.
View Charlie Angus Profile
NDP (ON)
I have a point of clarification, Mr. Chair.
They said they wanted to keep the documents internal, but can we have a discussion about our rights as parliamentarians, being that this is a Canadian company and a parliamentary investigation, and whether we can obtain those documents?
View Chris Warkentin Profile
CPC (AB)
Absolutely, we can have a discussion at the end of the meeting on the request for documentation. I'll be sure to leave time for that.
View Jag Sahota Profile
CPC (AB)
In the brief you submitted to the committee, you said that your business is similar to that of mainstream social media. You also highlighted that your subsidiary, Pornhub, is one of the world's most popular websites. You claim that a MindGeek employee visually inspects each and every piece of content before it is uploaded. You say MindGeek employs 1,800 people. According to your own report, 2.8 hours' worth of content is uploaded every minute to your site, which means over 160 hours are uploaded in one hour. Over the course of each of your 1,800 employees' standard 7.5-hour shifts, 1,260 hours of content are uploaded.
How is it possible, even if every employee was dedicated to content moderation, that they would be able to review 1,260 hours' worth of content?
David Tassillo
View David Tassillo Profile
David Tassillo
2021-02-05 14:01
I agree that in terms of pure linear math it seems an impossible task to do, or impossible to do efficiently. The way we do it, irrespective of the amount of content, is that the content will not go live unless a human moderator views it. I want to assure the panel of that.
However, the content comes in to different buckets. It comes in from content partners. These are studios that are usually in the U.S. They are producing content professionally, and they include 2257 documentation. I'm not sure if you're familiar with the law, but that's a law that stipulates that content produced in the U.S. has signed documentation, release forms from all of the individuals performing in it, and all of the appropriate IDs. When stuff comes up through that channel, it can be viewed a lot quicker because we know that the appropriate documentation is available from the producer who is uploading the content. When we have stuff that's uploaded through the model program, a lot of times it's solo stuff so it can be flipped through a lot faster.
The compliance team is instructed, essentially, to spend as much time as needed to verify that a piece of content is okay. They are always instructed to err on the side of caution, and we tell them, “If you're at all worried, it doesn't really matter. Just don't ask questions and don't put it up.”
View Jacques Gourde Profile
CPC (QC)
Thank you, Mr. Chair.
I am the father of four daughters and grandfather of three granddaughters. I am disappointed with the witnesses before us today. They seem to trivialize the situation and want to defend their business at all costs, and they are doing a very good job.
Here, all parties are unanimous. Faced with the magnitude of the problem, all parliamentarians in Canada are affected right now. I'm not sure whether the witnesses are aware that their site can cause collateral damage to young teenagers who are caught in a maze with no way out; they don't see the light at the end of the tunnel. This causes major problems for those kids. It leads to depression, runaways, and in some cases, suicide.
We may never be able to connect the triggers. Your site is probably a trigger for major societal problems. We, as lawmakers, won't be able to keep our eyes closed on the collateral damage you cause for money, just for money. You have set up a site that provides mediocre safeguards, and I'm sure that you have spent more money on legal counsel than on protecting teenagers.
If you still have some ethics and honesty, I would ask you to provide the committee with your budgets for site security, the number of people working on security to protect people who make complaints, and your budgets for legal counsel.
Those working for your company are robots. They are robots who post and repost the content. They sometimes prevent certain content from being posted, but when that content makes money, the robots put it back into the system or accept it. This is inconceivable, it's just to make money. You're not protecting Canadians, our teens are getting into something they cannot get out of, and their lives are being affected. If you still have any ethics, set up a program to help them. When a teenager calls you to say that a video has been posted without her knowledge, that she doesn't consent and asks you to remove it, remove it.
What are you going to do to get rid of those videos?
David Tassillo
View David Tassillo Profile
David Tassillo
2021-02-05 14:12
Mr. Gourde, I genuinely as an individual, and as a parent and just as a person, understand your frustration. I genuinely do.
I'm going to try to address each piece of the question that you had.
We do have all the systems in place. Well, you will never have it all. It's always going to be an evolution. Right now, an end-user, if they do see something on the site—I want to reiterate—they can fill out the form and the content will be disabled. There is actually no human intervention. You could go right now to the site, fill out a content removal form, and the content will be removed immediately. I can't stop it; Feras can't stop it; nobody can stop it. It will happen on its own.
We are not making any attempt to make anything difficult for any end-user to take anything down. We understand the responsibility we have. We take it very seriously. We will continue to, and we will continue to add new features.
That's one of the reasons why we made this large step we did in December to change it to deter people further from misusing our platform. We made it so that if you're going to upload anything to the site, I need to know who that person is. We are now making it obligatory, for anyone who uploads to the site, that we have to have the government-issued ID of the individual uploading to the site, so that if someone does misuse the site and does use our platform to commit a crime, we are able to help law enforcement get to the bottom of it, irrespective of where they are in the world. We keep this information now. And even prior to this, we always worked with all law enforcement.
I know we keep going back to the testimony of Monday. We will continue to look into this investigation as more information is made available to us. We just cannot track it down right now. We're not saying it's not true. We just can't track it down right now.
As for the amount of money that we put into fighting these issues, the number is large. I think last year—I'm saying this as an estimate; I'm not 100% sure—it was roughly $10 million Canadian, and it continues to grow every year. We will continue to invest money into it. We're always looking for the best place to put the money.
We're working with a new provider that we found in the last 3-4 months that is able to work on even the comment engine to see if people are putting in negative comments and use that as a lead to potentially trigger that there's something wrong with this piece of content. There have been instances in our past when even our human moderators—because we do go back and check the comments manually; we don't have an automated system—actually caught it on the evidence of a comment, someone saying something like, “This person looks young” or “That doesn't make sense.” We would review it and take it down if we felt that was the case.
So we are committed to this.
View Shannon Stubbs Profile
CPC (AB)
View Shannon Stubbs Profile
2021-02-05 14:30
Thanks, Chair.
I hardly can even understand what is going on here. There are just a couple of issues that I guess I want to clarify.
You keep comparing yourself to other social media platforms and tech companies. I think the key difference is that every single one of those platforms explicitly banned the content that you profit from. There's that issue.
I also find it shocking that you would come to this public committee after it has been public that—and we as members of Parliament know without a shadow of a doubt—content of child sexual abuse material, non-consensual material and human trafficking material has been present on at least one of your at least 48 subsidiaries. How it could be that you've come to this committee and not actually know your terms of reference and not be able to answer those questions is just mind-boggling to me.
I guess I have a few more questions about your moderation and your content. You've said that you have MindGeek moderators, which we actually understand are called “content formatters”, which turns one's stomach, doesn't it? You've said that those content formatters view and approve every single video and approve each piece of content. Do you agree that MindGeek content formatters viewed and approved every example of child abuse and non-consensual content?
Results: 61 - 75 of 88 | Page: 5 of 6

|<
<
1
2
3
4
5
6
>
>|
Export As: XML CSV RSS

For more data options, please see Open Data