Committee
Consult the user guide
For assistance, please contact us
Consult the user guide
For assistance, please contact us
Add search criteria
Results: 1 - 15 of 117
View Dave MacKenzie Profile
CPC (ON)
View Dave MacKenzie Profile
2019-06-04 16:16
When we have that definition and somebody puts something on YouTube that may come from a movie or a television show, it seems to me as though, at times, those topics would be part of the broadcast. If those show up, what happens?
Colin McKay
View Colin McKay Profile
Colin McKay
2019-06-04 16:16
If those show up and they are flagged for review—a user flags them or they're spotted by our systems—we have a team of 10,000 who review videos that have been flagged to see if they violate our policies.
If the context is that something is obviously a clip from a movie or a piece of fiction, or it's a presentation of an issue in a particular way, we have to carefully weigh whether or not this will be recognized by our users as a reflection of cultural or news content, as opposed to something that's explicitly designed to promote and incite hatred.
View Dave MacKenzie Profile
CPC (ON)
View Dave MacKenzie Profile
2019-06-04 16:17
A couple of weeks ago a group of youngsters attacked and beat a woman in a park. I believe only one was 13; I think the rest of them were young. It showed up on the news. Would that end up in a YouTube video?
Colin McKay
View Colin McKay Profile
Colin McKay
2019-06-04 16:17
Speaking generally and not to that specific instance, if that video were uploaded to YouTube, it would violate our policies and would be taken down. If they tried to upload it again, we would have created a digital fingerprint to allow us to automatically pull it down.
The context of how a video like that is shown in news is a very difficult one. It's especially relevant not just to personal attacks, but also to terrorist attacks. In some ways, we end up having to evaluate what a news organization has determined is acceptable content. In reviewing it, we have to be very careful that it's clear to the viewer that this is part of a commentary either by a news organization or another organization that places that information in context.
Depending on the length and type of the video, it may still come down.
View Christine Moore Profile
NDP (QC)
How quickly are you able to remove a video that has already been removed and been modified, for example, using sound that goes faster. They do that. I've often seen that from my daughter. There are people producing Paw Patrol a little faster so that it is not recognized by the system and they are able to publish their video.
In terms of hate videos, are you able to quickly remove a video that has already been removed once and has been modified just to avoid those controls?
Colin McKay
View Colin McKay Profile
Colin McKay
2019-06-04 16:33
Yes, we are.
I recognize the example you described. I've seen that as well. That is one of the challenges, especially immediately after a crisis. We're seeing content being uploaded and they are playing with it a little bit to try to confuse our systems.
What we do, particularly in the case of hate content and violent content online, is to tighten the standards within which we identify videos so that we're taking them down even more quickly.
Even in the context of Paw Patrol, I think your daughter will likely find that if she goes back to the same channel two weeks later, they may not have the Paw Patrol content because it will have been recognized and taken down.
View Christine Moore Profile
NDP (QC)
Okay.
I would like to know a little bit more about the process of reviewing flagged videos, and who reviews them when it's not done by a computer.
Also, are the workers reviewing these videos provided with any services, because having to listen to these kinds of things all the time causes a lot of distress to people? What services are you providing to these workers to make sure they do not go crazy from listening to all of these things all the time?
Colin McKay
View Colin McKay Profile
Colin McKay
2019-06-04 16:34
To begin with the process itself, as I mentioned, especially in the context of hate content, we are dealing with such a quantity that we rely on our machine learning and image classifiers to recognize content. If the content has been recognized before and we have a digital hash of it, we automatically take it down. If it needs to be reviewed, it is sent to this team of reviewers.
They are intensely trained. They are provided with local support, as well as support from our global teams, to make sure they are able to deal with the content they're looking at and also the needed supports. That is so that as they look at what can be horrific content day after day, they are in a work environment and a social environment where they don't face the same sorts of pressures that you're describing. We are very conscious that they have a very difficult job, not just because they're trying to balance rights versus freedom of expression versus what society expects to find when online, but also because they have the difficult job of reviewing material that others do not want to review.
For us, whether they're based in one office or another around the world, we are focused on giving them training and support so they can do their job effectively and have work-life balance.
View Iqra Khalid Profile
Lib. (ON)
How long does it take you to remove something once it's reported or flagged to you? What's the specific timeline?
Colin McKay
View Colin McKay Profile
Colin McKay
2019-06-04 16:37
It varies, depending on the context and the severity of the material.
We've already had examples in our conversation today about whether or not it's commentary or it's news reporting, or it's actual video of a violent attack. In the context of the Christchurch attack, we found that there were so many people uploading the videos so quickly that we had to accelerate our artificial intelligence review of the videos and make on-the-fly decisions about taking down video, based on its being substantially similar to previous uploads.
In that process, the manual review was shortened extremely because we were facing a quantity.... In a case where there's broader context to be considered, there's still a commitment to review it quickly, but we do need a process of deliberation.
Damian Collins
View Damian Collins Profile
Damian Collins
2019-05-28 11:07
Thank you, Mr. Chairman.
I'm going to direct my first question to the Facebook representatives. I'm sure you're aware that one of the principal concerns of members of this committee has been that deceptive information, deliberately and maliciously spread through the tools created by social media companies, are a harm to democracy, and this disinformation is used to undermine senior politicians and public figures, public institutions and the political process.
With that in mind, could Facebook explain why it has decided not to remove the video of Nancy Pelosi that presents a distorted impression of her to undermine her public reputation? The reason I think this is so important is that we're all aware that new technology is going to make the creation of these sorts of fake or manipulated films much easier. Perhaps you could explain why Facebook is not going to take this film down.
Neil Potts
View Neil Potts Profile
Neil Potts
2019-05-28 11:08
Thank you, Mr. Collins.
I'm happy to explain our approach to misinformation a bit more clearly for this committee.
First, I want to be clear that we are taking action against that video—
Damian Collins
View Damian Collins Profile
Damian Collins
2019-05-28 11:08
I'm sorry, Mr. Potts, we haven't got much time. I'd like you to answer the question you've been asked, not give a statement about Facebook's policies on misinformation or what else you might have done. I want you to answer the question as to why you, unlike YouTube, are not taking this film down?
Damian Collins
View Damian Collins Profile
Damian Collins
2019-05-28 11:08
I know you're down ranking it. Why aren't you taking the film down?
Results: 1 - 15 of 117 | Page: 1 of 8

1
2
3
4
5
6
7
8
>
>|
Export As: XML CSV RSS

For more data options, please see Open Data