Committee
Consult the user guide
For assistance, please contact us
Consult the user guide
For assistance, please contact us
Add search criteria
Results: 1 - 15 of 80
View Shannon Stubbs Profile
CPC (AB)
View Shannon Stubbs Profile
2021-06-07 11:12
I think you have spoken about the concept of having a 24-hour takedown rule, so that once it has been notified that material is there, there would be a provision for that. I think that's a good idea. Of course, the trouble is that when child sexual abuse material or non-consensual images have been up for even 24 hours, they can have hundreds or thousands of viewers—millions in the case of Pornhub and MindGeek. We've heard from victims that explicit images of them were online for three years before they found out. In the case of Serena Fleites, hers was shared and downloaded all over her school before she knew. Then she got into a never-ending back and forth to try to get the platforms to be accountable and to take down the materials.
Can you explain or enlighten us about what prevention mechanisms might actually be in place?
View Steven Guilbeault Profile
Lib. (QC)
This is a very good question. My office and my department have spoken as well with victims and victims' organizations. What we want to do with this legislation is to really shift the challenge for victims of having to try to get these images taken down—if we're referring to images that we would find on Pornhub, for example. We're trying to shift the burden of doing this from the individual to the state. It would be up to the Government of Canada, through a regulator, to do that, as it is in other countries, such as Australia, with their e-safety commissioner.
That's the goal we're pursuing with the tabling of this legislation. You are correct; we are also working to ensure that not only are the images taken down but they are removed from websites or associate websites to prevent, for example, the download of such images. They're not going to be downloaded and uploaded and downloaded and uploaded, as we've seen in many cases.
View Steven Guilbeault Profile
Lib. (QC)
Companies should abide by Canadian laws. Whether they're online companies or physical companies, there should be no distinction. As I said earlier, the challenge we face now is that the tools we have to deal with these online harms just aren't adapted to the virtual world.
View Jacques Gourde Profile
CPC (QC)
Minister, would it have been possible to include a provision in Bill C‑10 to regulate platforms like Pornhub so as to finally protect our children, who are going through unspeakable things right now?
View Steven Guilbeault Profile
Lib. (QC)
Thank you for the question.
I find your question very cynical, as your party consistently opposes the passage of Bill C‑10, which is not about content moderation, but rather about web giants contributing to our cultural sector's artists and musicians.
View Jacques Gourde Profile
CPC (QC)
View Jacques Gourde Profile
CPC (QC)
Thank you, Mr. Chair.
We have had some very disturbing testimony about underage children being exploited by platforms, and we need to take action. You told us you would put in place a new provision, new legislation, which probably won't come into effect for a year, a year and a half. We need to move much, much faster than that. We live in a society where our children are not protected, currently, from web giants.
How are you going to speed up the process? Why couldn't C‑10 close the loophole for now?
View Steven Guilbeault Profile
Lib. (QC)
Once again, your party opposes the passage of Bill C‑10, which has nothing to do with content moderation, while the hate speech and online harm bill specifically addresses the issue of content moderation.
Yet you say you oppose content moderation. You and many of your colleagues say that the government wants to take away your freedom of expression. The exploitation of persons bill will ensure...
View Han Dong Profile
Lib. (ON)
Thank you very much, Chair.
I want to thank you, Minister Guilbeault, for coming to the committee today and talking about a very important topic.
First of all, I want to go back to your opening statement. You cited an increase of xenophobia and Islamophobia in behaviours or speeches online over the recent months. As a member of the Asian-Canadian community, I observe and witness first-hand some of these intolerable behaviours online.
I have to say that the pandemic is changing people's socialized behaviour. More and more, people are spending time on social media. Then we have some of these bad actors using various platforms, seeing them as tools of disguise, seeing them as a protection, and also utilizing bots and trolls and saying all kinds of things they otherwise wouldn't say in public.
You mentioned that children in the country are being victimized, and the platforms are not doing anything. That's precisely what we are talking about today.
We know that social media companies, including the one we are doing a study on, have been acting unilaterally and opaquely. Sometimes they introduce half measures after public pressure, but they haven't been serious about consulting with industry experts and listening to the recommendations of the audience and the groups of victims.
In your opinion, what can the giants do to respect Canadians' will and Canadian law in terms of protecting the general public? It's in their best interest as well, because that's their audience and their client base. A very few bad actors are contaminating the online environment.
Can you talk a little about that?
View Steven Guilbeault Profile
Lib. (QC)
There are many elements in what you said.
First, I think one of the purposes of the legislation is to ensure more transparency on the part of the platforms in terms of their guidelines and practices regarding content moderation, because right now it's very uneven. Some companies have better content moderation practices than others, and some have very little. You're right—they are not transparent.
Some may have rejoiced in the decision of this platform or that platform to ban this user or another user, but under which criteria? Why them and not someone else? This is clearly something we want to tackle. Frankly, there is an issue where we see the very business model of some of the platforms being about creating controversy and nourishing hate speech and intolerance, because it creates more traffic on their platform. Therefore, they can sell more publicity and make more money.
As part of the legislation that will be tabled, this is also something that we as a legislator will need to address.
Results: 1 - 15 of 80 | Page: 1 of 6

1
2
3
4
5
6
>
>|
Export As: XML CSV RSS

For more data options, please see Open Data