Committee
Consult the user guide
For assistance, please contact us
Consult the user guide
For assistance, please contact us
Add search criteria
Results: 1 - 11 of 11
Shimon Koffler Fogel
View Shimon Koffler Fogel Profile
Shimon Koffler Fogel
2021-06-16 17:03
With your permission, Mr. Chair, I'll quickly begin and make the following observation.
I think the pace of change in the landscape or backdrop with which we're looking at these issues is breathtaking. The idea that it behooves us to review those instruments, policies, regulations and legislation that are currently in place on a regular basis is one that I think is self-evident.
We never would of thought, even two years.... I mean, smart phones only came into existence at the end of 2012. It's really only now that we're beginning to appreciate the power of social media as a vehicle either for good or, in this context, something very, very not good. So I think that it does behoove us to look at old legislation, old regulations and old approaches, and test them against the reality of today.
I'll also point out that, for example, in a concrete way, we're always trying to balance—and I know your committee is struggling with balancing—the issue of free speech with freedom from threat. Some of you will recall that there was a contentious debate about section 13. It was ultimately eliminated by the government of the day, because it is a two-edged sword. On the one hand it enshrines the notion we all believe in, which is freedom of expression. On the other hand, it's also been used as a way to insulate groups that are trying to foment hate with protection from the very thing we're trying to prevent.
It's adding work to your plate, but I think it behooves you to routinely build into legislation and recommendations a need for periodic review that would test the reality against what you are trying to achieve.
Shimon Koffler Fogel
View Shimon Koffler Fogel Profile
Shimon Koffler Fogel
2021-06-16 17:17
John, you'd like to keep me on mute.
Voices: Oh, oh!
The Chair: I know that's impossible, Shimon.
Mr. Shimon Koffler Fogel: Maybe Sameha has something else to offer.
I think there is no question that social media has changed everything. It has allowed for not just the flourishing but the explosion of hate that is insulated, protected, anonymous and enables people to act out their most vile thoughts. We have to come up with remedies that are calibrated to align with the potency that social media represents.
I don't think it's unique to Canada, but here's the thing: We have to be mindful. That's why I was so happy, Mr. Chair, that you were focusing more broadly and moving away from some terms, because what pose as specific threats to Jews may not pose the same threat to Muslims and may not pose the same threat to women or to indigenous people. We have to have instruments that are sufficiently malleable or flexible that they can address and include the whole range of threats that are out there and that are expressed on a common platform like social media.
View Tako Van Popta Profile
CPC (BC)
Thank you.
Two minutes is too short for this, but thank you to all three witnesses. Mr. Fogel from the Centre for Israel and Jewish Affairs, and Mr. Farooq and Ms. Omer from the National Council of Canadian Muslims, thanks for being with us and helping us through this very difficult conversation.
I'm going to ask a question that a couple of people attempted to ask and ran out of time, which is about balancing civil liberties and keeping Canadians safe, particularly when it comes to the Internet.
Mr. Fogel, I think it was you who said that we need new tools when it comes to regulating the Internet. I don't know if you were talking about criminal laws or civil remedies. Perhaps you could expand on that. What would civil remedies look like as far as that goes?
Shimon Koffler Fogel
View Shimon Koffler Fogel Profile
Shimon Koffler Fogel
2021-06-16 17:38
Thank you for the question.
I'll try to be really brief over here. It's a challenge for me.
I think one of the takeaways of this whole discussion is that to really address this effectively you need a whole-of-government approach. You have sister committees in Parliament that are looking at some of these questions. Online hate is something that the anti-racism secretariat has been focusing on a lot and providing some resources for stakeholders, such as the NCCM and us, to be able to explore remedies. Social media platforms have been brought in and not quite coerced, but encouraged, to take some ownership and to provide some of the solutions.
I don't know what all of the instruments will be. I know that for them to be effective it requires the buy-in from all of the stakeholders. That means government, communities and social service providers.
We have to distinguish between two groups. There are the vast bulk of Canadians who may be ignorant and insensitive to the impact of social media posts. They need to be educated. Then there are the marginal ones who have to be chased into the corner or prosecuted or somehow defanged, so that they don't constitute an ongoing threat.
Shimon Koffler Fogel
View Shimon Koffler Fogel Profile
Shimon Koffler Fogel
2021-06-16 17:40
I'm not by any means an expert in this area, but I do know this: Social media companies have the most sophisticated algorithms that exist. They do have the capacity to track, to monitor, to isolate and to pull out words, phrases and context. It's scary how much they're able to do. If there's the will to do it, there's the technological capacity to do. It seems to me that the first order of business is to try and weed out all of those toxic sites, those conversations, those chat rooms and so forth, so that the individual has far-reduced options in terms of gravitating towards things that are toxic and hateful.
View Kamal Khera Profile
Lib. (ON)
Thank you, Mr. Chair.
Thank you to both of our witnesses for being here, but more importantly, for all the work you do.
Dr. Perry, I want to start off with you, and I want to talk about online hate.
I know you've teamed up with Facebook Canada to address instances of online hate. It is a topic that we've certainly discussed in committee. You have declared that online platforms have been a gift to alt-right groups known for spreading conspiracy theories via video clips.
Could you maybe expand a little bit on your findings and efforts in this area? How do we address promoting hatred on mainstream channels, as well as on underground networks, such as Parler and Gab?
Barbara Perry
View Barbara Perry Profile
Barbara Perry
2021-05-31 15:57
These are all very good questions. They're not easy questions by any stretch.
One of the most disturbing things we found in this round of work—the Institute for Strategic Dialogue is doing much of our online analysis—is that in two successive years, Canadian posters were among the most active within the far-right ecosystem, if you will.
Just quantitatively, that's problematic. We tend to think we are immune to those kinds of narratives, but there you are. In particular in the first round—that would have been the 2019 report that we did with ISD—we actually found that they were, in fact, second and third in two of the most extreme platforms, Fascist Forge and Iron March. These are the ones that are most likely to promote violence, and mass violence in particular.
Again, quantitatively, that is the problem, but it's also a problem qualitatively, given the breadth of the speech, the viciousness of the speech as it's directed towards particular individuals or particular communities, whether it's emails or posts directed towards an individual or it's those who vilify particular groups. It's rampant online, obviously.
I think we have to consider the impacts of this on a sense of community, a sense of belonging and a sense of security, as well. It is something that absolutely silences communities. It makes them less willing to engage online, which has become the way we communicate—especially now, with COVID.
How do we confront it and how do we regulate it? It's such a challenge. We've been exploring it globally over the last five or six years. We've been trying to constrain the most heinous sorts of speeches.
When I'm talking about hate speech here, I'm talking about dangerous speech, speech that promotes violence, that explicitly promotes vilification and that directs hatred towards particular groups. Warman v. Kouba identified these sorts of elements of speech as the hallmarks of hate.
I think we need to put much more pressure on social media giants to enforce their community standards. Most of them are at least as strong as our own federal definitions. We need to encourage the actual use of those. I hear so many...from the research but also from the people I work with. They are identifying speech that seems to cross those boundaries, which.... There's no response to the complaints, so I think we need to hold their feet to the fire.
In terms of the alternative platforms, that's where the real challenge lies because access to the darkest spaces is more difficult for researchers, for police, for journalists and for anyone who wants to know what's happening there. There are challenges there because they're specifically set up to avoid any sort of community standards. Most of us are at a loss as to how to respond to those. Again, perhaps we put pressure on the domains to not host them, as happened with Parler. I think it was after the January 6 events.
I think that is a new challenge presenting itself.
View Kristina Michaud Profile
BQ (QC)
Thank you very much.
We know that extremist groups rely heavily on social networks and platforms, such as Facebook, Twitter and other platforms that have even been banned, to recruit people and to misinform and radicalize them. Some people believe that shutting down certain platforms would not be beneficial because it would send people to private networks on the Internet.
Even if it's not on these private networks and it's on the platforms that we know and access every day, how can the government and the RCMP intervene to detect this kind of violent extremism, whether it's violent speech or video sharing?
Should there be collaboration with the private companies that own these platforms, or could the government and RCMP intervene directly?
Michael Duheme
View Michael Duheme Profile
Michael Duheme
2021-05-12 17:20
I'll talk about what the RCMP can do with respect to websites.
The majority of the investigations we conduct into hateful comments spread on social networks are triggered when we receive reports from people who have observed this on a site and report it to us. In most cases, we trigger an investigation.
Of course, if the social networks remove the information without notifying us, we don't have access to that information. It's no different than when someone calls the police to make a report and the police initiate an investigation, except that it happens on social networks.
If the platforms remove this information without notifying us, we can no longer take informed action on the complaint.
Members of Parliament often receive derogatory or hateful messages on social media. In these cases as well, the RCMP initiates an investigation and we follow through. Sometimes that's a challenge because people can use all sorts of mechanisms on social media to avoid being found.
I won't hide from you that this is one of our concerns, and it's not just about social networks. When you implement a new law or a new process, people always find ways around that through other mechanisms.
You've all heard of the dark Web. There are probably already many IMVE groups on the dark Web.
View Darren Fisher Profile
Lib. (NS)
Thank you very much for that.
Someone claimed in testimony that two neo-Nazi groups no longer have an online presence. Mr. Harris touched on this as well.
Maybe you could reinforce whether there is a concern that these organizations could morph into something else or go deeper underground, because they don't give up on this level of hate that they have within themselves.
Dominic Rochon
View Dominic Rochon Profile
Dominic Rochon
2021-05-12 17:37
I'll take a stab at answering that.
I think my colleague from CSIS, Mr. Hahlweg, certainly did a good job of describing the dangers that of course will continue to happen once you start listing these entities. By default, listing them does enable social media platforms to remove these entities. What I mean by this is that they might have a social media presence in order to try to raise funds for their cause, for example. With their being listed, it allows social media platforms to say, “No, we're not going to be selling T-shirts to promote your particular ideology.” As such, they start removing that particular presence.
It doesn't mean that you're eradicating their presence in terms of their ability to propagate. I think it was my colleague from the RCMP, Monsieur Duheme, who mentioned that inevitably what they will do is revert to going to the dark web, or they will revert to going to encrypted channels or hidden channels to be able to continue to spread their rhetoric, but with that tool of a Criminal Code listing, at least they're not going to be able to do it as overtly.
As I said, though, Criminal Code listing is but one tool. It does help with certain aspects, but it does then push us further downstream to have to try to cope with some of the challenges of the spreading of their rhetoric in other avenues.
Results: 1 - 11 of 11

Export As: XML CSV RSS

For more data options, please see Open Data