Committee
Consult the user guide
For assistance, please contact us
Consult the user guide
For assistance, please contact us
Add search criteria
Results: 1 - 15 of 67
Mustafa Farooq
View Mustafa Farooq Profile
Mustafa Farooq
2021-06-16 16:26
Thirdly, I think we need to see robust online hate regulation that is balanced and that ensures the protection of civil liberties through consultation with the best experts in Canada and internationally.
Lastly, we'd like to see a review on how national security agencies have been dealing with neo-Nazi and white supremacist groups.
I also note that we will be providing a brief and follow-up to expand further on the recommendations.
Thank you.
Shimon Koffler Fogel
View Shimon Koffler Fogel Profile
Shimon Koffler Fogel
2021-06-16 16:27
Thank you, Mr. Chair, along with the members of the committee, for inviting our participation in this important discussion. My name is Shimon Fogel. I'm the president and CEO of the Centre for Israel and Jewish Affairs, the advocacy agent of the Jewish federations across Canada. We're a national non-partisan, non-profit organization representing more than 150,000 Jewish Canadians affiliated through Jewish federations from coast to coast. Our mission is to preserve and protect the quality of Jewish life in Canada through advocacy.
For Canada's Jewish community, the conversation about ideologically motivated violent extremism is inextricably linked with anti-Semitism. As I speak, Jewish Canadians are facing a dangerous rise in anti-Semitism across the country, and indeed, around the world. The UJA Federation of Greater Toronto, an organization that closely monitors the security situation of the Jewish community in the GTA, reported a fivefold spike in anti-Semitic incidents last month compared to previous months this year. In May, individuals who attended a peaceful pro-Israel rally in Montreal were pelted with rocks. Police seized weapons and made 15 arrests, including for armed assault. In April in Victoria, the words “Kill the Jews” and “Gas the Jews” were spray painted on a Jewish community institution. We too observed swastikas and Nazi symbols on banners at anti-Israel rallies in multiple cities. Jewish businesses were targeted across Canada, either by vandals or for boycotts.
In Canada, no one should ever feel that they're at risk in their own neighbourhood. No one should feel the need to hide their identity. No Canadian should be made to feel they do not belong, yet we have community members who are thinking twice before wearing a kippah or a Star of David necklace in public. This isn't the Canada we know or want.
In 2019, the most recent year for which Statistics Canada data are available, Jews were the most targeted religious group for police-reported hate crimes, and targets of the second-most-police-reported hate crime overall. On average, an anti-Semitic incident happens pretty much every day of the week, 365 days of the year. Comprising only less than 1% of the Canadian population, Jewish Canadians accounted for 16% of all victims of hate crimes in 2019, a trend repeated year after year. This should be of grave concern to all Canadians.
Anti-Semitic incidents are also occurring online, in troubling numbers, where anti-Semitism and ideological extremism percolate and pose a threat to the well-being of all Canadians. As social media has become central to our daily lives, racist, xenophobic, misogynistic, anti-authoritarian and other hate-filled groups are exploiting platforms such as YouTube, Facebook, TikTok, Twitter and Instagram to spread their toxic ideals, often targeting our children and young adults. These vile groups are also active on Parler, 8chan and in other dark corners of the Internet, where they promote their hatred, radicalize and recruit Canadian youth.
We know from experience that this toxicity spread online can and too often does have real-world consequences. Online activities spurred murders of Jews in Pittsburgh and Muslims in Christchurch. The Pittsburgh shooter reportedly posted more than 700 anti-Semitic messages in hate-filled online communities over nine months prior to the attack. The Christchurch shooter's livestreaming of the killings was a means of promoting and inciting more such heinous acts.
While we welcome the addition of the Proud Boys to the list of terrorist entities, we believe more needs to be done. For some time, we have strongly encouraged the Government of Canada to list both the Iranian Revolutionary Guard Corps, in its entirety, and Samidoun, a PFLP-affiliated organization that operates right here in Canada.
However, we must disabuse ourselves of the idea that radicalization happens only with the support of an organized group. The proliferation of online content has empowered the so-called lone wolf. Radicalization can manifest remotely, circulating in chats and forums without the direct support or coordination of an organized group. This new threat also makes it even more difficult for police and security services to track suspicious activity. From what we understand of the horrific tragedy in London, the murderer acted independently and may have been radicalized as a lone wolf. The same is true of the 2018 Toronto van attack.
Anti-Semitism is not associated solely with ideologically motivated violent extremists. While Jew hatred is central to many xenophobic belief systems such as neo-Nazism and white supremacy, anti-Semitism is also a key component in both religiously motivated violent extremism and in politically motivated violent extremism. Anti-Semitism is a hatred that does not live in a single category. It finds purchase in all three.
What most people may not appreciate is that anti-Semitism is a threat not only to Jews, but also to all Canadians and to our way of life. Combatting anti-Semitism benefits all of us, and we need to call it out whenever and wherever we see it, because what starts with Jews never ends with Jews.
Jewish Canadians value our just, liberal democratic society. There has been a lot of discussion about the role of law enforcement. From our perspective, we believe a well-educated and a well-resourced police force is an essential component in flighting hate crime.
Let me conclude, therefore, by providing five recommendations for the committee's consideration.
First, we recommend that law enforcement be given the tools they need to combat hate and radicalization, including bolstering existing police hate crime and community liaison units, and providing funding to establish new units where they do not yet exist. This includes increasing resources for security services to monitor, track and protect Canadians from online radicalization.
Second, we recommend increasing resources for law enforcement, Crown attorneys, judges and others to ensure they receive sufficient training on the importance of combatting online hate.
Third, we also recommend strengthening legislation to combat online hate, including developing a multipronged approach to raise awareness of online hate, adopting civil remedies to combat online hate, and establishing requirements for online platforms and Internet service providers for monitoring and addressing online hate on their own platform.
Fourth, we believe that funding for the security infrastructure program, SIP, should be increased. This program allows at-risk private not-for-profit organizations, such as places of worship and educational institutions, to enhance their security. To quickly illustrate the value of the program, a security guard at Congregation Shaar Hashomayim in Montreal was able to thwart an arson attack on the synagogue because of the surveillance cameras funded in part by the program.
Finally, we recommend Canada establish a community institution security rebate. As one of the groups most targeted by hate-motivated crime, Jewish institutions spend millions of dollars every year on security personnel. We recommend that the federal government implement a security rebate for at-risk places of worship, schools and community centres.
In conclusion, Mr. Chair and committee members, even though the Jewish community is resilient, we too feel vulnerable at the moment and we are respectfully asking you to take action. What we have proposed will not only serve the Jewish community, but it will benefit all Canadians. History has taught us repeatedly that if left unchecked, the toxin of anti-Semitism can poison all of us.
Thanks for inviting me here today.
Shimon Koffler Fogel
View Shimon Koffler Fogel Profile
Shimon Koffler Fogel
2021-06-16 17:03
With your permission, Mr. Chair, I'll quickly begin and make the following observation.
I think the pace of change in the landscape or backdrop with which we're looking at these issues is breathtaking. The idea that it behooves us to review those instruments, policies, regulations and legislation that are currently in place on a regular basis is one that I think is self-evident.
We never would of thought, even two years.... I mean, smart phones only came into existence at the end of 2012. It's really only now that we're beginning to appreciate the power of social media as a vehicle either for good or, in this context, something very, very not good. So I think that it does behoove us to look at old legislation, old regulations and old approaches, and test them against the reality of today.
I'll also point out that, for example, in a concrete way, we're always trying to balance—and I know your committee is struggling with balancing—the issue of free speech with freedom from threat. Some of you will recall that there was a contentious debate about section 13. It was ultimately eliminated by the government of the day, because it is a two-edged sword. On the one hand it enshrines the notion we all believe in, which is freedom of expression. On the other hand, it's also been used as a way to insulate groups that are trying to foment hate with protection from the very thing we're trying to prevent.
It's adding work to your plate, but I think it behooves you to routinely build into legislation and recommendations a need for periodic review that would test the reality against what you are trying to achieve.
Mustafa Farooq
View Mustafa Farooq Profile
Mustafa Farooq
2021-06-16 17:06
I'll say briefly that I think Shimon is right, as I think he often is, on the critical tension here between a desire to protect folks versus those critical constitutional values that we uphold and know that we need to be upheld.
I think those are exactly the kinds of reasons that we had concerns around overly broad language vis-à-vis terrorist propaganda. We were pleased to see that the most recent iteration of legislation narrowed it down to a more focused “counselling” offence. We thought that was important.
From our perspective, we want to see the legislation applied equally, but that's not the same as seeing.... As in the sense that white supremacist terrorist group should be dealt with appropriately through the listing provisions that are there, we have to careful about overexpanding our Criminal Code, especially around terrorism sections. I think there are existing tools that need to be utilized, and if there are other ways of approaching white supremacist groups, such as the creation of a new listing procedures, I think that could be done outside of the precise mechanics of terrorism legislation, which, of course, has with it a whole regulatory and legislative set of considerations to deal with.
View Steven Guilbeault Profile
Lib. (QC)
Thank you, Mr. Chair.
Mr. Chair, members of the committee, good morning.
I would first like to acknowledge that I am joining you from Montreal, on the traditional territory of the Mohawk and other Haudenosaunee peoples.
Thank you for inviting me to speak to you today. With me, as you said, are Joëlle Montminy, senior assistant deputy minister, cultural affairs, and Pierre-Marc Perreault, acting director, digital citizen initiative.
Like you and many other Canadians, I am concerned by the disturbing rise and spread of hateful, violent and exploitive content online and on social media.
As a legislator and father of four children, I find some of the content of these platforms to be profoundly inhuman.
I am also deeply troubled by the consequences and the echoes of that content in the real world.
The overall benefits of the digital economy and social media are without question. In fact, I published a book, shortly before I took up politics, wherein I talked about the benefits of the digital economy, of artificial intelligence in particular, but also about some unintended negative consequences.
In Canada, more than 9 out of 10 adults use at least one online platform, and since the beginning of the pandemic, online platforms have played an even more important role in our lives.
We use social media platforms like Facebook, Twitter, Instagram and YouTube to stay connected to our families, friends and colleagues. We use them to work, to conduct business, to reach new markets and audiences, to make our voices and opinions heard, and to engage in necessary and vital democratic debate. However, we have also seen how social media can have negative and very harmful impacts.
On a daily basis, there are Internet users who share damaging content, either to spread hate speech, the sexual exploitation of children, terrorist propaganda, or words meant to incite violence.
This content has led and contributed to violent outbursts such as the attack on the Islamic Cultural Centre in Quebec City in 2017, and similar attacks in Christchurch, New Zealand, in 2019.
Canadians and people all over the world have watched these events and others unfold on the news with shock and fear. We all understand the connections between these events and hateful, harmful online discourse. We worry about our own safety and security online. We worry about what our children and our loved ones will be exposed to.
According to a recent poll by the Canadian Race Relations Foundation, an overwhelming 93% of Canadians believe that online hate and racism are a problem, and at least 60% believe that the government has an obligation to prevent the spread of hateful and racist content online.
In addition, the poll revealed that racialized groups in Canada are more than three times more likely to experience racism online than non-racialized Canadians.
Since the beginning of the COVID‑19 pandemic, we have seen a rise in anti-Asian hate speech on the Internet and a steady increase in anti-Semitic rhetoric, further fuelled by recent events.
A June 2020 study by the Institute for Strategic Dialogue found that Canadians use more than 6,600 online services, pages and accounts hosted on various social media platforms to convey ideologies tinged with white supremacism, misogyny or extremism. This type of content wreaks havoc and destroys lives. It is intimidating and undermines constructive exchange. In doing so, it prevents us from having a true democratic debate and undermines free speech.
The facts speak for themselves. We must act, and we must act now. We believe that every person has the right to express themselves and participate in Internet exchanges to the fullest extent possible, without fear and without intimidation or concern for their safety. We believe that the Internet should be an inclusive place where we can safely express ourselves.
Our government is therefore committed to taking concrete steps to address harmful content online, particularly if the content advocates child sexual exploitation, terrorism, violence, hate speech, and non-consensual sharing of intimate images.
In fact, this is one of the priorities outlined in the mandate letter given to me by Prime Minister Justin Trudeau. So we have begun the process to develop legislation that will address the concerns of Canadians.
Over the past few months my office and I have engaged with over 140 stakeholders from both civil society organizations and the digital technology sector regarding this issue. This has included seven round-table discussions. We also spoke with indigenous groups, racialized Canadians, elected provincial officials, municipal officials and our international partners to assess our options and begin to develop a proposed approach.
In addition, given the global nature of the problem, I have hosted a virtual meeting with my counterparts from Australia, Finland, France and Germany—who were part of the multi-stakeholder working group on diversity of content online—to discuss the importance of a healthy digital ecosystem and how to work collectively.
I am also working closely with my colleagues the ministers of Justice, Public Safety, Women and Gender Equality,Diversity and Inclusion and Youthas well asInnovation, Science and Industry to find the best possible solution.
Our collaborative work aims to ensure that Canada's approach is focused on protecting Canadians and continued respect for their rights, including freedom of opinion and expression under the Charter of Rights and Freedoms. The goal is to develop a proposal that establishes an appropriate balance between protecting speech and preventing harm.
Let me be clear. Our objective is not to reduce freedom of expression but to increase it for all users, and to ensure that no voices are being suppressed because of harmful content.
We want to build a society where radicalization, hatred, and violence have no place, where everyone is free to express themselves, where exchanges are not divisive, but an opportunity to connect, understand, and help each other. We are continuing our work and hope to act as quickly and effectively as possible. I sincerely hope that I can count on the committee's support and move forward to build a more transparent, accountable and equitable digital world.
I thank you for your attention and will be happy to answer any questions you may have.
View Francesco Sorbara Profile
Lib. (ON)
Thank you, Minister.
I have a follow-up question on what we are seeing in terms of some content that is being posted online and its negative impact on various communities.
With that, communities across Canada are extremely worried about the rise of Islamophobia, hate speech online, as you just mentioned, towards our indigenous communities, and other forms of prejudice that have only intensified during this pandemic. We've all seen that words can lead to violence.
As parliamentarians, we recognize that we all have a duty to lead by example; that is to say, to engage in respectful dialogues, to be open to debates of ideas and to hear the positions of Canadians in order to work for a society where everyone is free to flourish with dignity.
Minister, can you tell us more about what our government is doing to fight the promotion of hatred and violence online?
Thank you.
View Steven Guilbeault Profile
Lib. (QC)
This is really an important point. There are some people out there—a minority, clearly—who would advocate that we shouldn't intervene and that there should be no laws whatsoever regarding the Internet in any way. What happens on the Internet stays on the Internet. Well, it's clearly not the case.
In June 2020, the Institute for Strategic Dialogue published a report on right-wing extremism in Canada, as I said earlier, identifying more than 6,000 right-wing extremist channels, pages, groups and accounts. Since 2014, Canadians—inspired in whole or in part by extreme views they've gathered online—have killed 21 people in this country and wounded 41. This idea that this stays on the Internet is simply false.
Notwithstanding that, we haven't waited until the introduction of this legislation. For two years now, we have been funding an initiative called the digital citizenship initiative, whereby we're working with victims groups and with academics around the country to increase the level of online literacy for Canadians, to help them detect false news and to help them recognize hate speech and extremist groups online.
View Kamal Khera Profile
Lib. (ON)
Thank you, Mr. Chair.
Thank you to both of our witnesses for being here, but more importantly, for all the work you do.
Dr. Perry, I want to start off with you, and I want to talk about online hate.
I know you've teamed up with Facebook Canada to address instances of online hate. It is a topic that we've certainly discussed in committee. You have declared that online platforms have been a gift to alt-right groups known for spreading conspiracy theories via video clips.
Could you maybe expand a little bit on your findings and efforts in this area? How do we address promoting hatred on mainstream channels, as well as on underground networks, such as Parler and Gab?
Barbara Perry
View Barbara Perry Profile
Barbara Perry
2021-05-31 15:57
These are all very good questions. They're not easy questions by any stretch.
One of the most disturbing things we found in this round of work—the Institute for Strategic Dialogue is doing much of our online analysis—is that in two successive years, Canadian posters were among the most active within the far-right ecosystem, if you will.
Just quantitatively, that's problematic. We tend to think we are immune to those kinds of narratives, but there you are. In particular in the first round—that would have been the 2019 report that we did with ISD—we actually found that they were, in fact, second and third in two of the most extreme platforms, Fascist Forge and Iron March. These are the ones that are most likely to promote violence, and mass violence in particular.
Again, quantitatively, that is the problem, but it's also a problem qualitatively, given the breadth of the speech, the viciousness of the speech as it's directed towards particular individuals or particular communities, whether it's emails or posts directed towards an individual or it's those who vilify particular groups. It's rampant online, obviously.
I think we have to consider the impacts of this on a sense of community, a sense of belonging and a sense of security, as well. It is something that absolutely silences communities. It makes them less willing to engage online, which has become the way we communicate—especially now, with COVID.
How do we confront it and how do we regulate it? It's such a challenge. We've been exploring it globally over the last five or six years. We've been trying to constrain the most heinous sorts of speeches.
When I'm talking about hate speech here, I'm talking about dangerous speech, speech that promotes violence, that explicitly promotes vilification and that directs hatred towards particular groups. Warman v. Kouba identified these sorts of elements of speech as the hallmarks of hate.
I think we need to put much more pressure on social media giants to enforce their community standards. Most of them are at least as strong as our own federal definitions. We need to encourage the actual use of those. I hear so many...from the research but also from the people I work with. They are identifying speech that seems to cross those boundaries, which.... There's no response to the complaints, so I think we need to hold their feet to the fire.
In terms of the alternative platforms, that's where the real challenge lies because access to the darkest spaces is more difficult for researchers, for police, for journalists and for anyone who wants to know what's happening there. There are challenges there because they're specifically set up to avoid any sort of community standards. Most of us are at a loss as to how to respond to those. Again, perhaps we put pressure on the domains to not host them, as happened with Parler. I think it was after the January 6 events.
I think that is a new challenge presenting itself.
View Pam Damoff Profile
Lib. (ON)
Thank you, Chair.
I'd like to thank all of our witnesses for being here today, especially on such short notice. Your testimony is very valuable to us.
My first question is for CSIS.
You mentioned online hatred and the prevalence of “echo chambers of hate”, whereby mobilization to violence can occur quite rapidly. The National Firearms Association is a group that shares offensive images online and has shared tweets that have been sympathetic to groups alleged to have IMVE affiliation. In one of them, the tweet said, “If the police will not protect you during a violent riot, you will have to protect yourself and others”.
I have personally been the subject of their comments. Recently, this committee voted to condemn remarks made by the group that discussed guillotining parliamentarians who support gun control, describing what is happening in Canada as “tyranny”.
My question for you is straightforward. We've seen far too many examples where language is later masked as jokes and then turned into real-world violence, either by those making the remarks or those following. I'm just wondering; what impact do these kinds of comments have on individuals who may be radicalized by them and should we be calling it out for what it is?
Timothy Hahlweg
View Timothy Hahlweg Profile
Timothy Hahlweg
2021-05-12 17:11
That's an excellent question, and I think it would be useful at this time to give a snapshot of how we investigate in this space, from a CSIS perspective. I think it will help articulate the space we hold vis-à-vis other people in this landscape.
The way that we look at it organizationally is really in three tiers.
We have the first tier, which is passive engagement. There are a lot of books out there, and there are videos and chat rooms. A lot of people are listening to some of this violent, abhorrent content, but these people are passive. They're not moving to violence at this stage.
When those individuals move to our second tier of threat actions, it is a more active engagement. This is where we're seeing people not just listening but putting some propaganda out there. They're adding content, communicating and letting their voices be known. A lot of this still falls in with freedom of speech, but some of it starts to bleed into what is the third tier. That's where the service gets involved.
The third tier sees these people mobilizing to violence or potentially mobilizing to violence. In the third tier, we're seeing a lot of increased operational security by these individuals. They're not staying in the open. They're going into more private chat rooms and more encrypted forums. We're seeing them go to a lot of alternative platforms. When we look at this third tier, from a service perspective it's really important that we look at what triggers the CSIS mandate. We have done a lot of work in this space over the last couple of years with our partners in the S and I community.
What do we require to actually investigate these threats? We need a willingness to kill or inspire others to kill; a threat of serious violence; an attempt to effect societal change, so not just a personal narrative but something bigger; and an ideological influence. Once we have those triggers, we're able to investigate these threats. We deconflict on a regular basis with our police colleagues, especially the RCMP, and then we decide who's best positioned to deal with them.
I hope that answers your question.
View Pam Damoff Profile
Lib. (ON)
It does, sort of.
I'm going to turn to the RCMP, in a similar vein. There has been rampant growth of this type of content online, and you remarked that you were gravely concerned with extremist views that are first fostered online and can lead to and have led to actual physical violence. Our colleague at CSIS listed a number of cases that did result in injury and death.
Who is being targeted? Do you see this being race- and gender-based hatred? Are you seeing it tied to these anti-mask rallies, where we're seeing neo-Nazi flags being flown?
Michael Duheme
View Michael Duheme Profile
Michael Duheme
2021-05-12 17:15
What we're seeing is that vulnerable groups, as I'll call them, are more targeted than the general population. It's important to note that we make a distinction between IMVE and hate-motivated crime. We're dealing with a lot of hate-motivated crime and with comments that are covered under the Criminal Code of Canada. There is a difference there. There are specific sections in the code to deal with hate-motivated crime. On the other side, as Tim mentioned earlier, with the IMVE, there's a deep-rooted ideology that's more complex than just hatred to things.
I don't have any information to say there are links with the different flags being shown at protests. We take every complaint seriously and investigate every complaint that is reported to us.
Mark, I'm not quite sure if there is anything you wish to add.
View Kristina Michaud Profile
BQ (QC)
Thank you very much.
We know that extremist groups rely heavily on social networks and platforms, such as Facebook, Twitter and other platforms that have even been banned, to recruit people and to misinform and radicalize them. Some people believe that shutting down certain platforms would not be beneficial because it would send people to private networks on the Internet.
Even if it's not on these private networks and it's on the platforms that we know and access every day, how can the government and the RCMP intervene to detect this kind of violent extremism, whether it's violent speech or video sharing?
Should there be collaboration with the private companies that own these platforms, or could the government and RCMP intervene directly?
Michael Duheme
View Michael Duheme Profile
Michael Duheme
2021-05-12 17:20
I'll talk about what the RCMP can do with respect to websites.
The majority of the investigations we conduct into hateful comments spread on social networks are triggered when we receive reports from people who have observed this on a site and report it to us. In most cases, we trigger an investigation.
Of course, if the social networks remove the information without notifying us, we don't have access to that information. It's no different than when someone calls the police to make a report and the police initiate an investigation, except that it happens on social networks.
If the platforms remove this information without notifying us, we can no longer take informed action on the complaint.
Members of Parliament often receive derogatory or hateful messages on social media. In these cases as well, the RCMP initiates an investigation and we follow through. Sometimes that's a challenge because people can use all sorts of mechanisms on social media to avoid being found.
I won't hide from you that this is one of our concerns, and it's not just about social networks. When you implement a new law or a new process, people always find ways around that through other mechanisms.
You've all heard of the dark Web. There are probably already many IMVE groups on the dark Web.
Results: 1 - 15 of 67 | Page: 1 of 5

1
2
3
4
5
>
>|
Export As: XML CSV RSS

For more data options, please see Open Data