Skip to main content
Start of content

JUST Committee Report

If you have any questions or comments regarding the accessibility of this publication, please contact us at accessible@parl.gc.ca.

PDF

TAKING ACTION TO END ONLINE HATE

Chapter 1—Context of the Study

Hate speech is not only used to justify restrictions or attacks on the rights of protected groups on prohibited grounds … hate propaganda opposes the targeted group’s ability to find self-fulfillment by articulating their thoughts and ideas. It impacts on that group’s ability to respond to the substantive ideas under debate, thereby placing a serious barrier to their full participation in our democracy. Indeed, a particularly insidious aspect of hate speech is that it acts to cut off any path of reply by the group under attack. It does this not only by attempting to marginalize the group so that their reply will be ignored: it also forces the group to argue for their basic humanity or social standing, as a precondition to participating in the deliberative aspects of our democracy.

Saskatchewan (Human Rights Commission) v. Whatcott, [2013] 1 SCR 467

With the rise of hate crimes reported to the police and the use of online platforms to promote hatred, several groups have requested that this issue be studied by Parliament.[1] Recent events in Canada and abroad have shown that online hate can have serious consequences and often precedes acts of violence. It is imperative that all governments around the world effectively address both online and offline acts of hatred. Government responses must strike the right balance between protected rights and freedoms.

In March 2019, the House of Commons Standing Committee on Justice and Human Rights (the Committee) decided to undertake a study on online hate.[2] The Committee was focused on a number of solutions, including, but not limited to, finding potential amendments to the Canadian Human Rights Act,[3] the Criminal Code,[4] or any other act of Parliament, that could help stem the propagation of hateful acts and the enticement of hatred on online platforms.

In April and June 2019, the Committee held seven meetings to hear evidence from a wide variety of stakeholders.[5] This report presents the main concerns raised during this study and the Committee’s recommendations to address this important issue and prevent all forms of hatred motivated by race, national or ethnic origin, colour, religion, age, sex, sexual orientation, gender identity or expression, marital status, family status, genetic characteristics, and disability. None of the recommendations presented in this report derogates from an individual’s constitutional right to freedom of expression protected under section 2b) of the Canadian Charter of Rights and Freedoms.

The Committee appreciates the expertise and time provided by all the witnesses who participated in this study.

Chapter 2—The Use of Online Platforms to Promote Hatred

The Internet and online platforms offer many opportunities and are beneficial to society in general. They offer new avenues for free expression and bring “tremendous benefits in promoting knowledge and in sharing and facilitating connections.[6] The Internet has also “become an important part of helping LGBTQ2SI individuals find or construct their identities.”[7]

Despite all of these benefits and opportunities, there was consensus among witnesses that online platforms and the Internet are being used to spread hate[8] and to radicalize, recruit and incite people to hate.[9]

On social media and the internet, troubled people can find dark spaces to trade their prejudicial views and to embolden each other in hostile intentions. Finding ways to discourage, shut-down and prevent such spaces is a vital aspect of upholding human rights and of creating safe communities.[10]

As stated by Alex Neve from Amnesty International Canada, “[t]he rise of hate-based and hate-fuelled discrimination is on the rise everywhere, often made easier—or at least more obvious—by the new and accessible channels the online world offers.”[11]

Several factors contribute to the spread of online hate, such as the possibility of hiding “behind a veil of anonymity”,[12] easy “access to an audience”,[13] and easy access to hate content.[14] Witnesses also indicated that since online platforms provide a wide audience, hateful ideas that would be considered serious in the real world, appear to be validated online, and then become normalized.[15] According to the former President of the Centre culturel islamique de Québec, Mohamed Labidi, “[u]nfortunately, we're witnessing a form of impunity online.”[16]

While online hate may be trivialized by some people and not taken seriously enough by online platforms and Internet service providers, it still constitutes hate and has devastating consequences on its victims. Often, they are subject “to humiliation and degradation, resulting in grave psychological and social consequences.”[17] Online hate “undermines the well-being and sense of security of victims” as well as their “sense of belonging.”[18] More generally, it increases discord in society and contributes to the marginalization of certain groups “by convincing listeners of the inferiority of the targeted group.”[19] As noted by Bradley Galloway from the Organization for the Prevention of Violence, “[t]he perpetuation of associated rhetoric can create an environment where discrimination, harassment and violence are viewed by individuals as not only a reasonable response or reaction but also as a necessary one.”[20] Online hate also contributes to radicalization of people and “leads to the risk that sympathizers of hate speech will take action.”[21] As explained by Professor Jasmin Zine, “[o]nline hate propagation creates an ideological breeding ground to inspire terrorists.”[22]

Throughout the study, several witnesses pointed out the link between online hate and real-life violence, as revealed yet again by recent horrific hateful attacks on different groups.[23]

As you all well know, recent years have seen a proliferation of extreme forms of hatred in online fora that encourage violence and dehumanize those who are the targets of this hate. Recent high-profile violent attacks in Canada and abroad have emphasized the reality that these sentiments do not remain online, but have tragic offline consequences as well, and that they are in need of immediate and sustained attention.[24]Anglican Church of Canada
As these horrific attacks demonstrate, hate can be lethal, and online hate can foreshadow mass violence. There is no question that the Internet has become the newest frontier for inciting hate that manifests itself disturbingly offline.[25]Canadian Rabbinic Caucus
Often, the perpetrators of this violence have been radicalized by online influences, or they have discovered a like-minded online community and through it find validation for their specific personal bigotry and hatred.[26]Presbyterian Church in Canada

As rightfully pointed out by Shimon Koffler Fogel from the Centre for Israel and Jewish Affairs “[w]e cannot afford to be complacent, given the link between online hate and real world violence.”[27]  Similarly, it was noted that “[t]he audacity and frequency with which people now spew hate online shows us that we are failing in how our system currently combats online hate.”[28] Throughout the study, witnesses stressed that we must recognize the urgent need for governments,civil society, online platforms and Internet service providers to take the necessary measures to counter the incitement of hatred through online platforms.[29]

Chapter 3—Striking the Right Balance Between Different Rights and Freedoms Protected by the Charter

The issue of online hate brings “into sharp focus two crucially important human rights matters.”[30] Witnesses spoke about the importance of striking the right balance between the rights and freedoms protected by the Canadian Charter of Rights and Freedoms (the Charter):[31]

We are dealing with two competing imperatives. On the one hand is the desire to ensure that people can avail themselves of the freedom to express thoughts and ideas freely, without fear of persecution or prosecution however odious those ideas might be. On the other hand, unlike our American cousins, we recognize that there is a limit to freedom of expression. When it begins to encroach on the safety and security and well-being of others, that really constitutes a red line.[32] Centre for Israel and Jewish Affairs
Ensuring that there are meaningful protections against online hate and harassment, while also maintaining our commitment to the fundamental Canadian value of freedom of expression, is both difficult and of utmost importance.[33]Egale Canada Human Rights Trust
Any attempts to regulate online hate will inevitably bump against freedom of expression, because contrary to what some say, the precise contours of hate speech are not easily discerned.[34]Canadian Civil Liberties Association

The importance of freedom of expression and the need to protect that freedom was raised repeatedly during the study. Freedom of expression is a core Canadian value, crucial for different reasons including that it “provides the avenues for exposing and addressing injustice and for evolving our understanding about society and democracy and the environment in a way that makes for a better world.”[35] Freedom of expression must allow all citizens to “feel free to speak about all public policy issues as best they can.”[36] Although restrictions to free speech such as “libel, impersonation, threats and incitement to violence”[37] exist, some witnesses explained that limitations to freedom of expression must only be applied “when necessary for the respect of the rights of others”:[38] any exception to free speech “must be limited, well-defined and serve the public interest.”[39] Similar comments were made by Akaash Maharaj from the Mosaic Institute who stated that “any abridgment of freedom of expression must … be only the barest minimum necessary to protect and preserve the dignity and security of citizens.”[40] As noted during the study, the Supreme Court of Canada “has made clear that only a very narrow interpretation is appropriate, in recognition of the fact that a broad restriction on hateful content would unduly or unreasonably limit freedom of expression.”[41]

Because expression is sometimes used “to threaten the most marginalized members of our society”[42] and considering the grave consequences for its victims, including the infringement of their rights and freedoms,[43] several witnesses were of the view that reasonable restrictions to freedom of expression are needed.[44] Such restrictions are necessary “to protect Canadians from those who wilfully promote hate propaganda and seek to radicalize vulnerable individuals.”[45] Several witnesses also noted that “to stop the spread of hate speech”,[46] there is a need for criminal consequences.[47]

“Free speech is not an unbridled right.”[48] When there is a conflict between the rights and freedoms protected by the Charter, they must be balanced. As reiterated during this study, rights and freedoms are subject to “reasonable limits prescribed by law as can be demonstrably justified in a free and democratic society.”[49]

Some witnesses specified that restrictions to freedom of expression are not about policing “distasteful speech” and that they should focus on “combatting online hate.”[50] “It is vital to differentiate between the legitimate dissent that may include unpopular or controversial views, and speech acts that incite hatred and create poisoned and threatening environments.”[51]

Witnesses warned the Committee that we must stop people trying to legitimize hate speech using freedom of expression as a disguise to do so.[52]

Chapter 4—Current Hate Crime Legislation

The Criminal Code contains four offences specifically related to hate crimes, as shown in Table 1 below.

Table 1—Offences Specific to Hate Crimes in the Criminal Code

Section of Criminal Code

Offence

Penalty associated with the offence

318

Advocating or promoting genocide.

The initiation of proceedings under this offence necessitates the consent of the Attorney General (section 318(2)).

Indictable offence with a maximum term of imprisonment of five years.

319(1)

Inciting hatred against any identifiable group,a by communicatingb statements in any public place, where such incitement is likely to lead to a breach of the peace.

  • Either an indictable offence with a maximum term of imprisonment of two years;
  • or an offence punishable on summary conviction with a fine of not more than $5,000, or a maximum term of imprisonment of six months, or both.

319(2)

Wilfully promoting hatred against any identifiable group, by communicating statements, other than in private conversation.

The initiation of proceedings under this offence necessitates the consent of the Attorney General (section 319(6)).

430(4.1)

Committing mischief motivated by bias, prejudice or hate based on colour, race, religion, national or ethnic origin, age, sex, sexual orientation, gender identity or expression or mental or physical disability in relation to, among others, a building used for religious worship or a building used by an identifiable group as an educational institution.

  • Either an indictable offence with a maximum term of imprisonment of 10 years;
  • or an offence punishable on summary conviction with a maximum term of imprisonment of 18 months.

Notes:    a.     “Identifiable group” is defined by section 318(4) of the Criminal Code as “any section of the public distinguished by colour, race, religion, national or ethnic origin, age, sex, sexual orientation, gender identity or expression, or mental or physical disability.”

b.     “Communicating” is defined by section 319(7) of the Criminal Code as including “communicating by telephone, broadcasting or other audible or visible means.”

In addition to these specific offences, the Criminal Code recognizes that the commission of other offences of broader application (e.g., assault, mischief, uttering threat and harassment) can also be motivated by hate. In such cases, hate becomes an aggravating factor to be considered by the court at the time of sentencing. Pursuant to section 718.2 (a)(i) of the Criminal Code, when imposing a sentence, the court shall consider evidence “that the offence was motivated by bias, prejudice or hate based on race, national or ethnic origin, language, colour, religion, sex, age, mental or physical disability, sexual orientation, or gender identity or expression, or on any other similar factor.”

Throughout this study, several witnesses expressed the need for more effective application of these Criminal Code provisions stressing, for example, that online hatred must “not go unchallenged.”[53] A number of options were presented to the Committee to support a “more robust consistent use” of these provisions:[54]

  • That enhanced training and support be provided to police, Crown prosecutors and judges.[55]
  • That law enforcement agencies be provided with the necessary resources and tools to prevent and fight hate speech.[56] It was suggested, for example, that more law enforcement hate crime units be created in major cities.[57]
  • That direction be given to law enforcement, public prosecutors and attorneys general to be “much more aggressive and active in applying the provisions of the Criminal Code.”[58]
  • That the federal government put forth a national strategy “for more effective enforcement of existing laws regarding the public incitement of hatred, with particular attention given to the ways these attitudes are expressed online.”[59]
  • That law enforcement “make hate-motivated cyber-attacks or website-hacking a priority” and that Canada “work with the international community to bring the perpetrators of these incidents to justice, whether or not the perpetrators are physically located in Canada.”[60]
  • That directives be developed to guide attorneys general in the exercise of consent required to initiate proceedings under sections 318 (i.e. advocating or promoting genocide) and 319(2) (i.e. wilfully promoting hatred against an identifiable group, by communicating statements, other than in private conversation and advocating genocide) and 320 (warrant of seizure) of the Criminal Code, so that the provisions are applied more consistently.[61] It was raised that the requirement for the consent of the attorney general may place “undue limits on the prosecution of online hate crimes,”[62] and that removing such a requirement “could increase the ability of police to pursue, without delay, action to stop such crimes from happening.”[63] According to Richard Warman, removing this requirement “would inevitably result in a new wave of constitutional challenges”.[64]

Finally, some witnesses suggested repealing section 319(3)(b) of the Criminal Code which exempts “a person who would otherwise be subject to an indictable offence, if their hate speech is ʽbased on a belief in a religious textʼ.”[65] According to the Canadian Secular Alliance, this exemption “is a clear violation of the principle of state neutrality in matters of religion.”[66]

Chapter 5—Available Data on Hate Crimes And Hate Incidents in Canada

5.1 Overview of Hate Crimes in Canada

Since 2009, police services across the country reported between 1,167 and 2,073 hate crimes each year in Canada.[67] Police-reported hate crimes increased significantly in 2017, “up 47% over the previous year, largely the result of an increase in hate-related property crimes, such as graffiti and vandalism.”[68]

More precisely, police reported a total of 2,073 hate crimes in 2017, which represents 664 more hate crimes than in 2016. The overall increase was mainly due to a rise in hate crimes motivated by religion (83%) or race or ethnicity (32%).[69]

  • Hate motivated by religion: “Hate crimes against all religions saw increases [in 2017].”[70] However, hatred against the Muslim population registered the highest increase in 2017, from 139 to 349 crimes (151%).[71] Hate crimes against the Jewish population rose from 666 to 878 crimes in 2017, representing a 63% increase from 2016.
  • Hate motivated by race or ethnicity: The 32% growth of hate crime motivated by a race or ethnicity in 2017 “was the result of 107 more hate crimes targeting the Black population (+50%) and 30 more [crimes] targeting the Arab and West Asian population.”[72]
  • Hate motivated by sexual orientation: An increase of 16% of hate crimes targeting sexual orientation was reported in 2017: crimes targeting sexual orientation rose from 176 in 2016 to 204 in 2017. These crimes targeting sexual orientation tend to be violent. As noted by Jennifer Klinck from Egale Canada Human Rights Trust, “online hate is of significant concern to the LGBTQ2SI community, because people are committing ever more acts of hate against us, and, all too often, those who hate us want to hurt and kill us.”[73]

Of all hate crimes reported by police in 2017, 43% were motivated by hatred of a race or ethnicity,[74] 41% were against a religion and 10% targeted sexual orientation.

5.2 Unreported Hate Crimes

Crimes reported by the police each year only include those that come to their attention and that are substantiated through a police investigation. Regarding hate crime, the Committee was told that Statistics Canada estimates that two out of three victims do not report to police.[75] As noted, for example, by Professor Jasmin Zine from Wilfrid Laurier University, not all people targeted by hate feel comfortable reporting to the police. Speaking about cyber-related hate crimes, she noted:

We know these cases are under-reported. Stats Canada revealed that between 2010 and 2017, police reported 374 cases of cyber-related hate crimes, but we know there are far more than that. There needs to be an empowering of the vulnerable and marginalized communities experiencing this to be able to bring their cases forward, to be heard and to have swift action based on that.[76]

Shalini Konanur from the South Asian Legal Clinique of Ontario indicated that “it can be reasonably difficult for racialized persons who have experienced being targeted by the police … to then have to seek assistance from members of that same force.”[77] Similarly, Ricki Justice from the Pride Centre of Edmonton noted that within the LGBTQ2S+ community “people are hesitant to report online hate because of a fear of police and their systemic mistreatment historically, so they don't come forward.”[78] It was also suggested that newcomers often do not feel comfortable reporting hate incidents to police.

[W]e find that many newcomers may not feel comfortable reporting any form of crime, let alone online hate, for various reasons. For example, they might feel that their engaging with enforcement in any way—even if it is a reportable crime—may jeopardize their citizenship application or PR status. They may not trust the police, or they may not understand what constitutes hate speech and not know that it is something that is reportable. They may believe that if hate speech is in a non-official language, it does not count as a crime in Canada and local law enforcement will not take it seriously. Some do not understand the process of reporting online hate and what happens afterwards. They may not believe that reporting it may make a difference, or they may feel that they are just causing problems by reporting a hate crime, especially if it is being perpetrated by a member of their own community.[79]

To facilitate the reporting of these crimes, Shahen Mirakian from the Armenian National Committee of Canada suggested that information be “provided to community organizations on how to report these sorts of incidents properly.”[80] It was also suggested that trusted third parties, such as grassroot organizations, could act as liaison officers between victims and the police to facilitate reporting.[81] Building trust with community members could also be done with the establishment of hate crime units within police forces across the country.[82]

A number of witnesses noted in this regard that there is an urgent need for “more community research to understand the prevalence of unreported hate crimes, as well as to understand which community groups tend not to report hate crimes and the barriers to doing so.”[83]

Improving our understanding of online hate in general was highlighted as an important aspect of combatting online hate. It is important to adopt an intersectional approach to know, for example, the extent of online hate, where it occurs, as well as the impact it has on different groups.[84] As stated by the Canadian Women’s Foundation:

It is especially important that any data collection is disaggregated by gender, and includes targeted information on those at special risk, such as LGBTQ2S+ survivors, Indigenous women, Black women, disabled women, and young women.[85]

5.3 Limited Data on Online Hate

While there is limited data pertaining specifically to online hate in Canada, this is not a reason to be complacent. The portrait of online hate presented to the Committee during this study is disturbing. During the study, witnesses shared the results of surveys that show that online hate is a serious problem in Canada. Below are the main figures dealing specifically with online hate as provided by the witnesses:

  • “Between 2010 and 2017, there were 364 police-reported hate crimes that were also recorded by police as cybercrimes in Canada.  The most commonly targeted group[s] of hate cybercrimes were the Muslim population (17%), [persons targeted due to their] sexual orientation (15%), the Jewish population (14%), and the Black population (10%).”[86]
  • Cision Canada, a Toronto-based PR software and service provider, reported a “600% rise in intolerant hate speech in social media postings by Canadians”[87] between 2015 and 2016.
  • A national survey conducted by the Association for Canadian Studies found that “almost 60% of Canadians have seen some form of hate speech posted on social media.”[88]
  • The annual audit of anti-Semitic incidents in Canada, “found that of the 2,042 recorded incidents in 2018—an increase of 16.5% over 2017—80% of those anti-Semitic incidents took place via online platforms.”[89]
  • Since 2008, websites of Armenian community organizations have been subjected to three separate incidents of cyber-attacks. The websites of Armenian-Canadian newspapers, churches and community organizations have been replaced with anti-Armenian propaganda, including, but not limited to, denials of the Armenian genocide.[90]

5.4 The need to Enhance Tracking and Reporting of Online Hate

As noted above, information about online hate in Canada is very limited. At present, Statistics Canada does not systematically track online hate crimes reported by the police each year. Witnesses told the Committee that to better understand the issue and guide our interventions, it is imperative that we develop methods to track all incidents of hatred.[91] This would include tracking hate crimes as well as hate incidents. As explained by Cara Zwibel from the Canadian Civil Liberties Association, it is important to differentiate the two:

I also think we shouldn't conflate hate crimes and hate incidents. An incident might be someone shouting a racial slur to a stranger in a grocery store. That's something that we might want to know about, but it's not something that the criminal law should be dealing with.[92]

The Committee was informed that Statistics Canada has been consulting different groups, including police services and academics, to improve data on hate crimes and to consider whether there is a capacity to record not only hate crimes, but also hate incidents occurring online and offline.[93] Some witnesses suggested that Statistics Canada be mandated to collect and share these statistics.[94]

The Committee was also told repeatedly that partnerships are key for the collection of data on online hate, particularly “between the Government of Canada and technology companies.”[95] Other witnesses also mentioned that social media companies should make it easier to report hate content, notably with the use of a report button.[96]

During the study, it was suggested that police be charged with referring people who make complaints about hate incidents that are falling “below the threshold of a crime” to the appropriate organizations so that they can count the incident and collect the information.[97]

Some witnesses raised the potential risks of leaving data collection to groups representing their communities as they may not be impartial in the way they present the data.[98] Another option discussed during the study was for Statistics Canada to create a “self-reporting online portal on their site.”[99] This portal would allow people who experience online hate that has perhaps not been reported to the police or to another organization to be recorded online.

5.5 Defining Hatred: a First Step to Tracking Online Hate

Defining hatred is an important first step for tracking online hate in a manner that can be understood and enforced uniformly.[100] From a legal standpoint, a clear definition of hate is also imperative as it would “draw the line between legal and illegal activity” and “[f]rom that point on, the law enforcement agencies will have a free hand to take action.”[101] A clear definition of all types of hatred would also help preventing and countering hate by all stakeholders, including the police, Internet service providers and online platforms. A clear understanding of what constitutes hatred may also facilitate reporting.

The definition of what constitutes online hate versus offensive materials needs to be clear. All community members, not just the legal community or subject experts, need to understand what is online hate and how hate can show up online, whether it be under the guise of educational material or news; how to make a report; and what happens after reporting a hate crime. If the community does not understand the definition and process, they will be reluctant to intervene or make a report.[102]

Recognizing the importance of defining hate, some witnesses recommended building upon the parameters defining hatred developed in two decisions of the Supreme Court of Canada, namely Saskatchewan (Human Rights Commission) v. Whatcott and R. v. Keegstra.[103] Others suggested that we should use the definition of anti-Semitism developed by the International Holocaust Remembrance Alliance. In that sense, Shimon Koffler Fogel from Centre for Israel and Jewish Affairs noted the following:

The international community's experience in defining anti-Semitism is an important model. The International Holocaust Remembrance Alliance, or IHRA, working definition of anti-Semitism, which is the world's most widely accepted definition of Jew hatred, should be included in any strategy to tackle online hate. It's a practical tool that social media providers can use to enforce user policies prohibiting hateful content and that Canadian authorities can use to enforce relevant legal provisions.[104]

The YWCA Canada recommended specifically to

integrate an intersectional gender equity lens and consider the gendered impacts of anti-Black racism, anti-Indigenous racism, anti-Semitism, Islamophobia and Xenophobia in any definition of “hate” and “online hate”.[105]

Chapter 6—Additional Tools To Counter Online Hate

6.1 Regulating Online PLatforms

During the study, witnesses discussed the best approach to countering online hate: should online platforms continue to self-regulate or should the government establish a legal or regulatory framework to set rules?

At present, online platforms remain largely unregulated in Canada.[106] Although Facebook, Google and Twitter already have policies on hate speech or hateful conduct, they indicated that they recognize the need to update and improve their policies on a regular basis.[107] For example, the Committee was informed that Google recently updated its hate speech policy by explicitly prohibiting YouTube videos promoting violence or hatred to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status. It also decided to remove content on YouTube denying that well-documented violent events, like the Holocaust, took place.[108] Moreover, several platforms have recently taken steps to improve their response to online hate, by updating for example the technology used to remove hateful content and increasing the number of employees dedicated to reviewing content.[109] In that regard, Kevin Chan informed the Committee that Facebook will create an external oversight board by the end of 2019 to help govern hate speech on their platform.[110]

Recent events have shown that online platforms are vulnerable to online hate and that inaction on their part can lead to very serious human rights violations. Based on the evidence heard, most witnesses, including representatives from online platforms, stressed the need for the government to set clear rules regarding hate speech, harassment and disinformation found online.

As noted by Kevin Chan from Facebook, since people use many different online platforms to communicate, the establishment of clear baseline standards applicable to all platforms would help to counter online hate.[111] According to some witnesses, online platforms should be encouraged to put in place robust governance, such as codes of conduct,[112] and should be required “to be more transparent about their content moderation, including their responses to harmful speech.”[113] It was suggested more specifically that social media companies “be required to publish regular transparency reports, providing anonymized information.”[114] They should also be “more forthcoming in revealing the factors and weightings they use to describe what posts are prioritized to their users, [and] they must give users greater and easier control to adjust those settings.”[115]

As online platforms appear to sometimes be unable to flag and remove hate content in a timely fashion, it was suggested that we should not solely rely on corporations to establish rules in this regard. Government leadership is necessary to regulate social media companies, and that the establishment of such a regulatory framework should be done in consultation with various stakeholders:

When I look at the history of social media companies trying to deal with these kinds of things, it's not very promising.… When it comes to these companies, it's a question mark really how much we want to rely on them to make the perfect rules. I think the government should have some involvement as well.[116]Ahmadiyya Muslim Jama'at
It seems that the mainstream social media platforms do have terms of use. They do have certain regulations and requirements, but it does seem very clear as well that they just cannot seem to keep up with what is going on as far as online hate is concerned. That being the case, and to echo what all of the witnesses have said, I think we need to see government take a leadership position. Of course, it will be in partnership with social media platforms, Internet service providers and other appropriate partners, but I definitely think that government will need to take the lead in this collaborative approach to actually being able to keep up with the monitoring and fighting of online hate.[117]Canadian Rabbinic Caucus
 lt is also increasingly clear that policy intervention by government is needed to mitigate the impact of the more egregious misuses of online social networks. Despite recent steps taken by Facebook and Twitter to remove certain accounts, government also has a role to play in regulating these online platforms.[118]Bahá'í Community of Canada

Furthermore, several witnesses believed that the development of a regulatory framework for online platforms should focus on human rights, notably to protect freedom of expression and to avoid censorship.[119]

As David Kaye, the UN special rapporteur on the freedom of expression, has urged, relying upon international human rights norms rather than the arbitrary judgements of commercial platforms is a better basis for the development of these standards. This includes delineating the rights and responsibilities of users, as well as safeguards to ensure that freedom of expression is not unduly curtailed.[120]Bahá'í Community of Canada
At this stage, it is evident that better regulation of online platforms is needed, but we cannot simply transpose old ideas onto this new forum. Requiring content monitoring by online platforms may be appropriate. However, there is a need to balance making platforms responsible for content from which they profit and the risk of incentivizing sweeping censorship.[121]Egale Canada Human Rights Trust

The question of whether it is appropriate to fine online platforms and Internet service providers when they fail to remove hate content was also discussed by witnesses.[122] Some witnesses believed that they need to be accountable for “ensuring they are not hosting hate websites and moderating their online social networking feeds” and that “fines should be imposed and criminal sanctions placed on violators.”[123] For example, it was suggested that failing to remove illegal content within a set period should result in severe penalties.[124] Others were of the view that this could lead to censorship and the removal of important online speech: “If there are really hefty fines and a need for fast action, that can be an incentive to just to take things off, and it can lead to the removal of important speech, political speech.”[125]

Many witnesses suggested looking at what is done in other jurisdictions to address this issue:[126] “Their successes and setbacks should be considered in crafting a made-in-Canada approach.”[127]

On the topic of removal of hate content by online platforms, a witness noted that the establishment of “a squad to track down hate messages on the Internet”[128] should be a priority. It was also raised that in certain jurisdictions, such as France, hate content must be taken down in a short period of time after it has been reported and that this approach could be beneficial in Canada:

I would recommend that illegal content on these [online] platforms be removed as quickly as possible, within 24 hours. I know that other countries have those regulations and that platforms take measures to dissuade users from repeatedly uploading illegal content, so it's not just taking the content down; it's making sure that the content isn't put back up again.[129]

In this context, the Committee was told that there is a need to be careful “about who's making the determination of what is hate” and to ensure that the decision to remove content is not given “to private corporations that have a profit motive to potentially censor any unpopular views.”[130]

Also, on the topic of removal of hate content by online platforms, some witnesses discussed the concept of “trusted flaggers” which is a concept present in the European Union, where groups specialized in online hate work in collaboration with online platforms and Internet service providers to identify online hate quickly.[131] However, it was noted that this way of working is not the most transparent. As noted from David Matas from B’nai Brith Canada, it seems like “a good system but it can't replace legislation.… You can't just say you'll leave it for the service providers to do, with the help of the NGOs.”[132] Different forms of technology, such as artificial intelligence, could also be used to facilitate the removal of hate content from online platforms.[133]

Other options were also presented to oversee the enforcement of regulations applying to online platforms or online hate in general. For example, it was suggested that this oversight and enforcement role be given to:

  • the Canadian Radio-television and Telecommunications Commission;[134]
  • an independent body with a broad mandate to regulate online content;[135]
  • an independent regulator similar to the one currently studied by the Government of United Kingdom.[136]

Finally, it was suggested that the regulation of social media companies be the subject of a comprehensive parliamentary study.[137]

6.2 Using Human Rights Law to Counter Online Hate

During the study, the issue of whether human rights law is an appropriate legislative tool to combat online hate was raised on several occasions.

In Canada, the Canadian Human Rights Act[138] (CHRA), which generally prohibits discrimination against persons employed by the federal government and federally regulated bodies, as well as against persons receiving services from the government and these bodies, does not currently contain a provision regarding online discriminatory acts. However, the CHRA previously included such a provision.

In fact, prior to its repeal in 2013,[139] former section 13 of the CHRA made it, among other things, a discriminatory practice for a person or a group of persons acting in concert to communicate by means of the Internet “any matter that is likely to expose a person or persons to hatred or contempt by reason of the fact that that person or those persons are identifiable on the basis of a prohibited ground of discrimination.”[140]

The repeal of section 13 of the CHRA took place in a context in which some individuals and groups were concerned about the fact that its application could violate freedom of expression. As put forth by Shimon Koffler Fogel from the Centre for Israel and Jewish Affairs,

[t]he problem was that ironically, groups or individuals we should be concerned about were using section 13 as a way of pushing back against those who were raising legitimate free expression ideas or concerns about particular topics. It was chilling, or more precisely freezing, the ability of people to offer critical comment about things of public interest without fear of being brought before some judicial process to account for what they said, because others were claiming that was triggering hate against them.[141]

On the one hand, some witnesses were of the view that the current legal framework is sufficient, and that human rights law is not the right tool to address online hate. These witnesses argued that former section 13 of the CHRA or any similar provision should not be re-instated.[142]

We are very concerned about any attempt to reinstate a hate speech provision in the Canadian Human Rights Act. These provisions have been shown to be ineffective and often abused. They chill freedom of expression and are applied in demonstrably unfair way.[143]Association for Reformed Political Action Canada
As for your direct question on the repealed section 13 of the Canadian Human Rights Act, we feel that the current law strikes a reasonable balance in terms of restrictions on free speech. We feel that existing provisions can be enforced more rigorously and more consistently across Canada and that the larger problem we face is not a lack of legislation addressing hatred in all of its forms but a lack of enforcement of existing provisions.[144]Canadian Secular Alliance
We supported the repeal, and continue to believe that asking human rights tribunals to play the role of censor does not fit well with functions of tribunals.… A human rights commission or tribunal charged with prosecuting hate speech is put in a situation of conflict. In their core anti-discrimination work, they seek to protect minority groups, but in addressing hate speech complaints, they may often have to tell such groups that a very offensive expression simply doesn't rise to the level of hate speech for the purposes of the act. In our view, section 13 was not an efficient or effective way of dealing with online hate.[145]Canadian Civil Liberties Association

On the other hand, some witnesses expressed that the repeal of section 13 of the CHRA left a gap in the legal tool box to counter online hate.[146] As noted by Avi Benlolo from the Friends of Simon Wiesenthal Center for Holocaust Studies,

section 13… allowed to bring down several online hate sites simply by bringing attention to them with [Internet service providers]. Our ability to sanction hate sites became limited when section 13 was repealed in 2013. We lost and invaluable tool that provided a red line for the public.[147]

Based on her own experience, Elizabeth Moore, a former extremist, noted that section 13 actually acted as an effective deterrent from indulging in unrestrained hatred.[148]

Recognizing the importance of establishing a non-criminal law remedy to online hate, some witnesses recommended re-instating section 13, with additional safeguards to address preoccupations related, for example, to freedom of expression and procedure.[149]

Some witnesses spoke in a more general manner, saying that, since 2013, circumstances have changed, and that there is a need for additional tools to counter online hate. According to these witnesses, different options must be explored before deciding on the most appropriate recourse.[150] A number of witnesses suggested that the government start with a comprehensive review of the CHRA in its entirety.[151]

The current gap in Canadian human rights law at the federal level enables the publishing of material on websites and social media that is prohibited from being published in physical form. For online hatred, the only remedy is a criminal complaint, which has a very high bar for conviction and can require special approval from a province's attorney general.[152]Morgane Oger Foundation
Another approach is to develop a new provision in the Canadian Human Rights Act on online hate. This requires addressing the clear deficiencies of section 13, which was an effective but flawed instrument. In line with recommendations offered by the Honourable Irwin Cotler, a restored section 13 would require significant safeguards to protect legitimate freedom of expression and prevent vexatious use of the section.… If misused, misconstrued or poorly constructed, any new legal provisions, including a renewed section 13, would risk undermining the overarching goal to protect Canadians and prevent hate propaganda from gaining sympathizers and adherents.[153]Centre for Israel and Jewish Affairs
The realities around online hate are different than they were even five or six years ago when section 13 was abolished, and certainly in the years before that when it was being used. I think it does merit considering whether there is a role for the commission to play here, with all of the provisos that I highlighted earlier. It would have to go forward with a clear recognition of the importance of both rights, and the kind of training, expertise and resourcing—drawing on international standards—that really help develop a sophisticated understanding of how those two rights have a profound interplay with each other.[154]Amnesty International Canada

6.3 A National Strategy to Counter Online Hate

Several organizations and witnesses called for the establishment of a national strategy to counter online hate.[155] This strategy must address the following four aspects: defining hate, tracking hate, preventing hate and intervening to stop hate, for example by regulating online platforms and enforcing the current Criminal Code provisions more rigorously. Such a strategy must also be based on partnerships and recognize that responsibility for combatting hatred “does not rest solely with the government”[156] and that “[c]orporations, including the large social media companies, must update their terms of use and their monitoring and reporting activities in order to better control the dissemination of hate through their networks and to remove hateful posts and users.”[157]

My hope is that this study will culminate in a unanimous call on the Government of Canada to establish a comprehensive strategy to counter online hate and provide the government with a proposed outline for that strategy. Today I'll share four elements that we believe are essential to include: defining online hate; tracking online hate; preventing online hate; and intervening against online hate.[158]Centre for Israel and Jewish Affairs
We believe that a national strategy to combat online hate is needed. One of the first steps is to ensure broad and inclusive engagement across Canada, including population groups that tend to be under-represented when doing consultations—including newcomers—in order to understand their experiences with online hate. We need to ensure that the process is as accessible and inclusive as possible to engage diverse groups and that there is a safe space for more vulnerable people and groups to express their experiences.[159]S.U.C.C.E.S.S.

6.4 Increasing Awareness and Digital Literacy

Many witnesses raised the issue of prevention as a critical part of combatting online hate. Repeatedly, the Committee was reminded that oftentimes “hate is born out of ignorance or misunderstanding, and it may be prevented through community engagement and outreach.”[160] Many options were presented throughout the study, including raising awareness regarding online hate,[161] promoting dialogue and engagement,[162] and increasing education on “responsible usage of social networking sites and websites.”[163]

Particularly, digital literacy for young people was highlighted by many witnesses as essential to combatting online hate. The Committee was told that youth must learn to deal with online hate and misinformation.[164] Unfortunately, it was noted that digital literacy is not currently offered consistently across the country and that it needs to be prioritized.[165] It was suggested that a national digital literacy strategy from kindergarten to post-secondary education be created.[166]

Overall, prevention efforts should be aimed at the general population as well as targeted groups and should focus on increasing both critical thinking skills and digital literacy, as illustrated by the following excerpts from the testimony:

In the current global environment, trust in traditional media and institutions has declined as online manipulation and misinformation have increased. A campaign to strengthen Internet literacy and critical online thinking with resources to support parents and educators would help mitigate these trends.[167]Centre for Israel and Jewish Affairs
A national strategy to address online hatred, then, must also equip families, community leaders and individual Canadians to challenge expressions of hatred, extremism and violence wherever they may encounter it.[168]Anglican Church of Canada
Youth needs to develop a strong moral framework on which to base decisions about their online activities, about which content they choose to consume and share, and about how they use their powers of expression when communicating with friends and strangers online. Any long-term solution to online hatred has to give due consideration to this generation that is coming of age in an information environment that is confusing, polarizing and indifferent to their moral and ethical development.[169]Bahá'í Community of Canada
We need to better educate community members on how to be allies and how to respond appropriately in this situation to ensure safety and promote reporting. Education is particularly important to engage newcomer youth. They have unique and complex experiences and pressures. They have challenges in navigating a new social reality and have limited trust in authority figures, as well as feelings of being powerless and hopeless. Dr. Ratna Ghosh, from McGill University, is currently doing important research about education as a form of soft power and a critical prevention tool in countering violent extremism, by supporting youth to develop values, skills, behaviours and norms that promote security and resilience.[170]S.U.C.C.E.S.S.
The federal government should fund school programs to build young Canadians’ abilities to resist polarisation and hatred, and to cultivate critical thinking and empathy.[171]Mosaic Institute

Finally, the National Council of Canadian Muslims suggested that a special grant program to develop digital literacy programming be created at the federal level to allow academics, entrepreneurs, anti-racism organizations and NGOs with expertise in digital literacy and online hate to create and teach online literacy courses. It was suggested that the funds could also be made available to academics for “conducting innovative research” on digital literacy.[172] According to the South Asian Legal Clinic of Ontario, instead of punitive sanctions such as imprisonment and fines, “a more meaningful remedy could lie in community-based programs that seek to address the motivators and thinking underlying hate crimes in a genuine attempt at anti-racism and anti-oppression education.”[173]

Chapter 7—Conclusions and Recommendations of the Committee

Given the increasing number of hate crimes being reported in Canada, the Committee felt compelled to study this important issue and participate in these dynamic discussions on how to better mitigate the incitement of hatred through online platforms. The following recommendations are meant to inform the Government of Canada and ultimately assist it in taking action to stem the spread of online hate and the enticement of hatred through racism, sexism, anti-Semitism, Islamophobia, homophobia, transphobia, or any form of bigotry based on all prohibited grounds of discrimination. The following recommendations are informed by the positive exchanges the Committee had with a wide array of stakeholders and experts across Canada.

As clearly illustrated by the evidence, combatting online hate requires four separate but equally important actions. We need to

  • properly define online hate;
  • track online hate
  • find a way to educate people on what constitutes online hate;
  • have intervention mechanisms to combat online hate which both involves working in consultation with Internet service providers and online platforms companies and penalizing them where they do not cooperate with government requirements.

The Committee recognizes the work already underway to combat the spread of hate in Canada whether by the governments, civil society and online platforms, but agrees with the witnesses that we must do better and improve our collaborative efforts. We must also improve our response to all forms of hatred, including hatred against LGBTQ2 or based on gender, as well as our response to "systemic racism" and "religious discrimination" based on the terminology developed under the four pillars of the federal government's anti-racism strategy: anti-Black racism, anti-Indigenous racism, Islamophobia and anti-Semitism.

After careful consideration of the evidence, the Committee recommends:

Recommendation 1—Funding for Training on Online Hate

That the Government of Canada increase funding for law enforcement, crown attorneys and judges to ensure that they receive sufficient training and orientation on the importance, and the need to combat online hatred, including being sensitive to complainants.

Recommendation 2—Sharing Best Practices

That Justice Canada develop materials and best practices on collecting data and combatting online hate to be distributed to law enforcement agencies across Canada.

Recommendation 3—Addressing the Gap in Data Collection

That the Government of Canada adopt a two-pronged approach to address the gap in data collection that recognizes the fact that members of marginalized groups often feel more comfortable reporting hate incidents and hate crimes directly to civil society organizations which reflect their community rather than law enforcement officials:

  • Firstly, resources need to be allocated to assist in the collection of data, by both governmental institutions as well as civil society organizations. This will ensure that we have a more complete understanding of the extent of hatred in Canada, particularly hatred that is directed online.
  • Secondly, in order to facilitate the reporting of hate crimes, it is paramount that agents of the state, including police forces, reflect the racial, religious, LGBTQ2 and general diversity of the populations they represent. Police forces, particularly their hate crimes units, must work collaboratively alongside civil society organizations including utilizing the data collected by such organizations, to fully address incidents of hate motivated incidents and crimes, including those occurring online.

Recommendation 4—Tracking Online Hate

That the Government of Canada implement the recommendations regarding the tracking of online hate formulated by the Standing Committee on Canadian Heritage in its report entitled ‘‘Taking Action Against Systemic Racism and Religious Discrimination Including Islamophobia’’, dated February 2018:

  • Recommends that the Government of Canada establish uniform pan-Canadian guidelines and standards for the collection and handling of hate crime data and hate incident data; this would include efforts to standardize the definition and the interpretation, by law enforcement, of hate crimes - Recommendation 5.
  • Recommends that the Government of Canada create a national database to retain and analyze hate crime and hate incident data - Recommendation 6
  • Recommends that the Government of Canada mandate relevant departments and encourage partners at the provincial and municipal levels and within civil society to create additional reporting options for victims of hate crimes and hate incidents, in addition to reporting to law enforcement - Recommendation 8.

Recommendation 5—Preventing Online Hate

That the Government of Canada work with the provincial and territorial governments and community organizations who combat hate on appropriate requirements to educate the population as to what on the Internet constitutes hate. Federal organizations such as the Canadian Race Relations Foundation and the Canadian Human Rights Commission should be utilized to provide models of best practices on combatting online hate.

Recommendation 6—Formulating a Definition of Hate

That the Government of Canada formulate a definition of what constitutes ‘hate’ or ‘hatred’ that is consistent with Supreme Court of Canada jurisprudence. It is critical that this definition acknowledges persons who are disproportionately targeted by hate speech including but not limited to racial, Indigenous, ethnic, linguistic, sexual orientation, gender identity, and religious groups.

Recommendation 7—Providing a Civil Remedy

That the Government of Canada develop a working group comprised of relevant stakeholders to establish a civil remedy for those who assert that their human rights have been violated under the Canadian Human Rights Act, irrespective of whether that violation happens online, in person, or in traditional print format. This remedy could take the form of reinstating the former section 13 of the Canadian Human Rights Act, or implementing a provision analogous to the previous section 13 within the Canadian Human Rights Act, which accounts for the prevalence of hatred on social media.

Recommendation 8—Establishing Requirements for Online Platforms and Internet Service Providers

That the Government of Canada establish requirements for online platforms and Internet service providers with regards to how they monitor and address incidents of hate speech, and the need to remove all posts that would constitute online hatred in a timely manner.

  • These requirements should set common standards with regards to making reporting mechanisms on social media platforms more readily accessible and visible to users, by ensuring that these mechanisms are simple and transparent.
  • Online platforms must have a duty to report regularly to users on data regarding online hate incidents (how many incidents were reported, what actions were taken/what content was removed, and how quickly the action was taken). Failure to properly report on online hate, must lead to significant monetary penalties for the online platform.
  • Furthermore, online platforms must make it simple for users to flag problematic content and provide timely feedback to them relevant to such action.

Recommendation 9—Authentication

That online platforms be encouraged to provide optional mechanisms to authenticate contributors and digitally sign content, and couple this with visual indicators signifying that given user or content is authenticated, and provide users options for filtering non-signed or non-authenticated content.


[1]              Centre for Israel and Jewish Affairs, Press Release: CIJA Urges Action in Response to Spike in Antisemitic Hate Crimes, 29 November 2018; The Evangelical Fellowship of Canada, Calling Parliament to address online hate: Letter to the Minister of Justice, 4 February 2019.

[2]              House of Commons, Standing Committee on Justice and Human Rights (JUST), Minutes, 19 March 2019.

[3]              Canadian Human Rights Act, R.S.C., 1985, c. H-6.

[4]              Criminal Code, R.S.C., 1985, c. C-46.

[5]              A list of witnesses who appeared before the Committee is set out in Appendix A and a list of briefs submitted to the Committee, in Appendix B of this report.

[6]              JUST, Evidence, 1st Session, 42nd Parliament, 2 May 2019 (Queenie Choo, Chief Executive Officer, S.U.C.C.E.S.S.).

[7]              JUST, Evidence, 1st Session, 42nd Parliament, 16 May 2019 (Jennifer Klinck, Chair, Legal Issues Committee, Egale Canada Human Rights Trust).

[8]              JUST, Evidence, 1st Session, 42nd Parliament, 2 May 2019 (Queenie Choo, Chief Executive Officer, S.U.C.C.E.S.S.); JUST, Evidence, 1st Session, 42nd Parliament, 28 May 2019 (Elizabeth Moore, Educator and Advisory Board Member, Canadian Anti-Hate Network and Parents for Peace, As an Individual; Avi Benlolo, President and Chief Executive Officer, Friends of Simon Wiesenthal Center for Holocaust Studies; Bradley Galloway, Research and Intervention Specialist, Organization for the Prevention of Violence).

[9]              See, for example, JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Shimon Koffler Fogel, President and Chief Executive Officer, Centre for Israel and Jewish Affairs; Imam Farhan Iqbal, Ahmadiyya Muslim Jama'at); JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Jasmin Zine, Professor, Sociology and Muslim Studies Option, Wilfrid Laurier University, As an Individual).

[10]            JUST, Brief submitted by the Evangelical Lutheran Church in Canada, Online Hate, 3 May 2019.

[11]            JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Alex Neve, Secretary General, Amnesty International Canada).

[12]            JUST, Evidence, 1st Session, 42nd Parliament, 2 May 2019 (Mukhbir Singh, President, World Sikh Organization of Canada).

[13]            JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Mohamed Labidi, Former President, Centre culturel islamique de Québec).

[14]            JUST, Evidence, 1st Session, 42nd Parliament, 28 May 2019 (Faisal Khan Suri, President, Alberta Muslim Public Affairs Council).

[15]            JUST, Evidence, 1st Session, 42nd Parliament, 2 May 2019 (Rev Daniel Cho, Moderator, Presbyterian Church in Canada); JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Mohamed Labidi, Former President, Centre culturel islamique de Québec).

[16]            JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Mohamed Labidi, Former President, Centre culturel islamique de Québec).

[17]            JUST, Evidence, 1st Session, 42nd Parliament, 16 May 2019 (Jennifer Klinck, Chair, Legal Issues Committee, Egale Canada Human Rights Trust). See also, JUST, Evidence, 1st Session, 42nd Parliament, 16 May 2019 (Morgane Oger, Founder, Morgane Oger Foundation); Brief submitted by Jane Bailey and Valerie Steeves, Online Hate, 9 May 2019.

[18]            JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Mohamed Labidi, Former President, Centre culturel islamique de Québec).

[19]            JUST, Evidence, 1st Session, 42nd Parliament, 16 May 2019 (Jennifer Klinck, Chair, Legal Issues Committee, Egale Canada Human Rights Trust). See also Brief submitted by Jane Bailey and Valerie Steeves, Online Hate, 9 May 2019.

[20]            JUST, Evidence, 1st Session, 42nd Parliament, 28 May 2019 Bradley Galloway, Research and Intervention Specialist, Organization for the Prevention of Violence).

[21]            JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Mohamed Labidi, Former President, Centre culturel islamique de Québec).

[22]            JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Jasmin Zine, Professor, Sociology and Muslim Studies Option, Wilfrid Laurier University, As an Individual).

[23]            Several witnesses reminded the Committee that prior to committing these horrific hateful crimes, the killers were very active online. In the case of the mass murder of Jews in Pittsburgh, “the killer reportedly posted more than 700 anti-Semitic messages online over a span of nine months leading up to the attack.” In the mass murder of Muslims in Christchurch, the “shooter's decision to livestream his horrific crime was a clear attempt to provoke similar atrocities.” JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Shimon Koffler Fogel, President and Chief Executive Officer, Centre for Israel and Jewish Affairs). See also, JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Geoffrey Cameron, Director, Office of Public Affairs, Bahá'í Community of Canada); JUST, Evidence, 1st Session, 42nd Parliament, 28 May 2019 (Faisal Khan Suri, President, Alberta Muslim Public Affairs Council; Sinan Yasarlar, Public Relations Director, Windsor Islamic Association).

[24]            JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Ryan Weston, Lead Animator, Public Witness for Social and Ecological Justice, Anglican Church of Canada).

[25]            JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Rabbi Idan Scher, Canadian Rabbinic Caucus).

[26]            JUST, Evidence, 1st Session, 42nd Parliament, 2 May 2019 (Rev Daniel Cho, Moderator, Presbyterian Church in Canada).

[27]            JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Shimon Koffler Fogel, President and Chief Executive Officer, Centre for Israel and Jewish Affairs). See also, JUST, Brief submitted by The United Church of Canada, Online Hate, 9 May 2019.

[28]            JUST, Evidence, 1st Session, 42nd Parliament, 28 May 2019 (Shalini Konanur, Executive Director and Lawyer, South Asian Legal Clinic of Ontario).

[29]            See, for example, JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Ryan Weston, Lead Animator, Public Witness for Social and Ecological Justice, Anglican Church of Canada); JUST, Evidence, 1st Session, 42nd Parliament, 2 May 2019 (Brian Herman, Director, Government Relations, B'nai Brith Canada); JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Andrew P.W. Bennett, Director, Cardus Religious Freedom Institute); JUST, Evidence, 1st Session, 42nd Parliament, 16 May 2019 (Jennifer Klinck, Chair, Legal Issues Committee, Egale Canada Human Rights Trust); JUST, Evidence, 1st Session, 42nd Parliament, 28 May 2019 (Shalini Konanur, Executive Director and Lawyer, South Asian Legal Clinic of Ontario; Avi Benlolo, President and Chief Executive Officer, Friends of Simon Wiesenthal Center for Holocaust Studies; Faisal Khan Suri, President, Alberta Muslim Public Affairs Council).

[30]            JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Alex Neve, Secretary General, Amnesty International Canada).

[31]            See, for example, JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Alex Neve, Secretary General, Amnesty International Canada; Shimon Koffler Fogel, President and Chief Executive Officer, Centre for Israel and Jewish Affairs); JUST, Evidence, 1st Session, 42nd Parliament, 2 May 2019 (Rev Daniel Cho, Moderator, Presbyterian Church in Canada); JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Andrew P.W. Bennett, Director, Cardus Religious Freedom Institute); JUST, Evidence, 1st Session, 42nd Parliament, 16 May 2019 (Cara Zwibel, Director, Fundamental Freedoms Program, Canadian Civil Liberties Association); JUST, Evidence, 1st Session, 42nd Parliament, 4 June 2019 (John Robson, As an Individual).

[32]            JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Shimon Koffler Fogel, President and Chief Executive Officer, Centre for Israel and Jewish Affairs).

[33]            JUST, Evidence, 1st Session, 42nd Parliament, 16 May 2019 (Jennifer Klinck, Chair, Legal Issues Committee, Egale Canada Human Rights Trust).

[34]            JUST, Evidence, 1st Session, 42nd Parliament, 16 May 2019 (Cara Zwibel, Director, Fundamental Freedoms Program, Canadian Civil Liberties Association).

[35]            JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Alex Neve, Secretary General, Amnesty International Canada).

[36]            JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (André Schutten, Legal Counsel and Director of Law and Policy, Association for Reformed Political Action Canada).

[37]            JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Leslie Rosenblood, Policy Advisor, Canadian Secular Alliance).

[38]            JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Alex Neve, Secretary General, Amnesty International Canada).

[39]            JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Leslie Rosenblood, Policy Advisor, Canadian Secular Alliance). See also JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Alex Neve, Secretary General, Amnesty International Canada; André Schutten, Legal Counsel and Director of Law and Policy, Association for Reformed Political Action Canada).

[40]            JUST, Evidence, 1st Session, 42nd Parliament, 28 May 2019 (Akaash Maharaj, Chief Executive Officer, Mosaic Institute).

[41]            JUST, Evidence, 1st Session, 42nd Parliament, 16 May 2019 (Cara Zwibel, Director, Fundamental Freedoms Program, Canadian Civil Liberties Association).

[42]            JUST, Evidence, 1st Session, 42nd Parliament, 2 May 2019 (Mukhbir Singh, President, World Sikh Organization of Canada).

[43]            For example, Mr. Shahen Mirakian noted that hate propaganda infringes “the freedom of expression of the targeted group by delegitimizing or vilifying identifiable groups” and “makes it impossible for members of those groups to be heard or participate in civil society in a meaningful fashion.” JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Shahen Mirakian, President, Armenian National Committee of Canada).

[44]            See, for example, JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Shimon Koffler Fogel, President and Chief Executive Officer, Centre for Israel and Jewish Affairs); JUST, Evidence, 1st Session, 42nd Parliament, 2 May 2019 (Mukhbir Singh, President, World Sikh Organization of Canada); JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Mohamed Labidi, Former President, Centre culturel islamique de Québec).

[45]            JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Shimon Koffler Fogel, President and Chief Executive Officer, Centre for Israel and Jewish Affairs).

[46]            JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Mohamed Labidi, Former President, Centre culturel islamique de Québec).

[47]            Ibid. See also JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Andrew P.W. Bennett, Director, Cardus Religious Freedom Institute).

[48]            JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Jasmin Zine, Professor, Sociology and Muslim Studies Option, Wilfrid Laurier University, As an Individual).

[49]            Canadian Charter of Rights and Freedoms, section 1. See, for example, JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Andrew P.W. Bennett, Director, Cardus Religious Freedom Institute).

[50]            JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Rabbi Idan Scher, Canadian Rabbinic Caucus).

[51]            JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Jasmin Zine, Professor, Sociology and Muslim Studies Option, Wilfrid Laurier University, As an Individual).

[52]            See, for example, JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Imam Farhan Iqbal, Ahmadiyya Muslim Jama'at); JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Jasmin Zine, Professor, Sociology and Muslim Studies Option, Wilfrid Laurier University, As an Individual).

[53]            JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Ryan Weston, Lead Animator, Public Witness for Social and Ecological Justice, Anglican Church of Canada).

[54]            JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Shimon Koffler Fogel, President and Chief Executive Officer, Centre for Israel and Jewish Affairs).

[55]            See, for example, JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Shimon Koffler Fogel, President and Chief Executive Officer, Centre for Israel and Jewish Affairs); JUST, Evidence, 1st Session, 42nd Parliament, 2 May 2019 (Mukhbir Singh, President, World Sikh Organization of Canada; Brian Herman, Director, Government Relations, B'nai Brith Canada). Mustafa Farooq noted that “training around how to interpret and how to lay charges under the Criminal Code and section 319 would be helpful.” JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Mustafa Farooq, Executive Director, National Council of Canadian Muslims); JUST, Evidence, 1st Session, 42nd Parliament, 28 May 2019 (Elizabeth Moore, Educator and Advisory Board Member, Canadian Anti-Hate Network and Parents for Peace, As an Individual; Avi Benlolo, President and Chief Executive Officer, Friends of Simon Wiesenthal Center for Holocaust Studies); JUST, Brief submitted by Sarah Leamon Law Group, Consultation on Online Hate, 8 May 2019.

[56]            JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Mukhbir Singh, President, World Sikh Organization of Canada); Brief submitted by the Organization for the Prevention of Violence, Responding to Hate Crimes and Incidents in Canada, May 2019.

[57]            JUST, Evidence, 1st Session, 42nd Parliament, 2 May 2019 (Brian Herman, Director, Government Relations, B'nai Brith Canada); Brief submitted by the Organization for the Prevention of Violence, Responding to Hate Crimes and Incidents in Canada, May 2019.

[58]            See, for example, JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Shimon Koffler Fogel, President and Chief Executive Officer, Centre for Israel and Jewish Affairs). Shimon Koffler Fogel indicated, among other things that “section 320.1 of the Criminal Code, which enables the courts to seize computer data believed on reasonable grounds to house hate propaganda, is a pragmatic tool that should be applied more often.” See also JUST, Evidence, 1st Session, 42nd Parliament, 2 May 2019 (Mukhbir Singh, President, World Sikh Organization of Canada); JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Shimon Koffler Fogel, President and Chief Executive Officer, Centre for Israel and Jewish Affairs); JUST, Evidence, 1st Session, 42nd Parliament, 28 May 2019 (Sinan Yasarlar, Public Relations Director, Windsor Islamic Association); JUST, Brief submitted by The Evangelical Fellowship of Canada, Online Hate, May 2019; Brief submitted by the Organization for the Prevention of Violence, Responding to Hate Crimes and Incidents in Canada, May 2019.

[59]            JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Ryan Weston, Lead Animator, Public Witness for Social and Ecological Justice, Anglican Church of Canada). See also, JUST, Evidence, 1st Session, 42nd Parliament, 2 May 2019 (Brian Herman, Director, Government Relations, B'nai Brith Canada).

[60]            JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Shahen Mirakian, President, Armenian National Committee of Canada).

[61]            JUST, Evidence, 1st Session, 42nd Parliament, 2 May 2019 (David Matas, Senior Legal Counsel, B'nai Brith Canada).

[62]            JUST, Evidence, 1st Session, 42nd Parliament, 28 May 2019 (Shalini Konanur, Executive Director and Lawyer, South Asian Legal Clinic of Ontario).

[63]            JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (André Schutten, Legal Counsel and Director of Law and Policy, Association for Reformed Political Action Canada). See also, JUST, Brief submitted by Sarah Leamon Law Group, Consultation on Online Hate, 8 May 2019.

[64]            JUST, Brief submitted by Richard Warman, Online Hate, May 2019.

[65]            JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Leslie Rosenblood, Policy Advisor, Canadian Secular Alliance). See also JUST, Evidence, 1st Session, 42nd Parliament, 2 May 2019 (David Matas, Senior Legal Counsel, B'nai Brith Canada).

[66]            JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Leslie Rosenblood, Policy Advisor, Canadian Secular Alliance).

[67]            Canadian Centre for Justice Statistics, Police-reported hate crime in Canada, 2017, No. 85-002-X, 30 April 2019. Crimes reported by the police each year include only those that come to the attention of police services and are substantiated through a police investigation. To put these statistics into perspective, hate crimes represent a tiny fraction of all police-reported crime in Canada each year. In 2017, they accounted for less than 0.1% of all police-reported crimes in the country.

[68]            Ibid., p. 5.

[69]            Respectively, increasing from 666 to 878 crimes and 460 to 842 crimes from 2016 to 2017. Ibid.

[70]            Ibid., p. 3.

[71]            Ibid.

[72]            Canadian Centre for Justice Statistics, Police-reported hate crime in Canada, 2017, No. 85-002-X, 30 April 2019, p. 11.

[73]            JUST, Evidence, 1st Session, 42nd Parliament, 16 May 2019 (Jennifer Klinck, Chair, Legal Issues Committee, Egale Canada Human Rights Trust).

[74]            As in 2016, hate crimes against Blacks were the most common type of crime motivated by racial or ethnic hatred in 2017.

[75]            JUST, Evidence, 1st Session, 42nd Parliament, 30 May 2019 (Kimberly Taplin, National Crime Prevention and Indigenous Policing Services, Royal Canadian Mounted Police).

[76]            JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Jasmin Zine, Professor, Sociology and Muslim Studies Option, Wilfrid Laurier University, As an Individual).

[77]            JUST, Evidence, 1st Session, 42nd Parliament, 28 May 2019 (Shalini Konanur, Executive Director and Lawyer, South Asian Legal Clinic of Ontario).

[78]            JUST, Evidence, 1st Session, 42nd Parliament, 16 May 2019 (Ricki Justice, Acting Chair, Pride Centre of Edmonton).

[79]            JUST, Evidence, 1st Session, 42nd Parliament, 2 May 2019 (Queenie Choo, Chief Executive Officer, S.U.C.C.E.S.S.).

[80]            JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Shahen Mirakian, President, Armenian National Committee of Canada). See also JUST, Evidence, 1st Session, 42nd Parliament, 2 May 2019 (Queenie Choo, Chief Executive Officer, S.U.C.C.E.S.S.).

[81]            JUST, Evidence, 1st Session, 42nd Parliament, 28 May 2019 (Dahabo Ahmed Omer, Board Member, Stakeholder Relations, Federation of Black Canadians).

[82]            JUST, Evidence, 1st Session, 42nd Parliament, 30 May 2019 (Naseem Mithoowani, Partner, Waldman & Associates, As an Individual).

[83]            JUST, Evidence, 1st Session, 42nd Parliament, 2 May 2019 (Queenie Choo, Chief Executive Officer, S.U.C.C.E.S.S.).

[84]            JUST, Brief submitted by Alyssa Blank, Combatting Online Hate: An Alternative Approach, May 2019; JUST, Brief submitted by the Organization for the Prevention of Violence, Responding to Hate Crimes and Incidents in Canada, May 2019.

[85]            JUST, Brief submitted by the Committee by the Canadian Women’s Foundation, Online Hate, 10 May 2019.

[86]            Information provided to the Committee by Statistics Canada by email. Statistics Canada states specifically the following: “It is important to note that police-reported data on cyber-related hate crimes are an undercount due to the fact that not all police services have been able to provide Statistics Canada with information on those incidents that are cyber related.”

[87]            JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Rabbi Idan Scher, Canadian Rabbinic Caucus).

[88]            JUST, Evidence, 1st Session, 42nd Parliament, 2 May 2019 (Queenie Choo, Chief Executive Officer, S.U.C.C.E.S.S.).

[89]            JUST, Evidence, 1st Session, 42nd Parliament, 2 May 2019 (Brian Herman (Director, Government Relations, B'nai Brith Canada).

[90]            JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Shahen Mirakian, President, Armenian National Committee of Canada).

[91]            See, for example, JUST, Evidence, 1st Session, 42nd Parliament, 2 May 2019 (Mukhbir Singh, President, World Sikh Organization of Canada).

[92]            JUST, Evidence, 1st Session, 42nd Parliament, 16 May 2019 (Cara Zwibel, Director, Fundamental Freedoms Program, Canadian Civil Liberties Association).

[93]            JUST, Evidence, 1st Session, 42nd Parliament, 2 May 2019 (Brian Herman, Director, Government Relations, B'nai Brith Canada).

[94]            JUST, Evidence, 1st Session, 42nd Parliament, 28 May 2019 (Avi Benlolo, President and Chief Executive Officer, Friends of Simon Wiesenthal Center for Holocaust Studies).

[95]            JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Shimon Koffler Fogel, President and Chief Executive Officer, Centre for Israel and Jewish Affairs).

[96]            JUST, Evidence, 1st Session, 42nd Parliament, 28 May 2019 (Lina Chaker, Spokesperson, Windsor Islamic Council).

[97]            JUST, Evidence, 1st Session, 42nd Parliament, 2 May 2019 (Brian Herman, Director, Government Relations, B'nai Brith Canada).

[98]            Ibid.

[99]            Ibid.

[100]          See, for example, JUST, Brief submitted by the National Council of Canadian Muslims, Brief on Online Hate: Legislative and Policy Approaches, 9 May 2019; JUST, Brief submitted by The Evangelical Fellowship of Canada, Online Hate, May 2019.

[101]          JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Seifeddine Essid, Social Media Officer, Centre culturel islamique de Québec).

[102]          JUST, Evidence, 1st Session, 42nd Parliament, 2 May 2019 (Queenie Choo, Chief Executive Officer, S.U.C.C.E.S.S.).

[103]          JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Jasmin Zine, Professor, Sociology and Muslim Studies Option, Wilfrid Laurier University, As an Individual; Bernie Farber, Chair, Canadian Anti-Hate Network). See also JUST, Evidence, 1st Session, 42nd Parliament, 28 May 2019 (Elizabeth Moore, Educator and Advisory Board Member, Canadian Anti-Hate Network and Parents for Peace, As an Individual). The Chief Commissioner of the Canadian Human Rights Commission also noted that the Committee could also look into the “hallmarks of hate developed by the Canadian Human Rights Tribunal.” JUST, Evidence, 1st Session, 42nd Parliament, 30 May 2019 (Marie-Claude Landry, Chief Commissioner, Chief Commissioner, Canadian Human Rights Commission).

[104]          JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Shimon Koffler Fogel, President and Chief Executive Officer, Centre for Israel and Jewish Affairs). See also JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Rabbi Idan Scher, Canadian Rabbinic Caucus).

[105]          Brief submitted by YWCA Canada, Addressing Online Hate: Applying Intersectional Gender Lens, 10 May 2019.

[106]          See, for example, JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Shimon Koffler Fogel, President and Chief Executive Officer, Centre for Israel and Jewish Affairs); JUST, Evidence, 1st Session, 42nd Parliament, 2 May 2019 (David Matas, Senior Legal Counsel, B'nai Brith Canada).

[107]          JUST, Evidence, 1st Session, 42nd Parliament, 30 May 2019 (Michele Austin, Head, Government and Public Policy, Twitter Canada, Twitter Inc.); JUST, Evidence, 1st Session, 42nd Parliament, 4 June 2019 (Colin McKay, chef, Relations gouvernementales et politiques publiques); JUST, Evidence, 1st Session, 42nd Parliament, 6 June 2019 (Kevin Chan, Global Policy Director, Facebook inc.).

[108]          Information provided to the Committee by Google Canada by email.

[109]          See, for example, JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Shimon Koffler Fogel, President and Chief Executive Officer, Centre for Israel and Jewish Affairs); JUST, Evidence, 1st Session, 42nd Parliament, 2 May 2019 (Mukhbir Singh, President, World Sikh Organization of Canada); JUST, Evidence, 1st Session, 42nd Parliament, 28 May 2019 (Avi Benlolo, President and Chief Executive Officer, Friends of Simon Wiesenthal Center for Holocaust Studies).

[110]          JUST, Evidence, 1st Session, 42nd Parliament, 6 June 2019 (Kevin Chan, Global Policy Director, Facebook inc.).

[111]          JUST, Evidence, 1st Session, 42nd Parliament, 6 June 2019 (Kevin Chan, Global Policy Director, Facebook inc.).

[112]          JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Jasmin Zine, Professor, Sociology and Muslim Studies Option, Wilfrid Laurier University, As an Individual).

[113]          JUST, Evidence, 1st Session, 42nd Parliament, 2 May 2019 (Brian Herman, Director, Government Relations, B'nai Brith Canada). Ryan Weston noted specifically that “Corporations, including the large social media companies, must update their terms of use and their monitoring and reporting activities in order to better control the dissemination of hate through their networks and to remove hateful posts and users”. JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Ryan Weston, Lead Animator, Public Witness for Social and Ecological Justice, Anglican Church of Canada). See also JUST, Evidence, 1st Session, 42nd Parliament, 28 May 2019 (Faisal Khan Suri, President, Alberta Muslim Public Affairs Council).

[114]          JUST, Evidence, 1st Session, 42nd Parliament, 28 May 2019 (Akaash Maharaj, Chief Executive Officer, Mosaic Institute).

[115]          Ibid.

[116]          JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Imam Farhan Iqbal, Ahmadiyya Muslim Jama'at).

[117]          JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Rabbi Idan Scher, Canadian Rabbinic Caucus). See also JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Ryan Weston, Lead Animator, Public Witness for Social and Ecological Justice, Anglican Church of Canada).

[118]          JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Geoffrey Cameron, Director, Office of Public Affairs, Bahá'í Community of Canada).

[119]          JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Rabbi Idan Scher, Canadian Rabbinic Caucus); JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Mustafa Farooq, Executive Director, National Council of Canadian Muslims).

[120]          JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Geoffrey Cameron, Director, Office of Public Affairs, Bahá'í Community of Canada).

[121]          JUST, Evidence, 1st Session, 42nd Parliament, 16 May 2019 (Jennifer Klinck, Chair, Legal Issues Committee, Egale Canada Human Rights Trust).

[122]          See, for example, JUST, Evidence, 1st Session, 42nd Parliament, 2 May 2019 (Mukhbir Singh, President, World Sikh Organization of Canada).

[123]          JUST, Evidence, 1st Session, 42nd Parliament, 28 May 2019 (Avi Benlolo, President and Chief Executive Officer, Friends of Simon Wiesenthal Center for Holocaust Studies). See also, JUST, Evidence, 1st Session, 42nd Parliament, 28 May 2019 (Faisal Khan Suri, President, Alberta Muslim Public Affairs Council; Lina Chaker, Spokesperson, Windsor Islamic Council).

[124]          JUST, Evidence, 1st Session, 42nd Parliament, 28 May 2019 (Akaash Maharaj, Chief Executive Officer, Mosaic Institute).

[125]          JUST, Evidence, 1st Session, 42nd Parliament, 16 May 2019 (Jennifer Klinck, Chair, Legal Issues Committee, Egale Canada Human Rights Trust).

[126]          See, for example, JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Richard Marceau, Vice‑President, External Affairs and General Counsel, Centre for Israel and Jewish Affairs). JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Mustafa Farooq, Executive Director, National Council of Canadian Muslims); JUST, Brief submitted by Alyssa Blank, Combatting Online Hate: An Alternative Approach, May 2019; JUST, Brief submitted by Heidi Tworek, International Approaches to Regulating Hate Speech Online, 20 May 2019.

[127]          JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Shimon Koffler Fogel, President and Chief Executive Officer, Centre for Israel and Jewish Affairs).

[128]          JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Mohamed Labidi, Former President, Centre culturel islamique de Québec).

[129]          JUST, Evidence, 1st Session, 42nd Parliament, 16 May 2019 (Ricki Justice, Acting Chair, Pride Centre of Edmonton).

[130]          JUST, Evidence, 1st Session, 42nd Parliament, 16 May 2019 (Jennifer Klinck, Chair, Legal Issues Committee, Egale Canada Human Rights Trust).

[131]          JUST, Evidence, 1st Session, 42nd Parliament, 2 May 2019 (David Matas, Senior Legal Counsel, B'nai Brith Canada).

[132]          Ibid.

[133]          See, for example, JUST, Evidence, 1st Session, 42nd Parliament, 28 May 2019 (Faisal Khan Suri, President, Alberta Muslim Public Affairs Council; Lina Chaker, Spokesperson, Windsor Islamic Council).

[134]          JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Mohamed Labidi, Former President, Centre culturel islamique de Québec).

[135]          JUST, Evidence, 1st Session, 42nd Parliament, 28 May 2019 (Avi Benlolo, President and Chief Executive Officer, Friends of Simon Wiesenthal Center for Holocaust Studies). See also, JUST, Evidence, 1st Session, 42nd Parliament, 28 May 2019 (Heidi Tworek, Assistant Professor, University of British Columbia); JUST, Brief submitted by the Iranian Canadian Congress, Online Hate, May 2019.

[136]          JUST, Evidence, 1st Session, 42nd Parliament, 2 May 2019 (Brian Herman, Director, Government Relations, B'nai Brith Canada). See United Kingdom, Government of the United Kingdom, Open consultation, Online Harms White Paper, last updated 30 April 2019.

[137]          JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Mustafa Farooq, Executive Director, National Council of Canadian Muslims).

[138]          Canadian Human Rights Act, R.S.C., 1985, c. H-6.

[140]          Section 13 of the Canadian Human Rights Act is reproduced at Appendix C. The prohibited grounds of discrimination set out in section 3 of the CHRA are: race, national or ethnic origin, colour, religion, age, sex, sexual orientation, gender identity or expression, marital status, family status, genetic characteristics, disability and conviction for an offence for which a pardon has been granted or in respect of which a record suspension has been ordered.

[141]          JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Shimon Koffler Fogel, President and Chief Executive Officer, Centre for Israel and Jewish Affairs).

[142]          See, for example, JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (André Schutten, Legal Counsel and Director of Law and Policy, Association for Reformed Political Action Canada); JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Leslie Rosenblood, Policy Advisor, Canadian Secular Alliance); JUST, Evidence, 1st Session, 42nd Parliament, 16 May 2019 (Cara Zwibel, Director, Fundamental Freedoms Program, Canadian Civil Liberties Association); JUST, Evidence, 1st Session, 42nd Parliament, 4 June 2019 (Lindsay Shepherd, As an Individual; Mark Steyn, As an Individual).

[143]          JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (André Schutten, Legal Counsel and Director of Law and Policy, Association for Reformed Political Action Canada).

[144]          JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Leslie Rosenblood, Policy Advisor, Canadian Secular Alliance).

[145]          JUST, Evidence, 1st Session, 42nd Parliament, 16 May 2019 (Cara Zwibel, Director, Fundamental Freedoms Program, Canadian Civil Liberties Association).

[146]          See, for example, JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Shimon Koffler Fogel, President and Chief Executive Officer, Centre for Israel and Jewish Affairs); JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Mustafa Farooq, Executive Director, National Council of Canadian Muslims); JUST, Evidence, 1st Session, 42nd Parliament, 16 May 2019 (Morgane Oger, Founder, Morgane Oger Foundation); JUST, Evidence, 1st Session, 42nd Parliament, 28 May 2019 (Avi Benlolo, President and Chief Executive Officer, Friends of Simon Wiesenthal Center for Holocaust Studies).

[147]          JUST, Evidence, 1st Session, 42nd Parliament, 28 May 2019 (Avi Benlolo, President and Chief Executive Officer, Friends of Simon Wiesenthal Center for Holocaust Studies).

[148]          JUST, Evidence, 1st Session, 42nd Parliament, 28 May 2019 (Elizabeth Moore, Educator and Advisory Board Member, Canadian Anti-Hate Network and Parents for Peace, As an Individual).

[149]          See, for example, JUST, Evidence, 1st Session, 42nd Parliament, 2 May 2019 (David Matas, Senior Legal Counsel, B'nai Brith Canada); JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Bernie Farber, Chair, Canadian Anti-Hate Network) JUST, Evidence, 1st Session, 42nd Parliament, 16 May 2019 (Nancy Peckford, Senior Advisor, Equal Voice; Morgane Oger, Founder, Morgane Oger Foundation); JUST, Evidence, 1st Session, 42nd Parliament, 28 May 2019 (Shalini Konanur, Executive Director and Lawyer, South Asian Legal Clinic of Ontario); JUST, Evidence, 1st Session, 42nd Parliament, 28 May 2019 (Sinan Yasarlar, Public Relations Director, Windsor Islamic Association); JUST, Brief submitted by Women’s Legal Education and Action Fund, Online Hate, 10 May 2019. Also, in its brief, the British Columbia Teachers’ Federation proposes to include hate speech provisions in the Canadian Human Rights Act. JUST, Brief submitted by the British Columbian Teachers’ Federation, Study on Online Hate, May 2019; JUST, Evidence, 1st Session, 42nd Parliament, 30 May 2019 (Naseem Mithoowani, Partner, Waldman & Associates, As an Individual); JUST, Brief submitted by Jane Bailey and Valerie Steeves, Online Hate, 9 May 2019. The Canadian Women’s Foundation recommended that a human rights remedy similar to former section 13 be created. See Brief submitted by the Canadian Women’s Foundation, Online Hate, 10 May 2019.

[150]          See, for example, JUST, Evidence, 1st Session, 42nd Parliament, 16 May 2019 (Jennifer Klinck, Chair, Legal Issues Committee, Egale Canada Human Rights Trust); JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Alex Neve, Secretary General, Amnesty International Canada); JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Bernie Farber, Chair, Canadian Anti-Hate Network); JUST, Evidence, 1st Session, 42nd Parliament, 16 May 2019 (Jennifer Klinck, Chair, Legal Issues Committee, Egale Canada Human Rights Trust); JUST, Brief submitted by Sarah Leamon Law Group, Consultation on Online Hate, 8 May 2019.

[151]          See, for example, JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Mustafa Farooq, Executive Director, National Council of Canadian Muslims); JUST, Evidence, 1st Session, 42nd Parliament, 28 May 2019 (Faisal Khan Suri, President, Alberta Muslim Public Affairs Council); JUST, Brief submitted by the National Council of Canadian Muslims, Brief on Online Hate: Legislative and Policy Approaches, 9 May 2019.

[152]          JUST, Evidence, 1st Session, 42nd Parliament, 16 May 2019 (Morgane Oger, Founder, Morgane Oger Foundation).

[153]          JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Shimon Koffler Fogel, President and Chief Executive Officer, Centre for Israel and Jewish Affairs).

[154]          JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Alex Neve, Secretary General, Amnesty International Canada).

[155]          See, for example, JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Rabbi Idan Scher, Canadian Rabbinic Caucus; Shahen Mirakian, President, Armenian National Committee of Canada); JUST, Evidence, 1st Session, 42nd Parliament, 28 May 2019 (Shalini Konanur, Executive Director and Lawyer, South Asian Legal Clinic of Ontario); JUST, Evidence, 1st Session, 42nd Parliament, 30 May 2019 (Marie-Claude Landry, Chief Commissioner, Chief Commissioner, Canadian Human Rights Commission); JUST, Brief submitted by Jane Bailey and Valerie Steeves, Online Hate, 9 May 2019.

[156]          For example, Herman mentioned that “[t]he public needs to understand the challenges and the role they play in countering online hate, including disinformation. We feel strongly that action cannot just be left to governments, platforms and content providers.” See JUST, Evidence, 1st Session, 42nd Parliament, 2 May 2019 (Brian Herman, Director, Government Relations, B'nai Brith Canada).

[157]          JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Ryan Weston, Lead Animator, Public Witness for Social and Ecological Justice, Anglican Church of Canada).

[158]          JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Shimon Koffler Fogel, President and Chief Executive Officer, Centre for Israel and Jewish Affairs). More information about the recommended national strategy can be found in the Policy Brief of the Centre for Israel and Jewish Affairs entitled Combating Online Hate, November 2018.

[159]          JUST, Evidence, 1st Session, 42nd Parliament, 2 May 2019 (Queenie Choo, Chief Executive Officer, S.U.C.C.E.S.S.).

[160]          JUST, Evidence, 1st Session, 42nd Parliament, 2 May 2019 (Mukhbir Singh, President, World Sikh Organization of Canada).

[161]          See, for example, JUST, Evidence, 1st Session, 42nd Parliament, 28 May 2019 (Sinan Yasarlar, Public Relations Director, Windsor Islamic Association); JUST, Brief submitted by The Evangelical Fellowship of Canada, Online Hate, May 2019; Brief submitted by the Committee by the Canadian Women’s Foundation, Online Hate, 10 May 2019.

[162]          See, for example, JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Imam Farhan Iqbal, Ahmadiyya Muslim Jama'at); JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (André Schutten, Legal Counsel and Director of Law and Policy, Association for Reformed Political Action Canada).

[163]          JUST, Evidence, 1st Session, 42nd Parliament, 28 May 2019 (Avi Benlolo, President and Chief Executive Officer, Friends of Simon Wiesenthal Center for Holocaust Studies).

[164]          JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Mustafa Farooq, Executive Director, National Council of Canadian Muslims).

[165]          See, for example, JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Alex Neve, Secretary General, Amnesty International Canada).

[166]          JUST, Evidence, 1st Session, 42nd Parliament, 28 May 2019 Bradley Galloway, Research and Intervention Specialist, Organization for the Prevention of Violence).

[167]          JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Shimon Koffler Fogel, President and Chief Executive Officer, Centre for Israel and Jewish Affairs). See also, JUST, Brief submitted by The Evangelical Fellowship of Canada, Online Hate, May 2019.

[168]          JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Ryan Weston, Lead Animator, Public Witness for Social and Ecological Justice, Anglican Church of Canada).

[169]          JUST, Evidence, 1st Session, 42nd Parliament, 11 April 2019 (Geoffrey Cameron, Director, Office of Public Affairs, Bahá'í Community of Canada).

[170]          JUST, Evidence, 1st Session, 42nd Parliament, 2 May 2019 (Queenie Choo, Chief Executive Officer, S.U.C.C.E.S.S.).

[171]          JUST, Evidence, 1st Session, 42nd Parliament, 28 May 2019 (Akaash Maharaj, Chief Executive Officer, Mosaic Institute).

[172]          JUST, Evidence, 1st Session, 42nd Parliament, 9 May 2019 (Mustafa Farooq, Executive Director, National Council of Canadian Muslims). Similar comments were made by Lina Chaker. JUST, Evidence, 1st Session, 42nd Parliament, 28 May 2019 (Lina Chaker, Spokesperson, Windsor Islamic Council).

[173]          JUST, Evidence, 1st Session, 42nd Parliament, 28 May 2019 (Shalini Konanur, Executive Director and Lawyer, South Asian Legal Clinic of Ontario).