Skip to main content Start of content

JUST Committee Meeting

Notices of Meeting include information about the subject matter to be examined by the committee and date, time and place of the meeting, as well as a list of any witnesses scheduled to appear. The Evidence is the edited and revised transcript of what is said before a committee. The Minutes of Proceedings are the official record of the business conducted by the committee at a sitting.

For an advanced search, use Publication Search tool.

If you have any questions or comments regarding the accessibility of this publication, please contact us at accessible@parl.gc.ca.

Previous day publication Next day publication
Skip to Document Navigation Skip to Document Content






House of Commons Emblem

Standing Committee on Justice and Human Rights


NUMBER 152 
l
1st SESSION 
l
42nd PARLIAMENT 

EVIDENCE

Thursday, May 30, 2019

[Recorded by Electronic Apparatus]

  (0850)  

[English]

     Welcome to this meeting of the Standing Committee on Justice and Human Rights, as we resume our study on online hate.
    We have quite an illustrious group of witnesses with us today.
    From the Canadian Human Rights Commission, we have Madam Marie-Claude Landry, who is the Chief Commissioner.

[Translation]

    I want to welcome you, Ms. Landry. It's a pleasure to have you here.
    We're also joined by Monette Maillet.

[English]

    She is the Deputy Executive Director and Senior General Counsel, Human Rights Promotion. Welcome.
    From the Department of Canadian Heritage, we have Ms. Lisa-Marie Inman, who is the Director General of Multiculturalism. Welcome, Ms. Inman.
    From the Department of Justice, which is here only to offer technical expertise on our questions, we have Mr. Glenn Gilmour, Counsel, Criminal Law Policy Section. Welcome, Mr. Gilmour.
    We also have Mr. Eric Nielsen, Counsel, Human Rights Law Section. Welcome, Mr. Nielsen.
    Mr. Gilmour will answer any questions about the Criminal Code provisions on online hate, and Mr. Nielsen will answer questions about the Canadian Human Rights Act.
    From the Royal Canadian Mounted Police, we have Ms. Kimberly Taplin, who is the National Crime Prevention and Indigenous Policing Services Superintendent. Welcome, Ms. Taplin. Thank you for coming.
    From the Saskatchewan Human Rights Commission, by video conference, we have Mr. David Arnot, who is the Chief Commissioner. Welcome, Mr. Arnot.
    Good morning.
    All of the different groups have eight minutes.
    Because you are on video conference, Mr. Arnot, and we don't want to lose you, we're going to start with you, sir.
    The floor is yours.
     Thank you very much for the invitation and the opportunity.
    There's been a proliferation of hate speech online, propaganda, radicalism and obscenity. In 2016, Cision documented a 600% increase in the amount of hate speech in social media postings between November 2015 and November 2016. In 2019, Léger Marketing indicated that 60% of Canadians report having seen hate speech on social media.
    These statistics should not come as a surprise to anyone. When the federal government repealed section 13 of the Canadian Human Rights Act in 2013, we lost the capacity to protect against this. For the past six years Canadian citizens have had little ability to protect themselves against online hate speech and discrimination.
    The fundamental problem is that Criminal Code provisions are often ineffective; prosecutions are few; proof of intent to promote hatred against a group beyond a reasonable doubt is almost impossible to meet. The 2008 Saskatchewan Provincial Court case of Crown v. Ahenakew demonstrates that clearly.
    In the case of Saskatchewan Human Rights Commission v. Whatcott, the Supreme Court of Canada, in a unanimous decision, stated that an effective way to curb hate speech is not within the Criminal Code, but in a civil process through human rights commissions. The commission argued that the Criminal Code provisions regulate only the most extreme forms of hate speech, advocating genocide or inciting a breach of the peace. The Supreme Court specifically and narrowly defined hate speech to ensure that human rights legislation does not unreasonably infringe on freedom of expression. This is the most important contribution the Saskatchewan Human Rights Commission has made to Canadian jurisprudence. I put forward the idea that this case provides a blueprint for the work of this committee.
     Judge Rothstein made the following salient points for the court.
     The court described nine indicia of hate in paragraph 44 which are clear, concise and unambiguous. The argument for free speech is not a shield to be used to protect against hate speech. The courts have consistently used the hate speech definition from the 1990 Taylor case in the Supreme Court of Canada. This analysis excludes merely offensive or very hurtful, obnoxious expressions.
     Expression which debates the merits of reducing the rights of some Canadian citizens who are vulnerable is not a prohibition. It restricts the use of the expression that exposes the members to hatred. Ideas are not the target; rather the mode of expression of the idea is the target.
     Ironically, hate speech arises in public debates and can be very restrictive and exclusionary. Legitimate debate in our democracy that is expressed in a civil manner encourages the exchange of opposing views. Hate speech is antithetical to that objective. It shuts down dialogue by making it difficult or impossible for members of a vulnerable group to respond, thereby stifling discourse. Hate speech that shuts down public debate cannot dodge prohibition on the basis that it promotes debate.
     Preventative measures in human rights legislation reasonably centre on the effects rather than the intent of the hatemonger. The evil of hate propaganda is beyond doubt. Hate expression causes real harm to real people. Hate speech demeans, denigrates and dehumanizes the citizens it targets. Through hate speech individuals are told they are entitled to less than other Canadians because of the characteristics they possess.
     With the advent of instant unfettered electronic communication, the opportunity for dissemination is nearly unlimited and largely uncontrolled. A realistic view of modern society must inform free speech, discourse, and the limits thereon.
    The Whatcott judgment was rendered in February 2013. Later that same year, section 13 of the Canadian Human Rights Act was officially repealed. The repeal was based on the argument that it unduly fettered free speech. Opponents to the section provided only anecdotal examples that justify their position. There is no empirical evidence that human rights legislation unduly fetters legal speech. Contrary to the arguments of the free speech advocates, Canada has no democratic tradition of unbridled free speech. Freedom of speech in Canada has always been freedom governed by limits recognized in law.
     Principles of freedom of speech were originally derived through common law principles showing up in the Constitution Act, 1867. Freedom of speech was expressly declared in the Canadian Bill of Rights, 1960. A Canadian citizen's right to freedom of expression was not given express constitutional protection until the enactment of the charter in 1982.
    Despite the charter protection of freedom of expression, there are numerous limits to free expression that are justifiable in a free and democratic society. Reasonable limits to expression protect against greater harms that flow from unfettered speech.
    Some of those limitations include defamation, libel, slander, perjury, child pornography, court ordered publication ban, limits on tobacco, alcohol and drug advertising, insider trading, fraud in the business sector, copyrights, trademarks, and hate speech. There are literally hundreds of legally justified limitations on freedom of expression in Canada.
    However, let's remain focused on hate speech. Here are the recommendations of the Saskatchewan Human Rights Commission to this committee:
    First, the Saskatchewan Human Rights Commission supports the reintroduction of prohibitions in the Canadian Human Rights Act against hateful expression, and the inclusion of telecommunications and the Internet in that act and that re-inclusion.
    The provision could be more effective if the Canadian Human Rights Commission is permitted to commence a complaint on its own initiative on behalf of an affected group, such as a class action type of model. The Saskatchewan Human Rights Commission has that ability. Proceeds of a successful complaint could be paid to a community organization that supports the targeted group and/or fights against hate speech.
    We must enact meaningful legislation that allows human rights commissions to do their job effectively and to hold those who spread online hate responsible for their actions.
    Second, create legislation that holds companies financially accountable for hosting, spreading or creating content that foments online hate. Germany passed the “Facebook act”, which requires social media networks with more than two million users to take down hateful content within 24 hours or face a very significant financial penalty.
    In the United Kingdom, the “Online Harms White Paper” has proposed establishing an independent regulator that would write a code of practice for social networks and Internet companies and have the ability to fine companies that don't enforce those rules. In Canada, we must follow suit.
    Recently, giant tech companies such as Microsoft, Twitter, Facebook and Google came together to condemn online hate and agreed to a nine-point plan on how to curb hate. That is a very good thing. However, we cannot rely on commercial entities to determine what type of behaviour and content is acceptable. That would be a fundamental abdication of the legislative responsibility of Parliament. Instead, we need to develop a “made in Canada for Canada” plan, a plan created by governments after thorough consultations with industry stakeholders, a plan that publicly sets out rules, that monitors platform compliance and that penalizes when necessary.
    Third, Canadian agencies must be given the means and mandate to monitor and investigate online hate, extremism and radicalized influences. In a time when hate and misinformation spread like wildfire online, data collection and intelligence gathering are paramount. That is why part of a “made in Canada for Canada” plan should include a partnership between federal security agencies, social media companies and Internet providers. We have arrived at a moment in our history in which words and well-intentioned platitudes no longer suffice.
    The digital revolution, which has transformed society for both good and ill, has begun to disrupt our democracy. Individuals and groups, foreign and domestic, are using online misinformation, hate and extremist recruitment to erode democratic discourse and to drive a wedge between Canadian citizens.
    We cannot let that happen. We need to take action. Our leaders must have the authority and the moral courage to do what is right. They must choose unity over division, understanding over ignorance, and respect over hate. They must make decisions that work towards the greater good, that respect the rule of law, reflect the charter, and in turn, make the difficult decisions that protect what it means to be a Canadian citizen.

  (0855)  

     This starts by enacting meaningful legislation that will allow governments, human rights commissions, industry, regulatory agencies and the public to effectively combat online hate and misinformation. That's where it starts, but that's not where it ends.
    Fourth, we must also invest in education so that youth of tomorrow no longer—

  (0900)  

    Mr. Arnot, I'm sorry, but you have exceeded your time limit by a little bit. Could I ask you to wrap up, please.
    I would say that Heritage Canada should pay attention to these issues, and that Heritage Canada should look to ensure that digital literacy is available to all students in Canada from grades K to 12 on a coast-to-coast basis.
    I had other things to say, Chair; however, I'll close with that.
    Thank you.
    Thank you very much. That's very helpful and appreciated.

[Translation]

    We'll now move on to the Canadian Human Rights Commission.
    Ms. Landry, you have the floor.

[English]

    Good morning.
    My remarks will be offered in both official languages.
    Thank you for inviting the Canadian Human Rights Commission to participate in this discussion today on online hate. I am joined by my colleague Monette Maillet, Deputy Executive Director and Senior General Counsel.
    The proliferation of online hate is a clear and present danger. In recent years it has become painfully clear that allowing online hate to fester can result in horrific consequences. We are therefore encouraged that the justice committee is conducting this important study. We are pleased to see that you are hearing from several witnesses representing the people and communities most often targeted by hate.
    Hate speech, and particularly online hate, is both an urgent public safety issue and a fundamental human rights issue. Hate speech violates a person's most basic human rights and freedoms: the right to equality and the right to live free from discrimination.
    I will focus my remarks on three key points. First, online hate causes harm. Second, there is a gap in the law when it comes to protecting people from online hate. Third, a comprehensive strategy is needed.

[Translation]

    The Internet has given everyone the power to have their own platform and to be a broadcaster. People can be louder than ever before and influence more people than ever before. In many ways, this is a major step forward. However, the Internet has made it possible to amplify and spread hate speech.
    Far too often, people are victimized by online hate because of their race, religion, gender, sexual orientation or where they're from. Online hate has been found to cause fear and serious psychological harm. It shuts down debate and it promotes conflict, division and social tension. At its most serious, online hate incites violence, and too often, far too often, leads to tragic situations.
    If Canadians targeted by online hate are expected to live their lives in a toxic atmosphere, we're basically failing them. Canada has a responsibility under international and domestic laws to promote equality and to protect all Canadians from discrimination.

[English]

     This brings me to my second point. There is a gap in the law when it comes to protecting people from online hate. The now repealed section 13 of the Canadian Human Rights Act has given the commission an informed perspective on addressing online hate in Canada.
    As many of you may know, section 13 was originally written into the CHRA to prevent harm from prohibited hate messages, based on anti-Semitism being communicated by telephone in the 1970s. Following the attacks of September 11, section 13 was broadened to include messages communicated over the Internet. For many years, it was effective in shutting down a number of extreme neo-Nazi websites. However, this approach is not well suited to respond to today's rapidly evolving technology. As you know, section 13 was deemed to be a constitutionally sound provision.
    As well, the Supreme Court of Canada has confirmed that some limits to free speech are justifiable in a free and democratic society. We have noted that previous witnesses have spoken of the need for a definition of “hate”. To this end, we encourage this committee to look at the definitions put forward by the Supreme Court of Canada, as well as the hallmarks of hate developed by the Canadian Human Rights Tribunal.
    In the discussion around freedom of expression and hate speech, we must not forget the fundamental right to equality and to be free from discrimination. There is no hierarchy of rights, and rights sometimes compete. The commission believes there needs to be an appropriate balance. That is going to require meaningful participation and accountability of all involved parties.
    What we can say for certain is that something must be done quickly to address the proliferation of online hate. It threatens public safety, violates human rights and undermines democracy. As other witnesses have said, addressing online hate will require a proactive approach that involves tracking, intervention and prevention.
    This brings me to my third point. A comprehensive strategy is needed. It will take a concerted and coordinated long-term effort that is proactive, multipronged and multi-faceted. It will take innovative thinking, technical expertise, proper resourcing, coordination and co-operation.
    The strategy will need to bring together all levels of government, telecommunication and Internet providers, social media platforms, civil society, academia and, most importantly, victims of hate.
    These efforts must be led by the government. The government has a duty to meet its domestic and international human rights obligations. This includes protecting citizens from hateful speech.

  (0905)  

[Translation]

    In conclusion, the Canadian Human Rights Commission is committed to fighting against hate and to participating in a broader, coordinated solution.
    In response to evidence heard by the committee, the CHRC finds that a simple amendment to the Canadian Human Rights Act to include provisions similar to the former section 13 would be insufficient. In this modern era, this legal change alone could neither provide the scope nor the level of protection or remedies necessary to prevent online harassment or to effectively reduce hate propaganda.
    If the committee or the government explores possible amendments to the Canadian Human Rights Act or to other legislation as part of a broader response to hate propaganda issues, the CHRC would be happy to contribute its expertise.
    In the coming days, the CHRC will submit a number of documents, including a summary report of a recent jointly organized event to discuss online hate.
    Thank you. My colleague Monette Maillet and I would be pleased to answer your questions.
    Thank you, Ms. Landry.

[English]

    We're going to move to the Department of Canadian Heritage now.
    Ms. Inman.
     Thank you very much for inviting me to address the committee today. I'd like to speak to you about the work the department is undertaking related to racism and religious discrimination.

[Translation]

    Evidence is clear that racism and discrimination continue to exist in Canada. Addressing them is part of the federal government's responsibility to sustain a society that values all its members and treats them with dignity and respect.

[English]

    One way that is achieved is through Canada's multiculturalism policy, which was designed to create a climate in which the multicultural heritage of each of us is valued and to contribute to building a society where all can participate in the economic, social, cultural and political life of Canada.

[Translation]

    The multiculturalism program works toward these objectives by focusing its efforts on building an integrated and socially cohesive society; improving the responsiveness of federal institutions to the needs of a diverse population; and engaging in discussions on multiculturalism, inclusion and diversity at the domestic and international levels.

[English]

    There are four key activities that the multicultural program undertakes. First is grants and contributions via the community support, multiculturalism and anti-racism initiatives program. Second is public outreach and promotion through public events and key outreach initiatives such as Asian Heritage Month and Black History Month. Third is support of federal and public institutions to help them meet their obligations under the Canadian Multiculturalism Act. Fourth is international engagement through providing support for Canada's membership in the International Holocaust Remembrance Alliance and ensuring Canada meets its obligations as a signatory to the International Convention on the Elimination of All Forms of Racial Discrimination.
    In budget 2018, new funding in the amount of $23 million over two years was allocated to the program: $21 million to support events and projects that target racism and discrimination with a particular focus on indigenous peoples and racialized women and girls, and $2 million to support cross-country consultations on a new national anti-racism and anti-discrimination approach.

  (0910)  

[Translation]

    Budget 2018 also provided $9 million over three years to the Department of Canadian Heritage and $10 million over five years to the Public Health Agency of Canada to address the challenges faced by Black Canadians.

[English]

    In 2018, the Minister of Canadian Heritage and Multiculturalism was asked by the Prime Minister to develop a new federal anti-racism approach to combat racism and discrimination. In support of this mandate, we carried out engagement sessions from October 2018 to March 2019 to gather input from Canadians, including experts, faith and community leaders, and those with lived experiences of racism and discrimination.

[Translation]

    In total, 22 in-person sessions were held, involving over 600 participants from some 443 organizations. Over 1,000 online submissions were received.

[English]

    A further $45 million over three years was allocated in budget 2019 for the multiculturalism program to develop and implement a federal anti-racism strategy. In the budget announcement, the strategy was described as finding ways to counter racism in its various forms, with a strong focus on community-based projects. The announcement also highlighted an anti-racism secretariat that would work across government to identify opportunities, coordinate activities and engage with Canada's diverse communities.

[Translation]

    Increasingly intolerant and racist language—hate speech—is available online. It isn't just flourishing in private conversations on social media platforms such as Facebook or Instagram. It's also on the rise on more public sites such as YouTube, and in comments sections, web forums and blogs.

[English]

    Participants in our engagement sessions told us that online hate is an underlying factor that contributes to or causes racism. It is a serious phenomenon that exists in many forms and significantly impacts young people. People told us that social media can play a significant role both in spreading hate and also in combatting it.

[Translation]

    Canadian Heritage plays a vital role in the cultural, civic and economic life of all Canadians. We'll continue to use the levers available to us to work towards addressing hate online, together with our federal partners and with communities.

[English]

    Thank you.
    Thank you very much.
    We'll now go to the RCMP.
    Superintendent Taplin, the floor is yours.
    Good morning, Mr. Chair, members of the committee and ladies and gentlemen.
    Thank you for inviting me here to speak with you today.
    I am Superintendent Kim Taplin, and as you know, I am the officer in charge of National Crime Prevention and Indigenous Policing Services.
    The RCMP takes hate-motivated crimes and incidents very seriously and is committed to continuing to provide services that are focused on the safety of our communities.
    Canadians are increasingly active online, with some using multiple communication devices and a wide variety of tools, such as instant messaging and various social media applications, which provide enormous benefits for Canadian society, but also present unintended opportunities to spread hatred.
    A hate-motived crime, whether online or not, is any criminal offence motivated by the offender's hate, bias or prejudice towards a group or individual, based on colour, race, religion, national or ethnic origin, age, sex, sexual orientation, gender identity or expression and mental or physical disability. This would include a physical attack against someone because of their disability or sexual orientation, or hate-motivated vandalism, such as hate graffiti at a religious institution.
    A hate or bias incident may be motivated by the same factors as those of a hate-motivated crime, but does not reach the threshold of being a criminal offence. Such incidents may include name-calling or racial insults.
    If not addressed, you've heard here today that both hate-motivated crimes and incidents can be a warning sign and even a catalyst for more serious violence in communities. They also have negative impacts on communities' well-being and safety.
    The RCMP proactively works with communities to identify, prioritize and solve problems. This collaborative approach is based on the philosophy that prevention is a core responsibility of policing, where decisions are evidence-based and responses should be community-led, police-supported, sustainable and flexible.
    The RCMP has several consultative committees through which communities' interests become reflected in our work, such as the commissioner’s advisory committee on visible minorities, the commissioner's national indigenous advisory committee and the national youth advisory committee. The RCMP also participates in external committees, such as Public Safety Canada's cross-cultural round table on security, and Canada's anti-racism strategy, led by the Department of Canadian Heritage.
    Statistics Canada estimates that two out of three victims of hate-motivated crime do not report to police. The RCMP is focused on increasing the reporting by building trust with community members. The RCMP also has a national operational policy to assist investigators dealing with hate-motivated crimes and is committed to monitoring threats to public safety. This includes intelligence gathering and ongoing assessment, in collaboration with law enforcement partners, to determine the severity of the threat level posed by any particular actor or group.
    To properly investigate incidents of online hate, law enforcement must be able to work as effectively in the digital world as in the physical. Rapid technological advancements continue to underlie the complexity of police investigations, including online hate.
    It is important to note that investigating hate-motivated crimes falls under the mandate of the local police of jurisdiction. Furthermore, the RCMP has deemed it a priority to recruit qualified applicants from a wide range of backgrounds to better reflect the diverse population of Canada. The RCMP also ensures that all employment policies, practices and standards are fully inclusive and provide all Canadians with equal and fair opportunities within the spirit of employment equity policies and legislation.
    In support of our collective effort to counter hate-motivated crimes and incidents, I encourage all communities to become educated on, and speak out against, hate; to enhance situational awareness of related issues in their communities; to practise emergency procedures; to be vigilant; and to contribute to community resilience. The RCMP has been part of these efforts in many communities across Canada, and will continue to reach out with professionalism and compassion to enhance trust with the communities we serve.

  (0915)  

     I would be happy to respond to any of your questions.
    Thank you.
    Superintendent, thank you very much.
    Now we will move to questions.
    Mr. Cooper.
    Thank you, Mr. Chair, and thank you to the witnesses.
    I'll begin with Mr. Arnot.
    Mr. Arnot, you made reference to the Ahenakew case. That particular case was based on a complex set of facts, including whether the conversation was a private one, which it was ultimately determined not to be, and also, whether Mr. Ahenakew was prodded by a journalist, which went to the question of intent.
    In light of that, would you characterize the actions of Mr. Warman to have been ethical and appropriate?
    I don't know who you're referring to, sir. I can't hear you.
    I was speaking of Richard Warman.
    Just to clarify, I'm not trying to conflate the Ahenakew facts with the facts of Mr. Warman. I just noted that in Ahenakew there was an issue of intent and the question of whether someone was prodded, so the court went through and made findings of fact applying the law.
    I'm now shifting, in the context of prodding, to whether, in your opinion, the actions of Mr. Warman are ethical.
    Okay.
    I don't want to give an opinion on Mr. Warman and whether his actions are ethical, but my fundamental point is that the Criminal Code is a very high standard. Proving intent beyond a reasonable doubt is almost impossible. Look at the Supreme Court of Canada, which says the best place to deal with these issues is in the Canadian Human Rights Act or human rights commissions, because it's a civil process, which is much more amenable. It focuses on the effects of the hate speech rather than the intent.
    It's almost an impossible burden to prove hate speech cases or hate under the Criminal Code, and the focus of the committee should be on human rights legislation. That's really a fundamental point and I think that's the point of the Supreme Court of Canada.

  (0920)  

    Right. I thank you for that, and I take your point and appreciate that you might not want to give an opinion on Mr. Warman.
    I will read into the record that the Canadian Human Rights Tribunal did characterize his actions as “disappointing and disturbing”.
    Mr. Arnot, you just stated, as you did in your earlier testimony, and Madam Landry, you did as well, that when section 13 of the Canadian Human Rights Act was repealed, that took away an important tool. What about section 320.1 of the Criminal Code, which seems to be a section that could be used but isn't utilized?
    I'd be interested in comments from both of you on section 320.1.
    What I'll do to answer is just share our experience. Some 99% of the complaints that we received and that we referred to the tribunal under the Canadian Human Rights Act were extreme, hateful and based on neo-Nazi ideology.
    In the conversations we've had with police during the course of the time that we were responsible for section 13, many shared with me that it was too difficult to get a charge laid under the Criminal Code for hate. They were actually interested in how they could use section 13 of the Canadian Human Rights Act.
    I'm not an expert in criminal law. We haven't done research on how effective that is. I know there haven't been many convictions or charges laid under the Criminal Code, but that has been our experience with the Criminal Code.
    Mr. Arnot.
     I have experience in criminal courts and I would say this. Human rights transgressions fit much better under human rights legislation, with the proper tools for human rights commissions to deal with them, than hate being dealt with in the Criminal Code. I agree with Ms. Maillet's observation.
    Really, I again focus on the Supreme Court of Canada finding very much the same thing, saying that, in effect, it's too difficult to deal with these transgressions in Criminal Code situations. They should be funnelled to human rights commissions and using a civil process for their determination, and perhaps using other methods such as a restorative justice approach that focuses on solutions rather than prosecutions and a myopic prosecution model.
    You cannot prosecute your way into social change; that's the fundamental point. We need to use the human rights commissions in a much more effective way, which was exactly the point of Rothstein and the unanimous judgment of the Supreme Court of Canada.
    Thank you very much.
    Thank you for those questions.
    Ms. Khalid.
    Thank you, witnesses, for coming in today and for your very important testimony.
    I'll start with Madam Landry.
    Can you explain to us why section 13 of the Canadian Human Rights Act was repealed to begin with?
    As I was appointed after the repeal of section 13, I'll ask Madam Maillet to answer that question.
    Please.
    That was a decision made by Parliament. We weren't involved in all of the discussions. Our information was that there had been one complaint that was filed with the commission. It was not of neo-Nazi ideology. It was not extreme. It was dismissed. However, it caused a reaction as it was a complaint against a magazine.
    There were issues with the complaint being filed, both in provincial jurisdiction because it was a print magazine, but also they had an online presence, which meant we then had jurisdiction as well.
    There were some issues there. If the committee is looking at any type of reinstatement, those issues of jurisdiction would need to be addressed, because respondents then have to defend themselves in several fora, which is not ideal.

  (0925)  

    Thank you.
    Madam Landry, in your remarks, you said there's no hierarchy of rights. When rights conflict with each other, how do you prioritize which rights to protect? That is something we hear a lot, the right to freedom of speech or freedom of expression being conflicted with other rights as identified in our charter. How do you balance that out?

[Translation]

    I'll start responding, and then I'll turn the floor over to my colleague.
    In two very clear decisions rendered in different years by different groups of justices, the Supreme Court of Canada established that certain limits were reasonable in a democratic society. While we must have freedom of expression in this country to maintain a democracy, we must also ensure that reasonable limits are set and that hatred and intolerance aren't allowed to spread at a staggering rate, as is currently the case. That's one reason why the Canadian Human Rights Commission is calling for immediate action. This phenomenon must be addressed quickly.
    The Supreme Court of Canada has made it very clear that there are acceptable limits for these types of issues.

[English]

    Do you have anything to add to that?
    No, I think she answered it well. Thanks.
    Thank you.
    My colleague Mr. Cooper spoke about the Criminal Code being a very effective tool to combat online hate. I've had personal experience with this, where an attorney general is required to give the sign-off in order for even charges to be laid. I know you spoke a bit about the challenges faced in the Criminal Code, such as having to prove beyond a reasonable doubt.
    In regard to what was section 13 of the Canadian Human Rights Act, do you think there can be amendments made or issues that can be addressed in perhaps a newer version that balances both of those concerns?
     First of all, we have to assume that material will be clearly hateful, extreme, and it causes harm. Once we have that set of facts before us, how do we then deal with it?
    It's our opinion that we need a multipronged approach, that a provision in the Canadian Human Rights Act cannot stand alone. Clearly, we need agencies, regulatory agencies, police, social media platforms, Internet service providers, and so on, to play a role.
    The question is, do you become reactive, so after something happens, a complaint gets filed or a charge is laid? Section 13 of the Canadian Human Rights Act was very effective at shutting down websites. There could be some amendments around jurisdiction, perhaps providing the commission with a way to deal with things more quickly, but the issue with a complaint-based system is that it takes time.
    If we are limiting freedom of expression, we have to ensure that it's very narrowly limited. The issue becomes what happens to social media. Websites, you can shut down, and you can fine Internet service providers, but if we were to open the Canadian Human Rights Act to complaints based on Twitter, YouTube and Facebook, I can't imagine that we would be resourced to do any other work. That is something that the committee should consider.
    However, in terms of a proactive compliance model whereby you have standards, I'm sure the committee has heard of examples in Europe where that has happened, where they're held accountable. Internet service providers, Facebook, YouTube and Twitter are held accountable for letting hate fester online and potentially cause harm and lead to violence.

  (0930)  

    Thank you very much. We're over the six minutes.
    Ms. Ramsey.
    Thank you, Chair.
    To the witnesses, thank you so much for being here and providing your testimony today.
    My first question is for Ms. Inman from the Department of Canadian Heritage.
     I heard you speak about the anti-racism work, which is certainly commendable and incredibly important in the prevention of people getting to a point where they're spreading hateful messages or sharing those things. However, I wonder if there is anything in particular where you're dealing with online hate or talking about an education program for Canadians.
    That's something we've heard pretty consistently from people who have testified here, that we need, now, a full-blown, almost immediate ability to educate Canadians, not just K to 12, which is fantastic, but all Canadians on what constitutes hate speech, what to do if they see hate speech, and to really have that happen in a very quick manner, because Canadians are struggling. I hear all the struggles that you're having in trying to address this and it's moving very quickly.
    Have you been directed by the Minister of Canadian Heritage to start such a program, or are you looking at that behind the scenes? It would be helpful to know.
    As I mentioned, we did receive funding in budget 2019 to work on an anti-racism approach.
    Is it online? I'm sorry to interrupt you, but I'm thinking of online specifically.
    It didn't specify online, but that's certainly part of what we're looking at.
    In terms of our approach, the work we're doing is really heavily informed by the engagement sessions we held across the country. Online hate is something that was raised at those engagement sessions.
    Do you think it will be expanded to all groups that are essentially targeted by this? Certainly many groups outside of those who experience racism are also being targeted online. Is there any conversation about expanding that?
    It was racism and religious discrimination.
    I mean more specifically the online portion. Are you working on something at Canadian Heritage in that regard?
     We hear the urgency in many people who come and present to us, so I'd like to hear that there's something happening or that we're moving towards that.
    Yes, we are working on a number of different initiatives right now. Certainly online hate and the propagation of online hate is something that we're looking at.
    Okay.
    My colleagues in broadcasting and telecommunications also have work ongoing in terms of specifically digital literacy, that piece of it, and ensuring a robust information and news ecosystem.
     Anything you could share with the committee or submit to the committee after the fact would be helpful for our report.
    Excellent. I'd be happy to get you some more information on that.
    Thank you so much.
    My next question is for the Canadian Human Rights Commission and the Saskatchewan Human Rights Commission.
     We've heard from many organizations and groups that are attempting to collect their own data. It's very tough. These groups are on a shoestring budget. They don't have the ability to collect that data, and absent the data we don't have the ability to then put that towards the creation of the programs and things that you're discussing.
    I would like to know, from your respective organizations, whether you are working on any type of outreach program designed to assist those organizations in the collection of that data.
    We do partner with several organizations in terms of education function. We have not partnered with any organization in terms of collection of data.
    Okay.
    If we had unlimited resources—
    I know.
    —that might be something we would do, but unfortunately not.
    Thank you.
    Mr. Arnot, have you been able to have any type of initiative in Saskatchewan?
    The short answer is no.
    Thank you.

  (0935)  

    On the point of education, though, in Saskatchewan we have created a pedagogy that answers the questions, what does it mean to be a Canadian citizen and what are the rights of citizenship, and also, what are the responsibilities that go with those rights, and how do you build and maintain respect for every citizen. Why? It's because every human being, every citizen, deserves equal moral consideration.
    It is a large pedagogy. It's in all the schools in Saskatchewan, in grades K to 12, and it's available in French and English.
    What we're doing is hopefully creating a citizen who embodies five Es, a citizen who is enlightened, ethical, empowered, engaged and empathetic. It's this broad rubric that we really need to inculcate in the minds of students in Canada.
    Thank you so much.
    Thank you.
    Last, we talked a lot about the tools that are needed by the RCMP, by the human rights commissions, by everyone basically, but when it boils down to reporting, there's no standardization of the way that social media gather reporting and allow people to report. Even how to report is very difficult for a lot of people, or what happens after you report.
    We've heard actually from several groups that going directly to the police is a challenge for some racialized communities and indigenous communities in Canada. There has been a suggestion of having an intermediary in that space so that they could go to report without feeling that they're directly engaging with the police services and the RCMP.
    Can any of you weigh in on what you think we should be doing to create reporting mechanisms that are transparent and understandable for Canadians?

[Translation]

    Before I give the floor to Ms. Maillet, I want to make one point.
    In my presentation, I spoke of the need for a broad, coordinated and multi-faceted approach. The current radical growth of online hate mustn't be addressed by a single organization. Instead, there must be a coordinated and collaborative approach. That's the only way to collect data that will help us analyze and address the phenomenon.
    Online hate is growing at a tremendous rate. It has exploded. It's difficult to gain the upper hand. Without a coordinated and proactive approach, we won't succeed.

[English]

    Yes.
    Thanks. We're past the six minutes now.
    I'm going to Mr. Ehsassi.
    Thank you, Mr. Chairman.
    Witnesses, thank you for your valuable testimony.
    I'll start off with Ms. Inman. I was looking over your testimony and you referenced engagement sessions. I understand that these went on between October 2018 and March 2019. Since those engagement sessions have wrapped up, are you putting together a summary of your findings that would be publicly available?
    Yes, we're continuing work on that right now and it's our intention to publish a report basically that wraps up not only the in-person engagement sessions, but also the online submissions that we received. For those who couldn't attend the sessions, there was the opportunity to go online and submit either a written submission or respond to one of two surveys that mirrored the in-person engagement we had.
    There is the intention to make that available once the data analysis and gathering is complete.
     Excellent. When can we expect to see that?
    I'm not sure. My best guess would be probably a little later in the summer.
    This summer?
    I don't think it will be six months from now. It should be in relatively short order.
    Thank you very much.
    Now, if I could go to you, Superintendent Taplin, in your testimony you provided some very disturbing information, I would say. You say that “two out of three victims of hate-motivated crime do not report to police”.
    Why is that? That's an incredibly high number. Is it, in your opinion, because the police do not have the resources? Is it because in many instances people know that going to the police isn't going to result in anything effective? In your opinion, why is this happening over and over again?
    Thank you for the question. I'm not in a position to give an opinion about that, but what I can say is that we are aware that it is important that communities understand the role the police can play and that they trust the police to investigate crimes.
     What we are doing is working with our communities to enhance our relationships with our community members, meeting with our community members and providing presentations on hate crime and other topics. That's twofold. One, it puts a face of the police to the community. Two, it provides a point of contact so that the community can actually reach out to the police. Enhancing our visibility in the communities is tremendously important.

  (0940)  

    Thank you.
    You also touched on the fact that the RCMP is monitoring threats. In a lot of the incidents we've been reading about, there essentially has been a trail of hate, if you will. All this evidence comes to light after a terrible incident happens. What do you do once you monitor a particular individual who is spewing hate? What do you do with that information?
    Right. Thank you very much for that question as well.
    That's not my area of expertise; however, what I can say is that the RCMP is committed to investigating all incidents of suspected or actual information with respect to hate-motivated crimes and incidents. When we do receive information, we investigate all leads and all information to the best of our abilities.
    But you don't know what you do with that information or with the investigation?
    I'm sorry. I don't understand your question.
    You say you investigate these and look into them, but is there anything actionable that happens after you've had an opportunity to focus on an individual who is spewing hate?
    We work within the Criminal Code. When sufficient evidence is presented, a full investigation is undertaken, and then obviously if there's sufficient evidence to support charges, charges will be pursued.
    Okay. Do you ever self-initiate, then, or do people actually have to bring disturbing comments to your attention?
    Again, there are two sorts of areas that I'd like to speak to. One is the federal lens, which is federal policing. The second one is with provinces and territories through our contract policing role. We receive information in two ways. One is that we do rely on the public to report suspected or actual hate-motivated crimes to the police. Two, we do monitor publicly available social media.
    Yes, but my question was, do you monitor it even if someone doesn't flag it to your attention?
    Absolutely.
    Okay. Thank you.
    Thank you very much.
    While we have the Department of Justice here, colleagues, I just want to ask this since we have our subject matter experts here.
    Mr. Gilmour, Mr. Cooper raised the matter of section 320.1. It's been raised frequently as to why it's so rarely used. As the subject matter expert from the Department of Justice, why do you believe section 320.1 is so rarely used in Canada?
    I'm afraid I'd have to speculate on why it's not being used. It may be, perhaps, as was mentioned, that lack of resources might be an issue.
     It has been mentioned that section 320.1.... Maybe I'll just set out the parameters of it. Section 320.1 is a specific provision in the Criminal Code that was created by the Anti-terrorism Act back in 2001. It allows a judge to order the deletion of hate propaganda that's made publicly available on a computer system that is within the jurisdiction of the court. There are safeguards built into that particular procedure, whereby the person who has put the material on the computer, for example, can come before the court and argue as to why it should not be deleted.
    To my knowledge, I'm not aware that this provision has ever been used. I can't speculate, really, as to why, other than maybe a lack of resources or perhaps the need for more education. It also has been mentioned on occasion in these hearings, I believe, that for this provision there's a requirement to obtain the consent of the appropriate attorney general as well.
    There has been discussion here about hate speech and the hate speech provisions in the Criminal Code. I thought I would just mention that what's probably most relevant in this context are the hate propaganda provisions in the Criminal Code. There are three of them: advocating or promoting genocide against an identifiable group; inciting hatred in a public place that is likely to lead to a breach of the peace where it's directed against an identifiable group; and, wilful promotion of hatred against an identifiable group.
    It has been mentioned that intention is needed for the hate speech crimes. In fact, intention is needed for two of the three hate speech crimes: advocating or promoting genocide and the wilful promotion of hatred. The one that requires inciting hatred in a public place likely to lead to a breach of the peace has a lesser mens rea component—probably recklessness—and that's because of the imminent danger to the public peace.

  (0945)  

    Am I correct that section 320.1 has no mens rea component?
    It's an in rem procedure. No one need be charged with the crime. There's also section 320, which was originally put in the Criminal Code when the offences were created back in 1970—a different time, of course—which allowed for a judge to order the seizure and forfeiture of hate propaganda kept on premises for distribution or sale. Section 320.1 was meant to update and modernize that particular procedure to take into account the Internet.
    Thank you so much to all of the different groups that presented today. Your testimony has been enormously helpful to us. It's really appreciated. Also, as Ms. Ramsey mentioned, if anybody has anything to follow up with, if you would, that would be very much appreciated. The Human Rights Commission mentioned that they were going to send us some documents.
     Thank you again for your testimony.
    We will ask the members of the next panel to come up. We're briefly going to suspend.

  (0945)  


  (0950)  

    We now have with us Mr. Anver Emon, Professor of Law and Canada Research Chair in Religion, Pluralism and the Rule of Law, from the University of Toronto. I don't know.... The video seems to have disappeared.
    We are joined here in Ottawa by Ms. Naseem Mithoowani, a Partner at Waldman & Associates in Toronto.
     Welcome.
    We're also joined by Ms. Heidi Tworek, Assistant Professor at the University of British Columbia, who is joining us by video conference from Washington.
     Welcome, Ms. Tworek.
    Because we now have two people on video conference and I don't want to lose anybody, we're going to start with the folks on video conference.
    Mr. Emon, are you able to hear me?
    Since I see you on the screen, Ms. Tworek, and I don't see Mr. Emon right now, perhaps we will start with you. You have eight minutes. We really appreciate you joining us.
     Thank you, Mr. Chair, and thank you to the committee for the invitation to appear before you today.
    It is frankly disturbing that we live in a world where online hate is rising, where what Whitney Phillips has called “the oxygen of amplification” has elevated extremist views and where in several cases online hate speech has directly led to offline violence, so I very much welcome the committee's careful consideration of how Canada can address these troubling developments.
    I've personally examined European and North American approaches to hate speech, extremism and disinformation. Today, I will briefly outline some of the other approaches democracies are taking, which I talk about more in my brief that I've submitted and, second, how the German example in particular raises some questions for the reconsideration of introducing section 13 again. Finally, I'll discuss some measures that could be taken to address a broader category of harmful speech, which is a non-legal category, but I can try to address some of the broader questions that have been raised.
    Let me first state the very sobering fact that hate speech is not a problem that can be solved. It will be a continual, evolving and ongoing threat. Still, levels of hate speech can ebb and flow. This depends upon the architecture of online ecosystems and the type of speech they promote, as well as the broader political, economic and cultural factors. This can facilitate more hate speech and hate-related crime, but it can also do the reverse.
    First, this is an international problem, as I've mentioned. Democracies around the world are trying to find ways to address this issue. Let me name a couple of the examples that we can discuss in questions.
     First, the U.K. has suggested an approach to regulate through a “duty of care” framework that requires social media companies to have a design that prevents online harms. France has suggested a regulation that would mandate transparency and “accountability by design” from the social media companies. Finally, Germany has taken a legal approach, creating a law that requires social media companies with more than two million unique users in Germany to address and enforce 22 statutes of speech law that already exist in Germany.
     There's a range of things, from the legal to the co-regulatory to self-regulatory and codes of conduct.
    In the case of what we're discussing today, the German Netzwerkdurchsetzungsgesetz, or NetzDG, is particularly instructive. Passed in 2017 and in force since 2018, this is technically a German mouthful word that is literally translated as “network enforcement law”, so it doesn't introduce new statutes of speech law. Rather, it requires social media companies to enforce law that already exists and to actually attend to complaints that are posted within 24 hours or face up to 50 million euros of fine per post.
    Let me then talk about some considerations this raised. First, this was not about introducing new law but enforcing existing law. It has been a major problem in the German case to get Facebook and company to comply. Second, it raises questions about how we get social media companies to actually comply with and enforce existing law. It also raised the question of the scale. To give you a sense, YouTube and Twitter, in a six-month period, were receiving more than 200,000 complaints, so there's a question of the scale of the enforceability and potential backlogs. There's also the question of whether things would be enforced nationally or globally. We've seen that mostly what falls under it is actually being taken down under a company's global terms of service.
    This law also only deals with pieces of content, so it doesn't deal with other ways in which hate can be propagated or funded online through ecosystems. Let me give a Canadian example here.
     Very recently, a member of the Canadian far right tried to use the GoFundMe platform to raise money for an appeal against a libel suit he had lost for defaming a Muslim Canadian. Ontario Supreme Court Justice Jane Ferguson called the far right man's words “hate speech at its worst”, but only after complaints from a journalist and members of the public did the GoFundMe platform actually take down this man's appeal for funds, even though it violated their terms of service. This is just one illustration of how this is broader than actual pieces of content.

  (0955)  

    Finally, let me talk about the way in which we might address a broader category of harmful speech, which is a non-legal category of speech but speech that may undermine free, full and fair democratic discourse online. I've written a report with Chris Tenove and Fenwick McKelvey, two fellow academics, about how we can address this problem of harmful speech without infringing on our democratic right to free expression. Let me give three suggestions.
    First, we have suggested the creation of a social media council. This would mandate regular meetings of social media companies and civil society, particularly marginalized groups that are disproportionately affected by hate and harmful speech online. This social media council could be explicitly created through the framework of human rights. The idea is supported by, amongst others, the UN special rapporteur on freedom of expression and opinion. By linking to international human rights, this would also ensure that Canada doesn't inadvertently provide justifications for liberal regimes to censor speech in ways that could deny basic human rights elsewhere around the world.
    Second, we should firmly consider what kinds of transparency we might mandate from social media and online companies. There's so much that we don't know about the way the algorithms work and whether they promote bias in various kinds of ways. We should contemplate whether to, along the lines of algorithmic impact assessments, require audits and transparency from the companies to understand if their algorithms are themselves facilitating discrimination or promoting hate speech.
    Third, we need to remember that civil society is an important part of this question. This is not something to solely be addressed by governments and platforms. Civil society plays a key role here. We often see that platforms only take down certain types of content after it has been flagged and raised by civil society organizations or journalists. We need to support those civil society organizations and journalists who are working on this, and also who are supporting those who are deeply affected by hate and harmful speech.
    Finally, we also need to support the research that thinks through the sort of positive element of this, that is to say, how do we encourage more constructive engagement online?
     As you can see from this short testimony, there's much to be done, on all sides.
    Thank you for inviting me to be part of this conversation.

  (1000)  

    Thank you very much, Professor Tworek.
    Now we will go to Professor Emon.
    Professor Emon, the floor is yours.
     I want to begin by thanking you for inviting me to address the committee today. I'm sorry I can't be there in person, but I'm here virtually, in the capacity of director of the Institute of Islamic Studies at the University of Toronto where I am also professor of law and history.
    At the Institute of Islamic Studies, I oversee a collaborative research project that we call the study of Islam and Muslims in Canada, or SIMiC for short. SIMiC is a collaborative project that partners with six Canadian universities and six community partner organizations. SIMiC blends research with a public responsibility for recalibrating the conversation on Islam and Muslims today.
    I do not need to tell you that the existence of Islamophobia in our country is real and extremely concerning; you know this. I'm here because there are things we can do. Drawing on the work of SIMiC, I can identify three specific things you may want to consider as part of a whole-of-government approach, particularly as they relate to Canada's Muslim community as a target of online hate.
    The first concerns a reliable data architecture that provides disaggregated data on those communities most targeted. One core feature of SIMiC is to identify gaps in Canadian data architecture to chart the demography of Canada's Muslim communities. Comprising a team of academic researchers, settlement agencies and community organizations, the big data group at the institute is interested in determining what sorts of measures might be put in place to gain a better understanding of who Canada's Muslim communities are as well as their values, their hopes and their aspirations in Canada for themselves and their families.
    This summer, one of our research fellows will examine the extent to which existing datasets across the country, including raw datasets from StatsCan research data centres, can tell us something about Canadian Muslims in terms of gender, ethnic or racial category, educational achievement, employment status, income levels and so on. We plan to launch the report in September 2019, and I will share that report with this committee if it so desires.
    One key issue concerns the fact that StatsCan asks about religious identity only decennially rather than quinquennially. This approach is fundamentally counterproductive given that the current state of online hate quite often targets groups based on their religious identity. If we are to combat hate that targets people because of their religion—and let's be clear that's exactly what is happening with regard to Muslim Canadians—then we cannot continue to embrace an outdated data architecture that leaves us blind to the terrain in which we much now do our work.
    The big data group at SIMiC exists in part to illustrate exactly why we need to rethink data architecture policies at a national scale, starting with a religious identity question in StatsCan's quinquennial census.
    My second suggestion for something you may want to consider comes from the work we are doing on global anti-terrorism programs. The institute is part of a consortium of universities around the world examining the extent to which government programs on countering violent extremism have a disproportionate effect on certain communities and, in doing so, ignore others that need to be part of any inquiry.
    While we're at the beginning stages of this work, our research has turned up a glaring issue in Canada that may fall within the ambit of this committee. In 1989, Canada was a founding member of the Financial Action Task Force, or FATF, which at the time was charged with combatting money laundering as part of the war on drugs.
    After 9/11, the FATF issued a new set of special recommendations to track and combat terrorism financing. FATF guidelines recommend that each state party adopt what it calls a risk-based assessment model, or RBA, to prioritize its targets and allocate its limited resources.
    In 2015 Finance Canada issued a self-assessment to FATF. In that self-assessment, Finance Canada outlined Canada's RBA in relation to anti-terrorism financing. It identified 10 groups that posed the greatest threat of terrorism financing in Canada. Eight of them are Muslim-identified groups; one is Tamil and the other is Sikh. In other words, as far as the Government of Canada is concerned, 100% of terrorism financing risk comes from racialized groups and 80% comes from Muslim-identified groups. Nowhere in the 2015 document is there reference to white supremacist groups, white extremist groups and so on, despite the fact that such groups are no less prone to violence, as we have sadly seen.
    What does this have to do with online hate? While you will no doubt hear many arguments about freedom of expression as you attempt to regulate online hate, you already have a mechanism in place to track the financial funding of such hate, namely, FATF special recommendation number 8, which identifies charities and other not-for-profit organizations as being vulnerable to terrorist financing.
    The aim here in my suggestion is to go after those philanthropic organizations that fund the cacophony of hate. The U.S. is already ahead of the game on this. Think tanks and sociologists have issued reports identifying the principal funders of hate.

  (1005)  

     While any given instance of online hate is relatively cheap, my suggestion is that you revisit Canada's RBA to use existing financial monitoring regimes to turn off the spigot of funding across the board.
    My third and final suggestion concerns not so much combatting online hate as promoting new storytelling opportunities to enhance and improve on gaps in Canada's cultural heritage. Alongside our big data group is a second group that is working to create an archive that documents the history of Muslims in Canada. lt is an archive that will be created through collaboration among researchers at the university, community organizations and those individuals who hold records that capture this history.
    Our environmental scan of Canada' s major archival institutions shows that there is little if any representation of the various communities, in particular racialized minorities and Muslims, that constitute the fabric of our national mosaic. Whereas other jurisdictions, such as the U.K. and the U.S., have a growing culture of community archiving projects, this phenomenon is mostly unsupported by the government in Canada.
    We are beginning to see some movement in this regard with respect to Canada's indigenous communities, thanks in part to the work of the TRC and new funding schemes allocated to preserving indigenous knowledge. The archive project we are creating is a joint project in which the University of Toronto will serve as a core institutional partner. We have the digitization technology to create an open-access digital archive. Robarts Library has a storage facility for any and all analog copies that we obtain. Thomas Fisher Rare Book Library will provide future researchers with a venue to access those hard copies.
    By the end of the summer, the institute will publicly launch its acquisition policy in consultation with our community partners. Moreover, colleagues have expressed an interest in tying their course work to the archive whereby students can help us identify records while they also achieve course credit. Such archives not only foster education and community but also create opportunities for people to tell new stories about themselves and their communities in an academically rigorous way, with thick description. In short, our archive not only promises more speech, but it will deliver better speech.
    While we have the infrastructure and overhead to make this possible, our greatest challenge, and the challenge to any such archival project, is to identify funding sources to support archival review processes which involve human capital. The Department of Canadian Heritage certainly offers some funding for such projects, but the envelopes are limited. Its mandate is not narrowly focused on groups targeted by hate. Moreover, many of its grants expressly disqualify university-affiliated projects like ours, despite the fact that universities are well positioned with infrastructure to carry out such projects.
     It has been our experience that the Social Science and Humanities Research Council does not fund such projects, in part because they do not fall within prevailing views of what counts as formal research.
    While we remain committed to this project, our environmental scan suggests that supporting digital archival projects in participation with targeted communities can create a counterbalance to the online hate that we see proliferating. Consequently, this committee may wish to recommend jump-starting the creation of participatory digital archives, with a specific focus on those minority groups subjected to online hate.
    Thank you very much, and I welcome your questions.

  (1010)  

    Thank you very much.
    We will now go to Ms. Mithoowani.
    The floor is yours.
     Thank you very much for the invitation to speak today.
    My name is Naseem Mithoowani. I am a lawyer practising in Toronto, Ontario.
    As some of you may know, I am also one of the individuals who initiated human rights complaints in 2008 against Maclean's magazine for having published a feature article entitled “The future belongs to Islam”, authored by Mark Steyn. Maclean's, at that time, was our only national news magazine in an era when social media hadn't yet taken off. The article, therefore, garnered a fair amount of attention.
    lt described Muslims as being engaged in a nefarious plot to take over western democracy as we know it. lt insinuated that all Muslims were guilty either by being directly involved in violence or by supporting the goal silently. Muslims living in the west were demonized as “hot for jihad” and as breeding like “mosquitoes” for the sole aim of supplanting the western populations where we lived but with whom we shared no allegiance. Muslims were portrayed as inherently violent and deceitful.
    The Muslim community felt the harm of these words in their bones. This was a call to action for the west to wake up to the threat of Muslims living among them. lt was, in essence, asking Canadians to view their Muslim neighbours with suspicion.
     We also found 21 articles printed in the previous two years in Maclean's that contained the same anti-Muslim themes, referring to Muslims as “sheep-shaggers”, “global security threats”, “barbarians” and prone to frenzy. One memorable piece even suggested that the CBC comedy Little Mosque on the Prairiewas part of a conspiracy to distract the watching public from the security threat that Muslims posed by instead promoting them and portraying them as good and friendly community members.
    We found exactly zero counter articles or critical analysis in response.
    We sought a meeting with Maclean's to propose that they consider running a counter piece to address the allegations made in Mr. Steyn's article. More and better speech, we reasoned, was a win for all parties involved. It was only when we were completely shut out by Maclean's that we filed human rights complaints. The very fact that we had done so, in and of itself regardless of the outcome, was seen as proof of abuse, justifying the repeal of section 13 by the Conservative government at the time.
    With the benefit of hindsight, I think there are very few people who would today believe that we had no reason to be alarmed over the content of the publication in question.
    Those who peddle the rhetoric of a Muslim takeover don't care that the claim contains no truth. Suspicion and fear of Muslims sells. The idea that Muslims are actively trying to subvert western democracy is a warning that people heed, sometimes with horrific consequences. ln fact, we now know that the claims of western demographic decline and supposedly astronomical birth rates of Muslims are a staple in the modern white nationalist movement, usually framed in terms of an invasion, cultural replacement or white genocide.
    Indeed, shortly after our complaints were dismissed, the very article that we alleged was hateful was specifically quoted in the manifesto of a white supremacist, who then went on to kill 77 people in Norway in 2011. He justified his actions and his violence as a form of resistance against the inevitable Muslim takeover that Steyn and others were warning against.
    This idea of a Muslim takeover of the west has also played prominently in the motivations of the killing of Muslims in Quebec and New Zealand.
    Particularly after the deadly attack in New Zealand, even those who were most ardently in support of repealing section 13 following our complaints have paused to reconsider. Professor Richard Moon, for example, was a thought leader in the call for the repeal of section 13. He was commissioned by the government to write a report regarding section 13, and in that report he recommended repeal.

  (1015)  

     He has since had the opportunity to revisit the Maclean's complaints in a very recent blog. In it, Professor Moon expressly acknowledges that the outcry over our complaints was unwarranted. He states that in light of the rising tide of violence against Muslims, it is not surprising that Steyn's rhetoric has been cited by those who wish to cause Muslims harm.
    The truth, then, of the Maclean's complaints, and the controversy surrounding them, is that the Muslim community attempted to use section 13 to call out, 12 years ago, the very same hateful propaganda of a mass Muslim conspiracy theory that we are seeing as influencing mass murder today. The unfortunate lesson that I take out of my experience with the Maclean's case is that we, as a society, were not able to get ahead of the rhetoric at that time and call it out for what it is.
    Section 13 does not unduly restrict freedom of expression. It creates a tool to identify and address speech which harm far outweighs any potential benefit. This is in line with our societal values. In Canada, as opposed to other jurisdictions, we simply do not recognize an unlimited right of free expression. Rather, we recognize that legitimate restrictions may be placed on all rights and freedoms, including freedom of expression in a free and democratic society. Hate propaganda is a harm that needs to be confronted, since it shuts down dialogue by making it difficult or impossible for members of vulnerable groups to respond, thereby stifling discourse.
    Since the repeal of section 13, communities have been left open to attack. It is my first recommendation to this committee, therefore, that section 13 be reinstated.
    However, my experience with the Maclean's complaint leaves me to believe that section 13 alone is insufficient. Section 13 requires individuals to do the heavy lifting of making and carrying complaints. In addition to the financial and time commitment in making and carrying on complaints, those who initiate complaints are vulnerable to personalized targeting. When we made our complaints, for example, as law students just beginning our careers, we were called "legal jihadists", "terrorists", "sock puppets", and accused of using the tools of western democracy to dismantle it.
    Instead of seeing Steyn's portrayal of Muslims in the west bent on subverting democracy as the dangerous trope that it is, our actions and complaints as Muslims were viewed through this very lens. We were accused of using democratic tools, including section 13, to subvert western values.
    Confronting hate speech is in everyone's interest. That burden should not be placed on the shoulders of a few. It aligns with a better democratic system by ensuring that all voices are included. We cannot afford to download the entirety of the financial and emotional burden of standing up to hate onto vulnerable groups.
    My second recommendation is therefore that the committee considers the creation of a body which could intake complaints and carry them forward. I want to be clear that such a body should not disallow individuals and communities from taking personal carriage of complaints where they elect to do so, but should be seen instead as an alternative and complementary channel.
    I wish to conclude my remarks by stressing that reinstating section 13 is a vital first step towards combatting online hatred, but it is insufficient in and of itself. We need more tools and partnership amongst all industries, communities and civil society in order to address the problem effectively. The technology and reach of the Internet makes Canada a far different place from when we initiated complaints against the printed Maclean's magazine in 2008. We need creative solutions in response, which should include but not be limited to the reintroduction of section 13.
    I thank you for your attention, and I look forward to answering your questions in our next segment.
    Thank you.

  (1020)  

    Thank you very much.
    We will now move to questions.
    Mr. Cooper.
     Thank you, Mr. Chair.
    First of all, not to diminish the very good work of Brian Storseth, the former member of Parliament for Westlock—St. Paul, who did introduce Bill C-304 and was supported in the Senate by the late Senator Doug Finley, but it was the Liberal member of Parliament Keith Martin who first called for repeal of section 13 in 2008, and we thank him for that.
    Ms. Mithoowani, I would be interested in your thoughts on whether you consider BDS to constitute hate.
    My expertise, of course, is with section 13 of the human rights code federally. I fail to see how your question fits into my area of expertise.
    What I will note is that there is a myth that section 13 targets all types of hatred. In fact, section 13, the test and the legal analysis around section 13 out of sociological—
    Ms. Mithoowani, I'm just asking you whether you consider the BDS to constitute hate. I think it's relevant, because when we talk about hate, the definitional plurality is important, and I'm asking you that question directly.
    Perhaps the witness should be allowed to answer the question. She isn't really being allowed.
    And I'm asking her to be allowed.
    I'm going to allow her to answer. It is Mr. Cooper's time.
    Section 13 of the Canadian human rights code requires us to look at hallmarks of hatred. It targets the most extreme type of speech, that which demonizes and dehumanizes individuals by referencing them to subhuman characteristics, calling them cancers, calling them mosquitoes and vermin. This is the type of speech that I think is properly dealt with by section 13 according to the sociological evidence and the—
    Okay. Thank you. I'm not sure I got an answer.
    I'm not sure that the answer is relevant.
    Could you just confirm that you attended a July 19, 2014, rally in front of the Royal Ontario Museum?
    I'm sorry, I—
    Were you at a July 19, 2014, rally in front of the Royal Ontario Museum?
    I might have been. My remarks today—
    At that rally—
    Please let the witness finish, Mr. Cooper. You can ask questions, but she has to be allowed to finish.
    What I think is happening is exactly what was happening at the time of the Maclean's complaint. Instead of addressing the substance of what I'm saying and the harm of the discussion, at that time and now the focus is put on the individual who is making the complaint and—
    Yes. Thank you.
    —that's exactly one of the issues that I find with section 13. Those who are standing up, not for themselves—because, of course, I was not named personally in Mark Steyn's article. I was standing up for a community, and that's what individuals are required to do. Disadvantaged, disenfranchised communities are required to shoulder the burden of combatting online hate and then confront personal attacks similar to what the Conservative government is doing today and also did at that time.
    I can confirm that you were at that rally in which pro-Hamas chants occurred, where there were such anti-Semitic slogans as "from the river to sea, Palestine will be free". I'm looking at an image in which there's a sign saying "stop the Palestinian holocaust now. Fascist Israelis will be brought to justice.” So, yes, you did attend that.
    Now, how much time do I have?
    You have a minute and 40 seconds.
    Mr. Chair, I have to intervene. There's a mischaracterization of someone who attended a public rally. A witness here at the committee right now is having to agree with those particular things that were just read, having to subscribe to them. I think that's a gross mischaracterization. Again, this is not what Ms. Mithoowani is here to testify about.
    This is not meant to be a personal court. I think that the continuing usage by Mr. Cooper in this way to personally go after witnesses is reprehensible. This is the second time, Mr. Chair.

  (1025)  

     I understand, but we have to also understand that it's not unparliamentary—
    I did not say it was—
    The Chair: I agree, but for the chair to intervene—
    Ms. Tracey Ramsey: —but I'd like to say—
     For the chair to intervene, it has to be something that is unparliamentary. I can only suggest again, as we did last time, that Mr. Cooper move back to the subject matter so that we can move on with the meeting.
    No. This is going to happen again, Mr. Chair. We've seen this way too often.
    Mr. Chair, I [Inaudible—Editor]
    Frankly, he's just read into the record that she agrees with the signs that he read—
    I'm going to ask [Inaudible—Editor]
    —and that's a mischaracterization of this particular individual, and that's unacceptable.
    If Mr. Cooper did not, I was going to let the witness again fully respond at the end of his time.
    I welcome her to do so.
    Okay. Please—
    Again, it's completely unfair to witnesses at this committee to have to personally defend themselves against these types of accusations.
    She was at the—
    That is not why they're here. She's not here to defend herself as an individual against accusations that she ascribes to the things that he's shared.
    I think—
    I think this is a distraction from the main issues that we have to discuss.
     I think that asking an individual who attends any rally to comment and be responsible for every other comment and publication or sign at that rally is, frankly, beyond reason, beyond rational debate and discussion. I think, given that we have a 600% increase in online hate, let's focus on that.
    I would submit, Mr. Chair, that BDS constitutes online hate.
    Thank you, Mr. Chair.
    Thank you very much, Mr. Cooper.
    [Inaudible—Editor ] pro-life or attending United We Roll rallies.
    Ms. Tracey Ramsey: You can't ask for her personal—
    Some hon. members: Oh, oh!
    The Chair: Guys, order.
    On a point of order, Mr. Chair, throughout Mr. Cooper's time, which was his right to take, he received interruptions from nearly every single member of this committee except me and Mr. MacKenzie. Will we be afforded time that the Liberal members took and that the NDP took during that time?
    I've stopped—
    I have offered no comment out of turn in this meeting. However, if all members of the committee are going to be able to chime in and offer their commentary on the leader of the Conservative Party, Andrew Scheer, on—
    I'm talking about all [Inaudible—Editor]
    —people's views on abortion.... I don't know how—
    Who was doing such a thing?
    Ms. Khalid did.
    An hon. member: She just did.
    Mr. Michael Barrett: What's the order that's going to take place at this meeting? Are we going through people's time or is it a free-for-all? If it's a free-for-all, let's do it, Mr. Chair.
    Mr. Barrett, I am going to answer you.
    Number one, I stopped Mr. Cooper's time whenever I gave the floor to anybody else. Mr. Cooper's time was not taken away.
    Number two, as you saw, I as the chair did not rule Mr. Cooper out of order whatsoever during the course of that time. Yes, people jumped in, and I see in the House of Commons, in the same way as at committee, people jump in and heckle, and it's unfortunate. It should not happen. I don't agree with it, but there was nothing that took away Mr. Cooper's time.
    Again, I'm going to now move to the next questioner, and—
    Mr. Chair, I would respectfully ask for you to urge your Liberal colleagues to censor their interjections or please expect them from me for the duration of the meeting.
    Thank you, Mr. Barrett. Again, I—
    Ms. Ramsey.
    I'd like a point of clarification from the clerk, please, on if we are, as members of Parliament of this committee, entitled to weigh in with our opinion when we feel that we need to do so, respectfully through you, Mr. Chair. I would like you to confer with the clerk, please, and report back to the committee on whether or not we, as members of this committee, can intervene when we feel that it's important to do so.
    I'm going to briefly suspend.

  (1025)  


  (1025)  

    Order. I'm unsuspending.
     The answer is actually no. The only right of a member is to raise a point of order and to draw my attention to something that would be a violation of parliamentary rules. Again, disagreeing, even disagreeing vehemently, with the speaker is not a point of order unless they breach parliamentary rules. That's just the rules.
    Now I'm going to go to the next—

  (1030)  

    Mr. Chair, this is not a question—
    Do you have a point of order?
    Yes, it's a point of order.
    This is not a question of disagreeing. We're all obviously afforded the opportunity to disagree on any issue. This is a question of a member repeatedly, week after week, badgering witnesses. That is completely unacceptable.
     Every single Canadian should feel comfortable to appear before our committee. There is no room for actually attempting to bully or badger witnesses. This is the second time we've seen this, and this is unacceptable.
    Okay. I understand your point. It's a point of argument and, again, the proper procedure is then to amend the parliamentary rules to rule such type of badgering out of order. It is not my right as chairman, within the scope of how those questions were, to rule Mr. Cooper out of order.
     I take exception to being characterized as badgering anyone. I asked her a legitimate question on a definitional issue.
    Okay, I—
    [Inaudible—Editor]
    I also raised it—
    I understand, and I think we can again take these issues up in camera, if we want, following the meeting. I don't want to take further time away from the witnesses or the questioners.
    Now we're going to Mr. Virani.
    Mr. Chair, I think what we're trying to do is ascertain and come to some solutions on a very important issue. I think driving at the heart of the witness's substantive testimony is much more important than trying to ascribe whether an individual witness, in this context or any other, shares the opinions of anyone she may or may not have attended a rally with. Let's leave it at that.
    I want to say thank you to all three of you for being here.
    I want to say a specific welcome to Professor Emon, who is also a constituent and a member of the law faculty at my alma mater. I want to champion you and hold you up for the important work you have done on combatting Islamophobia, which has been a pressing matter, not just for the past two years in Parliament but going on for about two decades now, in the wake of 9/11.
    Let's get to the substance of the matter, section 13. We've heard a lot about section 13. I have limited time, probably about five minutes and 20 seconds left right now.
    Section 13 does not right now contain a definition of hate. It does not right now contain a threshold requirement. It also has a subsection (3), which exempts the service provider or the telecommunications network from any liability for the human rights violation.
    Do you have any comment on those three provisions? Does it need a threshold of what constitutes an organized campaign? Does it need a definition of hatred? Should some sort of liability, in the human rights parlance, attribute to the Internet service provider or the telecommunications provider or the social media company as such?
    That's open to all three of you.
    Ms. Mithoowani.
    With respect to whether we need a definition of hatred, that has already been analyzed by the Supreme Court of Canada in various decisions. I think taking that definition, it doesn't need to be....
    We already know that section 13 is constitutional in the way it's written now. I would hesitate to include a definition or a threshold, because we don't want to muddy the water. We have a good body of case law that clearly indicates there is a very high threshold for section 13. There is a test that tribunals use, which, as I mentioned, requires there to be hallmarks of hatred within a publication.
    Those hallmarks of hatred are specifically chosen, because through history we have found, through sociological evidence and also through looking at past examples of dehumanization of peoples, that this type of rhetoric, these hallmarks are used. That's what leads to the dehumanization of individuals. Dehumanization of individuals then leads to discrimination, violence and hatred against them.
    I believe we already have those tools, and we don't need to reintroduce them into the legislation. It's a constitutional provision, and the parameters of its use have been outlined by the Supreme Court of Canada.
    Okay, I'm going to pause you there.
    Professor Emon or Professor Tworek, do you have any views to add on those two issues?
    Ms. Tworek.
    Yes. I will make four points very quickly.
    The first, in terms of liability, is the question of what USMCA will allow. There is a question mark over whether the CDA, Communications Decency Act, section 230, is embedded within the USMCA, which could potentially make it hard for Canada to deal with liability. That's still to be determined a bit, but I want to put that out there.
    Second, we're now dealing with a different kind of Internet where we have both public and private. In terms of private groups, for example, we would need to say, think about what that message is that's forwarded to thousands of people. That's why I think it's very complicated to think about threshold.
    The third point is that threshold is complicated because within the Internet, as people at Facebook and other social media companies will say, there are questions of volume versus intensity. If you reach 20 people, but those 20 people go and do something, do you weigh that against something reaching 100,000 people who don't really do anything with it? That's a very complicated question that I think needs to be left to case law.
    Fourth, to re-emphasize what I said in my testimony, only a very, very narrow amount of hate speech is going to be dealt with through law. There are also broader categories of harmful speech. That's why I gave suggestions that were not necessarily specifically legal but rather whole-of-government approaches to try to deal with some of these issues.

  (1035)  

     Professor Emon, you could weigh in on this but also add into your response aspects of how we leverage those civil society groups in collecting the information, because some people have expressed concerns about bias. If people are more comfortable going to the black community with white supremacy complaints, granted, will you get an artificially inflated number? How would you respond to that concern?
    Let me go back and think about a whole-of-government approach. On the one hand, I would simply endorse many of the comments that Ms. Mithoowani has already remarked upon regarding section 13.
    I wanted to clarify my invocation of the financial action task force implicitly linking online hate to the promotion of terrorism. While that will strike some folks as a stretch, I do want to bring a critical race lens to this analysis. Thus far, as we've been talking about online hate, we're really mostly talking about white supremacists and white extremist hate promulgation against minorities, racialized or religious ones.
    In bringing a racial lens to this analysis, we have to ask ourselves whether or not we can also begin thinking about these online hate promoters as also promoting terrorism. That's why I bring up the special recommendations of the FATF. The FATF has a special category called designated non-financial businesses and professions in which there is no reference to social media organizations. I would simply suggest taking a look at that.
    In terms of focusing on civil society grids, it's not my experience thus far in working with a number of Muslim civil society groups that there has been an inflation of attacks. What we do have, rather, is a better appreciation of how those attacks are understood and felt within the context, within a very thick, enriched context.
    One of the limitations of law is that it has a tendency to flatten our experiences. Part of the challenge here and part of what we're trying to create at the institute in combatting Islamophobia is a thick narrative around what these attacks mean, how they're understood and how they resonate as hate.
    I don't think that you get an inflation by reference to civil society groups in these communities. I think what you have is a racialized and particularized framework that gives meaningfulness to these attacks of hate and therefore allows us to bring them within the legibility of any legal framework.
    Thank you very much.
    Ms. Ramsey.
    Thank you all so much for being here today and for all the work that you're doing to reduce and combat hate speech in your respective roles. I really do sincerely thank you, because without a kind of pan-Canadian, over-arching federal framework.... The work that you're doing is so important because it's informing us, but it's also helping Canadians to understand how to combat it and to identify it and what is available in the law.
    We really are struggling in this committee because this is such an incredibly large topic, and there are so many important areas. It's difficult to know what to start with and where to begin the work that's necessary.
    I thank you, Ms. Tworek, for some of the examples you shared with us in terms of what's happening in Germany, because the other piece of this is social media giants. We had Facebook here. We were attempting to bring Mark Zuckerberg here this week, and we couldn't even get him to come before a parliamentary committee, so how do we engage with these social media giants who don't view themselves as belonging to one country? They're global. They're the size of countries. It's a very significant challenge to try to talk about any forms of regulation when, quite frankly, they're even resisting to appear as witnesses.
     I want to ask you about how you think the social media giants such as Facebook could improve the way they handle hate speech on their platforms, given the volume you've mentioned that exists. I'd like you to speak to that. Then, I would ask our other two panellists, how are you informing or helping the conversation in your communities around reporting? How do you make some transparency around that?
    I'll start with Professor Tworek.

  (1040)  

     Thank you so much.
    I actually testified before the international grand committee and was there to hear those hearings, so I'm very much in tune with that. One part of the puzzle is that international coordination, of which Canada is a key part as a co-chair. That committee has done a really good job of bringing together MPs from 14 different countries that represent over 400 million people, and still Mark Zuckerberg and Sheryl Sandberg did not appear.
    Let me say four brief things. The first is that in the German case it was a big fine that really enabled the social media companies to come to the table and start enforcing German law. Beforehand they said that they couldn't comply, but when big fines were on the table, all of a sudden they actually could.
    The second part of this is that to handle the volume, they're simply going to need more content moderators. While some things are picked up by artificial intelligence, the reality is that most of this is done by humans. Just as a sidebar to flag, given the pretty awful labour conditions under which these people operate, which we in Canada should be concerned about from a human rights perspective—this is very psychologically burdensome work, and we have some evidence from journalists and others about how difficult this work is and how much PTSD the content moderators experience—the companies are going to have to pony up a lot more money to work with that.
    The third element of this is that we need to find out where the content moderators who work on Canada are located. We don't even know that kind of basic information. My guess is that none of them are in Canada. They don't have any contextual knowledge about Canada, for example, about what is language that denigrates indigenous people or other marginalized groups in the Canadian context. That's another pretty simple thing on which we could ask for clarification. We can try to provide more context.
    The fourth part of this then is the question of transparency and figuring out what we as Canadians need to know and whether that is under audit. I suggest that there are also very, very basic questions about how much of the hate speech we see in transparency reports and through social media companies is happening in Canada. The part of the German law that everybody, including Article 19 and other free speech organizations, praises is the transparency report that the NetzDG law mandates. That's something that everybody agrees on, regardless of where they are on the political spectrum. I think that's certainly something Canada can take away, and I can provide very specific suggestions on what we could look for from those transparency reports. It would be much more meaningful than is what is in the NetzDG ones or in the broad global ones the companies release.
    Thank you.
    Ms. Mithoowani.
    With respect to how we might encourage individual communities to identify and report hatred, I think the creation of specialized hate crime units within police forces is important. We know that, for example, in B.C., there are specialized police officers that work with Crown attorneys and that have become the experts in that area. As such, they do outreach with communities and have developed relationships of trust in which there's transparency and relationship building that has taken place. I think that's a model we can look to for other provinces and federally.
    Professor Emon, do you want to weigh in?
     Yes, please. I have a few remarks regarding section 13 on the combatting. What we found in developing our archive project was that at the Canadian national level, there are very few places through which you can actually have an articulated construction of Muslims in Canada.
    Let me give you an example. Library and Archives Canada has this little archives concept to document the Canadian experience. It's a very vague and abstract documenting process that tends to water down the particularities of any community, but just the other day I visited LAC online and I chose its “browse by topic” category. When you do so, there's no category for religion. There is an ethnocultural tab that will take you to a page with a lot of white ethnic groups and some Asian ethnic groups. The only religiously identified groups there are Jews and Mennonites. Muslims as a category do not feature on this search function. One can, of course, use key search terms to find anything, including something about Islam and Muslims, but the LAC website does not purposefully and proactively document Muslims in Canada in a way that can engage a broader viewing public.
     This is not just a federal matter. At the provincial level, the Multicultural History Society of Ontario's oral histories collection is also principally organized by ethnic groupings, though it does catalogue for two religious groups: Jews and Mennonites. If one were to look for Muslims in their photograph collection, for instance, one would have to enter the awkward phrase "Islamic Canadian", a phrase whose very formulation represents a fundamental ignorance about Islam and its adherents, who are called Muslim.
    It does seem to me that we have a fundamental religious illiteracy or an illiteracy in our society about certain groups. Therefore, characterizing something as hate speech against a group requires us to first understand the group on its terms, but we do not have even the data architecture to enable that.

  (1045)  

     Thank you very much.
    Mr. Fraser.
    Thank you, Chair.
    Thank you to the witnesses for being here today.
    Ms. Mithoowani, I would like to start with you. You talked about the importance of section 13 of the Canadian Human Rights Act. You'd like to see it reinstated, although, as you explained, you understand there are limitations to that. It perhaps puts too much of an onus or burden on the individual bringing forward that claim. We heard earlier, on another day of testimony, that while that is true, section 13 has an ancillary benefit of also providing perhaps in some circumstances a moderating influence on people's behaviour.
    Do you think that's true? Would that be another reason to reinstate it? It may be cumbersome at some points in time to bring forward complaints. It may also have the benefit of moderating some people's behaviour that would otherwise be unfettered to propagate whatever misinformation and hatred they would want.
    I do believe that. I also think human rights tribunals and commissions are well placed to deal with issues of hatred and discrimination. In Ontario, for example, our complaints were dismissed for lack of jurisdiction, because there's no provision within the Ontario code that addresses publications versus signs. The Ontario commission used its broader mandate to speak out against the article, calling it Islamophobic and pointing out the harm. That type of empowering of commissions to take on that work and to call out hatred is important to the targeted communities, I agree.
    Thank you very much.
    Professor Emon, you talked about trying to “turn off the spigot” of funding toward funders of hate. I guess in the world in which we live today, we have international actors and other countries—Russia and China have been cited as examples—perhaps using online misinformation and propaganda to try to divide people in western societies. Are you considering them to be promoting this sort of stuff and funding it, or are you talking about other organizations? If so, who are they?
    That's a great question. So far, most of the research on these organizations promoting hate has come out of a number of research studies being done in the United States. There's a 2011 report called “Fear, Inc.”, and Christopher Bail's work, Terrified, which talks about fringe groups that have become mainstream. It identifies a number of organizations, philanthropically organized in the U.S. as 501(c)(3) charitable organizations, that philanthropically support and provide platforms for promulgators of hate. I would be delighted to provide a list of that to this committee subsequent to this meeting. I could also provide a number of links to trackers of hate in the United States.
    What's been happening there, though, is that you do have a more concerted effort to track the funding of different hate speech.... The fact is that, as you and I both know, actual hate speech online is very cheap to put up, but there are costs, and those costs are diffused. The question, then, is who is funding it? We can find it, but we have to put our eyes and our attention to it. That's where I think with organizations like FINTRAC and others, given the whole-of-government approach you already have to track money laundering and anti-terrorism funding, you have the ability already embedded within Finance Canada to begin thinking about these [Inaudible—Editor].
    Just so I understand, is there an international...or at least amongst the western allies, for example, to deal with it in a collaborative fashion right now? Is that happening, or is it just individual countries doing it on their own?

  (1050)  

    To my knowledge, I don't know of any organization internationally leading a campaign on online hate. The closest one is FATF coming out of 1989. It originally was focused on the war on drugs and money laundering. It's now focused on anti-terrorism financing due to 9/11.
    It does seem to me that Canada, as a founding partner, could certainly take this issue up with them to begin thinking about expanding the ambit of the RBA model, certainly within the government but also with FATF, that as it tries to include, I would argue, social media corporations as part of the non-financial businesses, they also include it in their oversight policies.
    Thank you.
    Professor Tworek, you gave a list of recommendations or some things for the committee to consider. We talk about all the negatives here, but I think your last point was about how we can maybe pivot toward a more positive conversation and support those constructive dialogues online. Can you give some examples of things that could be done to try to promote more constructive dialogue online? I'm assuming there could be some education dealing with young people, teaching them in school about perhaps respectful ways to engage online. Do you agree with that? Also, what more could be done in this fashion?
     Yes. Let me give a couple of examples because it obviously has a broad range. Part of it has to do with the funding and architecture that Professor Emon already talked about, so I'll flag that as something that's a continual problem. The structures of funding we have in place right now make it hard to do a lot of the work that we're describing. There are two examples, though. I'll give some foundations, and researchers are working on these issues.
    One example is civics, which many of you probably know from the mock voting that they do in schools. I've been speaking with them about how to create new materials to encourage students to engage in dialogue to understand that democracy is about respectful disagreement. We don't all have to agree, but how do we actually engage with each other in a respectful manner without dehumanizing a particular group? That's one example of a foundation that's inculcating and helping students understand how to disagree respectfully in different kinds of ways. Other foundations think about that, too.
    A second example is of a researcher at Simon Fraser University, Maite Taboada, who is working through computational linguistics to look through over 600,000 comments on Globe and Mail articles. She's using that to understand what types of comments lead to more constructive dialogue online. That's not to say that any type of speech is then removed necessarily, but that we actually gain a better understanding of what types of speech actually lead to constructive dialogue.
     We really need more funding to delve into that kind of research so that we can figure out how to encourage people to engage with each other in meaningful and respectful ways even if they disagree fundamentally on issues.
    Very good.
    Is that my time? Okay.
    Thanks very much.
    Thank you very much.
    I want to thank all the witnesses. You've been very helpful to us in the course of our study. I really appreciate it.
    Before we move out of the public meeting, we have the subcommittee's fourth report that the committee needs to approve.
    It basically says that the subcommittee is going to meet again today after this meeting.
    Is everybody okay with this report?
    Some hon. members: Agreed.
    The Chair: I'm not hearing any opposition. The subcommittee will meet afterwards.
    The meeting is adjourned.
Publication Explorer
Publication Explorer
ParlVU