Skip to main content Start of content

JUST Committee Meeting

Notices of Meeting include information about the subject matter to be examined by the committee and date, time and place of the meeting, as well as a list of any witnesses scheduled to appear. The Evidence is the edited and revised transcript of what is said before a committee. The Minutes of Proceedings are the official record of the business conducted by the committee at a sitting.

For an advanced search, use Publication Search tool.

If you have any questions or comments regarding the accessibility of this publication, please contact us at accessible@parl.gc.ca.

Previous day publication Next day publication
Skip to Document Navigation Skip to Document Content






House of Commons Emblem

Standing Committee on Justice and Human Rights


NUMBER 143 
l
1st SESSION 
l
42nd PARLIAMENT 

EVIDENCE

Thursday, April 11, 2019

[Recorded by Electronic Apparatus]

  (0845)  

[English]

     Good morning, everyone. Welcome to the Standing Committee on Justice and Human Rights as we launch our study on the issue of online hate.
    This is a really important issue. With the increasing numbers of hate crimes being reported in Canada and groups feeling vulnerable and the increasing presence of hate on the Internet, this subject is one the committee wants to tackle.

[Translation]

    I know that a number of groups across Canada have asked us to study this issue. Today we will hear from three, and a fourth may join us later on.

[English]

    Today, representing the Centre for Israel and Jewish Affairs, we have Mr. Shimon Koffler Fogel, President and Chief Executive Officer; and Mr. Richard Marceau, Vice-President, External Affairs and General Counsel. Welcome, gentlemen.
    From the Anglican Church of Canada, we're joined by Mr. Ryan Weston, the Lead Animator of Public Witness for Social and Ecological Justice. Welcome.
    From the Canadian Rabbinic Caucus, we're joined by Rabbi Idan Scher. Welcome.
    We might be joined by the Ahmadiyya Muslim Jama'at, represented by Imam Farhan Iqbal. If he arrives, he will follow the other members of the panel in testifying.
    We will start with CIJA. Mr. Fogel, the floor is yours.
     Thank you, Mr. Chair, and thank you to the members of the committee for inviting us to join this important conversation.
    I'm pleased to offer reflections on behalf of the Centre for Israel and Jewish Affairs, CIJA, the advocacy agent of the Jewish Federations of Canada, which represents more than 150,000 Jewish Canadians from coast to coast.
    CIJA is encouraged by the launch of this important study. Since the horrific mass shootings at the Tree of Life synagogue in Pittsburgh last October, the deadliest act of anti-Semitic violence in North American history, CIJA has mobilized thousands of Canadians to email the justice minister to call for a national strategy to counter online hate, beginning with this very study.
    We've done so with the endorsement of a diverse coalition of partners, including Muslim, Christian, Sikh and LGBTQ+ organizations. It's also worth noting that various groups we have worked with to mark genocide awareness month, including Ukrainian, Armenian, Roma, Rwandan and Yazidi representatives, have united to call for a national strategy on online hate, knowing that genocide begins with words.
    Increasingly, terrorist organizations and hate groups rely on online platforms to spread their toxic ideas, recruit followers and incite violence. This is a problem that spans the ideological spectrum. For example, Canadians who have been radicalized to join ISIS have often done so after extensively consuming jihadist content online.
    In the case of two recent acts of white supremacist terrorism, the mass murder of Jews in Pittsburgh and Muslims in Christchurch, the killers made extensive use of social media to promote their heinous ideology. The Pittsburgh shooter reportedly posted more than 700 anti-Semitic messages online over a span of nine months leading up to the attack; and the Christchurch shooter's decision to livestream his horrific crime was a clear attempt to provoke similar atrocities.
    We cannot afford to be complacent, given the link between online hate and real world violence. My hope is that this study will culminate in a unanimous call on the Government of Canada to establish a comprehensive strategy to counter online hate and provide the government with a proposed outline for that strategy.
    Today I'll share four elements that we believe are essential to include: defining online hate; tracking online hate; preventing online hate; and intervening against online hate.
    First, a national strategy should clearly define online hate and not assume that online platforms have the capacity to navigate these waters on their own. The focus should not be on the insensitive, inappropriate or even offensive content for which counterspeech is generally the best remedy. The explicit goal should be to counter those who glorify violence and deliberately, often systematically, demonize entire communities. While freedom of expression is a core democratic value, authorities must act in exceptional circumstances to protect Canadians from those who wilfully promote hate propaganda and seek to radicalize vulnerable individuals.
    The international community's experience in defining anti-Semitism is an important model. The International Holocaust Remembrance Alliance, or IHRA, working definition of anti-Semitism, which is the world's most widely accepted definition of Jew hatred, should be included in any strategy to tackle online hate. It's a practical tool that social media providers can use to enforce user policies prohibiting hateful content and that Canadian authorities can use to enforce relevant legal provisions.
    Second, a national strategy requires enhanced tracking and reporting of online hate, via strategic partnerships between the Government of Canada and technology companies. There are models worth reviewing in developing a Canadian approach, such as TAT, the “Tech Against Terrorism” initiative, a UN-mandated program that engages online companies to ensure their platforms are not exploited by extremists.
    Third, a national strategy must include prevention. In the current global environment, trust in traditional media and institutions has declined as online manipulation and misinformation have increased. A campaign to strengthen Internet literacy and critical online thinking with resources to support parents and educators would help mitigate these trends.
    Fourth, a national strategy must include a robust and coordinated approach to intervention and disruption. There's a debate over whether government regulation or industry self-regulation is the best approach. Canada can benefit from the experience of other democracies, especially in Europe, that are further down the road in developing policy responses to online hate.

  (0850)  

     Their successes and setbacks should be considered in crafting a made-in-Canada approach. The solutions, whether regulatory, legislative or industry-based, will flow from open and inclusive discussion with governments, service providers, consumers and groups like CIJA, who represent frequently targeted segments of society.
    The committee will likely discuss the former section 13 of the Canadian Human Rights Act. CIJA has long held the view, as we stated years ago during the debate over rescinding section 13, that removing this provision leaves a gap in the legal tool box. There are multiple legitimate ways to remedy this.
     One is enhanced training for police and crown prosecutors to ensure more robust consistent use of Criminal Code hate speech provisions. Three recent criminal convictions in Ontario—one for advocating genocide and two for wilful promotion of hatred—demonstrate that these sections of the Criminal Code remain an effective vehicle for protecting Canadians. Similarly, section 320.1 of the Criminal Code, which enables the courts to seize computer data believed on reasonable grounds to house hate propaganda, is a pragmatic tool that should be applied more often.
    Another approach is to develop a new provision in the Canadian Human Rights Act on online hate. This requires addressing the clear deficiencies of section 13, which was an effective but flawed instrument. In line with recommendations offered by the Honourable Irwin Cotler, a restored section 13 would require significant safeguards to protect legitimate freedom of expression and prevent vexatious use of the section.
    I know there are strong feelings on both sides of the House towards section 13. This need not be a partisan issue. With the right balance, a reasonable consensus can be achieved. I emphasize the need for consensus because the use of legal measures to counter online hate will only be truly effective if those tools enjoy broad legitimacy among Canadians. If misused, misconstrued or poorly constructed, any new legal provisions, including a renewed section 13, would risk undermining the overarching goal to protect Canadians and prevent hate propaganda from gaining sympathizers and adherents.
     While Canada has not seen the same polarization that has marked American or European politics, we are not immune to it and the growth of anti-Semitism and other forms of hate in mainstream discourse that comes with it. History has repeatedly shown that Jews and other minorities are at grave risk in times of political upheaval and popular disillusionment with public institutions.
     With advances in technology, these processes now unfold with alarming speed and global reach. The tactics of terrorists and hate groups have evolved and so too must public policy. While there's no way to fully mitigate the threat of hate-motivated violence, a strong national strategy to combat online hate would make a meaningful difference.
    Chairman and members of the committee, I thank you for your time and would welcome any questions during the Qs and As.

  (0855)  

    Thank you very much.
    We will now move to the Anglican Church and Mr. Weston.
     Thank you, Mr. Chair and members of the committee, for the invitation this morning to speak to you about our concerns as the Anglican Church of Canada with regard to the proliferation of hate online and the very real impacts of this hatred for communities in this country and around the world. We're honoured to join with the voices of so many other diverse organizations in calling for stronger action on this issue across Canada.
    Our religious tradition teaches that every person is imbued with inherent dignity and also particularly calls us to embody special care and concern for those who find themselves uniquely vulnerable to harm or attack. Additionally, as Christians, our faith community has enjoyed a historically privileged position in this country, so we recognize that we have a particular responsibility to speak out for the protection of others. If we are to take these commitments seriously, we must raise our voices to oppose hatred in all its forms.
    As you all well know, recent years have seen a proliferation of extreme forms of hatred in online fora that encourage violence and dehumanize those who are the targets of this hate. Recent high-profile violent attacks in Canada and abroad have emphasized the reality that these sentiments do not remain online, but have tragic offline consequences as well, and that they are in need of immediate and sustained attention.
    We join with many of the other witnesses here today in calling for the federal government to develop a national strategy to combat online hatred. This government has the ability to impose reasonable regulation on social media corporations, Internet service providers and other relevant corporate actors to control the proliferation of violent hate speech in online spaces. The government must also develop a strategy for more effective enforcement of existing laws regarding the public incitement of hatred, with particular attention given to the ways these attitudes are expressed online so that these activities will not go unchallenged.
    In order to be effective, any national strategy will also work in partnership with other stakeholders, recognizing that responsibility for combatting hatred, both online and off, does not rest solely with the government. Corporations, including the large social media companies, must update their terms of use and their monitoring and reporting activities in order to better control the dissemination of hate through their networks and to remove hateful posts and users.
    Faith communities and civil society organizations must also affirm our commitment to combatting hate in the communities that we serve and to use our voice and influence to challenge such expressions wherever we encounter them. Recognizing that many of the hateful words and actions directed towards the communities impacted by hate speech are carried out in the name of religion, the active participation of faith communities and interfaith coalitions is essential to effectively combatting this reality.
    We commit to continuing our work with ecumenical and interfaith partners in addressing these attitudes within our own communities, and we draw strength from the leadership and witness of many Canadian faith groups who have been actively working to combat hate and oppression for many years, some of which are appearing before you today.
    It's also important to remember that hatred online is never completely detached from hatred offline: hatred that is being promoted among sympathetic networks or directed at individuals and communities in the streets of our country. Although online hatred presents new challenges in terms of the ready accessibility of such extreme views, the roots of these attitudes are based on arguments and myths with long and influential histories in Canada and around the world. We must confront these attitudes at every opportunity.
    A national strategy to address online hatred, then, must also equip families, community leaders and individual Canadians to challenge expressions of hatred, extremism and violence wherever they may encounter it. Education and awareness must be key parts of any strategy to address this issue, equipping people with the tools they need to dismantle these ideologies.
    While I've been speaking about online hate in fairly broad terms this morning, we must also name that there are specific communities being targeted by these sentiments and that any national strategy to combat this hatred will only be effective inasmuch as it recognizes the specific realities and repercussions of particular forms of hatred. Although an overarching strategy is certainly necessary in this work, we must also develop integrated strategies that address and debunk the myths underpinning anti-indigenous racism, anti-black racism, anti-Semitism, misogyny, Islamophobia, homophobia and transphobia, xenophobia and anti-immigrant sentiment, religious intolerance and other forms of hatred that have distinct impacts on the safety and security of identifiable groups of people in this country.
    The Anglican Church of Canada is increasingly attending to the importance of online presence as a positive means of communication and education, so supporting the development and implementation of a national strategy to combat hatred online is a natural step in this work for us.

  (0900)  

     We recognize that we have the ability to reach thousands of Canadians through our services and programs across the country, as well as with our online presence. We are committed to continuing to lift up a vision of this country and this world that truly welcomes and respects everyone by offering safe, supportive spaces for all Canadians and by challenging expressions of hatred directly.
    If we fail to take more concerted action in this country to combat all forms of hate—online and in person—then further high profile acts of violence will embolden similar action by others. We must all work together to offer a positive, loving alternative to this hate, an alternative that affirms the inherent dignity of all persons in Canada and around the world.
    Such an alternative requires strong, strategic direction from government that supports efforts by all stakeholders to challenge these attitudes. We are ready to collaborate in this work together with our prayers, our pulpits and our presence online, but only by working together can we confront this important issue and make our world a little safer for so many.
    Thank you.
    Thank you very much.
    We will now move to the Canadian Rabinic Caucus, Rabbi Sher.
     Thank you for having me here today. I represent the Canadian Rabbinic Caucus, a group of around 200 rabbis from across the country and from across the Jewish denominational spectrum.
    On October 27, 2018, 11 Jews were murdered at the Tree of Life synagogue in Pittsburgh, Pennsylvania. The murderer had been highly active in promoting anti-Semitism on social media. It's reported that he posted more than 700 anti-Semitic messages online in the nine months or so prior to the attack. Just two hours before the attack, the murderer foreshadowed his actions in his final disturbing online post.
    On Friday, March 15, 50 Muslims were murdered by a white nationalist terrorist at two mosques in Christchurch, New Zealand. These murders played out as a dystopian reality show delivered by some of America's biggest technology companies. YouTube, Facebook, Reddit and Twitter all had roles in publicizing the violence and, by extension, the hate-filled ideology behind it.
    The shooter also released a 74-page manifesto denouncing Muslims and immigrants, which spread widely online. He left behind a social media trail on Twitter and Facebook that amounted to footnotes to his manifesto, and over the two days before the shooting he posted about 60 of the same links across different platforms, nearly half of which were to YouTube videos that were still active many hours after the shooting.
    As these horrific attacks demonstrate, hate can be lethal, and online hate can foreshadow mass violence. There is no question that the Internet has become the newest frontier for inciting hate that manifests itself disturbingly offline.
    ln 2017, the World Jewish Congress, representing Jewish communities in 100 countries, released a report indicating that 382,000 anti-Semitic posts were uploaded to social media in 2016. Stated differently, that's one anti-Semitic post every 83 seconds.
    Although information on online hate in Canada is limited, between 2015 and 2016, according to Cision Canada, a Toronto-based PR software and service provider, there was a 600% rise in intolerant hate speech in social media postings by Canadians. The architect of the study explains that while some of that intolerant or hateful speech was generated by bots, as determined by analyzing the high frequency of posts over a short time, the researchers noted that the bots' language was later mimicked by human users, and therefore it was just as destructive.
     These numbers are staggering.
    The Canadian government rightfully prides itself as a global thought and action leader in the area of protecting the rights, the safety and the quality of life of the people both within its borders and worldwide. We personally have felt this. Canadian law enforcement agencies have been exceptionally responsive in providing support to our institutions, particularly following the Pittsburgh attack.
     However, what is now needed is for federal policy-makers to prevent similar atrocities by launching a national strategy to combat online hate. The explosive growth of digital communications has coincided with rising alienation from traditional media and institutions. Extremists have taken advantage, preying on vulnerable disaffected individuals through the same digital tools and collaborative online culture that now shape so much of our world.
    There is, of course, no way to fully eliminate the threat of hate-motivated violence, but a strong national strategy to combat online hate can make a meaningful difference in protecting Canadians. The Centre for Israel and Jewish Affairs, CIJA, has set out a four-step policy recommendation towards fighting online hate.
    Step one of that recommendation is defining hate. One very important prong of this step is for the Canadian government to define what constitutes hate. This should begin with the adoption of the International Holocaust Remembrance Alliance—IHRA—definition of anti-Semitism. The IHRA definition is a practical tool that should be used by Canadian authorities in enforcing the law and as well by social media providers in implementing policies against hateful content.

  (0905)  

     The further steps of CIJA's recommendation include tracking hate, preventing and intervening to stop hate.
     On that last step, intervening to stop hate, I would like to make it very clear that we are not looking to police distasteful speech. Freedom of expression is of course a core Canadian value. We are focused on the glorification of violence and systematic propaganda affecting Jews and other communities.
     We are confident that an effective balance can be struck between protecting free speech and combatting online hate that demonizes entire communities and leads to violence and murder.
    This of course is a complex issue, but we are calling on the Government of Canada to take the lead in understanding it and developing the tools to counter it. We are calling on the Government of Canada to launch a national strategy to tackle online hate, working in partnership with social media platforms and Internet service providers, as well as other appropriate partners. This is a crucial step in making a difference that we so badly need.
    Thank you.
    Thank you very much, Rabbi.
    We will move to the Ahmadiyya Muslim Jama'at. We've been joined by Imam Farhan Iqbal.
    Imam Iqbal, the floor is yours.
     Thank you for inviting me.
    As-salaam alaikum.
    The peace and blessings of God be on all of you.
    I would like to start by recognizing this committee of justice and human rights on behalf of the Ahmadiyya Muslim Jama'at. As an imam of the Ahmadiyya Muslim Jama'at Canada, I would like to offer our heartfelt regards and thankfulness for giving us the opportunity to share our thoughts on this important pertinent matter.
    There is no doubt that over the last decade we have seen an exponential rise of hate crimes in our society. When we look at the statistics around terrorism, gang violence and gun violence, we see a general albeit worrying upwards trend. Similarly—and sometimes overlooked—with the dawn of the Internet and social media, we have seen a stark rise in online hate speech and violence as well.
    To start off, we can quickly discuss what hate speech really is, as was mentioned earlier as well. Where do we draw the line between freedom of speech versus hate speech? Hate speech refers to abusive or threatening speech or writing that expresses prejudice against a particular group.
     To put into perspective how dangerous this is, according to a Maclean's article, online hate speech rose 600% in Canada. Some of the words they monitored to demonstrate this rise are #banmuslims and #whitepower. Most recently, we are all aware of the rise of Islamophobia, which has hit home in Canada in Quebec City and led up to the Quebec City mosque attack. There had been a surge in mosque arson and vandalism across Canada and eventually it led to this attack. As recently as last week, in London, U.K., a right-wing extremist, Steven Bishop, pled guilty to a plot to carry out a bomb attack at the Baitul Futuh mosque, which is one of Great Britain's largest mosques belonging to the Ahmadiyya Muslim Jama'at.
     Ahmadi Muslims believe that free speech is a sacred freedom, insofar as it provides an indispensable flow to a human's thoughts and is a force for good. Hate speech and ideology designed to cause harm and grief must not be allowed to use the disguise of freedom of speech.
    We all recognize the imminent and very real threat of hate speech. How can we solve it? Is there a way to solve this problem by putting together a 30-, 60- or 90-day plan? Most probably not, and that's because this effort is not about bending heads. It's about changing minds.
    To combat the rise of hate crimes online and in person, we need to respect our differences and to continue stay true to our notion of acceptance. Ignorance breeds suspicion, fear and anger. Familiarity breeds understanding, compassion and love. When people come together and find out how similar we are, it is only then that we can truly have feelings of sympathy, understanding, compassion and love. This is the way to combat hate. We need to open our doors and our hearts. We need to interact with one another with unconditional love. We need to recognize the rights of one another.
    To truly combat the rise of hate crimes, we need to accept that this is not something that will be fixed overnight. Rather, it is something that will require a daily and regular struggle, which will help shape the way we think and interact with one another. Once we start to truly embody the profound saying of “love for all, hatred for none”, only then can we can start to tackle the rise of hate crimes.
    Together, we are stronger than any one individual. Love is much stronger than fear. Fear demands that we think of ourselves as somehow separate from one another. By coming together, our common bond increases our capacity for love to be dominant and allows us to live together in peace and harmony.
    Thank you.

  (0910)  

     Thank you very much, Imam.
     I'd like to thank all of the members of the panel for their very important contributions.
    We are moving to questions, and we will start with Mr. Cooper.
    Thank you, Mr. Chairman.
     Thank you to the witnesses.
    I certainly agree that this is an important study and that all Canadians should be extremely concerned by the proliferation of hate that has increased worrisomely in the last number of years. It has hit home in communities across Canada, most significantly in the horrific mosque attack in Quebec City, and we saw it just eight hours away in Pittsburgh at the Tree of Life synagogue, which I know well, because it's a couple of blocks away from where my brother lives.
    It's important that we tackle this issue and while we do so, we of course have to be cognizant—as I think all of the witnesses have indicated—of fundamental freedoms, including freedom of speech, and how we can strike that balance.
    Perhaps I'll begin with Mr. Fogel. You did mention section 13, which I, with respect, believe was a flawed section for a number of reasons. You mentioned that, with the removal of what many believe to be a flawed section in the Canadian Human Rights Act, there is a gap.
    When you look at section 318 or section 319 of the Criminal Code, or I think you cited section 320.1, I'd just be interested in understanding where you see the gap. Perhaps you could elaborate on what you would recommend to close what you state is a gap.

  (0915)  

    From our perspective, we were pretty agnostic about whether section 13 at the time it was debated was tweaked to be responsive to some of the deficiencies inherent in the section, or whether, if it were abandoned, the Criminal Code provisions were provided with a more robust capacity to compensate for the loss of section 13. Essentially, the problem is the one that you referenced and that I think would be familiar to everybody on the committee. We are dealing with two competing imperatives. On the one hand is the desire to ensure that people can avail themselves of the freedom to express thoughts and ideas freely, without fear of persecution or prosecution however odious those ideas might be. On the other hand, unlike our American cousins, we recognize that there is a limit to freedom of expression. When it begins to encroach on the safety and security and well-being of others, that really constitutes a red line.
    The challenge with section 13 was that it was both a sword and a shield. It was providing some insulation for those who really did have malicious intent and it wasn't offering protection to those who were the targets of toxic or vitriolic hatred. We had hoped that government writ large—it's not just a federal issue but applies also at the provincial and municipal levels—would come forward and demonstrate a real political will to pick up the slack from the loss of section 13. In fact, I think Richard wrote to the attorneys general across the country, calling on them to adopt a more aggressive posture in terms of considering and acting upon potential hate crime activity, which would be able to compensate for the loss of section 13. Frankly, that hasn't been our experience. There has been some, but not enough. If we were to return to a section 13 kind of model, we would want to ensure that there were provisions with respect to evidence, and also with respect to the onus of ensuring that it was not a SLAPP or vexatious kind of lawsuit, that really would protect the ability of individuals and groups to articulate ideas even if they didn't meet with a uniformly positive response.
    There were a number of other things that I think the committee could look back to in the testimony given by a variety of different stakeholders and that could offer a solution. What is clear, however, is that government writ large has to be able to employ useful tools, tools that do make a distinction between freedom of expression and freedom from hate, as they proceed along this line.
    I know this is straying just a little bit from your question, but what we should note is increasingly the call from social media providers, the platforms that we are all talking about ubiquitously, who themselves are struggling to figure out what the lines are and where they should be intervening and what should be their response to things that appear online. I think there is a signal that they're looking for leadership from government to help provide them with the guidance necessary for them to put into place, whether it's human resources or algorithms or what have you.... The numbers are staggering. I have reviewed some of the stuff from Facebook alone in terms of the hundreds of millions of posts. It's hard to wrap your head around the idea of monitoring and responding to that volume of information.

  (0920)  

     On clear guidelines from government, I'll return to the issue of the IHRA definition of anti-Semitism. One of the benefits that it provides a government is that it offers a clear definition. You can then use that as a template for the algorithms you are going to put into place and the word searches you are going to use to trigger closer scrutiny and so forth.
    If we were to do this across the span of looking at those specific things to which different communities...whether it's LGBTQ or the Muslim community, the Bahá'í, and so forth, I think that would go a long way in being able to entrench the kind of distinctions on freedom from hate and freedom of expression that everybody around this table I suspect is rightly concerned about.
    Thank you very much.
    Thank you very much.
    Mr. Virani.
    I'm going to put the timer on, because I know there are a lot of strong advocates at the table.
    First of all, welcome. Thank you for being here. As-salaam alaikum. Shalom. It's really, really important.
    This is something that I take seriously, Canadians take seriously and my constituents in Parkdale-High Park take very seriously. I know that all of you are here with the best of intentions.
    We are seeing an unbelievable amount of hatred, and I'm glad you outlined many aspects of it. Whether it's anti-Semitism, Islamophobia, homophobia, transphobia, anti-indigenous sentiment, anti-black sentiment, incel movements, etc., these are a cause for huge concern right here in Canada. We've seen the attacks in Quebec, Pittsburgh and New Zealand.
    There was at one point, a tool—and I want to pick up on this, but I'm going to hold you to a bit of a time limit, Mr. Fogel—that was applied in a previous iteration of the Canadian Human Rights Act, section 13. It talked about targeted discrimination based on a prohibited ground towards an identifiable group that was spread by means of a telecommunication. It emphasized that this included the Internet. That provision was removed by the previous government in or around 2012-13.
    At the time, it had gone through some challenges. As early as 2006, previous iterations of the groups who are here, including B'nai Brith, the Canadian Jewish Congress, and the Friends of Simon Wiesenthal were defending that very provision. In the Whatcott decision by the Supreme Court of Canada, the analogue to that provision was upheld on the very basis that has been discussed.
    I want to know whether your perspective is that was an invalid provision—so perhaps over to CIJA—and if it wasn't invalid, if you can drive at the heart of what you think needs to be added to make it more robust.
    I apologize for my lengthy answers, but nobody gives me time to speak at home.
    Look, section 13 was critically important. It provided the protections that you just referenced, and I think it is clear that Canadians and groups within Canada need them.
    The problem was that ironically, groups or individuals we should be concerned about were using section 13 as a way of pushing back against those who were raising legitimate free expression ideas or concerns about particular topics. It was chilling, or more precisely freezing, the ability of people to offer critical comment about things of public interest without fear of being brought before some judicial process to account for what they said, because others were claiming that was triggering hate against them.
     That was the vulnerability of section 13. There were a whole range of ways to deal with it. You have them in front of you. You've clearly done the research and I would invite the committee to look carefully at those, because it either has to be brought back in a better construct..... Irwin Cotler's formulation—I won't go through it now, as some of you are familiar with it and you can easily access it—was probably the most compelling way to restructure section 13, or provide direction to law enforcement, the public prosecution process, the attorneys general to become much more aggressive and active in applying the provisions of the Criminal Code.

  (0925)  

     I am going to stop you there. I have two more areas that I want to canvass.
    We talked a bit about social media companies, and perhaps I will ask the Canadian Rabbinic Caucus to address this one.
    There are examples of different jurisdictions that regulate social media companies much more vigorously. Germany comes to mind because of its history with anti-Semitism and Nazism. Its regulations have compelled Facebook to be much more robust in the human resources it dedicates to this issue.
    What can we learn from examples like Germany?
     I would again point to the Centre for Israel and Jewish Affairs' policy recommendation statement and its four steps. They write extensively on Germany as well as on some of the projects the UN has been involved with. I believe Mr. Fogel mentioned those as well.
    I think there's a lot to be learned. Some of the European countries are definitely further along than Canada is in these areas, and there are certainly ways to create a made-for-Canada or made-in-Canada type of approach that also uses the lessons of other countries and the UN as well.
    The last point I want to raise is that we talked about and a number of you picked up on Mr. Fogel's theme, which is about preventing, and, in fact, intervening.
    There is a white supremacist strand to a lot of what we're seeing around the world. We have had people in this country as recently as this week questioning the presence of white supremacy.
    I want to know from various people at the table whether there is white supremacy in this country. Second, if there is white supremacy, what is the responsibility of elected legislators to denounce that very fact?
    Maybe Ahmadiyya Muslim Jama'at wants to address that. No?
    I will open it up to everyone.
    I think—and I don't think I will get in trouble for saying this—that, yes, there is white supremacy in this country. I think it's important to name that and to say that and to challenge it.
    I think the responsibility of the elected representatives, as people who have been given voice by those they represent, is to stand up to that. The government has a clear role, I think, in challenging that ideology, and in countering that ideology, and in offering a counter-narrative that gives us something else to aspire to and something else to build relationships with rather than a white supremacist ideology.
    Thank you very much.
    Ms. Ramsey.
    Thank you all for being here this morning and for your presentations.
    Picking up on that threat, I think we saw this week an attempt by Facebook to address some of the groups in Canada that are sharing this information online. We saw the banning of individuals and groups, which I think was a very good move. It wasn't across all social media platforms, unfortunately. I think it was Facebook and Instagram that did that.
    To Mr. Fogel's point earlier, the depths that exist in the Internet, even within one platform itself.... There are just layers upon layers of social media giants trying to control this themselves. It really begs the question about how they can do this on their own without government intervention, without the Canadian government being a part of that and, I think, having some basic rules around what is acceptable and what isn't, some ground rules for platforms in our own country.
    You all spoke about Pittsburgh and the Christchurch shooting, and the extensive amount of Islamophobic and anti-Semitic material that had been posted by both of these individuals. I think Canadians are asking how it is happening that this is all being posted. Why is no one going to these individuals and stopping it at that point? Is this a failure of social media? Is this a failure of policy? They are also asking how it happens that people are out there sharing these volumes of information and no one is challenging it.
    I think Mr. Fogel spoke to this clearly, but I want to ask this to the other panellists: Do you think online platforms should be able to establish their own policies to address online hate, or do you believe that Canada should establish some ground rules as well?
    I think social media companies should be establishing their own rules, but the government does have a role in pursuing legislation that sets a baseline of requirements for that so it's not relying solely on the goodwill of these corporations but reflects the commitment of this country in how we manage communications and telecommunications especially. I think there is a both/and there.

  (0930)  

    I would agree with that. When I look at the history of social media companies trying to deal with these kinds of things, it's not very promising. It's not that they have given us great results and they are doing this so well.
    There was a CBC documentary, I think, about this as well, about how these companies try to deal with provocative videos or violent videos and those kinds of things. It deviates from what we're discussing, but it's because they are run for money. It's a business really. Those kinds of videos give them more viewers and more users and more engagement, so they tend not to take down all the controversial videos out there, all the violent videos out there. They keep some of them so they can keep on engaging their audience.
    When it comes to these companies, it's a question mark really how much we want to rely on them to make the perfect rules. I think the government should have some involvement as well.
     It seems that the mainstream social media platforms do have terms of use. They do have certain regulations and requirements, but it does seem very clear as well that they just cannot seem to keep up with what is going on as far as online hate is concerned. That being the case, and to echo what all of the witnesses have said, I think we need to see government take a leadership position. Of course, it will be in partnership with social media platforms, Internet service providers and other appropriate partners, but I definitely think that government will need to take the lead in this collaborative approach to actually being able to keep up with the monitoring and fighting of online hate.
    If I may add, Ms. Ramsey, in September 2018, a report on online hate was produced and given to the Prime Minister of France. They are due to present a bill in the next few weeks.
    When I was reading through the report, some of its elements were interesting and actually quite simple and helpful. One of their recommendations is to have a universal logo on all those platforms that you can click if what is written is hateful. That helps those Internet companies. Another suggestion they have that might be interesting to look at is a time limit imposed on companies to take out or erase those comments. Another one that I looked at and that should be looked at is a way to make the complaints online so that you don't have to go to the police, don't have to drive down to the police station, to make the complaints. When you look at things to put in a national policy, these are some elements that I would invite you to explore. They're simple, and I think they could be helpful.
    It's Tracey's time, so you have to....
    Oh, I wasn't sure if you were saying it was up.
    I was wondering if Mr. Fogel was trying to get your attention. That's why I was....
    If I can add two super-quick comments....
    Of course.
    Look, here are two things about social media platforms. On the one hand, asking them to self-regulate and to, in effect, censor some of the content goes directly against their business model, which is to expand as opposed to contract. I think we have to be sensitive to this.
    I have discerned over the past 12 to 18 months a sea change, an evolution in the thinking of these platforms. I think they are scared. They did not appreciate just how big they were getting, just how powerful a tool their platforms were. I do not think they have the confidence that they can do this on their own.
    My sense, from some of the testimony they've given in various legislatures around the world, but certainly here in North America, is that they are looking for some support and leadership from government. I think they need it both for insulation, as well as for objective third party adjudication, for lack of a better term, to help direct and guide how they're going to put into place the kind of infrastructure....
    With Facebook alone, just the other day I heard an interview where a senior employee at Facebook was talking about moving from having 10,000 to 30,000 people with the dedicated task of reviewing posts online. It's crazy. I think they do need government support, and they don't see that as encroachment, but as helpful intervention.

  (0935)  

    Thank you very much.
    Ms. Khalid.
    Thank you to the witnesses for coming in today and for your very wise testimony.
    Last year I was at a mosque with a fellow member of Parliament. We had our heads covered, and the fellow member of Parliament refused to take a picture with me. I can only speculate that it was because of fear of the online mob that was waiting to comment on and judge what this member of Parliament was doing at a mosque, for example, with a person like me.
    In light of all of this online hate, what role can public figures like me and other members of Parliament and you play in combatting this? We can talk about policy and effective laws to combat this, but as Mr. Iqbal said, it's not about bending heads; it's about changing minds. How do we, as role models or as community leaders, take that initiative? I'd like to know a little bit about what you are doing to proactively combat this kind of online hatred as well.
    Could we go down the line, starting with Mr. Fogel?
     A public conversation has to take place. We always look to what we call “thought leaders”, those who will help frame and guide the discussion. Frankly, I think parliamentarians and public officials are uniquely positioned to be able to set the terms of reference for the discussion.
    As I noted earlier in my remarks, this only works if there is broad buy-in on the part of Canadians. In some respects, this is a very scary subject. It brings to mind the idea of how intrusive into people's lives “they” can be, whether it's Internet providers or government and so forth, because what you're really talking about is scrutinizing what people post, making or rendering a judgment on it and then taking action on it. That can be very intimidating. In order to get that buy-in, I think there has to be a, maybe not a uniform but certainly a united, set of messages being conveyed from the political sector about that need and also how it's safe for Canadians to be contemplating some regulation, regime or protocol that will govern this.
    For our part, as was mentioned by Ryan, we have an obligation from our pulpits, from our community centres and from our schools to be echoing that message at the more granular and local level.
    Thank you.
    Mr. Scher.
    Staying away from policy for a moment and what our roles should be, certainly within our synagogues we actively fight hate. We actively educate on the importance of these issues. Our synagogue, just as an example, often serves as a platform for bringing different people together and educating ourselves about different types of Canadians and different people within our own communities. Those types of messages tend to be very, very important and quite striking when it comes to changing the tone of conversations and discussions specifically around this issue.
    As well, just as we do in our pulpits and our congregations, I think a following of that model by parliamentarians and political leaders as well would be very striking, just as in the example you gave of being in a mosque, being a leader in that type of position, and following suit in all of our different communities, making it very clear that this is something the Government of Canada stands for.

  (0940)  

    Thank you.
    Mr. Weston.
    The short answer is that I wish that the person had shared the photo. I think a big part of the role of public figures is to make visible these relationships, to build those relationships and to normalize things that might be unfamiliar to people who are involved.
    Just last week I was talking to one of our bishops, who said he's gotten to know the local rabbi and imam because they've attended so many vigils together. They realized at one point that the only effective way to prevent any further need for vigils was to have ongoing relationships where they were not strangers to one another and where it wasn't an intimidating prospect. Sharing your presence in some of these places is effective, as is creating opportunity for people to build relationships with one another. Maybe public figures can be the meeting point. Rather than compelling people to go into a situation that they may not be comfortable with, create opportunities for people to be in a neutral space. Get to know one another so that it's not about an “other” person over there; it's about somebody who is right in front of you and with whom you can relate.
    Thank you.
    Mr. Iqbal.
    I would echo Mr. Weston. There needs to be more interfaith dialogue. I think people in your position can certainly have a role to play in encouraging interfaith connections and interfaith dialogue.
    When I was in Toronto, I was part of the Toronto Area Interfaith Council for a couple of years. The engagement we had with the city enabled us to host the Parliament of the World’s Religions in Toronto. After the unfortunate van attack in Toronto, it was the Toronto Area Interfaith Council that was able to bring so many people together in the square and have that event.
    Vigils were mentioned as well. Vigils can only be successful if the different communities are coming together. People who are in your position can help promote these kinds of connections. You have so many connections with the community and different faith groups. If you help them come together, I think it can make a difference.
     Thank you.
    I want to thank the members of the panel. You guys have been extraordinary in offering insight to us today. I wish we had more time, but we have another panel that we have to get up here.
    I'm going to suspend the meeting briefly to change panels. I'd ask the members of the next panel to come forward. I thank all of you very much for your help on this.

  (0940)  


  (0950)  

     We are now going to start our second panel on online hate. It is a pleasure to be joined by Amnesty International Canada with Mr. Alex Neve, the Secretary General. Welcome back.
    The Armenian National Committee of Canada is joining us by video conference from Toronto, represented by Mr. Shahen Mirakian, the President. Welcome, Mr. Mirakian.
    The Association for Reformed Political Action Canada is also joining us, represented by Mr. André Schutten, Legal Counsel and Director of Law and Policy. Welcome.
    Finally, the Bahá'í Community of Canada is joining us. I'd like to welcome Mr. Geoffrey Cameron, Director of the Office of Public Affairs.
    What we normally do is try to put the person by video conference first in case we lose the connection, so we're going to ask Mr. Mirakian to start.
    Mr. Mirakian, the floor is yours.
    My thanks to the chair and the members of the committee for inviting the Armenian National Committee of Canada to provide evidence to you today. My name is Shahen Mirakian and I am the President of the Armenian National Committee of Canada. I apologize for not being able to join you in person today.
    As representatives of a community that has suffered genocide, the ultimate expression of hate-based violence, we are more familiar than most with the consequences of the promotion of hate. Similarly, as a community that has routinely advocated for positions that run counter to the status quo, we are fierce defenders of freedom of expression. In our view, there is no contradiction in these two positions. Hate propaganda is a means of infringing the freedom of expression of the targeted group by delegitimizing or vilifying identifiable groups. Hate propaganda makes it impossible for members of those groups to be heard or participate in civil society in a meaningful fashion.
    Canada's history of protecting freedom of speech and freedom of expression while criminalizing the willful incitement of hate or advocacy of genocide has been a powerful example for the international community. Now Canada has to apply the lessons learned from nearly 50 years of combatting hate in the real world to the virtual world and develop a national strategy to address online hate.
    As you are aware, on April 24, 2015, the House of Commons adopted motion M-587, calling upon the government to recognize the month of April as Genocide Remembrance, Condemnation and Prevention Month. The Armenian National Committee of Canada has worked with various other non-governmental organizations, particularly the Centre for Israel and Jewish Affairs, the Ukrainian Canadian Congress, and the Humura Association, to ensure that each year the government recognizes April as Genocide Remembrance, Condemnation and Prevention Month.
    However, this April, recognition alone will not be enough. Over the past year the ANCC has worked collaboratively with a broad coalition of human rights advocacy organizations to ask for action as well. An important part of that effort was to ask the government to combat online hate. In December 2018, the ANCC joined with 17 other organizations, many of whom are providing evidence today, in sending a message to the Minister of Justice asking that a national strategy be launched to combat online hate.
     This year, for Genocide Remembrance, Condemnation and Prevention Month we are working with a broad coalition of communities that have experienced the horror of genocide to ask all Canadians to join us in requesting that the Government of Canada adopt policy solutions to the problem of online hate. We would encourage all Canadians to visit itstartswithwords.ca. At this site, Canadians can read all about what can be done to combat online hate and learn what other actions Canada can take to do its part to prevent future genocides and properly recognize those that have already taken place.
    We also want to go on the record today as strongly supporting the four policy recommendations proposed in November 2018 by the Centre for Israel and Jewish Affairs as the basis for a comprehensive national strategy for combatting online hate. Those four policy recommendations are: defining hate, monitoring hate, preventing hate, and intervening to stop hate. We also agree that there needs to be a greater use of existing tools to address online hate as well as consideration given to implementing new tools to assist authorities in responding to online hate.
    One specific area of concern we would like to highlight is law enforcement, which we believe must make hate-motivated cyber-attacks or website-hacking a priority. Since 2008, websites of Armenian community organizations have been subjected to three separate incidents of cyber-attacks. The websites of Armenian-Canadian newspapers, churches and community organizations have been replaced with anti-Armenian propaganda, including, but not limited to, denials of the Armenian genocide. Despite publicizing these incidents and reporting them to law enforcement, we are not aware of any active effort to identify the perpetrators or bring them to justice. While many cyber-attacks will never lead to actual violence, it is very possible that the perpetrators are linked to groups that either advocate for or actually engage in violence. If law enforcement prioritized identifying the parties who engage in hate-motivated cyber-attacks, they would be able to obtain intelligence on potentially violent groups and prevent hate-motivated physical attacks.
    We also believe that the regular surveys conducted by the Canadian Centre for Justice Statistics should specifically track hate-motivated cyber-vandalism and not just track victimization by individuals who have received hate-motivated messages online. Hate-motivated cyber-vandalism is a criminal act, just like hate-motivated physical vandalism, and law enforcement resources should be equally allocated to both. Canada should work with the international community to bring the perpetrators of these incidents to justice, whether or not the perpetrators are physically located in Canada.

  (0955)  

     In this regard, Canada's signing in 2005 of the additional protocol to the convention on cybercrime specifically concerning the criminalization of acts of racist or xenophobic nature was an important step, but the domestic tools must be implemented to allow for the extradition of suspects and co-operation with international partners.
    Finally, law enforcement needs to provide communities with the tools to properly report these crimes and obtain updates about the investigation. As it stands right now, we are not clear on to whom these crimes should be reported and if they are actively being investigated or how we can find out if they are being actively investigated.
     The harm done to communities by hate-motivated cyber-vandalism can be in some instances just as severe as the harm done by hate-motivated physical vandalism. This study being undertaken by this committee today and in upcoming days is a very important first step in combatting online hate.
     We are very grateful to this committee for making room on its agenda during Genocide Remembrance, Condemnation and Prevention Month to bring attention to this issue and to do its part in preventing future genocides. We are hopeful that this study will result in an effective national strategy to deal with the pressing problem of online hate promotion.
    Thank you very much.
    Thank you very much.
    We'll now move to Amnesty International Canada and Mr. Neve.
    Thank you very much, Mr. Chair. Allow me to first make two very brief opening remarks.
    First, I very much want to acknowledge that we're gathering for this hearing today on unceded and unsurrendered Algonquin territory. Given the degree to which online hate, amongst other things, very much plays out for indigenous peoples in Canada, I think that's a very important acknowledgement to begin with.
    I also can't help but make a gender observation as we begin. I'm conscious of the fact that both the earlier panel and now our panel, including me, are all men, and I note that two of the eleven members of Parliament on the committee today are women. As you will hear from me and my remarks, Amnesty International is particularly concerned about the very virulent gendered dimensions of online hate. I expect and assume that going forward this committee will seek and will have opportunities to ensure that there's going to be a very strong representation of that concern.
    I'm not taking up your time, but I have to note that the committee did not invite individuals; the committee invited groups. For both of these panels, the groups that were invited chose to send men as opposed to women. That was not the committee that asked them to choose by gender—
    Mr. Chair, I wasn't—
    —and I will now give you back your time.
     Mr. Chair, I wasn't blaming the committee. I was just making an observation.
    Obviously, the question in front of this committee brings into sharp focus two crucially important human rights matters. The first is the right to be free from discrimination, including the ugliest manifestations of discrimination that arise through abuse and violence expressed as hate, which in far too many corners of our world continues to the extremes of mass atrocities, such as crimes against humanity and genocide. It is a very important and stark reminder less than one week after we have remembered and commemorated the horror of the Rwandan genocide and very much a reminder on my mind as I'm freshly back from having spent time in the Rohingya refugee camps in Bangladesh, which of course is all about a community that has had to flee because of hatred. There's no doubt that the online world, which continues to transform and grow virtually daily, has become a troubling front line in that reality of discrimination and hate.
    The second is the right to freedom of expression, which is often referred to as the lifeblood of the human rights system. This is the right to hold, shape and share opinions and ideas, to engage with others and to take part in public debate. It is essential for so many reasons, including that free expression itself provides the avenues for exposing and addressing injustice and for evolving our understanding about society and democracy and the environment in a way that makes for a better world. Equally, there is no doubt the online world has been a very important and growing avenue for providing new possibilities for free expression.
    On any given day, Amnesty International researchers, campaigners, advocates and our millions of activists and supporters worldwide are taking action to uphold both of these essential human rights, both of which, of course, are enshrined in numerous international treaties. In international law, the right to be free from discrimination is considered to be so fundamental that nothing ever justifies its abrogation.
    The right to free expression is a right that is balanced in its very formulation. The International Covenant on Civil and Political Rights notes that it's a right that carries “special duties and responsibilities” and may therefore be subject to restrictions only if they are provided by law and are necessary for respect of the rights of others. The key word here is “necessary”.
    I would suggest to you that this word and this question—restrictions on free expression that are necessary for the respect of the rights of others—go to the very heart of your work. Please do remember that word “necessary”, because if there is a cautionary lesson from the world of human rights protection, it is that restrictions and limitations of any kind on any right are always a slippery slope and governments are very quick to push the limits. Necessity is absolutely crucial.
    The rise of hate-based and hate-fuelled discrimination is on the rise everywhere, often made easier—or at least more obvious—by the new and accessible channels the online world offers. Misogynistic racist hate has become a devastating phenomenon in almost all social media platforms. Amnesty International has drawn particular attention to that reality on Twitter through our major Toxic Twitter research project, which over the past two years has revealed how much online abuse and violence women are subjected to. This abuse and violence is exponentially worse for women of colour, LGBTI women, indigenous women and women from other marginalized communities. Like many indigenous organizations, faith groups and anti-racism campaigners, human rights organizations and others, we have repeatedly highlighted the many ways in which the hate and racism represented in white supremacy has also found a toxic home in the digital world. It is by no means limited to simply objectionable or offensive views, but is increasingly spilling over into hate-filled online discussions that stand in the background of threats against indigenous land defenders, and of course horrific acts of mass violence and killings such as in Christchurch, Pittsburgh, Sainte-Foy in Quebec, and Yonge Street in Toronto.
    Let me wrap up with six very quick final comments and recommendations that I hope will shape your ongoing work here. First, it is commendable and important the Canadian Criminal Code does criminalize the incitement of hatred against a growing number of identifiable groups. That does not mean, of course, that from a human rights perspective protections against hate in Canada should in any way be considered to be full and complete. Rarely if ever does the criminal law offer the whole solution to any human rights challenge.

  (1000)  

    Second, given the rapid rise, in particular, of online hate and its increasingly devastating consequences, governments are compelled to look for further action including with respect to further tools for investigating, enforcing and imposing sanctions.
    Third, hate, as we would all agree, is obviously a human rights issue that often leads to the most violent expressions of discrimination. In our country, human rights commissions have mandates that are grounded primarily in addressing discrimination. It is therefore intuitive and obvious to consider the role they could and should be playing in responding to this serious concern.
    Fourth, any move to provide a mandate to human rights commissions, including the Canadian Human Rights Commission, to address online hate should be grounded in strong recognition of the vital importance of the right to be free from discrimination and the right to free expression, and the development of clear guidelines and criteria drawing on international human rights standards that would assist investigators and adjudicators in understanding and giving shape to the crucial interplay and relationship between those rights.
    Fifth, given the tension between these two rights, the legal complexities in finding the right balance, and the rapid evolution in the nature and the reach of the online platforms involved, any move to provide a mandate to human rights commissions, for instance, must involve a very serious commitment to ensuring adequate resourcing to support the training, the expertise, the research, the outreach and the education that would be required.
    Finally, changes to the role of the Canadian Human Rights Commission or any other human rights bodies with respect to online hate absolutely should go forward as part of wider approaches to tackling the growing concerns about online hate and fulfilling the need, still unaddressed in Canada, to develop a national action plan on gender-based violence, including through the ongoing development of a national anti-racism strategy and measures to respond to Islamophobia, anti-Semitism and other religious intolerance.
    Thank you, Mr. Chair.

  (1005)  

    Thank you very much.
    We will now move to Mr. Schutten from ARPA.
    The honourable members of this committee are studying online hatred and what, if anything, the federal government can do to restrict it.
    Before we can address how to fix the problem, we first need to ask where the problem comes from and who is best suited to fix it. In a certain sense, the dark corners of the web are a window into the dark corners of the human heart. Greed, lust, hatred, anarchy, covetousness and lies infect the Internet and our hearts as well.
     Aleksandr Solzhenitsyn, writing in The Gulag Archipelago, said this:
...the line separating good and evil passes not through states, nor between classes, nor between political parties either—but right through every human heart—and through all human hearts.... And even in the best of all hearts, there remains...an un-uprooted small corner of evil.
Since then I have come to understand the truth of all the religions of the world: They struggle with the evil inside a human being (inside every human being). It is impossible to expel evil from the world in its entirety, but it is possible to constrict it within each person.
     Charles Colson, the founder of Prison Fellowship International, builds on this idea in his book Justice That Restores. He writes that there is no more urgent task than to restore the sense of community cohesion and to build a virtuous character into common life and that "without individual virtue, one cannot achieve a virtuous culture; without a virtuous culture, one cannot hire enough policemen to keep order.”
    As Michael Novak has trenchantly observed, adapted to a Canadian audience, “in a virtuous culture” we have 37 million policemen and “in a culture that mocks virtue, we cannot hire enough policemen.”
    Who is best suited to offer solutions to the problem of online hatred? I don't think the honourable members of this committee realize it, but you have already made a big step in the right direction when, just over a year ago, you amended Bill C-51 to preserve the protections afforded to houses of worship in section 176 of the Criminal Code.
    Not only did you signal, rightly, that you care about the protection of vulnerable citizens in a state of prayer and worship, whether in a mosque, synagogue, temple or church, but you also preserved protections for the institutions that can inculcate that virtue in individuals so that we can have a virtuous society. If we want that virtuous society, we need to protect churches, mosques and synagogues to continue to preach peace, shalom, shalam. That's where the work against online hate starts. It is absolutely necessary for this committee, indeed all of Parliament, to understand this. Do not undermine houses of worship; protect them and expect good things from them.
    However, I'm not suggesting that the state has no other role in combatting violence and the senseless slaughter resulting from seething hatred, as witnessed in New Zealand and Pittsburgh. The Hebrew psalms speak to the proper role of the state. Psalm 72 says of the king:
For he will deliver the needy who cry out, the afflicted who have no one to help. He will take pity on the weak and the needy and save the needy from death. He will rescue them from oppression and violence, for precious is their blood in his sight.
    This psalm points to the God-given role of the state to protect from bloodshed and violence the weak and the needy, the vulnerable citizen.
    The Apostle Paul, in his letter to the Romans builds on this command. He says:
...the one in authority is God's servant for your good. But if you do wrong, be afraid, for rulers do not bear the sword for no reason. They are God's servants, agents of wrath to bring punishment on the wrongdoer.
    A clear application of this biblical passage to online hatred would be that the government does have a role in enacting swift justice to punish a wrongdoer seeking violence against another person or group of people. So where the vitriol of online hatred rises to the level of incitement to violence threats of violence, which are crimes under Criminal Code sections 264.1 on threats, 318 on advocating genocide, and 319 on public incitement to hatred, then the police must act swiftly to investigate, to arrest, to charge and then to prosecute.
    Perhaps—and I put this out there as a thought experiment—one impediment to swift action and swift justice on the crimes of advocating genocide and public incitement to hatred are the unusual requirements that the attorney general's consent is needed to proceed. Perhaps, by removing those two subsections, we could increase the ability of police to pursue, without delay, action to stop such crimes from happening.
    However, one word of warning that ARPA Canada wants to share is that we are very concerned about overzealous attempts to fix the problem of online hate. We co-signed a letter requesting the justice committee to study this issue with a good faith understanding that we would be able to raise legitimate concerns about what would constitute going too far.

  (1010)  

     We are very concerned about any attempt to reinstate a hate speech provision in the Canadian Human Rights Act. These provisions have been shown to be ineffective and often abused. They chill freedom of expression and are applied in demonstrably unfair way. Let me give you one example of what some commentators have described as politically correct double standards.
     In 2003, in a case called Johnson v. Music World Ltd., a complaint was made against a record label for a song called Kill the Christian. The lyrics of the song were read into the record by the complainant, and included the following, referring to Christians:
     You are the one we despise
    Day in day out your words [comprise lies]
    I will love watching you die
    Soon it will be and by your own demise
    ...Satan wants you dead
    Kill the Christian, kill the Christian
    Kill the Christian, kill the Christian
    ...The death of prediction
     Kill the Christian
    Kill the Christian, dead!
    The panel found that while the content and tone of the communication appeared on their face to be discriminatory, there was “very little vulnerability of the target group”, so there was no violation constituting hate speech. Yet three years later, in a case called Lund v. Boissoin, a panel found that a letter published in a mainstream newspaper in Red Deer, Alberta, that made disparaging remarks about homosexuality was in fact hate speech and ordered the writer to cease publishing in future in newspapers, in email, on the radio, in public speeches—including sermons—or on the Internet. The panel chair for both of those decisions was the same person: Lori Andreachuk.
    Public policy discussions, I would argue, require as broad and as open an access to expression as is possible. Freedom of expression ought to be such that all citizens feel free to speak about all public policy issues as best they can. We can preserve that freedom, and we must preserve that freedom. By putting finite resources into hate speech codes other than the Criminal Code, the government potentially will distract from true hate speech that leads to violence. That’s a distraction that will not do much to curb the kind of violence we saw in Pittsburgh or in New Zealand.
    To conclude, my requests would be as follows.
    One, take seriously the protection of other institutions in society that can inculcate virtue in our citizens, including religious institutions.
    Two, the state needs to demonstrate swift justice against these crimes. Ecclesiastes 8:11 says, “When the sentence for a crime is not quickly carried out, people’s hearts are filled with schemes to do wrong.” This committee should consider removing the requirement for the attorney general’s consent to prosecute incitements to genocide and public hatred in subsections 318(3) and 319(6) of the Criminal Code.
     Finally, we ask that we do not entertain incorporating hate speech measures into the Canadian Human Rights Act. This distracts resources from the more pressing work of preventing violence against vulnerable citizens.
    Thank you very much.
    Thank you very much. I would appreciate it if you would share the links to the two judgments you referenced.
    Yes, absolutely.

  (1015)  

    Thank you very much.
    Mr. Cameron, the floor is yours, sir.
    I would like to thank the committee for inviting my testimony today as a representative of the Bahá'í Community of Canada. I'm also appearing as a member of the executive committee of the Canadian Interfaith Conversation, a national body that seeks to foster and promote religious dialogue and harmony.
    Bahá'ís, as members of a religion that has been Canada since the late 1800s and that has established communities in most localities in this country, are not the targets of online hate in Canada. However, this issue is of particular concern to our community first and foremost because of core teachings of the Bahá'í faith regarding the promotion of the fundamental oneness of humanity and the elimination of all forms of prejudice. Public or private expressions of hatred towards groups of people, whether online or off-line, are inimical to these beliefs.
    We have joined with many other faith and civil society groups to call for the study of the root causes and potential solutions to the rising incidence of online hate that has been directly connected to violent attacks on particular groups. Women, Muslims, Jews, Sikhs and racial minorities have been among the most recent targets of violence that was inspired by hatred spread online.
    The recent attacks on Muslims at prayer at two mosques in Christchurch, New Zealand; the van attack in downtown Toronto; the attack on Jewish worshippers at the Tree of Life synagogue in Pittsburgh and the shooting at the lslamic Cultural Centre of Quebec City are all recent examples of killers who spent extensive time in digital worlds of hatred.
    As Professor Richard Moon has found, “Hate crimes are committed most often...by individuals who have immersed themselves in an extremist subculture that operates at the margins of public discourse, and principally on the Internet.”
    Sadly, this is also a problem with which Bahá'ís have first-hand experience in other countries. ln the most egregious case of Iran, a government-supported media campaign of defamation and incitement to hatred has been directly tied to outbursts of violence and murder targeting Bahá'ís. A similar pattern has begun to proliferate in nearby Yemen.
    It is clear, then, from a growing body of experience, that the spread of online hatred targeting a defined group can lead individuals, who are perhaps already inclined to bigoted thinking, to act with violence.
    What should be done about this problem? Any lasting solution has to somehow take into consideration the roles and responsibilities of individuals, groups, corporations and the institutions of government. With regard to government, I will refrain from commenting on the question of whether section 13 should be reinstated or whether the hate speech provisions in the Criminal Code are sufficient to prosecute cases of online hate. There is a delicate balance, as others have mentioned, to be struck between guaranteeing the free exchange of ideas in the public sphere and sanctioning those whose aim is not to advance truth, but to spread hatred. Clearly the government and, by extension, the courts have a role to play in prosecuting cases of hate speech.
    lt is also increasingly clear that policy intervention by government is needed to mitigate the impact of the more egregious misuses of online social networks. Despite recent steps taken by Facebook and Twitter to remove certain accounts, government also has a role to play in regulating these online platforms. Any effective policy intervention must ensure national and local community involvement in determining the standards for online platforms. As David Kaye, the UN special rapporteur on the freedom of expression, has urged, relying upon international human rights norms rather than the arbitrary judgements of commercial platforms is a better basis for the development of these standards. This includes delineating the rights and responsibilities of users, as well as safeguards to ensure that freedom of expression is not unduly curtailed.
    However, government action by itself is insufficient. There is also a role for civil society in pushing these companies further in the right direction, beyond the letter of the law. One organization, Change the Terms, has called on tech companies like Facebook, Google and Twitter to take steps to curb the use of social media, payment processors, event-scheduling pages, chat rooms and other applications for hateful activities. There are concrete steps that can be taken by these powerful companies, which are accountable both to government and to the wider society, that can create a healthier public sphere for all of us.
    Finally, there is an educational responsibility that falls to community leaders, teachers, families and parents. Changes in the attitudes, values and behaviours of individuals are a necessary part of the solution.

  (1020)  

     The online environment is ultimately a mirror reflection of our society. We live in a world in which prejudice against certain groups is propagated by many people, even those who do not intend to provoke violent reactions. Religious leaders have a particular responsibility to educate people, to promote fellowship and concord and not to stoke the fires of fanaticism and prejudice. Young people especially need access to education that teaches them from the earliest years that humanity is one family. They require education and mentorship that go beyond a simplistic condemnation of hatred or a set of dos and don'ts regarding their online activities. Youth need to develop a strong moral framework on which to base decisions about their online activities, about which content they choose to consume and share, and about how they use their powers of expression when communicating with friends and strangers online.
    Any long-term solution to online hatred has to give due consideration to this generation that is coming of age in an information environment that is confusing, polarizing and indifferent to their moral and ethical development. From where do young people learn to express themselves, using language that is intended to educate rather than to dismiss or denigrate? As they seek to learn about social issues, how will they know the difference between intelligent criticism and hateful propaganda? What ethical tools and social support are we giving to them as they navigate the online world?
    Answering these questions is a responsibility that falls not only to government; it is part of a response to online hate that we must all accept to carry forward.
    Thank you.
    Thank you very much.
    Thank you very much to all members of the panel for your interventions.
    We'll now go to questions, starting with Mr. Barrett.
    Thanks to all of the witnesses who are appearing today. It's important as we move forward in the study that everyone agree that we clearly denounce white supremacy and racism and intolerance directed to any group and denounce any hateful ideologies. I think in being aligned in that way we would have a very good basis to move forward.
    To pick up on Mr. Cameron's comments on how we can encourage the reporting of online hate propaganda—because it obviously is going to take a whole-of-society approach to address this—the government can do what governments do. None of the major players are Canadian corporations, so we need to have their voluntary co-operation in a lot of cases to have any real meaningful effect, but really, as Mr. Schutten said, 37 million people working together will be more effective than any agents of government.
    So how would you recommend that we encourage the reporting of online hate propaganda? I'll ask the four witnesses to answer that question.
    I'll begin by picking up on some of the findings that have come out of the extensive research we've done with respect to Twitter. I highlight Twitter not because we think it's the worst—and it's certainly not the only platform of concern—but simply because that is where our research has focused.
    One of the very serious concerns we noted is that many people were reporting abuse—and in our report this was focused on women in particular—and the complaints were going nowhere. There was a profound lack of transparency with respect to the mechanisms that Twitter had in place, for instance, to address that. There was virtually no public reporting of the complaints they were getting, and this raised all sorts of concerns with respect to Twitter's own accountability.
    Now, I totally agree with you that the challenge here is that these massive online platforms are not Canadian corporations. That does not mean, of course, that Canadians—individual Canadians and Canadian governments, whether federal or provincial—don't have a very important role on that side of things to keep the pressure on these online platforms to make sure that there is a real possibility of doing what you have raised and that something comes of it, so that there's actually a response and action taken with respect to legitimate online hate on those platforms and more public reporting from the companies themselves to demonstrate the extent of the problem and what they're doing about it.

  (1025)  

     Thank you.
    Reporting on online hate is tricky, and I'll say two things in this regard. One is that the whole-of-society approach makes me a little less afraid of overreach into freedom of expression, in the sense that we see organizations like Amnesty International and others doing these big studies and investigations and so on, which I think is a good first step.
    As far as individuals reporting on what they see online is concerned, the big struggle is that everybody has their own definition of hatred. We're seeing, I think more and more—I'll speak only of the Canadian context—that we lack the ability to disagree well. We don't know how to debate anymore. People who are online in particular are quick to throw labels, such as words ending in “phobe”—whether that's homophobe or Islamophobe or Christophobe, or whatever—to anything they feel uncomfortable with or they disagree with. That is stifling good, honest, rigorous debate about issues and ideas and policies and so on.
    In the first panel, there was an idea of a universal button that you could click to report an online hate crime of sorts. It seems to me that's something that could be easily abused if we're not all in agreement, at least on a baseline, of what we mean by online hate.
    I would agree with what we said in that initial letter that came to this committee about defining hate. That's going to be a very tough decision to make, but we have to do it right.
    Do I have any time left?
    You have one minute left.
    Mr. Cameron.
    Thank you.
    I think the question of what to do with reporting is a secondary, in a way, to what exactly the standards are against reporting as evaluated. This echoes other comments on the panel.
    As I mentioned in my testimony, I think there's a healthy discussion right now about the terms set by these online platforms for participation in them. I think there's a real need to reconsider what those terms are and for that reconsideration to be done not only in response to government regulation, but also in consultation with local and community groups and with reference to international human rights standards.
    I don't think it's a simple technical issue, but a reconsideration of exactly what is acceptable and what the standards are that are used to evaluate acceptable speech on these platforms.
    You have 10 seconds.
     Do you want to ask Mr. Mirakian?
    Yes, if he could—
    Very quickly, I'd like to speak about the issue of cyber-vandalism and the reporting of that, which was something I brought up before.
    If someone were to, on a hate-motivated basis, physically vandalize a structure belonging to the Armenian community, I would know how to report that fairly easily. If someone were to graffiti something onto my community centre, I know exactly who to phone from the police and how to report that.
    However, if someone vandalizes or hacks into my website and replaces the content with the same sort of propaganda that they would have spray-painted onto my community centre, I have no idea to whom to report that properly. For instance, the website may be for a Canadian organization but hosted in the United States or in a different country. The perpetrators could be from anywhere in the world, and there's no way of identifying them. They have complete anonymity. We don't know if they've ever even entered Canada.
    They may steal data from us. They've stolen email lists, for instance, and sent hateful messages to people on those lists. We have no idea, again, to whom that should be reported properly. There should be some information provided to community organizations on how to report these sorts of incidents properly. These should be tracked in the same way that physical vandalism has been; otherwise we under-report the incidence of hate crime in Canada.
    I think that's one thing I would very much recommend, that the committee work with law enforcement, especially federal law enforcement—I think they're best positioned to deal with this—to figure out how this reporting should be done, and to which law enforcement agencies, and to ask them to prioritize dealing with these types of incidents as well.
    Thank you very much.
    I'd like to note, for the benefit of the witnesses, that this committee made an amendment to a private member's bill of one of our colleagues, Chandra Arya. It gave the same protection, for example, to an Armenian community centre, which previously was granted only to churches, mosques, or synagogues, in the burden of establishing a hate crime.
    We get what you said. Thank you very much.
    We'll go to Mr. Ehsassi.

  (1030)  

     Thank you, Mr. Chair.
    Thank you to all of the panellists. I think you've been very helpful in identifying this huge problem that we have and that something has to be done about it.
     Mr. Neve, we just heard from Mr. Mirakian that one of the problems we're having is that no one knows how to navigate their way through different online platforms. It seems to me that the leadership Amnesty has shown with respect to Twitter abuse.... I think in your response you were also alluding to that, that Twitter wasn't very effective in flagging online hate and being responsive to it. Have you found that some of the other online media tools have been more effective than Twitter?
    I wouldn't want to put ourselves out there as having done that comparison to the degree of rigour that I would feel confident in giving an assessment to you. I think what we're seeing, and I think I heard remarks in the earlier panel about this, is that there is starting to be a bit of breathlessness among many of the online companies in recognizing how vulnerable they are in this regard. Some of that I think is coming from the right place of recognizing that there's some real culpability and responsibility for their past approaches having led to very serious abuses and acts of violence. Part of it is obviously commercially motivated as well.
    It has been slow with Twitter, and that's been frustrating and difficult. We are getting some pickup and some changes are slowly happening. However, I think much more pressure is needed, including from governments. I think governments need to go on the record with companies like Twitter with very clear demands and recommendations around this whole area of change and prevention and accountability and oversight.
    Speaking of governments, we did hear earlier today about some jurisdictions or countries doing a better job in combatting online hate. Germany was one example that was provided. Is there any particular jurisdiction that any of our witnesses would like to refer to as an example that has the right regulatory or legal framework?
    Not to my knowledge.
    I'd have to pass on that as well.
    I will too, but I will certainly follow up on that. Obviously, we have global networks that I can access. I would be very surprised if I came back to the committee with the gold standard anywhere. I think this is a challenge, and it's a challenge that pretty well every government is struggling with and still falling short on, but I'm sure there are some good practices that we can share with you.
    Thank you.
    Mr. Mirakian, do you have a response?
    No, I'm sorry.
    One of the things we've heard is the alarming rise in online hate. We heard about bots. We heard all sorts of horrific numbers. One of the things that concerns me in particular with respect to certain communities is that they face state-sanctioned hate. In the case of the Bahá’í community, it's good to hear, Mr. Cameron, that there isn't a lot of activity in Canada that should be disconcerting.
    Have we seen state actors necessarily take on particular groups? Have you evidenced anything of the sort when it comes to the Bahá’í community in Canada?
    Do you mean overseas or in Canada?
    Overseas, yes, and your sense that other states are trying to promote online hate.
    As I mentioned in my testimony, the evidence that we have from Iran indicates that tens of thousands of pieces of state-sponsored propaganda have been published in state media, which have also then been picked up and repeated by other actors in society. There have been cases of murder with impunity that are directly linked to that violence.
    As I mentioned, although our community has not been a target of online hate here in Canada, we're not unaffected, given the number of Iranian Bahá’ís who have come to Canada, many as refugees, seeking shelter from violence that is in many cases provoked by state-sponsored propaganda, often online.

  (1035)  

    Mr. Mirakian, I'd be remiss if I didn't ask you the same question. Has there been any evidence of state-sanctioned hatred by other countries that we should know of in Canada?
     I'm not certain if it's been in Canada, but the President of Azerbaijan, for instance, once tweeted that the enemy of Azerbaijan was the Armenian diaspora, or all the Armenian people. He tweeted this in the English language for everyone to see.
    It's hard to know whether he was sanctioning the sort of attacks that we are subject to from overseas perpetrators who are repeating that sort of propaganda, or if he, himself, is just being provocative and then thinking, what happens, happens.
    Obviously, I have no way of figuring that out. I would appreciate if Canadian law enforcement were to look into that sort of thing and figure out if there is actually state sponsorship behind some of these acts of online hatred.
    Thank you.
    Mr. Ehsassi, you have 30 seconds.
    Mr. Neve, can I ask you about section 13 of the human rights act? Could you comment on that and how much of a gap has been created?
    We think this is something worth exploring. We recognize, and don't disagree with, the concerns about how some cases proceeded in the past.
    This is not the past; this is today. The realities around online hate are different than they were even five or six years ago when section 13 was abolished, and certainly in the years before that when it was being used.
    I think it does merit considering whether there is a role for the commission to play here, with all of the provisos that I highlighted earlier. It would have to go forward with a clear recognition of the importance of both rights, and the kind of training, expertise and resourcing—drawing on international standards—that really help develop a sophisticated understanding of how those two rights have a profound interplay with each other.
    Thank you very much.
    Ms. Ramsey.
    Thank you, panellists, for being here today.
    One of the challenges in online space is that people see articles and published material and believe them to be true. They don't often look at the source. I think there is a general distrust in mainstream media in our country and there is such a significant need for media literacy for people.
    Mr. Cameron, you were talking a bit about education. I think a core piece of what we're looking at here is people understanding how to identify what is a legitimate piece of media and what is something that shares perhaps hateful messages and things on the Internet, and how to distinguish between those things and determine that.
    I don't think that people generally have those types of analytical skills. The members of Parliament who sit at this table receive many emails from constituents who send us a link to something and ask what it is about. We're often able to debunk it or say that it isn't a credible source, but it's a very significant challenge.
    I wonder if you could each speak to the role you think our education systems in Canada should play in combatting hate in general, but certainly online hate. I have two teenage sons, and I don't believe that our education system is keeping pace with the culture, specifically, the online activity and technology. Our kids are on platforms that we don't even know about. There are these corners of the Internet where they are sharing information, and there probably aren't many parents or adults who are even in those spaces.
    I wonder if you can talk about how our education system could address that, and how we can address that gap for adults as well. Most of us in this room saw the Internet come and we got onto Facebook and all these different platforms and used it for whichever purposes—sharing things with family and friends—but it certainly has grown to a place where even our understanding of what's there and what's happening there is very limited.
    I wonder if you can speak to the role you think education should play in this.
    Yes, is the short answer. I couldn't agree more. I think that to frame it as media literacy is very important.
    I think we probably would all agree that even before the advent of social media, media literacy was always underplayed and overlooked in our education system. Even when it was the traditional mainstream legacy media that we were mainly concerned about, there were issues as well, including how to empower citizenry about how to engage with media and how to make use of media. That is exponentially more the case now.
    While there certainly are some good teachers, practices and modules that I'm anecdotally aware of—including some of the experiences my own children have had as they went through the education system—there is no doubt that right across the country there is, at best, profound inconsistency as to what is offered to young people around digital media literacy in a wider context of media literacy and empowerment. I think it absolutely needs to be prioritized.

  (1040)  

    I'd agree. I think education is always a good thing. I guess a question for this committee would be jurisdiction. Obviously education is a provincial matter, but certainly the federal government can encourage the provincial governments to be pushing more on this.
     I think Internet safety is a big concern. We definitely pushed for more of that in sex education and so on, so that there would be more awareness about Internet safety among young people.
    As well, I think what would be needed as part of that is both a good and healthy definition of what hatred is and then a good education around how to disagree well. I think that would help a lot as well, and it would simmer things down a bit more online.
    The anonymous element online, too, is something that's significant. People don't have their real name, they're under an alias or it's a bot. You don't even really know, at the end of the day, if that is really the person.
    Of course, we see horrible instances of this being used for trafficking, luring and all of these really sordid things, but when there isn't a clear identification even of who the person is, I don't know how you can attempt to stop them from what they're doing.
    I would agree that media literacy is incredibly important.
    I'm also concerned about hate-adjacent speech, the kind of prejudice that can percolate online where it becomes funny to make anti-Semitic comments or racist comments with friends in a joking way. I'm also concerned that the kind of polarization we see in our society is becoming reinforced by online algorithms that push young people especially toward content that is more and more at the extremes or margins of discourse, which then naturally cascades into exposure to more hateful material.
    I think, beyond just literacy about the sources of news, the kind of education that's required is one that goes much deeper, actually, to recognize comments as prejudicial or potentially hateful before they get to an incitement to hatred or violence.
    You have 15 seconds left.
    Okay. I will go over to our guest in Toronto, then, quickly.
    I think one place where the federal government does have a role is providing more examples in Canadian and world history of the consequences of hatred, which should be included in the resources that the federal government makes available online, especially.
    There is already a Canadian Holocaust resource web page that the Department of Canadian Heritage has online. I think that should be also for other incidents of crimes against humanity and genocide. Also, incidents of hate crime in Canada should be highlighted, whether on the web page of the Canadian Museum for Human Rights web page, or that of the Department of Canadian Heritage, or even Global Affairs Canada.
    Certain of these incidents are important in our history. It's important to remember things like the Christie Pits riot or.... I don't remember the exact name. I do know that every Japanese ship's name ends in Maru, but there was an incident on one with the Sikhs. Certainly the incidents between European Canadians and indigenous people should be highlighted better. These resources need to be available and people should have somewhere they can look and say, “This is what's real”.
    Thank you very much.
    Now we'll go to Ms. Khalid.
    I just have one question, and I'll be passing on the rest of my time to Mr. Virani.
    First, Mr. Neve, you referenced that a study has been commissioned by Amnesty International, if I'm not mistaken. Can you share that study with the committee, please, if possible?

  (1045)  

    Certainly. I wasn't commissioned by Amnesty, but it was carried out by Amnesty.
    It was carried out by Amnesty. Wonderful.
    It was a study that focused on the gendered dimensions of violence in the world of Twitter, called “Toxic Twitter”.
    Thank you. We would love to use that as we deliberate.
    Second, just in the interests of time, I'm going to stick with you, Mr. Neve.
    We see social media platforms give organizations and individuals the opportunity to pay to have more viewership of their messages, their opinions and whatever they're sharing. We've seen organizations raising money by playing on people's fears. We've seen organizations that are paying money to these specific interest groups to spread that very hateful message.
    Do you think perhaps there is some kind of framework that can be developed with respect to that sponsorship piece, of getting your narrative out there, so I guess whoever has the most money would eventually then have the loudest voice on the Internet? Could I have your comments on that, please?
     Sure. I'll be very frank in admitting that Amnesty International sometimes makes use of that.
    Exactly.
    We don't have the resources that most others do, so it's a little boost on Facebook, but that's sometimes an essential part of also getting positive messages out there.
    I think that as a wider strategy around tackling online hate goes forward, that is a key aspect of it. The strategy needs to be looking at all manifestations of how hate finds its place and gets an even greater profile in the digital world. If sometimes whoever has the deepest pockets is going to get the greatest reach with their hate, that has to be of top concern in any such strategy.
    Thank you.
    I will pass the rest of my time to Mr. Virani.
    That's generous. Thank you, Ms. Khalid.
    There are about four minutes left.
    First of all, thank you to all of you.
    It's good to see you again, Shahen, in Toronto. Thank you for being here. Your contributions are invaluable.
    I have a few points.
    Thank you for the reference to Bill C-51 and the amendments that were made. There have also been amendments by the government in respect of the security infrastructure funding, which is the funding we provide to increase surveillance capacity and security in places of worship. Unfortunately, these things have all been triggered by horrific events. It was doubled after the Quebec mosque shooting. It was doubled again after the New Zealand shooting. However, I think that's important.
    Mr. Neve, you also mentioned that the anti-racism secretariat money in budget 2019 is dedicated to developing a robust anti-racism strategy. There are issues that all of us care about. I, in particular, care deeply about these kinds of issues.
    Mr. Schutten, I want to ask you a question, because it's really germane to what we're studying here. Is the issue with section 13 a problem—you seem very well-versed legally, so I'm going to put to you a very strict legal question. The analogue to that provision was tested by the Supreme Court of Canada in its Whatcott decision, and section 13 was upheld. There was a minor amendment about how you should be able to belittle. Belittling is in the domain of free speech.
     However, is your issue with the text of section 13 as it then was, which has effectively been upheld by the Supreme Court of Canada, or is the issue you raised—and raised poignantly—with the decision-making that took place? As a lawyer, I know that inconsistent decision-making is the bane of any litigation lawyer. Where's the rub there?
    The Supreme Court, in the Whatcott decision, said that having a hate speech provision in the Saskatchewan Human Rights Code was one way a government could combat hate speech and so on, but it did not say that it was a necessary provision. So it's not that it's constitutionally obligatory for a government to have a hate speech provision. I would say that this is what the Supreme Court was pretty clear on.
    That said, I think that even post-Whatcott, the better policy decision is to not be policing expression, even Whatcott-style expression, because it does not rise to the level of violence. It doesn't rise to the level of the types of things that have triggered this very hearing.
     I do not defend Mr. Whatcott's way of expressing himself at all, but he's a person who is trying to engage in a public policy debate. He's doing it poorly, but that's what he's trying to do. Also, his engagement is particularly around political speech, and of all free speech, political speech is the most important to protect. We can quibble about whether or not the Supreme Court got it right, but I think it's playing a very fine line around that freedom of political expression.
    I can share with the committee, as well, a peer-reviewed journal article I've published with a law professor from Osgoode Hall on the Whatcott decision and on how we think the Supreme Court got it wrong.

  (1050)  

    I think that would be helpful.
    You have 30 seconds.
    With all of you leveraging how we better track.... And I'm cognizant that when we did the M-103 study, people came forward and said that Muslims will report to Muslim civil society groups, and Jews will report to Jewish civil society groups. B'nai Brith, which is in the room right now, has done a great report on hate crimes. How do we leverage civil society to help us do that tracking that we've heard about this morning?
    I think it's very hard for groups to track online hate. I spoke with the NCCM before testifying today and asked for their reporting on online hate against Muslims, and they weren't able to supply comprehensive reporting. This is an organization that's very well organized and does an effective job, so I think it's hard to track effectively.
    I would say there's some danger in leaving it only or primarily to communities to do this themselves. There will be resource constraints. There will be fearfulness factors. There will be all sorts of things that stand in the way of that.
    I would say that government has a role to work with communities to jointly design, resource and maybe even run the right kinds of reporting mechanisms. That really needs to be a joint effort going forward. Otherwise, at best, it's only going to be piecemeal. Some communities will have greater interest, capacity and resources, but others less so, and the last thing we would want, as a result of that, would be to have an incomplete and skewed picture of the extent of hatred in the country.
     Thank you very much.
    I want to thank all the groups that participated in this panel. You have been enormously helpful to us.
     I wish everybody a wonderful day.
    The meeting is adjourned.
Publication Explorer
Publication Explorer
ParlVU