Good morning, everyone. Welcome to the Standing Committee on Justice and Human Rights as we launch our study on the issue of online hate.
This is a really important issue. With the increasing numbers of hate crimes being reported in Canada and groups feeling vulnerable and the increasing presence of hate on the Internet, this subject is one the committee wants to tackle.
I know that a number of groups across Canada have asked us to study this issue. Today we will hear from three, and a fourth may join us later on.
Today, representing the Centre for Israel and Jewish Affairs, we have Mr. Shimon Koffler Fogel, President and Chief Executive Officer; and Mr. Richard Marceau, Vice-President, External Affairs and General Counsel. Welcome, gentlemen.
From the Anglican Church of Canada, we're joined by Mr. Ryan Weston, the Lead Animator of Public Witness for Social and Ecological Justice. Welcome.
From the Canadian Rabbinic Caucus, we're joined by Rabbi Idan Scher. Welcome.
We might be joined by the Ahmadiyya Muslim Jama'at, represented by Imam Farhan Iqbal. If he arrives, he will follow the other members of the panel in testifying.
We will start with CIJA. Mr. Fogel, the floor is yours.
Thank you, Mr. Chair, and thank you to the members of the committee for inviting us to join this important conversation.
I'm pleased to offer reflections on behalf of the Centre for Israel and Jewish Affairs, CIJA, the advocacy agent of the Jewish Federations of Canada, which represents more than 150,000 Jewish Canadians from coast to coast.
CIJA is encouraged by the launch of this important study. Since the horrific mass shootings at the Tree of Life synagogue in Pittsburgh last October, the deadliest act of anti-Semitic violence in North American history, CIJA has mobilized thousands of Canadians to email the justice minister to call for a national strategy to counter online hate, beginning with this very study.
We've done so with the endorsement of a diverse coalition of partners, including Muslim, Christian, Sikh and LGBTQ+ organizations. It's also worth noting that various groups we have worked with to mark genocide awareness month, including Ukrainian, Armenian, Roma, Rwandan and Yazidi representatives, have united to call for a national strategy on online hate, knowing that genocide begins with words.
Increasingly, terrorist organizations and hate groups rely on online platforms to spread their toxic ideas, recruit followers and incite violence. This is a problem that spans the ideological spectrum. For example, Canadians who have been radicalized to join ISIS have often done so after extensively consuming jihadist content online.
In the case of two recent acts of white supremacist terrorism, the mass murder of Jews in Pittsburgh and Muslims in Christchurch, the killers made extensive use of social media to promote their heinous ideology. The Pittsburgh shooter reportedly posted more than 700 anti-Semitic messages online over a span of nine months leading up to the attack; and the Christchurch shooter's decision to livestream his horrific crime was a clear attempt to provoke similar atrocities.
We cannot afford to be complacent, given the link between online hate and real world violence. My hope is that this study will culminate in a unanimous call on the Government of Canada to establish a comprehensive strategy to counter online hate and provide the government with a proposed outline for that strategy.
Today I'll share four elements that we believe are essential to include: defining online hate; tracking online hate; preventing online hate; and intervening against online hate.
First, a national strategy should clearly define online hate and not assume that online platforms have the capacity to navigate these waters on their own. The focus should not be on the insensitive, inappropriate or even offensive content for which counterspeech is generally the best remedy. The explicit goal should be to counter those who glorify violence and deliberately, often systematically, demonize entire communities. While freedom of expression is a core democratic value, authorities must act in exceptional circumstances to protect Canadians from those who wilfully promote hate propaganda and seek to radicalize vulnerable individuals.
The international community's experience in defining anti-Semitism is an important model. The International Holocaust Remembrance Alliance, or IHRA, working definition of anti-Semitism, which is the world's most widely accepted definition of Jew hatred, should be included in any strategy to tackle online hate. It's a practical tool that social media providers can use to enforce user policies prohibiting hateful content and that Canadian authorities can use to enforce relevant legal provisions.
Second, a national strategy requires enhanced tracking and reporting of online hate, via strategic partnerships between the Government of Canada and technology companies. There are models worth reviewing in developing a Canadian approach, such as TAT, the “Tech Against Terrorism” initiative, a UN-mandated program that engages online companies to ensure their platforms are not exploited by extremists.
Third, a national strategy must include prevention. In the current global environment, trust in traditional media and institutions has declined as online manipulation and misinformation have increased. A campaign to strengthen Internet literacy and critical online thinking with resources to support parents and educators would help mitigate these trends.
Fourth, a national strategy must include a robust and coordinated approach to intervention and disruption. There's a debate over whether government regulation or industry self-regulation is the best approach. Canada can benefit from the experience of other democracies, especially in Europe, that are further down the road in developing policy responses to online hate.
Their successes and setbacks should be considered in crafting a made-in-Canada approach. The solutions, whether regulatory, legislative or industry-based, will flow from open and inclusive discussion with governments, service providers, consumers and groups like CIJA, who represent frequently targeted segments of society.
The committee will likely discuss the former section 13 of the Canadian Human Rights Act. CIJA has long held the view, as we stated years ago during the debate over rescinding section 13, that removing this provision leaves a gap in the legal tool box. There are multiple legitimate ways to remedy this.
One is enhanced training for police and crown prosecutors to ensure more robust consistent use of Criminal Code hate speech provisions. Three recent criminal convictions in Ontario—one for advocating genocide and two for wilful promotion of hatred—demonstrate that these sections of the Criminal Code remain an effective vehicle for protecting Canadians. Similarly, section 320.1 of the Criminal Code, which enables the courts to seize computer data believed on reasonable grounds to house hate propaganda, is a pragmatic tool that should be applied more often.
Another approach is to develop a new provision in the Canadian Human Rights Act on online hate. This requires addressing the clear deficiencies of section 13, which was an effective but flawed instrument. In line with recommendations offered by the Honourable Irwin Cotler, a restored section 13 would require significant safeguards to protect legitimate freedom of expression and prevent vexatious use of the section.
I know there are strong feelings on both sides of the House towards section 13. This need not be a partisan issue. With the right balance, a reasonable consensus can be achieved. I emphasize the need for consensus because the use of legal measures to counter online hate will only be truly effective if those tools enjoy broad legitimacy among Canadians. If misused, misconstrued or poorly constructed, any new legal provisions, including a renewed section 13, would risk undermining the overarching goal to protect Canadians and prevent hate propaganda from gaining sympathizers and adherents.
While Canada has not seen the same polarization that has marked American or European politics, we are not immune to it and the growth of anti-Semitism and other forms of hate in mainstream discourse that comes with it. History has repeatedly shown that Jews and other minorities are at grave risk in times of political upheaval and popular disillusionment with public institutions.
With advances in technology, these processes now unfold with alarming speed and global reach. The tactics of terrorists and hate groups have evolved and so too must public policy. While there's no way to fully mitigate the threat of hate-motivated violence, a strong national strategy to combat online hate would make a meaningful difference.
Chairman and members of the committee, I thank you for your time and would welcome any questions during the Qs and As.
Thank you, Mr. Chair and members of the committee, for the invitation this morning to speak to you about our concerns as the Anglican Church of Canada with regard to the proliferation of hate online and the very real impacts of this hatred for communities in this country and around the world. We're honoured to join with the voices of so many other diverse organizations in calling for stronger action on this issue across Canada.
Our religious tradition teaches that every person is imbued with inherent dignity and also particularly calls us to embody special care and concern for those who find themselves uniquely vulnerable to harm or attack. Additionally, as Christians, our faith community has enjoyed a historically privileged position in this country, so we recognize that we have a particular responsibility to speak out for the protection of others. If we are to take these commitments seriously, we must raise our voices to oppose hatred in all its forms.
As you all well know, recent years have seen a proliferation of extreme forms of hatred in online fora that encourage violence and dehumanize those who are the targets of this hate. Recent high-profile violent attacks in Canada and abroad have emphasized the reality that these sentiments do not remain online, but have tragic offline consequences as well, and that they are in need of immediate and sustained attention.
We join with many of the other witnesses here today in calling for the federal government to develop a national strategy to combat online hatred. This government has the ability to impose reasonable regulation on social media corporations, Internet service providers and other relevant corporate actors to control the proliferation of violent hate speech in online spaces. The government must also develop a strategy for more effective enforcement of existing laws regarding the public incitement of hatred, with particular attention given to the ways these attitudes are expressed online so that these activities will not go unchallenged.
Faith communities and civil society organizations must also affirm our commitment to combatting hate in the communities that we serve and to use our voice and influence to challenge such expressions wherever we encounter them. Recognizing that many of the hateful words and actions directed towards the communities impacted by hate speech are carried out in the name of religion, the active participation of faith communities and interfaith coalitions is essential to effectively combatting this reality.
We commit to continuing our work with ecumenical and interfaith partners in addressing these attitudes within our own communities, and we draw strength from the leadership and witness of many Canadian faith groups who have been actively working to combat hate and oppression for many years, some of which are appearing before you today.
It's also important to remember that hatred online is never completely detached from hatred offline: hatred that is being promoted among sympathetic networks or directed at individuals and communities in the streets of our country. Although online hatred presents new challenges in terms of the ready accessibility of such extreme views, the roots of these attitudes are based on arguments and myths with long and influential histories in Canada and around the world. We must confront these attitudes at every opportunity.
A national strategy to address online hatred, then, must also equip families, community leaders and individual Canadians to challenge expressions of hatred, extremism and violence wherever they may encounter it. Education and awareness must be key parts of any strategy to address this issue, equipping people with the tools they need to dismantle these ideologies.
While I've been speaking about online hate in fairly broad terms this morning, we must also name that there are specific communities being targeted by these sentiments and that any national strategy to combat this hatred will only be effective inasmuch as it recognizes the specific realities and repercussions of particular forms of hatred. Although an overarching strategy is certainly necessary in this work, we must also develop integrated strategies that address and debunk the myths underpinning anti-indigenous racism, anti-black racism, anti-Semitism, misogyny, Islamophobia, homophobia and transphobia, xenophobia and anti-immigrant sentiment, religious intolerance and other forms of hatred that have distinct impacts on the safety and security of identifiable groups of people in this country.
The Anglican Church of Canada is increasingly attending to the importance of online presence as a positive means of communication and education, so supporting the development and implementation of a national strategy to combat hatred online is a natural step in this work for us.
We recognize that we have the ability to reach thousands of Canadians through our services and programs across the country, as well as with our online presence. We are committed to continuing to lift up a vision of this country and this world that truly welcomes and respects everyone by offering safe, supportive spaces for all Canadians and by challenging expressions of hatred directly.
If we fail to take more concerted action in this country to combat all forms of hate—online and in person—then further high profile acts of violence will embolden similar action by others. We must all work together to offer a positive, loving alternative to this hate, an alternative that affirms the inherent dignity of all persons in Canada and around the world.
Such an alternative requires strong, strategic direction from government that supports efforts by all stakeholders to challenge these attitudes. We are ready to collaborate in this work together with our prayers, our pulpits and our presence online, but only by working together can we confront this important issue and make our world a little safer for so many.
Thank you for having me here today. I represent the Canadian Rabbinic Caucus, a group of around 200 rabbis from across the country and from across the Jewish denominational spectrum.
On October 27, 2018, 11 Jews were murdered at the Tree of Life synagogue in Pittsburgh, Pennsylvania. The murderer had been highly active in promoting anti-Semitism on social media. It's reported that he posted more than 700 anti-Semitic messages online in the nine months or so prior to the attack. Just two hours before the attack, the murderer foreshadowed his actions in his final disturbing online post.
On Friday, March 15, 50 Muslims were murdered by a white nationalist terrorist at two mosques in Christchurch, New Zealand. These murders played out as a dystopian reality show delivered by some of America's biggest technology companies. YouTube, Facebook, Reddit and Twitter all had roles in publicizing the violence and, by extension, the hate-filled ideology behind it.
The shooter also released a 74-page manifesto denouncing Muslims and immigrants, which spread widely online. He left behind a social media trail on Twitter and Facebook that amounted to footnotes to his manifesto, and over the two days before the shooting he posted about 60 of the same links across different platforms, nearly half of which were to YouTube videos that were still active many hours after the shooting.
As these horrific attacks demonstrate, hate can be lethal, and online hate can foreshadow mass violence. There is no question that the Internet has become the newest frontier for inciting hate that manifests itself disturbingly offline.
ln 2017, the World Jewish Congress, representing Jewish communities in 100 countries, released a report indicating that 382,000 anti-Semitic posts were uploaded to social media in 2016. Stated differently, that's one anti-Semitic post every 83 seconds.
Although information on online hate in Canada is limited, between 2015 and 2016, according to Cision Canada, a Toronto-based PR software and service provider, there was a 600% rise in intolerant hate speech in social media postings by Canadians. The architect of the study explains that while some of that intolerant or hateful speech was generated by bots, as determined by analyzing the high frequency of posts over a short time, the researchers noted that the bots' language was later mimicked by human users, and therefore it was just as destructive.
These numbers are staggering.
The Canadian government rightfully prides itself as a global thought and action leader in the area of protecting the rights, the safety and the quality of life of the people both within its borders and worldwide. We personally have felt this. Canadian law enforcement agencies have been exceptionally responsive in providing support to our institutions, particularly following the Pittsburgh attack.
However, what is now needed is for federal policy-makers to prevent similar atrocities by launching a national strategy to combat online hate. The explosive growth of digital communications has coincided with rising alienation from traditional media and institutions. Extremists have taken advantage, preying on vulnerable disaffected individuals through the same digital tools and collaborative online culture that now shape so much of our world.
There is, of course, no way to fully eliminate the threat of hate-motivated violence, but a strong national strategy to combat online hate can make a meaningful difference in protecting Canadians. The Centre for Israel and Jewish Affairs, CIJA, has set out a four-step policy recommendation towards fighting online hate.
Step one of that recommendation is defining hate. One very important prong of this step is for the Canadian government to define what constitutes hate. This should begin with the adoption of the International Holocaust Remembrance Alliance—IHRA—definition of anti-Semitism. The IHRA definition is a practical tool that should be used by Canadian authorities in enforcing the law and as well by social media providers in implementing policies against hateful content.
The further steps of CIJA's recommendation include tracking hate, preventing and intervening to stop hate.
On that last step, intervening to stop hate, I would like to make it very clear that we are not looking to police distasteful speech. Freedom of expression is of course a core Canadian value. We are focused on the glorification of violence and systematic propaganda affecting Jews and other communities.
We are confident that an effective balance can be struck between protecting free speech and combatting online hate that demonizes entire communities and leads to violence and murder.
This of course is a complex issue, but we are calling on the Government of Canada to take the lead in understanding it and developing the tools to counter it. We are calling on the Government of Canada to launch a national strategy to tackle online hate, working in partnership with social media platforms and Internet service providers, as well as other appropriate partners. This is a crucial step in making a difference that we so badly need.
Thank you for inviting me.
The peace and blessings of God be on all of you.
I would like to start by recognizing this committee of justice and human rights on behalf of the Ahmadiyya Muslim Jama'at. As an imam of the Ahmadiyya Muslim Jama'at Canada, I would like to offer our heartfelt regards and thankfulness for giving us the opportunity to share our thoughts on this important pertinent matter.
There is no doubt that over the last decade we have seen an exponential rise of hate crimes in our society. When we look at the statistics around terrorism, gang violence and gun violence, we see a general albeit worrying upwards trend. Similarly—and sometimes overlooked—with the dawn of the Internet and social media, we have seen a stark rise in online hate speech and violence as well.
To start off, we can quickly discuss what hate speech really is, as was mentioned earlier as well. Where do we draw the line between freedom of speech versus hate speech? Hate speech refers to abusive or threatening speech or writing that expresses prejudice against a particular group.
To put into perspective how dangerous this is, according to a Maclean's article, online hate speech rose 600% in Canada. Some of the words they monitored to demonstrate this rise are #banmuslims and #whitepower. Most recently, we are all aware of the rise of Islamophobia, which has hit home in Canada in Quebec City and led up to the Quebec City mosque attack. There had been a surge in mosque arson and vandalism across Canada and eventually it led to this attack. As recently as last week, in London, U.K., a right-wing extremist, Steven Bishop, pled guilty to a plot to carry out a bomb attack at the Baitul Futuh mosque, which is one of Great Britain's largest mosques belonging to the Ahmadiyya Muslim Jama'at.
Ahmadi Muslims believe that free speech is a sacred freedom, insofar as it provides an indispensable flow to a human's thoughts and is a force for good. Hate speech and ideology designed to cause harm and grief must not be allowed to use the disguise of freedom of speech.
We all recognize the imminent and very real threat of hate speech. How can we solve it? Is there a way to solve this problem by putting together a 30-, 60- or 90-day plan? Most probably not, and that's because this effort is not about bending heads. It's about changing minds.
To combat the rise of hate crimes online and in person, we need to respect our differences and to continue stay true to our notion of acceptance. Ignorance breeds suspicion, fear and anger. Familiarity breeds understanding, compassion and love. When people come together and find out how similar we are, it is only then that we can truly have feelings of sympathy, understanding, compassion and love. This is the way to combat hate. We need to open our doors and our hearts. We need to interact with one another with unconditional love. We need to recognize the rights of one another.
To truly combat the rise of hate crimes, we need to accept that this is not something that will be fixed overnight. Rather, it is something that will require a daily and regular struggle, which will help shape the way we think and interact with one another. Once we start to truly embody the profound saying of “love for all, hatred for none”, only then can we can start to tackle the rise of hate crimes.
Together, we are stronger than any one individual. Love is much stronger than fear. Fear demands that we think of ourselves as somehow separate from one another. By coming together, our common bond increases our capacity for love to be dominant and allows us to live together in peace and harmony.
Thank you, Mr. Chairman.
Thank you to the witnesses.
I certainly agree that this is an important study and that all Canadians should be extremely concerned by the proliferation of hate that has increased worrisomely in the last number of years. It has hit home in communities across Canada, most significantly in the horrific mosque attack in Quebec City, and we saw it just eight hours away in Pittsburgh at the Tree of Life synagogue, which I know well, because it's a couple of blocks away from where my brother lives.
It's important that we tackle this issue and while we do so, we of course have to be cognizant—as I think all of the witnesses have indicated—of fundamental freedoms, including freedom of speech, and how we can strike that balance.
Perhaps I'll begin with Mr. Fogel. You did mention section 13, which I, with respect, believe was a flawed section for a number of reasons. You mentioned that, with the removal of what many believe to be a flawed section in the Canadian Human Rights Act, there is a gap.
When you look at section 318 or section 319 of the Criminal Code, or I think you cited section 320.1, I'd just be interested in understanding where you see the gap. Perhaps you could elaborate on what you would recommend to close what you state is a gap.
From our perspective, we were pretty agnostic about whether section 13 at the time it was debated was tweaked to be responsive to some of the deficiencies inherent in the section, or whether, if it were abandoned, the Criminal Code provisions were provided with a more robust capacity to compensate for the loss of section 13. Essentially, the problem is the one that you referenced and that I think would be familiar to everybody on the committee. We are dealing with two competing imperatives. On the one hand is the desire to ensure that people can avail themselves of the freedom to express thoughts and ideas freely, without fear of persecution or prosecution however odious those ideas might be. On the other hand, unlike our American cousins, we recognize that there is a limit to freedom of expression. When it begins to encroach on the safety and security and well-being of others, that really constitutes a red line.
The challenge with section 13 was that it was both a sword and a shield. It was providing some insulation for those who really did have malicious intent and it wasn't offering protection to those who were the targets of toxic or vitriolic hatred. We had hoped that government writ large—it's not just a federal issue but applies also at the provincial and municipal levels—would come forward and demonstrate a real political will to pick up the slack from the loss of section 13. In fact, I think Richard wrote to the attorneys general across the country, calling on them to adopt a more aggressive posture in terms of considering and acting upon potential hate crime activity, which would be able to compensate for the loss of section 13. Frankly, that hasn't been our experience. There has been some, but not enough. If we were to return to a section 13 kind of model, we would want to ensure that there were provisions with respect to evidence, and also with respect to the onus of ensuring that it was not a SLAPP or vexatious kind of lawsuit, that really would protect the ability of individuals and groups to articulate ideas even if they didn't meet with a uniformly positive response.
There were a number of other things that I think the committee could look back to in the testimony given by a variety of different stakeholders and that could offer a solution. What is clear, however, is that government writ large has to be able to employ useful tools, tools that do make a distinction between freedom of expression and freedom from hate, as they proceed along this line.
I know this is straying just a little bit from your question, but what we should note is increasingly the call from social media providers, the platforms that we are all talking about ubiquitously, who themselves are struggling to figure out what the lines are and where they should be intervening and what should be their response to things that appear online. I think there is a signal that they're looking for leadership from government to help provide them with the guidance necessary for them to put into place, whether it's human resources or algorithms or what have you.... The numbers are staggering. I have reviewed some of the stuff from Facebook alone in terms of the hundreds of millions of posts. It's hard to wrap your head around the idea of monitoring and responding to that volume of information.
On clear guidelines from government, I'll return to the issue of the IHRA definition of anti-Semitism. One of the benefits that it provides a government is that it offers a clear definition. You can then use that as a template for the algorithms you are going to put into place and the word searches you are going to use to trigger closer scrutiny and so forth.
If we were to do this across the span of looking at those specific things to which different communities...whether it's LGBTQ or the Muslim community, the Bahá'í, and so forth, I think that would go a long way in being able to entrench the kind of distinctions on freedom from hate and freedom of expression that everybody around this table I suspect is rightly concerned about.
Thank you very much.
I'm going to put the timer on, because I know there are a lot of strong advocates at the table.
First of all, welcome. Thank you for being here. As-salaam alaikum. Shalom. It's really, really important.
This is something that I take seriously, Canadians take seriously and my constituents in Parkdale-High Park take very seriously. I know that all of you are here with the best of intentions.
We are seeing an unbelievable amount of hatred, and I'm glad you outlined many aspects of it. Whether it's anti-Semitism, Islamophobia, homophobia, transphobia, anti-indigenous sentiment, anti-black sentiment, incel movements, etc., these are a cause for huge concern right here in Canada. We've seen the attacks in Quebec, Pittsburgh and New Zealand.
There was at one point, a tool—and I want to pick up on this, but I'm going to hold you to a bit of a time limit, Mr. Fogel—that was applied in a previous iteration of the Canadian Human Rights Act, section 13. It talked about targeted discrimination based on a prohibited ground towards an identifiable group that was spread by means of a telecommunication. It emphasized that this included the Internet. That provision was removed by the previous government in or around 2012-13.
At the time, it had gone through some challenges. As early as 2006, previous iterations of the groups who are here, including B'nai Brith, the Canadian Jewish Congress, and the Friends of Simon Wiesenthal were defending that very provision. In the Whatcott decision by the Supreme Court of Canada, the analogue to that provision was upheld on the very basis that has been discussed.
I want to know whether your perspective is that was an invalid provision—so perhaps over to CIJA—and if it wasn't invalid, if you can drive at the heart of what you think needs to be added to make it more robust.
I apologize for my lengthy answers, but nobody gives me time to speak at home.
Look, section 13 was critically important. It provided the protections that you just referenced, and I think it is clear that Canadians and groups within Canada need them.
The problem was that ironically, groups or individuals we should be concerned about were using section 13 as a way of pushing back against those who were raising legitimate free expression ideas or concerns about particular topics. It was chilling, or more precisely freezing, the ability of people to offer critical comment about things of public interest without fear of being brought before some judicial process to account for what they said, because others were claiming that was triggering hate against them.
That was the vulnerability of section 13. There were a whole range of ways to deal with it. You have them in front of you. You've clearly done the research and I would invite the committee to look carefully at those, because it either has to be brought back in a better construct..... Irwin Cotler's formulation—I won't go through it now, as some of you are familiar with it and you can easily access it—was probably the most compelling way to restructure section 13, or provide direction to law enforcement, the public prosecution process, the attorneys general to become much more aggressive and active in applying the provisions of the Criminal Code.
Thank you all for being here this morning and for your presentations.
Picking up on that threat, I think we saw this week an attempt by Facebook to address some of the groups in Canada that are sharing this information online. We saw the banning of individuals and groups, which I think was a very good move. It wasn't across all social media platforms, unfortunately. I think it was Facebook and Instagram that did that.
To Mr. Fogel's point earlier, the depths that exist in the Internet, even within one platform itself.... There are just layers upon layers of social media giants trying to control this themselves. It really begs the question about how they can do this on their own without government intervention, without the Canadian government being a part of that and, I think, having some basic rules around what is acceptable and what isn't, some ground rules for platforms in our own country.
You all spoke about Pittsburgh and the Christchurch shooting, and the extensive amount of Islamophobic and anti-Semitic material that had been posted by both of these individuals. I think Canadians are asking how it is happening that this is all being posted. Why is no one going to these individuals and stopping it at that point? Is this a failure of social media? Is this a failure of policy? They are also asking how it happens that people are out there sharing these volumes of information and no one is challenging it.
I think Mr. Fogel spoke to this clearly, but I want to ask this to the other panellists: Do you think online platforms should be able to establish their own policies to address online hate, or do you believe that Canada should establish some ground rules as well?
Look, here are two things about social media platforms. On the one hand, asking them to self-regulate and to, in effect, censor some of the content goes directly against their business model, which is to expand as opposed to contract. I think we have to be sensitive to this.
I have discerned over the past 12 to 18 months a sea change, an evolution in the thinking of these platforms. I think they are scared. They did not appreciate just how big they were getting, just how powerful a tool their platforms were. I do not think they have the confidence that they can do this on their own.
My sense, from some of the testimony they've given in various legislatures around the world, but certainly here in North America, is that they are looking for some support and leadership from government. I think they need it both for insulation, as well as for objective third party adjudication, for lack of a better term, to help direct and guide how they're going to put into place the kind of infrastructure....
With Facebook alone, just the other day I heard an interview where a senior employee at Facebook was talking about moving from having 10,000 to 30,000 people with the dedicated task of reviewing posts online. It's crazy. I think they do need government support, and they don't see that as encroachment, but as helpful intervention.
We are now going to start our second panel on online hate. It is a pleasure to be joined by Amnesty International Canada with Mr. Alex Neve, the Secretary General. Welcome back.
The Armenian National Committee of Canada is joining us by video conference from Toronto, represented by Mr. Shahen Mirakian, the President. Welcome, Mr. Mirakian.
The Association for Reformed Political Action Canada is also joining us, represented by Mr. André Schutten, Legal Counsel and Director of Law and Policy. Welcome.
Finally, the Bahá'í Community of Canada is joining us. I'd like to welcome Mr. Geoffrey Cameron, Director of the Office of Public Affairs.
What we normally do is try to put the person by video conference first in case we lose the connection, so we're going to ask Mr. Mirakian to start.
Mr. Mirakian, the floor is yours.
My thanks to the chair and the members of the committee for inviting the Armenian National Committee of Canada to provide evidence to you today. My name is Shahen Mirakian and I am the President of the Armenian National Committee of Canada. I apologize for not being able to join you in person today.
As representatives of a community that has suffered genocide, the ultimate expression of hate-based violence, we are more familiar than most with the consequences of the promotion of hate. Similarly, as a community that has routinely advocated for positions that run counter to the status quo, we are fierce defenders of freedom of expression. In our view, there is no contradiction in these two positions. Hate propaganda is a means of infringing the freedom of expression of the targeted group by delegitimizing or vilifying identifiable groups. Hate propaganda makes it impossible for members of those groups to be heard or participate in civil society in a meaningful fashion.
Canada's history of protecting freedom of speech and freedom of expression while criminalizing the willful incitement of hate or advocacy of genocide has been a powerful example for the international community. Now Canada has to apply the lessons learned from nearly 50 years of combatting hate in the real world to the virtual world and develop a national strategy to address online hate.
As you are aware, on April 24, 2015, the House of Commons adopted motion M-587, calling upon the government to recognize the month of April as Genocide Remembrance, Condemnation and Prevention Month. The Armenian National Committee of Canada has worked with various other non-governmental organizations, particularly the Centre for Israel and Jewish Affairs, the Ukrainian Canadian Congress, and the Humura Association, to ensure that each year the government recognizes April as Genocide Remembrance, Condemnation and Prevention Month.
However, this April, recognition alone will not be enough. Over the past year the ANCC has worked collaboratively with a broad coalition of human rights advocacy organizations to ask for action as well. An important part of that effort was to ask the government to combat online hate. In December 2018, the ANCC joined with 17 other organizations, many of whom are providing evidence today, in sending a message to the Minister of Justice asking that a national strategy be launched to combat online hate.
This year, for Genocide Remembrance, Condemnation and Prevention Month we are working with a broad coalition of communities that have experienced the horror of genocide to ask all Canadians to join us in requesting that the Government of Canada adopt policy solutions to the problem of online hate. We would encourage all Canadians to visit itstartswithwords.ca. At this site, Canadians can read all about what can be done to combat online hate and learn what other actions Canada can take to do its part to prevent future genocides and properly recognize those that have already taken place.
We also want to go on the record today as strongly supporting the four policy recommendations proposed in November 2018 by the Centre for Israel and Jewish Affairs as the basis for a comprehensive national strategy for combatting online hate. Those four policy recommendations are: defining hate, monitoring hate, preventing hate, and intervening to stop hate. We also agree that there needs to be a greater use of existing tools to address online hate as well as consideration given to implementing new tools to assist authorities in responding to online hate.
One specific area of concern we would like to highlight is law enforcement, which we believe must make hate-motivated cyber-attacks or website-hacking a priority. Since 2008, websites of Armenian community organizations have been subjected to three separate incidents of cyber-attacks. The websites of Armenian-Canadian newspapers, churches and community organizations have been replaced with anti-Armenian propaganda, including, but not limited to, denials of the Armenian genocide. Despite publicizing these incidents and reporting them to law enforcement, we are not aware of any active effort to identify the perpetrators or bring them to justice. While many cyber-attacks will never lead to actual violence, it is very possible that the perpetrators are linked to groups that either advocate for or actually engage in violence. If law enforcement prioritized identifying the parties who engage in hate-motivated cyber-attacks, they would be able to obtain intelligence on potentially violent groups and prevent hate-motivated physical attacks.
We also believe that the regular surveys conducted by the Canadian Centre for Justice Statistics should specifically track hate-motivated cyber-vandalism and not just track victimization by individuals who have received hate-motivated messages online. Hate-motivated cyber-vandalism is a criminal act, just like hate-motivated physical vandalism, and law enforcement resources should be equally allocated to both. Canada should work with the international community to bring the perpetrators of these incidents to justice, whether or not the perpetrators are physically located in Canada.
In this regard, Canada's signing in 2005 of the additional protocol to the convention on cybercrime specifically concerning the criminalization of acts of racist or xenophobic nature was an important step, but the domestic tools must be implemented to allow for the extradition of suspects and co-operation with international partners.
Finally, law enforcement needs to provide communities with the tools to properly report these crimes and obtain updates about the investigation. As it stands right now, we are not clear on to whom these crimes should be reported and if they are actively being investigated or how we can find out if they are being actively investigated.
The harm done to communities by hate-motivated cyber-vandalism can be in some instances just as severe as the harm done by hate-motivated physical vandalism. This study being undertaken by this committee today and in upcoming days is a very important first step in combatting online hate.
We are very grateful to this committee for making room on its agenda during Genocide Remembrance, Condemnation and Prevention Month to bring attention to this issue and to do its part in preventing future genocides. We are hopeful that this study will result in an effective national strategy to deal with the pressing problem of online hate promotion.
Thank you very much.
Mr. Chair, I wasn't blaming the committee. I was just making an observation.
Obviously, the question in front of this committee brings into sharp focus two crucially important human rights matters. The first is the right to be free from discrimination, including the ugliest manifestations of discrimination that arise through abuse and violence expressed as hate, which in far too many corners of our world continues to the extremes of mass atrocities, such as crimes against humanity and genocide. It is a very important and stark reminder less than one week after we have remembered and commemorated the horror of the Rwandan genocide and very much a reminder on my mind as I'm freshly back from having spent time in the Rohingya refugee camps in Bangladesh, which of course is all about a community that has had to flee because of hatred. There's no doubt that the online world, which continues to transform and grow virtually daily, has become a troubling front line in that reality of discrimination and hate.
The second is the right to freedom of expression, which is often referred to as the lifeblood of the human rights system. This is the right to hold, shape and share opinions and ideas, to engage with others and to take part in public debate. It is essential for so many reasons, including that free expression itself provides the avenues for exposing and addressing injustice and for evolving our understanding about society and democracy and the environment in a way that makes for a better world. Equally, there is no doubt the online world has been a very important and growing avenue for providing new possibilities for free expression.
On any given day, Amnesty International researchers, campaigners, advocates and our millions of activists and supporters worldwide are taking action to uphold both of these essential human rights, both of which, of course, are enshrined in numerous international treaties. In international law, the right to be free from discrimination is considered to be so fundamental that nothing ever justifies its abrogation.
The right to free expression is a right that is balanced in its very formulation. The International Covenant on Civil and Political Rights notes that it's a right that carries “special duties and responsibilities” and may therefore be subject to restrictions only if they are provided by law and are necessary for respect of the rights of others. The key word here is “necessary”.
I would suggest to you that this word and this question—restrictions on free expression that are necessary for the respect of the rights of others—go to the very heart of your work. Please do remember that word “necessary”, because if there is a cautionary lesson from the world of human rights protection, it is that restrictions and limitations of any kind on any right are always a slippery slope and governments are very quick to push the limits. Necessity is absolutely crucial.
The rise of hate-based and hate-fuelled discrimination is on the rise everywhere, often made easier—or at least more obvious—by the new and accessible channels the online world offers. Misogynistic racist hate has become a devastating phenomenon in almost all social media platforms. Amnesty International has drawn particular attention to that reality on Twitter through our major Toxic Twitter research project, which over the past two years has revealed how much online abuse and violence women are subjected to. This abuse and violence is exponentially worse for women of colour, LGBTI women, indigenous women and women from other marginalized communities. Like many indigenous organizations, faith groups and anti-racism campaigners, human rights organizations and others, we have repeatedly highlighted the many ways in which the hate and racism represented in white supremacy has also found a toxic home in the digital world. It is by no means limited to simply objectionable or offensive views, but is increasingly spilling over into hate-filled online discussions that stand in the background of threats against indigenous land defenders, and of course horrific acts of mass violence and killings such as in Christchurch, Pittsburgh, Sainte-Foy in Quebec, and Yonge Street in Toronto.
Let me wrap up with six very quick final comments and recommendations that I hope will shape your ongoing work here. First, it is commendable and important the Canadian Criminal Code does criminalize the incitement of hatred against a growing number of identifiable groups. That does not mean, of course, that from a human rights perspective protections against hate in Canada should in any way be considered to be full and complete. Rarely if ever does the criminal law offer the whole solution to any human rights challenge.
Second, given the rapid rise, in particular, of online hate and its increasingly devastating consequences, governments are compelled to look for further action including with respect to further tools for investigating, enforcing and imposing sanctions.
Third, hate, as we would all agree, is obviously a human rights issue that often leads to the most violent expressions of discrimination. In our country, human rights commissions have mandates that are grounded primarily in addressing discrimination. It is therefore intuitive and obvious to consider the role they could and should be playing in responding to this serious concern.
Fourth, any move to provide a mandate to human rights commissions, including the Canadian Human Rights Commission, to address online hate should be grounded in strong recognition of the vital importance of the right to be free from discrimination and the right to free expression, and the development of clear guidelines and criteria drawing on international human rights standards that would assist investigators and adjudicators in understanding and giving shape to the crucial interplay and relationship between those rights.
Fifth, given the tension between these two rights, the legal complexities in finding the right balance, and the rapid evolution in the nature and the reach of the online platforms involved, any move to provide a mandate to human rights commissions, for instance, must involve a very serious commitment to ensuring adequate resourcing to support the training, the expertise, the research, the outreach and the education that would be required.
Finally, changes to the role of the Canadian Human Rights Commission or any other human rights bodies with respect to online hate absolutely should go forward as part of wider approaches to tackling the growing concerns about online hate and fulfilling the need, still unaddressed in Canada, to develop a national action plan on gender-based violence, including through the ongoing development of a national anti-racism strategy and measures to respond to Islamophobia, anti-Semitism and other religious intolerance.
Thank you, Mr. Chair.
The honourable members of this committee are studying online hatred and what, if anything, the federal government can do to restrict it.
Before we can address how to fix the problem, we first need to ask where the problem comes from and who is best suited to fix it. In a certain sense, the dark corners of the web are a window into the dark corners of the human heart. Greed, lust, hatred, anarchy, covetousness and lies infect the Internet and our hearts as well.
Aleksandr Solzhenitsyn, writing in The Gulag Archipelago, said this:
...the line separating good and evil passes not through states, nor between classes, nor between political parties either—but right through every human heart—and through all human hearts.... And even in the best of all hearts, there remains...an un-uprooted small corner of evil.
Since then I have come to understand the truth of all the religions of the world: They struggle with the evil inside a human being (inside every human being). It is impossible to expel evil from the world in its entirety, but it is possible to constrict it within each person.
Charles Colson, the founder of Prison Fellowship International, builds on this idea in his book Justice That Restores. He writes that there is no more urgent task than to restore the sense of community cohesion and to build a virtuous character into common life and that "without individual virtue, one cannot achieve a virtuous culture; without a virtuous culture, one cannot hire enough policemen to keep order.”
As Michael Novak has trenchantly observed, adapted to a Canadian audience, “in a virtuous culture” we have 37 million policemen and “in a culture that mocks virtue, we cannot hire enough policemen.”
Who is best suited to offer solutions to the problem of online hatred? I don't think the honourable members of this committee realize it, but you have already made a big step in the right direction when, just over a year ago, you amended Bill to preserve the protections afforded to houses of worship in section 176 of the Criminal Code.
Not only did you signal, rightly, that you care about the protection of vulnerable citizens in a state of prayer and worship, whether in a mosque, synagogue, temple or church, but you also preserved protections for the institutions that can inculcate that virtue in individuals so that we can have a virtuous society. If we want that virtuous society, we need to protect churches, mosques and synagogues to continue to preach peace, shalom, shalam. That's where the work against online hate starts. It is absolutely necessary for this committee, indeed all of Parliament, to understand this. Do not undermine houses of worship; protect them and expect good things from them.
However, I'm not suggesting that the state has no other role in combatting violence and the senseless slaughter resulting from seething hatred, as witnessed in New Zealand and Pittsburgh. The Hebrew psalms speak to the proper role of the state. Psalm 72 says of the king:
For he will deliver the needy who cry out, the afflicted who have no one to help. He will take pity on the weak and the needy and save the needy from death. He will rescue them from oppression and violence, for precious is their blood in his sight.
This psalm points to the God-given role of the state to protect from bloodshed and violence the weak and the needy, the vulnerable citizen.
The Apostle Paul, in his letter to the Romans builds on this command. He says:
...the one in authority is God's servant for your good. But if you do wrong, be afraid, for rulers do not bear the sword for no reason. They are God's servants, agents of wrath to bring punishment on the wrongdoer.
A clear application of this biblical passage to online hatred would be that the government does have a role in enacting swift justice to punish a wrongdoer seeking violence against another person or group of people. So where the vitriol of online hatred rises to the level of incitement to violence threats of violence, which are crimes under Criminal Code sections 264.1 on threats, 318 on advocating genocide, and 319 on public incitement to hatred, then the police must act swiftly to investigate, to arrest, to charge and then to prosecute.
Perhaps—and I put this out there as a thought experiment—one impediment to swift action and swift justice on the crimes of advocating genocide and public incitement to hatred are the unusual requirements that the attorney general's consent is needed to proceed. Perhaps, by removing those two subsections, we could increase the ability of police to pursue, without delay, action to stop such crimes from happening.
However, one word of warning that ARPA Canada wants to share is that we are very concerned about overzealous attempts to fix the problem of online hate. We co-signed a letter requesting the justice committee to study this issue with a good faith understanding that we would be able to raise legitimate concerns about what would constitute going too far.
We are very concerned about any attempt to reinstate a hate speech provision in the Canadian Human Rights Act. These provisions have been shown to be ineffective and often abused. They chill freedom of expression and are applied in demonstrably unfair way. Let me give you one example of what some commentators have described as politically correct double standards.
In 2003, in a case called Johnson v. Music World Ltd., a complaint was made against a record label for a song called Kill the Christian. The lyrics of the song were read into the record by the complainant, and included the following, referring to Christians:
You are the one we despise
Day in day out your words [comprise lies]
I will love watching you die
Soon it will be and by your own demise
Kill the Christian, kill the Christian
Kill the Christian, kill the Christian
...The death of prediction
Kill the Christian, dead!
The panel found that while the content and tone of the communication appeared on their face to be discriminatory, there was “very little vulnerability of the target group”, so there was no violation constituting hate speech. Yet three years later, in a case called Lund v. Boissoin, a panel found that a letter published in a mainstream newspaper in Red Deer, Alberta, that made disparaging remarks about homosexuality was in fact hate speech and ordered the writer to cease publishing in future in newspapers, in email, on the radio, in public speeches—including sermons—or on the Internet. The panel chair for both of those decisions was the same person: Lori Andreachuk.
Public policy discussions, I would argue, require as broad and as open an access to expression as is possible. Freedom of expression ought to be such that all citizens feel free to speak about all public policy issues as best they can. We can preserve that freedom, and we must preserve that freedom. By putting finite resources into hate speech codes other than the Criminal Code, the government potentially will distract from true hate speech that leads to violence. That’s a distraction that will not do much to curb the kind of violence we saw in Pittsburgh or in New Zealand.
To conclude, my requests would be as follows.
One, take seriously the protection of other institutions in society that can inculcate virtue in our citizens, including religious institutions.
Two, the state needs to demonstrate swift justice against these crimes. Ecclesiastes 8:11 says, “When the sentence for a crime is not quickly carried out, people’s hearts are filled with schemes to do wrong.” This committee should consider removing the requirement for the attorney general’s consent to prosecute incitements to genocide and public hatred in subsections 318(3) and 319(6) of the Criminal Code.
Finally, we ask that we do not entertain incorporating hate speech measures into the Canadian Human Rights Act. This distracts resources from the more pressing work of preventing violence against vulnerable citizens.
Thank you very much.
I would like to thank the committee for inviting my testimony today as a representative of the Bahá'í Community of Canada. I'm also appearing as a member of the executive committee of the Canadian Interfaith Conversation, a national body that seeks to foster and promote religious dialogue and harmony.
Bahá'ís, as members of a religion that has been Canada since the late 1800s and that has established communities in most localities in this country, are not the targets of online hate in Canada. However, this issue is of particular concern to our community first and foremost because of core teachings of the Bahá'í faith regarding the promotion of the fundamental oneness of humanity and the elimination of all forms of prejudice. Public or private expressions of hatred towards groups of people, whether online or off-line, are inimical to these beliefs.
We have joined with many other faith and civil society groups to call for the study of the root causes and potential solutions to the rising incidence of online hate that has been directly connected to violent attacks on particular groups. Women, Muslims, Jews, Sikhs and racial minorities have been among the most recent targets of violence that was inspired by hatred spread online.
The recent attacks on Muslims at prayer at two mosques in Christchurch, New Zealand; the van attack in downtown Toronto; the attack on Jewish worshippers at the Tree of Life synagogue in Pittsburgh and the shooting at the lslamic Cultural Centre of Quebec City are all recent examples of killers who spent extensive time in digital worlds of hatred.
As Professor Richard Moon has found, “Hate crimes are committed most often...by individuals who have immersed themselves in an extremist subculture that operates at the margins of public discourse, and principally on the Internet.”
Sadly, this is also a problem with which Bahá'ís have first-hand experience in other countries. ln the most egregious case of Iran, a government-supported media campaign of defamation and incitement to hatred has been directly tied to outbursts of violence and murder targeting Bahá'ís. A similar pattern has begun to proliferate in nearby Yemen.
It is clear, then, from a growing body of experience, that the spread of online hatred targeting a defined group can lead individuals, who are perhaps already inclined to bigoted thinking, to act with violence.
What should be done about this problem? Any lasting solution has to somehow take into consideration the roles and responsibilities of individuals, groups, corporations and the institutions of government. With regard to government, I will refrain from commenting on the question of whether section 13 should be reinstated or whether the hate speech provisions in the Criminal Code are sufficient to prosecute cases of online hate. There is a delicate balance, as others have mentioned, to be struck between guaranteeing the free exchange of ideas in the public sphere and sanctioning those whose aim is not to advance truth, but to spread hatred. Clearly the government and, by extension, the courts have a role to play in prosecuting cases of hate speech.
lt is also increasingly clear that policy intervention by government is needed to mitigate the impact of the more egregious misuses of online social networks. Despite recent steps taken by Facebook and Twitter to remove certain accounts, government also has a role to play in regulating these online platforms. Any effective policy intervention must ensure national and local community involvement in determining the standards for online platforms. As David Kaye, the UN special rapporteur on the freedom of expression, has urged, relying upon international human rights norms rather than the arbitrary judgements of commercial platforms is a better basis for the development of these standards. This includes delineating the rights and responsibilities of users, as well as safeguards to ensure that freedom of expression is not unduly curtailed.
However, government action by itself is insufficient. There is also a role for civil society in pushing these companies further in the right direction, beyond the letter of the law. One organization, Change the Terms, has called on tech companies like Facebook, Google and Twitter to take steps to curb the use of social media, payment processors, event-scheduling pages, chat rooms and other applications for hateful activities. There are concrete steps that can be taken by these powerful companies, which are accountable both to government and to the wider society, that can create a healthier public sphere for all of us.
Finally, there is an educational responsibility that falls to community leaders, teachers, families and parents. Changes in the attitudes, values and behaviours of individuals are a necessary part of the solution.
The online environment is ultimately a mirror reflection of our society. We live in a world in which prejudice against certain groups is propagated by many people, even those who do not intend to provoke violent reactions. Religious leaders have a particular responsibility to educate people, to promote fellowship and concord and not to stoke the fires of fanaticism and prejudice. Young people especially need access to education that teaches them from the earliest years that humanity is one family. They require education and mentorship that go beyond a simplistic condemnation of hatred or a set of dos and don'ts regarding their online activities. Youth need to develop a strong moral framework on which to base decisions about their online activities, about which content they choose to consume and share, and about how they use their powers of expression when communicating with friends and strangers online.
Any long-term solution to online hatred has to give due consideration to this generation that is coming of age in an information environment that is confusing, polarizing and indifferent to their moral and ethical development. From where do young people learn to express themselves, using language that is intended to educate rather than to dismiss or denigrate? As they seek to learn about social issues, how will they know the difference between intelligent criticism and hateful propaganda? What ethical tools and social support are we giving to them as they navigate the online world?
Answering these questions is a responsibility that falls not only to government; it is part of a response to online hate that we must all accept to carry forward.
Very quickly, I'd like to speak about the issue of cyber-vandalism and the reporting of that, which was something I brought up before.
If someone were to, on a hate-motivated basis, physically vandalize a structure belonging to the Armenian community, I would know how to report that fairly easily. If someone were to graffiti something onto my community centre, I know exactly who to phone from the police and how to report that.
However, if someone vandalizes or hacks into my website and replaces the content with the same sort of propaganda that they would have spray-painted onto my community centre, I have no idea to whom to report that properly. For instance, the website may be for a Canadian organization but hosted in the United States or in a different country. The perpetrators could be from anywhere in the world, and there's no way of identifying them. They have complete anonymity. We don't know if they've ever even entered Canada.
They may steal data from us. They've stolen email lists, for instance, and sent hateful messages to people on those lists. We have no idea, again, to whom that should be reported properly. There should be some information provided to community organizations on how to report these sorts of incidents properly. These should be tracked in the same way that physical vandalism has been; otherwise we under-report the incidence of hate crime in Canada.
I think that's one thing I would very much recommend, that the committee work with law enforcement, especially federal law enforcement—I think they're best positioned to deal with this—to figure out how this reporting should be done, and to which law enforcement agencies, and to ask them to prioritize dealing with these types of incidents as well.
Thank you, panellists, for being here today.
One of the challenges in online space is that people see articles and published material and believe them to be true. They don't often look at the source. I think there is a general distrust in mainstream media in our country and there is such a significant need for media literacy for people.
Mr. Cameron, you were talking a bit about education. I think a core piece of what we're looking at here is people understanding how to identify what is a legitimate piece of media and what is something that shares perhaps hateful messages and things on the Internet, and how to distinguish between those things and determine that.
I don't think that people generally have those types of analytical skills. The members of Parliament who sit at this table receive many emails from constituents who send us a link to something and ask what it is about. We're often able to debunk it or say that it isn't a credible source, but it's a very significant challenge.
I wonder if you could each speak to the role you think our education systems in Canada should play in combatting hate in general, but certainly online hate. I have two teenage sons, and I don't believe that our education system is keeping pace with the culture, specifically, the online activity and technology. Our kids are on platforms that we don't even know about. There are these corners of the Internet where they are sharing information, and there probably aren't many parents or adults who are even in those spaces.
I wonder if you can talk about how our education system could address that, and how we can address that gap for adults as well. Most of us in this room saw the Internet come and we got onto Facebook and all these different platforms and used it for whichever purposes—sharing things with family and friends—but it certainly has grown to a place where even our understanding of what's there and what's happening there is very limited.
I wonder if you can speak to the role you think education should play in this.
That's generous. Thank you, Ms. Khalid.
There are about four minutes left.
First of all, thank you to all of you.
It's good to see you again, Shahen, in Toronto. Thank you for being here. Your contributions are invaluable.
I have a few points.
Thank you for the reference to Bill and the amendments that were made. There have also been amendments by the government in respect of the security infrastructure funding, which is the funding we provide to increase surveillance capacity and security in places of worship. Unfortunately, these things have all been triggered by horrific events. It was doubled after the Quebec mosque shooting. It was doubled again after the New Zealand shooting. However, I think that's important.
Mr. Neve, you also mentioned that the anti-racism secretariat money in budget 2019 is dedicated to developing a robust anti-racism strategy. There are issues that all of us care about. I, in particular, care deeply about these kinds of issues.
Mr. Schutten, I want to ask you a question, because it's really germane to what we're studying here. Is the issue with section 13 a problem—you seem very well-versed legally, so I'm going to put to you a very strict legal question. The analogue to that provision was tested by the Supreme Court of Canada in its Whatcott decision, and section 13 was upheld. There was a minor amendment about how you should be able to belittle. Belittling is in the domain of free speech.
However, is your issue with the text of section 13 as it then was, which has effectively been upheld by the Supreme Court of Canada, or is the issue you raised—and raised poignantly—with the decision-making that took place? As a lawyer, I know that inconsistent decision-making is the bane of any litigation lawyer. Where's the rub there?
The Supreme Court, in the Whatcott decision, said that having a hate speech provision in the Saskatchewan Human Rights Code was one way a government could combat hate speech and so on, but it did not say that it was a necessary provision. So it's not that it's constitutionally obligatory for a government to have a hate speech provision. I would say that this is what the Supreme Court was pretty clear on.
That said, I think that even post-Whatcott, the better policy decision is to not be policing expression, even Whatcott-style expression, because it does not rise to the level of violence. It doesn't rise to the level of the types of things that have triggered this very hearing.
I do not defend Mr. Whatcott's way of expressing himself at all, but he's a person who is trying to engage in a public policy debate. He's doing it poorly, but that's what he's trying to do. Also, his engagement is particularly around political speech, and of all free speech, political speech is the most important to protect. We can quibble about whether or not the Supreme Court got it right, but I think it's playing a very fine line around that freedom of political expression.
I can share with the committee, as well, a peer-reviewed journal article I've published with a law professor from Osgoode Hall on the Whatcott decision and on how we think the Supreme Court got it wrong.