Thank you. On behalf of Egale Canada, I would like to thank the committee for the opportunity to speak today on this critical question of online governance.
Ensuring that there are meaningful protections against online hate and harassment, while also maintaining our commitment to the fundamental Canadian value of freedom of expression, is both difficult and of utmost importance. As part of its mission, Egale works to improve the lives of LGBTQ2SI people in Canada, by promoting human rights and inclusion through research, education, community engagement and public policy contributions.
I am the chair of Egale's legal issues committee, which is a made up of LGBTQ2SI lawyers from across Canada. I am also a partner at Power Law, with a practice focused on constitutional law. I am grateful for the assistance of other members of the legal issues committee in preparing these remarks, particularly Professor Samuel Singer, Daniel Girlando and Melissa McKay.
Online hate poses a significant threat and is therefore an issue of particular concern to the LGBTQ2SI community. According to a Statistics Canada report on police-reported hate crime in Canada for 2017, hate crimes in general and hate crimes targeting members of the LGBTQ2SI community in particular are on the rise.
Police-reported hate crimes targeting sexual orientation rose 16% in 2017, compared with 2016. Crimes motivated by hatred of sexual orientation accounted for 10% of hate crimes. Police-reported data on trans-targeted hate crimes is suspect, as nearly half of reported incidents—15—occurred in 2017 alone, likely corresponding to the 2017 addition of gender identity and expression to the Criminal Code. We do know, however, from Trans Pulse, that 20% of trans people in Ontario have been physically or sexually assaulted for being trans. We also know that many survey respondents did not report these assaults to police. In fact, 24% reported having been assaulted by police.
Further, a significant proportion—15%—of hate crimes that are also cybercrimes target members of LGBTQ2SI community. Of particular concern is that hate crimes targeting members of the LGBTA2SI community are marked by violence. Hate crimes targeting sexual orientation were more likely to be violent than non-violent. Victims of violent hate crimes targeting sexual orientation and aboriginal peoples were also most likely to have sustained injury. Similarly, hate crimes targeting trans or asexual people were very often violent, with 74% of incidents involving violence.
In short, online hate is of significant concern to the LGBTQ2SI community, because people are committing ever more acts of hate against us, and, all too often, those who hate us want to hurt and kill us.
The Supreme Court of Canada's unanimous decision in Whatcott, a case that specifically dealt with hate speech targeting homosexuals, and in which Egale intervened, succinctly summarized the real harms caused by hate speech. First, hate speech subjects individual members of the targeted group to humiliation and degradation, resulting in grave psychological and social consequences. Second, hate speech harms society at large, by increasing discord, and, even if only subtly and unconsciously, by convincing listeners of the inferiority of the targeted group.
The regulatory response to online hate should also take into account how certain types of speech are fundamentally at odds with the values that underlie freedom of expression, including the search for truth, and democratic participation in the marketplace of ideas.
As the Supreme Court of Canada explained in Whatcott:
a particularly insidious aspect of hate speech is that it acts to cut off any path of reply by the group under attack. It does this not only by attempting to marginalize the group so that their reply will be ignored: it also forces the group to argue for their basic humanity or social standing, as a precondition to participating in the deliberative aspects of our democracy.
This insight has considerable resonance for members of the LGBTQ2SI community, who have often been portrayed as morally depraved child abusers, as was the case with some of the flyers in Whatcott, or in debates concerning access by trans people to bathrooms corresponding to their lived gender.
Beyond online hate speech, other forms of targeted online harassment are also of vital concern for the LGBTQ2SI community. Today, I will focus on two examples that cause serious harm.
First, cyber-bullying poses a particular threat to LGBTQ2SI youth. According to a 2016 Statistics Canada report on cyber-bullying and cyberstalking among Internet users aged 15 to 29 in Canada, more than one-third of the young homosexual and bisexual population were cyber-bullied or cyberstalked, compared with 15% of the heterosexual population. Cyber-bullying and cyberstalking were also correlated with substantially higher rates of discrimination, as well as physical and sexual assault.
According to a 2015 a Canada-wide survey by UBC's Stigma and Resilience Among Vulnerable Youth Centre, 50% of older trans youth experienced cyber-bullying.
The effects of cyber-bullying on LGBTQ2SI youth are serious. A 2018 systematic literature review by Abreu and Kenny found that these included suicidal ideation and attempt, depression, lower self-esteem, physical aggression, body image issues, isolation and reduced academic performance.
Second, aggressive trolling of members of the trans community has become a serious problem. Media reports indicate a growing trend, with members of the trans community who engage in public discourse online being targeted by an overwhelming volume of transphobic messages on online platforms. This form of harassment is marked by both the volume and the vitriol of the material, which has included alt-right memes and Nazi propaganda.
Further, the practice of doxing, collecting personal information on a person’s legal identity or Internet activities and publishing it to hostile publics, exposes members of the trans community to specific harms, such as revealing their deadnames, and to broader discrimination.
Such practices chill free expression, as trans people avoid participating in public discourse out of fear of reprisal.
A Norwegian study released in March “found that those who participate in online debates and comment sections, are more likely to receive hate speech than those who don’t participate online to the same extent.”It also found that members of the LGBTQ community are more likely than others to withdraw from political debate as a result.
While online hate and harassment are issues of particular concern to the LGBTQ2SI community, restrictions on online speech can also disproportionately affect that community. We know from the Little Sisters saga, when Canadian border officials equated representations of homosexuality with prohibited obscenity, that the policing of restrictions on speech can wrongly discriminate against unpopular viewpoints and groups. We also know that the Internet has become an important part of helping LGBTQ2SI individuals find or construct their identities.
In short, the issues are complex, and the stakes are high. A federal government response is needed. That response should be informed by careful study and will almost certainly require action on many fronts.
At this stage, it is evident that better regulation of online platforms is needed, but we cannot simply transpose old ideas onto this new forum. Requiring content monitoring by online platforms may be appropriate. However, there is a need to balance making platforms responsible for content from which they profit and the risk of incentivizing sweeping censorship. Creative solutions should also be explored to prevent online platforms from using algorithms that magnify and direct users towards ever more hateful and extreme content.
Additionally, more can be done through public education and information campaigns to strengthen online media literacy; to ensure a better understanding of what amounts to hate and harassment, since inflammatory and wrong understandings fuel distrust of initiatives to promote tolerance and inclusion; and to ensure broad public knowledge of the historically devastating effects of hate.
Finally, in any government response, hateful speech directed towards members of the LGBTQ2SI community must not be treated less seriously than speech directed towards other groups.
Egale Canada therefore calls upon the federal government to take a broad approach to developing a robust toolkit to combat online hate and harassment.
She is the first ever female mayor of North Grenville and is here today as an adviser.
Founded in 2001, Equal Voice is a national, bilingual, multi-partisan, not-for-profit organization dedicated to electing more women at all levels of government in Canada.
We are very concerned about how online hate is negatively affecting women's participation in politics.
I would like to begin by bringing to your attention a study commissioned by Equal Voice in November 2018 called “Votes to Victory”. The study, conducted by Abacus Data, examined barriers to women's participation in politics. This study was wide-ranging and, while not focused directly on online hate, had some relevant findings. For instance, the study found that 76% of men and 79% of women think that women politicians are treated differently then men, and 84% of women felt that politics is not friendly, which tied for the top reason women gave for not wanting to be involved in politics, along with time away from family. I believe that this perception of unfriendliness is in large part due to the online interactions involving women politicians that people observe far too often.
This leads me to my second point—to highlight the online hate experienced by women elected officials every day. Many brave women from all parties have spoken about this openly, or posted about it, including MP Rempel, MP Cesar-Chavennes, MP Ashton, and the Honourable Catherine McKenna, to name just a few. Unfortunately, the list gets longer every single day. The gender-based online hate they have experienced simply for doing their jobs in unacceptable. If we want more women in Parliament and in legislatures across Canada, which is what Equal Voice is working towards, then we need to strengthen protections for women politicians and for women candidates.
The issue of women in power, or those running for office, being attacked online is not a new one. In politics, it is important to have online fora where people can have heated political debates, and places where people can disagree with one another.
However, as social media evolves, so do the hateful attacks, bringing forth challenging times and a need for our laws and policies to evolve with them. There is no doubt that Canada needs to enact and enforce stronger consequences for initiating or participating in online hate.
Mr. Chair, I would now like to discuss a few of the ways that Equal Voice is working to combat the issue of online hate directed at women politicians and those aspiring to be politicians.
In 2014, Equal Voice launched its #respecther campaign, to expose the everyday sexism experienced by women politicians across Canada. Events were held around the campaign to equip women on how to address these attacks, and to discuss what can be done to eliminate them.
Recently, in April 2019, Equal Voice launched a modern safety guide developed in partnership with Facebook Canada, available to everyone on our website. It is particularly relevant for all current and aspiring politicians. The guide provides practical advice on how to stay safer online by using existing tools that many of us are unaware of. We hope this guide will be particularly useful in the upcoming federal election.
Earlier this year, we partnered with the Public Policy Forum on an event discussing online hate. Conclusions from that discussion were clear. We must work with governments and the social media industry to find better ways to reduce online hate.
Finally, through our Systemic Change initiative, Equal Voice is working to change the culture within legislatures themselves. This project is focused on working with provincial legislatures across Canada to reduce barriers to women's participation. Many of the tools developed for this project, such as sample anti-harassment policies, are also relevant at other levels of government.
We are proud of the steps that we have taken at Equal Voice, but the actions of small not-for-profit organizations like Equal Voice will never be enough. We need the government to act to combat online hate.
Equal Voice thanks the Standing Committee on Justice and Human Rights for taking on this important study. We look forward to your report and to assisting you in whatever way we can.
Thank you, Mr. Chair, for this opportunity to give these opening remarks. I look forward to the committee's questions.
Thank you for inviting me to address this committee today.
My name is Morgane Oger. My pronouns are she, her, and hers. I am the founder of the Morgane Oger Foundation. We work to reduce the gap between Canada’s human rights laws and the experience lived on the ground of persons facing systemic discrimination, through advocacy, education, and legal means.
Although this presentation specifically addresses anti-transgender hate, we believe that the basis of our argument applies equally to all types of online hate, regardless of the motive.
Hateful acts are devastating for the victim, who feels the rejection that she has difficulty getting rid of, and who often suffers a lasting psychological impact as a result of the trauma.
Neither insults nor the expression of divergent points of view constitute online hatred. It's the harassment. It's the incitement to discriminate. It's the deliberate publication of misinformation in order to deceive the public by giving people a sense of misplaced indignation. Hatred is meant to “pathologize” or demonize members of a community because they are who they are.
Hate propaganda acts by creating anger or disgust towards a person or group because of their identity. Hate speech incites discrimination or violence by any means available.
Canadian websites, such as The Post Millennial, Feminist Current, Woman Means Something, Canadian Christian Lobby, Culture Guard and Transanity, publish incitements to discriminate through misinformation in articles aimed at turning public opinion against the transgender community. Twitter and Facebook are awash with anti-transgender misinformation intended to justify anti-transgender discrimination.
During the 2017 B.C. general election, social conservative activist Bill Whatcott travelled to Vancouver with 1,500 flyers in hand, which urged people not to vote for me because I was transgender and for no other reason. He distributed them in the riding where I was contesting. The flyers had a photo of me, describing me as a biological male, and claimed that I was promoting homosexuality and transvestism. They stated that transsexuals were prone to sexually transmitted diseases and at risk of domestic violence, alcohol abuse and suicide.
After the election, I complained to the BC Human Rights Tribunal, which ruled in my favour in its March 2019 decision. Since 2017, Bill Whatcott has continued to engage in transphobic and derogatory harassment campaigns against me and others, focusing on a claim that he is being prevented from telling the truth that a man cannot be a woman. Whatcott’s campaign includes blog posts, trips to Vancouver to distribute more flyers, audio and video interviews, a series of social media posts and a number of articles.
Eventually the story was picked up on social and traditional media and took a life of its own, combining with other ongoing issues. Derivative articles stray further and further from the truth, and accusations proliferate.
The effects of Bill Whatcott's campaign against me continue. Two days after the ruling, Bill Whatcott came to a church where I was talking. His harassment is now mostly online and on the radio, but it doesn't end. It's never going to end. The truth is that what Mr. Whatcott did will never go away because it was widely rebroadcast online.
Because of Whatcott’s campaign, I had to teach my children to be wary of people. I had to ask them to keep an eye out for strangers. I had to explain to them why I had to do that. No mom wants to have to sit her children down and say to them that someone might want to hurt her or them because of who she is.
Shortly after the first Whatcott flyers and resulting wave of social media interest, I was attacked by a man who lunged at me at a political event. He tried to crash through a stroller with a child in it to get at me. Luckily, an undercover officer handled him without injuries, because by then, I was already under police protection.
Later in 2017, I was stopped in my back lane because of online commentary. A man, whom I didn't know, wanted to ask me about Whatcott. I was 20 metres from my home at the time, and the individual shared his displeasure. He expressed that what I was doing was wrong, that I should leave Whatcott alone, and that he and his church didn't like what I was doing.
In 2018, Whatcott announced in a Facebook video shot while hunting that he was coming back to Vancouver to distribute more flyers. He boasted about his shooting skills in the video. Vancouver police warned me and my children, and we had to upgrade our security precautions. He was in Vancouver for two weeks.
Due to the proliferation of claims made about me online, I now receive regular threats on the phone and countless threats online, some of them explicitly violent.
Because our provincial courts consider online publications to be a federal matter, and because section 13 of the Canadian Human Rights Act was revoked in 2013, there are no human rights measures in Canada today governing hatred online. If Whatcott had restrained himself to only share his flyers online through Facebook, Twitter, or his website, my complaint against him at the BC Human Rights Tribunal would have been impossible.
However, in 2013 the Supreme Court of Canada unanimously affirmed the legitimacy of human rights legislation that restricts hate speech in its Saskatchewan v. Whatcott decision. Furthermore, the Federal Court of Appeal found section 13 to be constitutionally sound in 2015, after it was repealed in Canada v. Lemire.
The current gap in Canadian human rights law at the federal level enables the publishing of material on websites and social media that is prohibited from being publishing in physical form. For online hatred, the only remedy is a criminal complaint, which has a very high bar for conviction and can require special approval from a province's attorney general. Canadians need a civil recourse that effectively deals with hate publications that can reach wide audiences like they can online.
Bill Whatcott is quoted, in Oger v. Whatcott, as estimating that the online version of his flyer reached approximately 10,000 people. His future posts were widely distributed and cited in socially conservative circles in Canada and the U.S.
Another anti-trans activist, Meghan Murphy, has had over 100,000 views on her anti-transgender videos filmed in Vancouver, a city where, if they had been put to paper, it would have broken the law.
Dozens of articles on the website Feminist Current get 1,000 shares each as they eviscerate transgender women, specifically using disinformation to advocate against our existing rights.
Canada's gap in online hate legislation also has an impact outside of Canada. We have Canadian websites inciting anti-transgender hatred in other countries where legislation is being considered, for example, in the United Kingdom and Scotland right now, and this is originating from Vancouver. It is unbelievable that we are participating in preventing other people from accessing equality. Because of our legislative gaps in regard to online publications, Canada is exporting this anti-transgender hatred. We're inciting prohibited discrimination to other countries.
The Morgane Oger Foundation has some recommendations. First, we recommend that the Canadian Human Rights Act be updated to address online hatred and incitement to discriminate on prohibited grounds; second, that any online material that can be produced and then retrieved on demand for display in a browser or device should be considered in the same way as if published on paper. As we move away from paper, our laws need to adapt. Therefore, third, all social media platforms doing commerce in Canada should be required to meet or exceed Canada's human rights laws as they pertain to publications. Fourth, because display screens are the modern equivalent of paper, when they are fetching information stored on a media for the purpose of displaying it, they should be treated as publications. Fifth, publications based on the storage of material on a media for the purpose of displaying it on demand should be handled within the same jurisdiction to keep the cost of enforcement low. Finally, when an individual or organization publishes material or allows it to be published, or when the consumer is in Canada, Canadian hate laws should apply.
Thank you very much for your consideration today.
My name is Ricki Justice. I am the Acting Chair of the Pride Centre of Edmonton.
Our mission at the Pride Centre of Edmonton is to provide supports that respond to the needs of people with diverse sexual orientation, gender identities and gender expression and of the people in their lives. We really work with the most marginalized people in our community, especially with youth.
What we are seeing right now is that youth are taking their own lives and that online bullying and hate have a significant role in suicide in youth in our community. So many of our youth spend a lot of time in the online world that it becomes central to their social lives, so feeling hatred and anger against them through an online venue has a significant impact on their mental health.
Many Albertans who live in rural or remote locations may not have structured LGBT communities or local support, so they rely heavily on online support groups that are affected by continued online hate.
For one of our service users, the negativity that gets directed toward them through their online time, whether that's through video game chats, Facebook or other social media, has played a role in multiple suicide attempts, which they have thankfully survived. This is our daily reality.
Mainstream media has a role in reinforcing negative messages about certain groups. In my day job I work at the Edmonton Mennonite Centre for Newcomers, where we have also seen online hate towards immigrants. Mainstream media plays a major role in reinforcing negative images of refugees, for example, and the LGBTQ2S+ community.
An example of this was during the recent cancellation of the pride festival in Edmonton when a group called Shades of Colour was blamed for the cancellation because they were protesting and asking for pride festival to refocus on queer, trans, black, indigenous and people of colour who are still fighting for equity in our community.
This group received.... Well, it was quite a horrible online hate campaign, including death threats, that resulted in their basically locking themselves up in their homes and feeling unsafe in their own community, which tells me that online hate really is real-world hate and that the two go hand in hand.
We also realized through this example that there is racism within the LGBTQ2S+ community and that there is a general lack of understanding of intersectionality and diversity in our community. Also within our community we find that people are hesitant to report online hate because of a fear of police and their systemic mistreatment historically, so they don't come forward.
Basically, I am advocating that we address root causes of online hate in the real world, such as social isolation, poverty and lack of education, but the Canadian government also needs to set clear expectations for social media platforms to provide information to the public regarding harmful speech on their platforms and their policies to address it.
I was very happy to hear announce that there will be a digital charter coming out at the end of the month, and I look forward to seeing what actions will be taken as part of that.
I would recommend that illegal content on these platforms be removed as quickly as possible, within 24 hours. I know that other countries have such regulations and that platforms take measures to dissuade users from repeatedly uploading illegal content, so it's not just taking the content down; it's making sure that the content isn't put back up again.
In Canada there is also a lack of civil society research on harmful online speech, and I think we need more of that so that we can have good evidence-based policy.
We also need public education about how to report online hate. The LGBTQ2S+ community needs to know they will be treated equitably if they report online hate, and police need to know how to handle these reports consistently.
Digital literacy for youth is very important to help them develop understanding about the sources of news but also to help them recognize and reject racist, sexist, homophobic and religion-based hate content. We also need to foster inclusivity in schools.
Last, we need to address the mental health impact created by harmful speech online through community-based mental health care supports.
Thank you very much.
Thank you, MP Barrett. It's good to see you in this setting. I usually see you in our home community.
I think it's twofold. We have a culture issue. Until very recently, Equal Voice was part of a conversation where women used to accept that the price of being in politics, and being under-represented in politics, was that you would be the target of some online hating and bullying. That just went along with the job.
What I think we've seen recently amongst all parties in most legislatures is that we are at a point where we think this is unacceptable and that no person's rights, regardless of their gender, their cultural background, or their sexual orientation, should be subject to online hate, or analogous experiences of hate, as a consequence of basic identity considerations.
What's good is that the conversation has evolved. What's challenging—and it's so great that this committee is taking on this work—is the reporting of these incidents. I recognize that social media companies are doing better at giving users control over how online hate is received. I always give this example. In my own recent election campaign, I ran a Facebook page, which is pretty common for a candidate at any level of government. I had far more control than I even understood.
While I was being trolled—minimally, by the way—I actually had a remarkably positive experience as a candidate, not just because I won but also because the dialogue was largely respectful online and offline. I was putting a lot of focus on the online aspect. I had control when trolling began. These were things we would consider to be out of order in any regular political campaign. My status as a mother was being challenged. They said I couldn't be a mayor and a parent and three children. Some of these assertions were really ridiculous. They started to go in a direction that was challenging.
Social media, Facebook in particular, gave me control over my platform. That was super- important—not for censoring but to take out comments that were unwarranted. It's a very frustrating experience for elected women to go beyond that mechanism, because reporting is very challenging. Social media companies are getting better at responding, but there is no standard.
I think you've heard around this table that we need a standard. Whether it's a digital charter or a regulatory framework that stipulates how and when social media companies can take action, I think a standard is incredibly important. We also know that through the Canadian Human Rights Commission we have lots of mechanisms. The bar to demonstrate and prove hate language is now criminal. We have other mechanisms that Canadians would not have been able to utilize in the past. There's a loss there in terms of how you ultimately take it on.
We were quite involved in Newfoundland's finance minister's journey as a woman who was the target of online hate. At the end of the day, as you might know, she left politics maybe earlier than expected. Part of that, or all of that, was because she experienced heightened degrees of frustration owing to excessive bullying and hate language directed her way, not because of policy but because of body size, gender, and familial status, which in the end made it untenable for her to serve in public life.
Certainly, I think the reporting mechanisms have to be easier. The responsiveness has to be better. I think we need to set a standard in Canada, and that's what's really missing.
This is a gentle nudge to my Conservative colleagues, as hopefully they would vote for it in a future time when you need more money.
Morgane, thanks for sharing your raw and real comments and for creating your foundation. It's important. It's not easy, but it is critical. Keep being a voice for the voiceless. We need you to do more, and I know you will.
Ricki, thank you for being an outstanding leader and a voice in a time that has been very difficult for the LGBTQ2 community in Edmonton. It's not easy when a community has disagreement within itself. Your work at the Pride Centre of Edmonton has been exceptional, so thank you. Thank you for coming out from Edmonton today.
Colleagues, tomorrow we and others will mark the International Day Against Homophobia, Transphobia and Biphobia. Fondation Émergence started this day 16 years ago. I can't believe we're here in 2019, 50 years after the decriminalization of homosexuality, with so much work left to do.
I'm in a reflective mood. I'm 45% sad and 55% hopeful and resolved that we're going to get through this. I think we need to reflect on difference and diversity, and how difference leads to diversity, which is great. How does diversity get twisted into being the other?
Just to be who you are, just to be who we are, we go through the fires of hell and we risk losing it all. It's about being different in a society that wants everybody to conform. Everybody on the panel today is linked, because the origins of biphobia, homophobia and transphobia are found in misogyny. As soon as somebody believes that being feminine or less masculine is somehow a bad thing, the phobias come up.
I will get to some questions. I don't usually do this, but I'm in a mood.
We have to figure this out. I don't know if it's progressives or people who don't hate, or I don't know what it is, but if we could just come together and get to the root of how people are othered, then I think we stand a chance. We shouldn't give the hate platforms any more oxygen, full stop.
I want to ask you some questions. How do we stop the hate from having a platform? In the United States, if you take a look at privacy laws, you'll see that there is a $40,000 U.S. fine for every privacy breach. What if we held the platforms accountable every time they posted something hateful online? For every view, there could be a $25,000 fine. Don't you think they would move quickly? Would that kind of fine system work to actually move the platforms to do more, in your opinion?
We'll have a quick yes-or-no round. Jennifer, go ahead.
Thank you all so much for being here today.
This panel is really critical, I think, as a diversity of voices, certainly in talking about how we tackle this in different ways.
Morgane, the fact that you were successful and referencing the case, I think, is important. Talking about what you're putting online for women is important, as well as the services you provide in Edmonton.
I thank you all for the work that you're doing. It's incredibly important. As the only woman politician currently sitting at the table, I certainly have experienced this. I've had my children threatened. I know what that feels like, and I know how that feels in your home.
First of all, you're all courageous—and Morgane, certainly you for being here and sharing your very personal story. I thank you for that because it's going to take the courage of people to stand up and fight this together, to battle it by exposing themselves more than they already have. I thank you for that. Your efforts are incredibly important on behalf of all Canadians, so I thank you for that today.
It really is shocking when you think about what you pointed out: that things are allowed online that are not allowed in print. If something was handed to us, we could challenge that. We have a way to do that. We know where to go. However, when it's online, things just seem to get lost. People attempt to report, and the reporting system is certainly something that we could study entirely on its own.
Ricki, you highlighted newcomers and immigrants who are nervous to report, LGBTQ people who are nervous to report and women who are nervous to report because then it puts the spotlight on them. We see the horror stories of what happens when people put themselves out there.
Morgane, you highlighted what your family has been through, which is unacceptable in our country.
First of all, I want to congratulate you on receiving the meritorious service medal in 2018 for your service to Canada on the matter of LGBTQ2+ rights. Thank you for that and, specifically, your transgender human rights work for sure.
I want to ask you all two questions—a little more about why you feel that the online publications are more harmful than the physical. What is the difference between the harms that people are experiencing online versus something that they would see in a publication? Second, how do you feel that limiting online hatred would help your work? I can imagine the work that you would all be able to do if you didn't have to focus so much of your efforts on combatting online hate.
Maybe I'll open it with Morgane because I started with her, and then we'll work down the panel.
For people in public life, it's the degree to which you can be individually targeted. It's why social media companies are doing better—with the Twitter mute function, for example. On Facebook, if you run a campaign page, you can actually hide someone's comment. They still believe it's there. They still believe their hate is out there in the world, but in fact it's been hidden. You have more control. I think it's the degree to which it's individually targeted and it lasts and you can't counter it effectively. Clearly, the viral effect is significant.
I'm EV's past executive director. I can't tell you how many calls I would field where this was among the top three questions: What will I do when—it's not even “if”—I am the target or my family is the target? It's all online. No one's thinking about a flyer in their community. People are wondering what do they do when they're the target.
It's a little bit better now, but barely. There was very little we could offer. Women have internalized this notion that if they're going to run for office and if they have intersectional identities that are also subject to being targeted or vilified, they expect that this is part of public life.
I think it's good to be realistic about public life. I don't think we ever want to say to women that this is all roses and they'll have a great time. It's incredibly satisfying and now that I'm serving in elected life I can say that.
We are always up against reframing politics and the political journey because of all of the crap. So much of it now dominates your online engagement, which is absolutely required to get elected. In my own experience, I actually don't think my electoral campaign would have been viable without online engagement. I had a huge reach. It is so incredibly powerful, but then the capacity for it to be turned against you is equally, if not more, powerful. That's the dance you're doing.
With better rigour and with better standards, at least we can say there's something to work with. The non-criminal administrative route to pursuing justice is also very helpful, I think.
Thank you to the committee for inviting the Canadian Civil Liberties Association to participate in its study on online hate.
As you all know, the CCLA is a national, non-profit and non-partisan public interest organization with over 50 years of experience in promoting respect for and observance of fundamental human rights and civil liberties. CCLA is deeply committed to protecting equality rights for all and has campaigned against discrimination in its many forms. Freedom of expression has also been a cornerstone of our work since the organization's inception.
Any attempts to regulate online hate will inevitably bump against freedom of expression, because contrary to what some say, the precise contours of hate speech are not easily discerned. As a result, we have argued that the Criminal Code prohibition on the wilful promotion of hatred, and prohibitions on hate speech contained in human rights codes, are vague and unreasonably restrict freedom of expression. In our view, a mature democracy like Canada does not achieve equality by limiting freedom of expression.
I'd like to start by addressing what was formerly section 13 of the Canadian Human Rights Act, as I understand it is a subject that the committee has a great deal of interest in.
CCLA appeared before the Senate committee on the bill that ultimately repealed section 13. We supported the repeal, and continue to believe that asking human rights tribunals to play the role of censor does not fit well with the functions of tribunals.
Human rights tribunals are focused on dealing with discriminatory acts in a variety of areas. In order to address issues of systemic discrimination and to help achieve substantive equality, they need to interpret human rights statutes liberally. However, when it comes to hate speech provisions, our Supreme Court has made clear that only a very narrow interpretation is appropriate, in recognition of the fact that a broad restriction on hateful content would unduly or unreasonably limit freedom of expression.
As a result, only the very worst and most extreme forms of speech are caught, even though we know that many more subtle forms of offensive messaging may have harmful impacts.
A human rights commission or tribunal charged with prosecuting hate speech is put in a situation of conflict. In their core anti-discrimination work, they seek to protect minority groups, but in addressing hate speech complaints, they may often have to tell such groups that a very offensive expression simply doesn't rise to the level of hate speech for the purposes of the act.
In our view, section 13 was not an efficient or effective way of dealing with online hate. I'm aware that some witnesses you've heard from have suggested that section 13 should be reinstated in either its original form or modified in some way, but for the reasons I've just outlined, CCLA disagrees with this approach.
More broadly, I want to emphasize that while the committee may be considering how the Canadian Human Rights Act or the Criminal Code can be amended to deal with the problem of online hate, it should consider that these and other strictly legislative tools may not be well suited to addressing the very complex issue of hatred, because, of course, underlying the issue of online hate is the issue of hatred more broadly.
Canada's experience with prosecuting those who are alleged to promote hatred shows that these individuals often use their prosecution as a way to further promote their message and to cast themselves as martyrs for free speech and gain a wider audience. Pursuing haters through our legal system can have counterproductive effects.
CCLA believes that the government does have a role to play. The government should focus efforts on education and counter-speech. The Canadian Human Rights Commission currently has a relatively narrow public education mandate. That body or another entity could engage in much more robust education efforts, including programs that bring people from diverse communities and backgrounds together in ways that can help to address the root causes of hatred.
There's also a need for education around digital literacy. We need to be focusing on ensuring that young people understand that content on the Internet can come from anywhere and everywhere, that not all sources are credible and that information can be easily manipulated. Organizations like MediaSmarts are already doing excellent work in this area, and I understand that some of their work on online hate is being done with support provided by Public Safety Canada. More work like this, and more support from the government on work like this, is what we recommend.
The government also has a role to play in countering hateful content online with its own counter-speech that focuses on messages of inclusion and equality, and that provides resources and support to groups that engage in counter-speech.
Because it would be more interesting to try to answer your questions, I'm going to stop there. Thank you again for inviting us to appear.
Honourable members, thank you very much for the invite to appear here today.
I'm with the Justice Centre for Constitutional Freedoms. We're a not-for-profit, non-political, non-religious organization. We're dedicated to the protection of the fundamental freedoms and constitutional rights of Canadians.
I'm going to talk about three things this morning: first, the problem with setting out to censor hate without proper parameters; second, the reality on the ground with human rights tribunals in the context of this study; and third, the dangers of state censorship and big tech combined. I will then provide you with four recommendations.
Like the Canadian Civil Liberties Association, the starting point for this conversation should properly be the Constitution of this country. That is Canada's foundational document, but it is not mentioned anywhere in the outline for this committee study and most of the witnesses before the committee made no mention of it except to urge you to infringe it as fast as possible.
Set out in paragraph 2(b) of the Canadian Charter of Rights and Freedoms is the fundamental right to have an opinion and to express it. This committee is studying online hate and preventing online hate, but it has not established parameters or definitions as to what constitutes hate. It behooves the committee to ask, what is hate and what is the enticement of hate? The reality is that crying hate has become one of the favourite tools in some circles to prevent dialogue and discredit disagreement.
You disagree with my religion, that's hate. You disagree with my politics, that's hate. You disagree with my gender identity, that's hate. You have concerns about immigration, resources and security, that's hate. If you're a single woman working out of your house as an aesthetician and you aren't comfortable waxing a pair of testicles, that's hate. You want to peacefully express your opinions on a university campus regarding abortion, you can't, because that's hate.
You just heard from a previous witness who said Meghan Murphy is hate, Feminist Current is hate and The Post Millennial is hate, all without any examples whatsoever. Therein lies the problem.
The same witness demonstrated in front of the Vancouver Public Library and compared the feminist talk going on inside to a Holocaust denial party, because the women were talking about the interests and rights of biological women.
Lastly, but not least, U.S. Senator Elizabeth Warren, within the last couple of days, described all of Fox News as hate.
None of this is hate. It's a disagreement and it's a dialogue, but it's not hate. It's protected speech under the Constitution and it is entirely legal.
I alluded to the woman in the waxing case. You've heard about this case. It made international news. The Justice Centre represented this woman. She's a single woman. She works out of her home. She has a small child and she provides aesthetician services to the community. She advertises on the Internet and tells the world that she provides waxing services to women.
She's trying to make ends meet. She doesn't have the supplies to wax somebody's scrotum. She doesn't really want to work on somebody's scrotum. She didn't start out intending to work on somebody's penis. It was irrelevant to her whether that person thought they were a man or a woman, because it was about physiology.
She had a human rights complaint made against her, which terrified her, and she told me that she went to 26 different lawyers first before she found the Justice Centre. Every single one of the 26 lawyers refused to take her case. Why? Well, they gave a variety of different reasons, according to my client. Some of them were afraid of activists; some were afraid of the different procedures at the Human Rights Tribunal. Some were afraid of representing somebody who had allegedly engaged in discrimination and they didn't want the stigma attached to representing somebody like that in that context.
There's also not much money in these cases, so they aren't particularly attractive to lawyers. That creates a significant access-to-justice problem that this committee needs to consider. It needs to consider people, like my client, who have a complaint made against them despite the fact they didn't do anything wrong.
A lot of people who have complaints against them are common people. Many have limited means and are facing a bewildering process, and even worse, they're facing the stigma of a human rights complaint. In this day and age of hypersensitivity and social media, where gossip travels around the world in an instant, being accused of discrimination in many cases is worse than a criminal accusation. It's enough to destroy your reputation. Even the lawyers don't want to be involved in it because they're afraid of stigma. They don't want to hear that you represented that bigot, that racist, that misogynist, that homophobe, that Islamophobe. How could you, in good conscience, represent these disgusting, filthy human beings?
Is the state going to appoint counsel and pay for it if people can't? In the woman's complaint, the complainant's name was withheld by the tribunal and kept private, but my client's name was publicized for the whole world to see. As a single mom, my client didn't need the complaint. She was trying to make ends meet. It caused her months of terror. Life was hard enough, and she told me that she wept when the complaint was withdrawn. I'm going to say that again: The complaint was withdrawn. It never made it to a hearing. There was never any vindication for her, simply the accusation that she had discriminated on the basis of gender identity or gender expression.
There are 14 other cases before the BC Human Rights Tribunal from the same complainant. Every single one of them, to my knowledge, requests damages against the people who refused to wax the complainant. None of them has a lawyer, to my knowledge, so there's lots of pressure to settle. Indeed, some of them have. Only the tribunal knows who the parties are until a hearing date is set, and then the parties are publicized three months in advance.
The Justice Centre offered to represent these respondents for free. We asked the BC Human Rights Tribunal, given the fact that there's an access-to-justice problem, to pass along that offer to all of the respondents. The BC Human Rights Tribunal refused to do so. That's something you need to consider, as well. Human rights tribunals are not the saviours in these case. Often they create more problems than they fix.
I want to say a little bit about the fine under former section 13 of the Canadian Human Rights Act. It was $10,000. That fine was found to be unconstitutional at the first stage of hearings. It was overturned by the Federal Court of Appeal—it never made it to the Supreme Court of Canada. The fine for a conviction of drunk driving is $1,000. That is a crime under the Criminal Code, which is a grave social evil. What you have heard this morning is that people should be punished for the vague crime—no specifics, like the case of Meghan Murphy, who is not here to defend herself—of transphobia or misgendering. That's part of the problem you need to think about.
How much time do I have left?
We recommend four things.
First, we recommended that the Canadian Human Rights Act, if it is to be amended, be amended to define what is and is not hate speech. Pursuant to the Supreme Court of Canada's decision in Saskatchewan v. Whatcott, 2013, 1 SCR 467 at paragraphs 90 and 91, the Supreme Court of Canada sets out what is hate speech. Most of what you've heard from the witnesses who are telling you something is hate speech doesn't even come close to hate speech.
Second, if there is any new legislation to be implemented, we say there ought to be defences to a complaint of hate speech mirroring the defences in subsection 319(3) of the Criminal Code, specifically that:
No person shall be convicted of an offence under subsection (2) [of 319]
(a) if he establishes that the statements communicated were true;
(b) if, in good faith, the person expressed or attempted to establish by an argument an opinion on a religious subject or an opinion based on a belief in a religious text;
I'll pause here to note that the Bible, under the parameters that you've been asked to consider, and the Koran and other religious books could be considered hate speech just because verses from them are posted online saying things like, “God created male and female”. That's not hate; that's a statement and it's entirely permissible, but it would be protected under the defences that I'm outlining here.
Subsection 319(3) continues:
(c) if the statements were relevant to any subject of public interest, the discussion of which was for the public benefit, and if on reasonable grounds he believed them to be true; or
(d) if, in good faith, he intended to point out, for the purpose of removal, matters producing or tending to produce feelings of hatred toward an identifiable group in Canada.
Third, we recommended that the maximum fine for any finding of hate be capped at no worse than the Criminal Code fine for drunk driving, at $1,000.
Fourth, we recommended that Parliament launch an initiative to encourage people to come forward with their big-tech censorship stories so that it can understand the extent of that problem, which is significant, and not embark on a mission of censorship without all of the facts.
Those are my submissions. Thank you.
Thank you to the witnesses for joining us today.
I'll be splitting my time with Mr. Erskine-Smith.
Mr. Cameron, I would like to pick up on a couple of comments you made in your presentation.
I take what you're saying about paragraph 2(b) of the charter. Freedom of expression is a fundamental freedom. Section 2 also includes other fundamental freedoms, such as religion and association. Of course, all the rights in the Charter of Rights are read together, and it's oftentimes a balancing of those fundamental freedoms, which can come into conflict. They have to be balanced.
I take issue with the fact that you think we should look at paragraph 2(b) first and that that's the most important and paramount consideration of all the rights. I disagree with that. Also, section 1 of the charter makes it very clear that all of the rights, including the fundamental freedoms, are subject to reasonable limits. The court has ruled on that, and I think it's misleading to say that paragraph 2(b) is the paramount consideration.
Another comment that you made was in your third recommendation, saying that any fine for anything involving hate speech online or whatever, should be capped at the Criminal Code fine for impaired driving, which is $1,000. That's the minimum fine, first of all, and second of all, you can go to jail for impaired driving. It is a serious offence, but of course it depends on all of the circumstances.
Third, you mentioned that the BC Human Rights Tribunal should, in some fashion or another, be promoting your legal services and giving you a platform in order to take on clients. That would help you get the word out there about your organization and what you stand for. I don't think it's the BC Human Rights Tribunal's role, at all, to be promoting any legal services over others.
I want to move, though, to something you said, which was that there's this sentiment out there that disagreeing with someone's point of view is considered hate. You went through a list of them and said “You disagree with that. That's hate.” I don't think that's true. I think the essential point here is that spreading misinformation angers people and riles people up online, and spreading that disinformation turns members of a community against one another. That's the fundamental problem we're seeing with things online that are not true, and they're being propagated by people with insincere motives, and motives that are outside the bounds of civil society, I would suggest.
What I would like to ask you, sir, is, when we see the Toronto van attack or what happened in Christchurch or the Quebec City mosque shooting, does it trouble you that those terrible individuals have been inspired by provocative and hateful content on social media platforms?