Committee
Consult the user guide
For assistance, please contact us
Consult the user guide
For assistance, please contact us
Add search criteria
Results: 1 - 15 of 281
View Leona Alleslev Profile
CPC (ON)
Thank you very much.
As you know, the Conservatives have called this emergency committee meeting because of reports that the Prime Minister has used the power of his office to attempt to muzzle private citizens and respected former career diplomats David Mulroney and Guy Saint-Jacques. These are serious allegations that merit an investigation, and of course it's our hope that this committee will agree to and support our request today.
As Mr. Mulroney said in some of the public statements that he's made, discouraging private citizens with expertise in foreign relations from speaking freely is fundamentally an undemocratic idea. Unfortunately, the Prime Minister has shown a clear pattern of silencing those who would speak out against him. We're concerned that Canadians no longer have confidence in the Prime Minister when he says that he did not direct these civil servants to try to silence his critics.
We saw the Prime Minister attempt to defend himself with the same language that he used during the SNC-Lavalin scandal and various other affairs, such as the Vice-Admiral Norman affair and a trip to India, and we obviously do not necessarily believe that. As Canadians know, Trudeau's early denials in the SNC-Lavalin scandal have turned out to be false, and now we're wondering what the case is with this affair.
It is clear that the Prime Minister has shown a pattern of behaviour of attempting to silence anyone who would challenge or criticize the government's approach to anything. The foreign affairs committee today must find in favour of our motion to be able to get to the bottom of this. Anything less would be a cover-up.
We would like to move the following motion:
That, the Committee invite the following witnesses to appear:
a. Minister of Foreign Affairs Chrystia Freeland;
b. Paul Thoppil, Assistant Deputy Minister for Asia-Pacific, Global Affairs Canada;
c. David Mulroney, as an individual;
d. Guy Saint-Jacques, as an individual;
e. Any other individual that the Committee deems relevant
that pursuant to Section 10(3) of the Parliament of Canada Act, the witnesses are to be sworn in;
that each witness appear individually on a panel, for no less than one hour; and
that all witnesses appear no later than August 15, 2019.
Now, Mr. Chair, I'd like to speak to the motion.
View Leona Alleslev Profile
CPC (ON)
Excellent.
There are three main reasons why we feel this motion must be supported. First and foremost, this is about preserving free speech in a democracy, and about the non-partisan nature of our federal public service. Second, it's also about the rapidly deteriorating relations with China and what the government's policy actually is on China. Third, and almost as important, it's about the checks and balances of the institution of this government and the balance that a House of Commons standing committee puts forward in holding the government to account.
What do I mean when I say “free speech”? Clearly, we are in a democracy and, therefore, people who have expertise based on the history of their careers and the experience they have gained over their careers can inform citizens on government behaviour. Whether it was an academic career, a financial management career, an industry career or the public service is irrelevant. They have gained experience, and once they are private citizens, they have the opportunity to inform the public.
We cannot have a Prime Minister's Office, or the Prime Minister himself, for whatever reason, looking to prevent anyone in this country from having the opportunity to speak freely to an issue and to inform citizens on that issue. A democracy is only as good as the information that citizens have. We know from the media reports that part of the conversation is that the PMO directed not to speak out against the government because this is an election year.
Even more so in an election year do we need to have the opportunity to have experts speak out, so that when citizens go to the polls, they have valid and informed information so that they can make a decision on whether or not the current government is the right government to lead them, going forward.
China, and our relationship with China, is one of the most important or significant issues facing the nation at the moment, with two people who are imprisoned in China—wrongfully, in Canada's opinion—and the serious economic impact of our exports of soy, pork and other grains being prevented from getting into China. This is a significant diplomatic and economic relationship and we need to know what experts think of the government's approach before we go to the polls.
We absolutely need to understand whether or not this Prime Minister has continued a pattern of behaviour of attempting to silence those who are experts or private citizens from being able to provide informed opinions, upon which Canadians, in a democracy, can make informed decisions about the shape and direction of their nation, and of course, the expertise of the government.
Secondly, we're looking at the partisan nature of the public service. If, in fact, the Prime Minister's Office is attempting to take non-partisan public officials and arm-wrestle them into behaving in a way that is partisan, if in fact that's what happened when they were asking the assistant deputy minister, Mr. Thoppil, to call these former diplomats and tell them that it's an election year and that they need to check in with the government on the government's policy, then the very fabric of Canada's democracy is at risk.
In Canada we have a non-partisan public service for the very reason that it spans across different governments. If they are asked, or directed by the PMO, whether or not they were specifically directed or whether the PMO merely intimated that direction, everyone in the public service knows that the Prime Minister's Office and the use of the term "Prime Minister's Office" are not to be taken lightly. They are, in many respects, a not-so-veiled threat that your behaviour needs to be a certain way. It's a very difficult position for a public servant to be in when the Prime Minister's Office calls and asks them to do something.
We as a committee have a responsibility to determine whether or not the Prime Minister's Office actually directly phoned a public servant and asked them to behave in an inappropriate way, in a partisan way, when doing so is totally and completely outside of that individual's responsibility to do so and goes against everything in the nature of our government.
Furthermore, we need to understand whether or not that supposed Prime Minister's Office individual did it with the knowledge of the Prime Minister, because the buck stops with the Prime Minister. We also need to know if it went to the Clerk of the Privy Council, or if in fact it completely skipped him, which also would be inappropriate. From free speech to not muzzling people to ensuring that we have non-partisan public servants and whether there is any way the Prime Minister's Office is asking public servants to behave in a partisan way—these are serious allegations that we need to get to the bottom of.
Additionally, we need to talk about Canada's policy on China. Clearly the government's policy up to this point has been weak and has not achieved what we need it to achieve. When supposedly these private citizens—the former diplomats Mr. Mulroney and Mr. Saint-Jacques—were told that they needed to check in with the government on their policy so that they can speak with a single voice, well, perhaps the opposition and all parliamentarians and Canadians should have and be afforded the same opportunity to hear what the government's policy is. We at this committee need to hear from the foreign affairs minister exactly what Canada's China policy is. If this senior associate deputy minister is able to tell two private citizens that they need to check in on the China policy of the country, then I think that all Canadians have as much responsibility to have that information as well. That is the role of a committee, to ensure that information gets to the citizens of the country.
Last, but by no means least, we have a responsibility, as the legislative branch, to do these kinds of investigations. There are only about 30 members of Parliament who are in cabinet, and in our country, in our democracy, they form the executive branch of our government. The other 300 or so, in addition to those cabinet ministers, form the legislative branch. House of Commons standing committees and all members of Parliament in the legislative branch have a responsibility to represent not only the citizens in their respective ridings but also citizens across this country to ensure that we hold the government to account. We're here to understand what the government's doing. We're here to challenge the government. We're here to represent all Canadians in holding the government to account and influencing the government's direction.
If that's not the role of members of Parliament, if the role, specifically of Liberal members of Parliament, is simply to do whatever the government says, then what is the role of members of Parliament and how is that undermining the very fabric and foundation of our democracy?
We saw quite clearly that the Prime Minister and senior people, both elected and unelected...because, of course, one big challenge of the Prime Minister's Office is that they have an incredible amount of authority and are able to dictate all kinds of things, but with far fewer checks and balances by not being elected officials.
If we saw, as we did with the SNC-Lavalin scandal and certainly with the Admiral Norman affair, the undermining and erosion of the independence of the judicial branch and how members of Parliament, specifically Liberal members of Parliament on the justice committee, were shutting down any kind of inquiry and open and transparent democratic investigation into the behaviours of the government, it affects not only one of the key pillars, the independence of the judicial branch, but also the checks and balances and the independence of the legislative branch.
Therefore, I am calling on all members of Parliament today at this committee to assume their responsibility to the citizens of the nation, to the office they hold and to their responsibilities as members of Parliament to hold government to account and ensure not only that the policies and the practices are correct, but also that the institution of government and the Parliament itself remain intact. We must vote in favour of this motion because there is far too much at stake, from free speech to the jeopardizing of a non-partisan public service, to not knowing what the government's China policy is and, therefore, not being able to hear from those who would agree with whatever it is and those who may disagree, and to ensure that we preserve and protect the very structure of our foundation of the independence of the executive, the legislative and, of course, the judicial branches and the role and accountability of a House of Commons standing committee to investigate when the Prime Minister is potentially overstepping his role and responsibility with a pattern of behaviour to muzzle anyone who would criticize, putting partisan goals ahead of the responsibility and the structure of the nation and, of course, covering up what's really going on and hiding the truth from Canadians.
The motion is to hear from these witnesses, to investigate these serious allegations and to protect and preserve Canada's democracy. I am pleading with the members of this committee to vote in favour of the motion.
Thank you.
View Arif Virani Profile
Lib. (ON)
Constitutionally protected free speech exists in Canada but it has limitations. The issue I think you're conflating is how you apply currently existing limitations on free speech, which exist for hatred, for defamation, for slander, for copyright, etc., to the online world. So that's the first point.
The second point is that if you're going to protect candidates, you need to protect all vulnerable candidates. Let's talk about the Islamophopia that Iqra Khalid faced. Let's talk about indigenous peoples in Saskatchewan in the wake of the death of Colten Boushie and the Gerald Stanley trial. You need to be thinking about this across the board, because there's a problem recruiting anyone who's Muslim, Jewish, indigenous, or black to get into this seat and into this chamber when we have such invective spreading online.
Colin McKay
View Colin McKay Profile
Colin McKay
2019-06-04 16:07
Thank you, Chair.
Thank you to all members of the committee for the opportunity to speak with you today.
I don't mind the delay. It's the business of Parliament, and I'm just happy to be a part of it today.
As the chair just mentioned, my name is Colin McKay, and I'm the head of government affairs and public policy for Google in Canada.
We, like you, are deeply troubled by the increase in hate and violence in the world. We are alarmed by acts of terrorism and violent extremism like those in New Zealand and Sri Lanka. We are disturbed by attempts to incite hatred and violence against individuals and groups here in Canada and elsewhere. We take these issues seriously, and we want to be part of the solution.
At Google, we build products for users from all backgrounds who live in nearly 200 countries and territories around the world. It is essential that we earn and maintain their trust, especially in moments of crisis. For many issues, such as privacy, defamation or hate speech, local legislation and legal obligations may vary from country to country. Different jurisdictions have come to different conclusions about how to deal with these complex issues. Striking this balance is never easy.
To stop hate and violent extremist content online, tech companies, governments and broader society need to work together. Terrorism and violent extremism are complex societal problems that require a response, with participation from across society. We need to share knowledge and to learn from each other.
At Google we haven't waited for government intervention or regulation to take action. We've already taken concrete steps to respond to how technology is being used as a tool to spread this content. I want to state clearly that every Google product that hosts user content prohibits incitement to violence and hate speech against individuals or groups, based on particular attributes, including race, ethnicity, gender and religion.
When addressing violent extremist content online, our position is clear: We are agreed that action must be taken. Let me take some time to speak to how we've been working to identify and take down this content.
Our first step is vigorously enforcing our policies. On YouTube, we use a combination of machine learning and human review to act when terrorist and violent extremist content is uploaded. This combination makes effective use of the knowledge and experience of our expert teams, coupled with the scale and speed offered by technology.
In the first quarter of this year, for example, YouTube manually reviewed over one million videos that our systems had flagged for suspected terrorist content. Even though fewer than 90,000 of them turned out to violate our terrorism policy, we reviewed every one out of an abundance of caution.
We complement this by working with governments and NGOs on programs that promote counter-speech on our platforms—in the process elevating credible voices to speak out against hate, violence and terrorism.
Any attempt to address these challenges requires international coordination. We were actively involved in the drafting of the recently announced Christchurch Call to Action. We were also one of the founding companies of the Global Internet Forum to Counter Terrorism. This is an industry coalition to identify digital fingerprints of terrorist content across our services and platforms, as well as sharing information and sponsoring research on how to best curb the spread of terrorism online.
I've spoken to how we address violent extremist content. We follow similar steps when addressing hateful content on YouTube. We have tough community guidelines that prohibit content that promotes or condones violence against individuals or groups, based on race, ethnic origin, religion, disability, gender, age, nationality, veteran status, sexual orientation or gender identity. This extends to content whose primary purpose is inciting hatred on the basis of these core characteristics. We enforce these guidelines rigorously to keep hateful content off our platforms.
We also ban abusive videos and comments that cross the line into a malicious attack on a user, and we ban violent or graphic content that is primarily intended to be shocking, sensational or disrespectful.
Our actions to address violent and hateful content, as is noted in the Christchurch call I just mentioned, must be consistent with the principles of a free, open and secure Internet, without compromising human rights and fundamental freedoms, including the freedom of expression. We want to encourage the growth of vibrant communities, while identifying and addressing threats to our users and their broader society.
We believe that our guidelines are consistent with these principles, even as they continue to evolve. Recently, we extended our policy dealing with harassment, making content that promotes hoaxes much harder to find.
What does this mean in practice?
From January to March 2019, we removed over 8.2 million videos for violating YouTube's community guidelines. For context, over 500 hours of video are uploaded to YouTube every minute. While 8.2 million is a very big number, it's a smaller part of a very large corpus. Now, 76% of these videos were first flagged by machines rather than humans. Of those detected by machines, 75% had not received a single view.
We have also cracked down on hateful and abusive comments, again by using smart detection technology and human reviewers to flag, review and remove hate speech and other abuse in comments. In the first quarter of 2019, machine learning alone allowed us to remove 228 million comments that broke our guidelines, and over 99% were first detected by our systems.
We also recognize that content can sit in a grey area, where it may be offensive but does not directly violate YouTube's policies against incitement to violence and hate speech. When this occurs, we have built a policy to drastically reduce a video's visibility by making it ineligible for ads, removing its comments and excluding it from our recommendation system.
Some have questioned the role of YouTube's recommendation system in propagating questionable content. Several months ago we introduced an update to our recommendation systems to begin reducing the visibility of even more borderline content than can misinform users in harmful ways, and we'll be working to roll out this change around the world.
It's vitally important that users of our platforms and services understand both the breadth and the impact of the steps we have taken in this regard.
We have long led the industry in being transparent with our users. YouTube put out the industry's first community guidelines report, and we update it quarterly. Google has long released a transparency report with details on content removals across our products, including content removed upon request from governments or by order from law enforcement.
While our users value our services, they also trust them to work well and provide the most relevant and useful information. Hate speech and violent extremism have no place on Google or on YouTube. We believe that we have developed a responsible approach to address the evolving and complex issues that have seized our collective attention and that are the subject of your committee's ongoing work.
Thank you for this time, and I welcome any questions.
Lindsay Shepherd
View Lindsay Shepherd Profile
Lindsay Shepherd
2019-06-04 9:02
Honourable members, thank you for the invitation to appear today.
Earlier this year, I received a seven-day suspension from the social media website Twitter for violating its rules against hateful conduct. According to the Twitter rules, you may not promote violence against, threaten or harass other people on the basis of race, ethnicity, national origin, sexual orientation, gender, gender identity, religious affiliation, age, disability or serious disease.
What was in my tweet that supposedly promoted violence, threatened or harassed someone? My tweet referenced an individual whom I cannot name here today due to a publication ban in this country. This individual can only be referred to as JY. JY is an individual who has taken 14 female aestheticians to the B.C. Human Rights Tribunal because they declined to perform waxing services on his male genitalia. There are also screenshots of Facebook messages between JY and others where it appears that he makes very predatory comments of wanting to help 10- to 12-year-old girls with their tampons in bathroom stalls.
In the tweet that got me suspended, I referred to JY as “a guy who creeps on young girls and vulnerable working women in the Vancouver area”. I posted some of the Facebook messages he has written about his plans to approach young girls in the female washrooms. Why was it deemed hateful conduct for me to write this tweet? It's because JY purports to be a male-to-female transgender person, so by alerting people to his troubling conduct, I got kicked off Twitter for seven days because what I wrote was seen as a transgression against his gender identity.
Prominent Canadian feminist Meghan Murphy was permanently banned from Twitter for misgendering the same individual, JY, whom I have just spoken about, and for tweeting, “men aren't women, though”. These tweets also fell under Twitter's hateful conduct policy. Murphy is now suing Twitter because, as a journalist, her livelihood is largely dependent on her online presence, and she is being denied an online presence and being denied the ability to participate in the public square, as online spaces are today's public square.
I am concerned about the potential return of legislation such as section 13 of the Canadian Human Rights Act. What that legislation does is punish Canadians who, in exercising their right to peaceful, free expression, might offend a member of a protected, marginalized group. If someone with a marginalized identity experiences commentary they find offensive, they can claim the offence is an attack on their identity rather than being legitimate expression. Human rights tribunals become the tools by which those who speak their mind peacefully and non-violently are silenced.
Many other witnesses before this committee have discussed the need for a definition of hate, and many call for a need to draw the line between free speech and hate speech. As a graduate student at Wilfrid Laurier University in 2017 and 2018, I woke up to how my peers and academic superiors understand hate. When the word got out that in the classroom where I was a teaching assistant I had played an excerpt from TVOntario's The Agenda with Steve Paikin, an excerpt that featured psychologist Dr. Jordan Peterson discussing Bill C-16, compelled speech and gender pronouns, a Ph.D. student at my university said at a rally that I had played hate speech in the classroom and had violated the spirit of the Charter of Rights and Freedoms. Likewise, a professor at George Brown College, named Dr. Griffin Epstein, asserted in a letter to the Toronto Star that I had played “hate speech in the classroom”. These are just two examples.
Recently, Facebook has taken to banning white nationalists from their platform. If you poke around online, you'll see that tons of people call me a white nationalist and a white supremacist because I have offered criticisms of the practice of indigenous land acknowledgements and have cited the statistically backed-up fact that white Canadians are becoming a minority in Canada. An instructor at Wilfrid Laurier University, Dr. Christopher Stuart Taylor, used class time in his anthropology class to tell his students that I have neo-Nazi, white supremacist ideologies, which he followed by saying, “I shouldn't have said that; forget I said anything.”
I don't have a Facebook account, but if I did, would it ban me? How many people does it take to smear you as a white nationalist or white supremacist before you get banned from certain online spaces?
This committee has noted that underlying their study on online hate is a finding by Statistics Canada that reported a 47% increase in police-reported hate crimes between 2016 and 2017. However, this increase is principally from non-violent crimes. As the Statistics Canada website reads: “police-reported hate crime in Canada rose sharply in 2017, up 47% over the previous year, and largely the result of an increase in hate-related property crimes, such as graffiti and vandalism”.
Perhaps you caught this story in the news recently. A couple of months ago at Laurentian University in Sudbury a student found some candy on a cafeteria table arranged in the shape of a swastika. This swastika-shaped candy arrangement is being investigated by the university as an incident of hatred and intimidation. However, I do not think that one isolated incident of candy arranged in a swastika is enough evidence to indicate that anyone is trying to incite hatred, target or intimidate. This is an example of how the bar for what constitutes hate is too low.
I have had so many encounters with the hypersensitivity around what constitutes hate that I know bringing back section 13 of the Canadian Human Rights Act would be a mistake. It would cast too wide a net, and extremists who are already intent on causing real-world violence will go to the deeper and darker web to communicate, while individuals who shouldn't be caught up in online hate legislation will inevitably get caught up in it.
Thank you.
John Robson
View John Robson Profile
John Robson
2019-06-04 9:09
Again, thank you very much to the members of the committee for an invitation to speak to the Standing Committee on Justice and Human Rights. I am here to speak in defence of the very fundamental human right of free speech.
I know that all the members here are extremely concerned about hate and intolerance, and I know you are horrified by the eruption of bad manners and loathsome opinions on the Internet. Too often social media seem to encourage our worst passions, but despite that—and it is a real problem—censorship is not the answer.
Censorship is an ugly word, and it may well not sound to you like what you're considering doing, in part because your motives are good, but censorship is the right word for what happens when government restricts freedom of speech for any but the narrowest of purposes, and censorship is an ugly word because censorship is an ugly thing.
There are legitimate grounds for government to restrict freedom of speech because the state exists to protect us from force and fraud. It is rightly illegal to conspire to commit crimes. It's illegal to libel or slander people. It's illegal to incite violence, and it's illegal to engage in material misrepresentation, but when governments seek to limit or prevent any communication that does anything else, including insulting or denigrating people or groups, it's censorship.
The problem with censorship is that it cuts the rattle off the snake; it doesn't drain the venom from the fangs. I want to be very clear here that a lot of the opinions that hate speech laws target are not just factually wrong, they are loathsome. My argument here isn't that neo-Nazis are fine people who happen to be misunderstood by idiots and the hypersensitive. My argument is that, in the battle of ideas, truth will prevail and that when you limit the battle of ideas, you put truth in peril.
I don't need to tell you why censorship in tyrannies is bad. They're trying to repress the truth. I don't need to tell you that if you go online you'll find yourselves called tyrants, neo-Nazis and all sorts of moronic insults, but the response to this kind of thing is to rebut it, to refute it, to laugh at it, to shun it, but it is not to call a cop.
What I want to do here is bring up the three arguments that John Stuart Mill made in On Liberty back in 1859 against censorship of unpopular ideas. It is important, to be clear, that it is censorship of unpopular ideas we are talking about. There is very little occasion for elected governments to try to censor popular ideas, but what Mill said is that, first and most fundamentally, an idea that people don't want to hear and that is unfamiliar and upsetting might turn out to be true.
I know you're not worrying about that when it comes to online hate, and there's no reason why you would be, but we have to protect freedom of speech because we might be wrong. We've been surprised before, and we don't have the wisdom to know in advance what ideas we shouldn't silence because we'll eventually realize they were right and which ideas we can safely trample underfoot because we know they are wrong.
Of course there are ideas that we would stake our souls, if we have souls, on being wrong, not just being erroneous, but being vicious. I don't know, because there are certain things you don't want on the record of the committee, but I'm going to say it out loud. Here are some ideas that are so wrong that you might be tempted to say no one can say them: Hitler should have finished the job, or blacks are inferior, that kind of stuff. There is no possibility that we are going to realize one day that they were true and that we shouldn't have been so blind to it.
This brings me to the second of Mill's arguments in favour of free speech, the Dracula effect. Of course he didn't call it that because Bram Stoker hadn't written his book yet, but it's the principle that sunlight destroys evil, that the way we get at truth is to speak out against error, denounce it and refute it.
Open societies are a gigantic gamble that truth has nothing to fear in a contest of ideas, and the trouble with censoring hateful speech is that you drive it underground where it isn't exposed to sunlight, where it isn't refuted, where it isn't ridiculed, where it isn't shamed and where people are not shown the error of their ways, because we want to rescue the haters as well as protect society from hate.
If you keep it off the open Internet, it goes into the dark web. It festers and it breathes in dank basements. It even lets haters wrap themselves in the mantle of martyrdom. You don't want to do that in the name of truth.
The third point that Mill makes is that if you live in a society where conventional wisdom is not challenged, even things that are true tend to be accepted as stale dogma and not as living truths. When you hear correct ideas defended, and when you defend them yourself, they become vital and living parts of your life. They become something you act on, that informs your existence and makes it better.
Censorship doesn't work. It didn't even work in tyrannies. Censorship in the Soviet Union allowed communism to last longer and in the end to collapse more disastrously. It also didn't work in Weimar, Germany, which had laws against anti-Semitism, and they didn't stop Hitler. What did people say in retrospect? They said we should have listened to what Hitler was saying. I meant to bring a copy of Mein Kampf as a prop, but I'm afraid I got busy this morning and forgot it. It belongs on every educated person's bookshelf because we need to know what hate looks like. We need to know how it could once have prevailed so we know how to fight it in others and in ourselves.
I once assigned it as a university text. I thought it would make a great headline, “Right-wing professor assigns Hitler text”. I don't even think the kids read it because it is so long. The one thing I wasn't worried about is they'd read it and become Nazis. You should not worry that if Canadians are exposed to hateful speech online it will turn them into haters. It will do the opposite. It will anger them. It will lead them to speak out against it. It will lead them to think more completely and thoroughly about tolerance and to be more tolerant people.
There are a lot more things I could say but I'm not going to steal my fellow witnesses' time.
I want to quote Queen Elizabeth I. At a time when religious differences threatened bloody civil war she said, “I have no desire to make windows into men's souls”.
That the state can prohibit acts of violence is very clear, and it's an essential duty that the state can prohibit incitement of violence. If someone stands on the street corner and says, “Kill that capitalist”, they're going to get arrested, and they should get arrested. But if someone stands on a street corner and says that the only solution to the ills of capitalism is violent proletariat revolution, they should not be arrested, because we don't need censorship to protect us from force and fraud. We certainly don't need it to protect us from truth or error. We are adults.
In free societies, from the time of Galileo and Socrates, our heroes are those who challenged conventional wisdom, shocked reputable opinion, outraged their neighbours and questioned authority. Most of them turned out to be cranks, and they're forgotten but some of them turned out to have been right. When we try to silence opinions we don't want to hear, we pay a huge price in truths we don't hear, and we drive untruths underground. In doing so we strengthen them; we do not weaken them.
Free speech lets us discover unexpected truths. It lets us refute error. It lets us live in the truth of our beliefs. It's a vitally important human right, and I implore this committee to uphold it in all its messy glory.
Thank you.
Mark Steyn
View Mark Steyn Profile
Mark Steyn
2019-06-04 9:17
Thank you very much, monsieur le président, and also honourable members of the committee. I am honoured to be here.
I would like to say a quick word—as much as I always enjoy seeing Ms. Raitt—about the defenestration of Mr. Cooper from this committee, which I understand is the business of the members of the committee.
I am concerned. I was driving into Ottawa listening to my old friend Evan Solomon on the radio, who was arguing that it was perhaps time for Mr. Cooper to be booted from caucus.
That is actually the age we live in, where people can have one infraction and their life implodes, their career implodes, they're vaporized for it. That is actually one of the most disturbing trends on the free speech issue. The surviving vice-chair of this committee said recently that Jordan Peterson should not be permitted to testify to this committee. Bernie Farber, I believe just last night said Lindsay Shepherd should be booted from appearing before this committee. Ms. Shepherd and Mr. Peterson are law-abiding Canadian citizens, and this practice of labelling people and demanding that they be instantly “de-platformed”, booted from polite society, is, in fact, more serious than some of the other matters before this committee.
I was here last time around, 10 years ago, when we got rid of section 13 because it was corrupt in absolutely every aspect of its operation, from minor bureaucrats indulging strange James Bond fantasies and playing undercover dress-up Nazis on the Internet to pathetic rubber-stamp jurists who gave section 13 a 100% conviction rate that even respectable chaps like Kim Jong-un and Saddam Hussein would have thought was perfectly ridiculous.
The worst aspect of it was secret trials—secret trials in Ottawa, not in Tehran or Pyongyang, but in Ottawa. I discovered it one evening before dinner and I emailed my friends at Maclean's. The eminent barrister, Julian Porter—who I see the Prime Minister recently retained as his Q.C.; that's how respectable he is—in a couple of hours wrote a motion referencing Viscount Haldane and Ambard v. Attorney-General of Trinidad and Tobago, real law, not the pseudo law of section 13, and did what John did. Julian's motion opened up that dank, fetid dungeon of pseudo justice to the public, to the people of Canada, and after 20 minutes in the cleansing sunlight that John talked about, the unimpressive jurist in that case, Athanasios Hadjis, decided that section 13 was unconstitutional and he wasn't going to have anything more to do with it. Sunshine works.
The most important aspect...while we're quoting judges, John Moulton wrote a famous essay a century ago on the realm of manners. He said the measure of a society is not what one is forbidden to do, which is to murder and steal and rape, and not what one is compelled to do, such as pay taxes or join the army or whatever. You measure a society by the space in between, the realm of manners, where free people regulate themselves. Canadians do not bash gays or lynch minorities because they are enjoined by the state not to do so. They do so because they are operating in Lord Moulton's realm of manners where free people, civilized people, regulate themselves. That is where the internal contradictions of a fractious multicultural society should be played out.
The idea of bureaucrats once again getting into this business is deeply disturbing. They didn't have enough work last time. Shortly before the Maclean's case, which was the one I was involved in, the senior counsel for the Canadian Human Rights Commission actually went to Toronto to speak to various groups to say they weren't getting enough cases and that's why people should file more complaints.
Ultimately, free speech is hate speech and hate speech is free speech. It's for the speech you hate, the speech you revile. The alternative to free speech is approved speech, and that necessarily means approved by whom? Well, approved by yourself as a citizen, if you don't want to have Lindsay Shepherd over to dinner, as Bernie Farber doesn't. That's fair enough. However, once it becomes speech approved by the state and by formal bodies, it effectively means the speech approved by the powerful.
The biggest threat to free speech at the moment is a malign alliance between governments and big tech doing the kinds of things that Lindsay spoke of. The photograph that sums it up is the one of Mr. Trudeau with Mrs. May, Ms. Ardern and President Macron the other day sitting across the table from the heads of Facebook, Twitter, Google and Apple. These are six woke billionaires who presume to regulate the opinions of all seven billion people on this planet. That is far more of a threat than some pimply 17-year-old neo-Nazi tweeting in his mother's basement somewhere out on the Prairies. That issue is the real threat to genuine liberty in our society.
I cannot believe that a mere 10 years on, we are talking about restoring this law. It was appalling, and unfortunately, this committee and the House never actually confronted it in reality.
I will finally say this on a personal note. I was born in Canada. I love Canada. I would die for Canada. I am old-fashioned enough to take the allegiance of citizenship seriously, but no monarch, no Parliament, no government, and certainly no bureaucratic agency operating the pseudo law of section 13 can claim jurisdiction over my right to think freely, to read freely, to speak freely and to argue freely.
Thank you very much, sir.
View Colin Fraser Profile
Lib. (NS)
View Colin Fraser Profile
2019-06-04 9:32
Thank you, Mr. Chair. I'll be sharing my time with Mr. Erskine-Smith.
Ms. Shepherd, I want to discuss with you a couple things that you mentioned in your presentation and also some activities that you've undertaken.
One thing that I think is missing sometimes when we talk about free speech is that it sometimes gets confused with consequence-free speech, meaning that people have to be responsible for what they do say. I agree, obviously, with the point that free speech in Canada is a protected right, that it is obviously extremely important and that we cherish it, but that it is subject to reasonable limits in our charter. Consequence-free speech is something that has to be borne in mind when responsible individuals are engaging in civil society.
I want to talk for a minute about a recent YouTube interview that you did with Mr. Gariépy. I'm sorry if I'm pronouncing that incorrectly. I'm not familiar with him. The topic of population replacement came up. I know you talked a bit in your presentation about whites becoming a minority. This YouTube channel hosts white supremacists quite often, including neo-Nazis like Richard Spencer, and former KKK grand wizard David Duke, who has appeared on that program. You appeared on it recently talking about population replacement. After you finished that statement, Mr. Gariépy then started talking about white genocide and how when whites are in the minority, like in South Africa and Haiti, white genocide occurs. You said nothing in rebuttal to that. Don't you think that free speech comes with a responsibility, especially when you're confronted with inflammatory and insightful rhetoric?
Lindsay Shepherd
View Lindsay Shepherd Profile
Lindsay Shepherd
2019-06-04 9:34
I don't think I'm here to defend my personal track record. In fact, at a previous hearing, Naseem Mithoowani, one of the witnesses, was asked about her personal activities, and it was deemed that it wasn't appropriate.
David Arnot
View David Arnot Profile
David Arnot
2019-05-30 8:49
Thank you very much for the invitation and the opportunity.
There's been a proliferation of hate speech online, propaganda, radicalism and obscenity. In 2016, Cision documented a 600% increase in the amount of hate speech in social media postings between November 2015 and November 2016. In 2019, Léger Marketing indicated that 60% of Canadians report having seen hate speech on social media.
These statistics should not come as a surprise to anyone. When the federal government repealed section 13 of the Canadian Human Rights Act in 2013, we lost the capacity to protect against this. For the past six years Canadian citizens have had little ability to protect themselves against online hate speech and discrimination.
The fundamental problem is that Criminal Code provisions are often ineffective; prosecutions are few; proof of intent to promote hatred against a group beyond a reasonable doubt is almost impossible to meet. The 2008 Saskatchewan Provincial Court case of Crown v. Ahenakew demonstrates that clearly.
In the case of Saskatchewan Human Rights Commission v. Whatcott, the Supreme Court of Canada, in a unanimous decision, stated that an effective way to curb hate speech is not within the Criminal Code, but in a civil process through human rights commissions. The commission argued that the Criminal Code provisions regulate only the most extreme forms of hate speech, advocating genocide or inciting a breach of the peace. The Supreme Court specifically and narrowly defined hate speech to ensure that human rights legislation does not unreasonably infringe on freedom of expression. This is the most important contribution the Saskatchewan Human Rights Commission has made to Canadian jurisprudence. I put forward the idea that this case provides a blueprint for the work of this committee.
Judge Rothstein made the following salient points for the court.
The court described nine indicia of hate in paragraph 44 which are clear, concise and unambiguous. The argument for free speech is not a shield to be used to protect against hate speech. The courts have consistently used the hate speech definition from the 1990 Taylor case in the Supreme Court of Canada. This analysis excludes merely offensive or very hurtful, obnoxious expressions.
Expression which debates the merits of reducing the rights of some Canadian citizens who are vulnerable is not a prohibition. It restricts the use of the expression that exposes the members to hatred. Ideas are not the target; rather the mode of expression of the idea is the target.
Ironically, hate speech arises in public debates and can be very restrictive and exclusionary. Legitimate debate in our democracy that is expressed in a civil manner encourages the exchange of opposing views. Hate speech is antithetical to that objective. It shuts down dialogue by making it difficult or impossible for members of a vulnerable group to respond, thereby stifling discourse. Hate speech that shuts down public debate cannot dodge prohibition on the basis that it promotes debate.
Preventative measures in human rights legislation reasonably centre on the effects rather than the intent of the hatemonger. The evil of hate propaganda is beyond doubt. Hate expression causes real harm to real people. Hate speech demeans, denigrates and dehumanizes the citizens it targets. Through hate speech individuals are told they are entitled to less than other Canadians because of the characteristics they possess.
With the advent of instant unfettered electronic communication, the opportunity for dissemination is nearly unlimited and largely uncontrolled. A realistic view of modern society must inform free speech, discourse, and the limits thereon.
The Whatcott judgment was rendered in February 2013. Later that same year, section 13 of the Canadian Human Rights Act was officially repealed. The repeal was based on the argument that it unduly fettered free speech. Opponents to the section provided only anecdotal examples that justify their position. There is no empirical evidence that human rights legislation unduly fetters legal speech. Contrary to the arguments of the free speech advocates, Canada has no democratic tradition of unbridled free speech. Freedom of speech in Canada has always been freedom governed by limits recognized in law.
Principles of freedom of speech were originally derived through common law principles showing up in the Constitution Act, 1867. Freedom of speech was expressly declared in the Canadian Bill of Rights, 1960. A Canadian citizen's right to freedom of expression was not given express constitutional protection until the enactment of the charter in 1982.
Despite the charter protection of freedom of expression, there are numerous limits to free expression that are justifiable in a free and democratic society. Reasonable limits to expression protect against greater harms that flow from unfettered speech.
Some of those limitations include defamation, libel, slander, perjury, child pornography, court ordered publication ban, limits on tobacco, alcohol and drug advertising, insider trading, fraud in the business sector, copyrights, trademarks, and hate speech. There are literally hundreds of legally justified limitations on freedom of expression in Canada.
However, let's remain focused on hate speech. Here are the recommendations of the Saskatchewan Human Rights Commission to this committee:
First, the Saskatchewan Human Rights Commission supports the reintroduction of prohibitions in the Canadian Human Rights Act against hateful expression, and the inclusion of telecommunications and the Internet in that act and that re-inclusion.
The provision could be more effective if the Canadian Human Rights Commission is permitted to commence a complaint on its own initiative on behalf of an affected group, such as a class action type of model. The Saskatchewan Human Rights Commission has that ability. Proceeds of a successful complaint could be paid to a community organization that supports the targeted group and/or fights against hate speech.
We must enact meaningful legislation that allows human rights commissions to do their job effectively and to hold those who spread online hate responsible for their actions.
Second, create legislation that holds companies financially accountable for hosting, spreading or creating content that foments online hate. Germany passed the “Facebook act”, which requires social media networks with more than two million users to take down hateful content within 24 hours or face a very significant financial penalty.
In the United Kingdom, the “Online Harms White Paper” has proposed establishing an independent regulator that would write a code of practice for social networks and Internet companies and have the ability to fine companies that don't enforce those rules. In Canada, we must follow suit.
Recently, giant tech companies such as Microsoft, Twitter, Facebook and Google came together to condemn online hate and agreed to a nine-point plan on how to curb hate. That is a very good thing. However, we cannot rely on commercial entities to determine what type of behaviour and content is acceptable. That would be a fundamental abdication of the legislative responsibility of Parliament. Instead, we need to develop a “made in Canada for Canada” plan, a plan created by governments after thorough consultations with industry stakeholders, a plan that publicly sets out rules, that monitors platform compliance and that penalizes when necessary.
Third, Canadian agencies must be given the means and mandate to monitor and investigate online hate, extremism and radicalized influences. In a time when hate and misinformation spread like wildfire online, data collection and intelligence gathering are paramount. That is why part of a “made in Canada for Canada” plan should include a partnership between federal security agencies, social media companies and Internet providers. We have arrived at a moment in our history in which words and well-intentioned platitudes no longer suffice.
The digital revolution, which has transformed society for both good and ill, has begun to disrupt our democracy. Individuals and groups, foreign and domestic, are using online misinformation, hate and extremist recruitment to erode democratic discourse and to drive a wedge between Canadian citizens.
We cannot let that happen. We need to take action. Our leaders must have the authority and the moral courage to do what is right. They must choose unity over division, understanding over ignorance, and respect over hate. They must make decisions that work towards the greater good, that respect the rule of law, reflect the charter, and in turn, make the difficult decisions that protect what it means to be a Canadian citizen.
This starts by enacting meaningful legislation that will allow governments, human rights commissions, industry, regulatory agencies and the public to effectively combat online hate and misinformation. That's where it starts, but that's not where it ends.
Fourth, we must also invest in education so that youth of tomorrow no longer—
Marie-Claude Landry
View Marie-Claude Landry Profile
Marie-Claude Landry
2019-05-30 9:00
Good morning.
My remarks will be offered in both official languages.
Thank you for inviting the Canadian Human Rights Commission to participate in this discussion today on online hate. I am joined by my colleague Monette Maillet, Deputy Executive Director and Senior General Counsel.
The proliferation of online hate is a clear and present danger. In recent years it has become painfully clear that allowing online hate to fester can result in horrific consequences. We are therefore encouraged that the justice committee is conducting this important study. We are pleased to see that you are hearing from several witnesses representing the people and communities most often targeted by hate.
Hate speech, and particularly online hate, is both an urgent public safety issue and a fundamental human rights issue. Hate speech violates a person's most basic human rights and freedoms: the right to equality and the right to live free from discrimination.
I will focus my remarks on three key points. First, online hate causes harm. Second, there is a gap in the law when it comes to protecting people from online hate. Third, a comprehensive strategy is needed.
The Internet has given everyone the power to have their own platform and to be a broadcaster. People can be louder than ever before and influence more people than ever before. In many ways, this is a major step forward. However, the Internet has made it possible to amplify and spread hate speech.
Far too often, people are victimized by online hate because of their race, religion, gender, sexual orientation or where they're from. Online hate has been found to cause fear and serious psychological harm. It shuts down debate and it promotes conflict, division and social tension. At its most serious, online hate incites violence, and too often, far too often, leads to tragic situations.
If Canadians targeted by online hate are expected to live their lives in a toxic atmosphere, we're basically failing them. Canada has a responsibility under international and domestic laws to promote equality and to protect all Canadians from discrimination.
This brings me to my second point. There is a gap in the law when it comes to protecting people from online hate. The now repealed section 13 of the Canadian Human Rights Act has given the commission an informed perspective on addressing online hate in Canada.
As many of you may know, section 13 was originally written into the CHRA to prevent harm from prohibited hate messages, based on anti-Semitism being communicated by telephone in the 1970s. Following the attacks of September 11, section 13 was broadened to include messages communicated over the Internet. For many years, it was effective in shutting down a number of extreme neo-Nazi websites. However, this approach is not well suited to respond to today's rapidly evolving technology. As you know, section 13 was deemed to be a constitutionally sound provision.
As well, the Supreme Court of Canada has confirmed that some limits to free speech are justifiable in a free and democratic society. We have noted that previous witnesses have spoken of the need for a definition of “hate”. To this end, we encourage this committee to look at the definitions put forward by the Supreme Court of Canada, as well as the hallmarks of hate developed by the Canadian Human Rights Tribunal.
In the discussion around freedom of expression and hate speech, we must not forget the fundamental right to equality and to be free from discrimination. There is no hierarchy of rights, and rights sometimes compete. The commission believes there needs to be an appropriate balance. That is going to require meaningful participation and accountability of all involved parties.
What we can say for certain is that something must be done quickly to address the proliferation of online hate. It threatens public safety, violates human rights and undermines democracy. As other witnesses have said, addressing online hate will require a proactive approach that involves tracking, intervention and prevention.
This brings me to my third point. A comprehensive strategy is needed. It will take a concerted and coordinated long-term effort that is proactive, multipronged and multi-faceted. It will take innovative thinking, technical expertise, proper resourcing, coordination and co-operation.
The strategy will need to bring together all levels of government, telecommunication and Internet providers, social media platforms, civil society, academia and, most importantly, victims of hate.
These efforts must be led by the government. The government has a duty to meet its domestic and international human rights obligations. This includes protecting citizens from hateful speech.
In conclusion, the Canadian Human Rights Commission is committed to fighting against hate and to participating in a broader, coordinated solution.
In response to evidence heard by the committee, the CHRC finds that a simple amendment to the Canadian Human Rights Act to include provisions similar to the former section 13 would be insufficient. In this modern era, this legal change alone could neither provide the scope nor the level of protection or remedies necessary to prevent online harassment or to effectively reduce hate propaganda.
If the committee or the government explores possible amendments to the Canadian Human Rights Act or to other legislation as part of a broader response to hate propaganda issues, the CHRC would be happy to contribute its expertise.
In the coming days, the CHRC will submit a number of documents, including a summary report of a recent jointly organized event to discuss online hate.
Thank you. My colleague Monette Maillet and I would be pleased to answer your questions.
Sinan Yasarlar
View Sinan Yasarlar Profile
Sinan Yasarlar
2019-05-28 8:46
Good morning, honourable MPs. We would like to thank the members of Parliament for allowing us to give our perspectives on online hate on behalf of the Windsor Islamic Council and the Windsor Islamic Association, of which I am the public relations director. Lina Chaker is from the Windsor Islamic Council.
Good morning to everyone.
The problem is victims of online hate. Internet use is growing year by year and will continue to do so in the generations to come. Just as we have regulated other technologies, including television, radio, movies, magazines, and other communication platforms, we cannot ignore the Internet. The harm of online conversations transcends the digital world. We don't need to cite violent events or even the most recent attack in New Zealand to prove that online hate has real-world consequences.
Our community centres are filled with troubled youth facing negative peer pressure, social anxiety, and mental health issues. The overall international Muslim community has been shaken twice over the past couple of years by terrorism, just as other communities have been. These terrorists clearly built their Islamic knowledge from misinformed online sources that spew hate.
We have our own Canadian example from January 29, 2017, in Quebec, with evidence that motivation was driven by online hate sites.
To prevent and respond to online hate, we believe there are three important actions the Government of Canada can take.
Number one is to set strict standards and guidelines for social media companies to self-regulate their content. Number two is to more readily enforce legislation that criminalizes both online and off-line hate speech. Number three is to increase awareness about public reporting and responding to this type of behaviour online.
The first action is to impose strict self-regulation standards and penalties for social media companies. Other countries have developed strategies to impose regulations and protocols for social media companies to self-regulate the content of hate speech on their sites. For example, Australia and Germany now penalize social media sites that fail to remove hateful content with financial charges or even imprisonment.
Alternatively, some countries such as Sri Lanka...[Technical difficulty—Editor] ...social media to stop the spread of misinformation and hate. Canada should consider policies of the kind that have been adopted in Australia, Germany and even Sri Lanka to enforce the removal of hateful content and combat terrorism.
We recognize that there may be difficulties in regulating online content. However, our country currently regulates other forms of online content such as child pornography, and anti-spam legislation does exist.
Similar to this, there has to be an effort to combat online hate. For the individuals who try to bypass such regulations, we should combat that by not allowing companies to provide individuals with VPNs or other IP-blocking programs.
Nuimber two is to introduce effective legislation to penalize those who incite hatred. In addition to penalizing social media companies for not taking down hateful content, we must penalize Canadians who spread hateful messages, whether online or off-line. Although we currently have tools to do so, such as section 319 of the Criminal Code, our community feels that they are not adequately utilized and thus cannot encompass online hate crimes.
In fact, we had an unfortunate local example here in Windsor, Ontario. An individual was spraying graffiti all over the city, on the posts and bus stop signs, inciting hatred and harm to Muslims specifically.
These acts weren't recognized as hate crimes under section 319, which makes our community pessimistic about the prospects of encompassing online hate speech. This individual had a misdemeanour and no other charges were pressed against him.
Recognizing this, we believe that section 13 of the Human Rights Act was a vital piece of legislation that was dedicated to online speech. However, it can be amended or restructured to be more effective. We recognize that section 13 was not heavily utilized before it was repealed. However, we do not find this to be a convincing reason not to reintroduce it.
Online hate can be responsible for other types of actions in our society, including verbal attacks against women with hijabs, trying to do harm to people of a visible minority and inciting the physical confrontations that have happened in several supermarkets, shopping areas and malls in our country.
Thus, we are not limiting the discussion of section 13, but hope that any legislation introduced to combat hate will readily be enforced for the betterment of our multicultural Canadian society. The frequency with which a piece of legislation is used should not be the basis on which we decide whether it exists or not. Rather, it should highlight to us that most people still do not know what to do when faced with online hate.
We recommend that there be more education on the consequences of promoting hate. While recognizing that education tends to be a provincial mandate, it is our believe that the Government of Canada can play a vital role. This leads us into our third and final point: educating the public on how to report incidents of hate.
Akaash Maharaj
View Akaash Maharaj Profile
Akaash Maharaj
2019-05-28 10:19
Thank you, Mr. Chair.
Committee members, the Mosaic Institute is grateful for the opportunity to participate in your deliberations on online hate. We recognize that your time is limited, and that you must be selective about the organizations you invite to appear. Thank you for including us.
Mosaic is a Canadian charitable institute that advances pluralism in societies and peace among nations. It operates through track two diplomacy and brings together people, communities and states to foster mutual understanding and to resolve conflict.
Over the years, we have convened Chinese and Tibetan youth leaders on peaceful co-existence on the Tibetan Plateau, we have assembled Sinhalese and Tamil representatives on reconciliation after the Sri Lankan civil war, and we have called together survivors of genocides to combat future global atrocities.
Fundamentally, our mission is to break cycles of hatred and violence by building empathy and common ground between peoples at strife. We have therefore seen first-hand how the speed and reach of social media have made it both a means of bringing us all together and a weapon to set us all at one another's throats.
The stakes are unutterably high. In our work with the Rohingya people, it has become clear to us that social media played a determinative role in spreading disinformation, fomenting hatred and coordinating mass slaughter, ending with the deaths of at least 10,000 innocent people and the ethnic cleansing of at least a million more. Canada is not Myanmar. Nevertheless, the ability of Parliament to contain and combat online hatred and incitement will quite literally decide whether people live or die.
It should go without saying that in a just and democratic society, there is no higher ideal, no greater ethic, no more sacrosanct imperative than freedom of expression. Peace, order and good government; liberté, égalité, fraternité; life, liberty and the pursuit of happiness—all are impossible without free public discourse. Freedom of expression becomes meaningless if it does not include freedom to offend, freedom to outrage and quite frankly, freedom to make an ass of oneself, although I'm sure that never happens in Parliament.
Voices: Oh, oh!
M. Akaash Maharaj: Any abridgement of freedom of expression must, therefore, be only the barest minimum necessary to preserve the dignity and security of citizens.
We believe that Canadian laws defining illicit hate speech are sufficient for that purpose, and the scope of proscribed speech need not and should not be expanded further. Legal, regulatory and social media frameworks fall short, not in defining hate but in identifying it and quarantining it before the virus spreads and wreaks its damage.
We do not underestimate the scale of the challenge that legislators and social media firms face. During the two and half hours set aside for this hearing, there will be 54 million new tweets and 4.7 billion new Facebook posts, comments and messages.
For your consideration, here are our recommendations.
First, social media firms must, either voluntarily or under legal compulsion, adhere to a set of industry standards on the speed with which they review reports that posts violate Canadian anti-hate laws or their platforms' own terms of service. For example, the European Union standards require firms to review a majority of reports within one day.
Second, social media firms should be required to have specific conduits to prioritize complaints from trusted institutions about offending content. A complaint from a children's aid society, for one, should be treated with immediate concern.
Third, there must be financial consequences for firms that fail to remove illegal content within a set period—penalties severe enough to make the costs of inaction greater than the costs of action. Germany's network enforcement act sets fines as high as 50 million euros when illegal posts stay up for more than 24 hours.
Fourth, social media firms should be required to publish regular transparency reports providing anonymized information on, among other issues, the performance of their machine learning systems at automatically intercepting proscribed posts; the speed with which firms respond to complaints from victims, trusted institutions and the public at large; and the accuracy of their responses to complaints as measured by a system of third party random sampling of posts that have been removed and posts that have been allowed to stand.
Fifth, social media firms must be more forthcoming in revealing the factors and weightings they use to decide what posts are prioritized to their users. They must give users greater and easier control to adjust those settings. Too often social media platforms privilege content that engages users by stoking fear and hatred. A business model based on dividing our communities should be no more acceptable than one based on burning down our cities.
Sixth, Parliament should enact the necessary appropriations and regulations to ensure that CSIS and the Communications Security Establishment have both the mandate and the means to identify and disrupt organized efforts by hostile state and transnational actors who exploit social media to sow hatred and polarization amongst Canadians in an effort to destabilize our nation.
Seventh, Parliament should consider legislative instruments to ensure that individuals and organizations that engage in incitement to hatred bear vicarious civil liability for any violent and harassing acts committed by third parties influenced by their posts.
Eighth, the federal government should fund school programs to build young Canadians' abilities to resist polarization and hatred, and to cultivate critical thinking and empathy. The best defence against hatred is a population determined not to hate.
Finally, especially in this election year, I would put it to you that parliamentarians must lead by example. Everyone in this room knows that the guardians of our democracy are not ministers but legislators. We look to you to stand between our leaders and the levers of power to ensure that public office and public resources are used only in the public interest. More than that, we look to you to be the mirror of our better selves and to broker the mutual understanding that makes it possible for a vast and pluralistic society to thrive together as one people and one country.
During the upcoming campaign, you and your parties will face your own choices on social media: whether to campaign by degrading your opponents, whether to animate your supporters through appeals to anger or whether to summon the better angels of our natures. Your choices will set the tone of social media this summer more decisively than any piece of legislation or any regulation you might enact. I hope you will rise to the occasion.
Thank you.
View Arif Virani Profile
Lib. (ON)
I'm going to go very quickly.
You're all doing amazing work. Thank you.
Thank you particularly, Mr. Galloway, for bringing a voice that we haven't heard very often.
Shalini, we're very proud to always have SALCO at the committee.
Mr. Maharaj, I will pick up on where you left off and just say that it is critical, not just this year, but any year, that parliamentarians exercise discipline and appropriate behaviour. I will say that I am troubled when we have senators of this Parliament question white supremacy and its presence. I'm also troubled by reports today that we have elected officials potentially making announcements about immigration policy in front of hotels that were the site of arson attacks in Toronto. I'll leave it at that.
I have a question for all four of you that relates to section 13 of the CHRA. It's a bit specific because I'm a bit of a specific lawyer and we like to get into the weeds a bit.
The specific aspects are that the old version of section 13 had an exemption for the telecommunication provider. Do you think that should remain, or do you want more accountability for the telecommunication provider and the social media platform?
Second, can we quell the free speech antipathy by simply having a rider in there, which may be superfluous, saying that nothing in this clause is meant to derogate from the constitutionally protected right to freedom of expression?
Third, do we need a definition of “hatred” incorporated into it? This was the suggestion by Irwin Cotler, a previous attorney general, in a private member's bill.
Fourth, should we have some sort of threshold for what constitutes the type of hatred that would trigger section 13 so that we don't get single instances but more of a mass-orchestrated attack?
If all four of you could opine on all or any parts of those, that would be terrific. Thank you.
Shalini Konanur
View Shalini Konanur Profile
Shalini Konanur
2019-05-28 11:06
On your point around putting a catch-all at the end to say that nothing in this section limits freedom of expression, I don't agree that we need that. Our Supreme Court has been clear that none of our freedoms are absolute; they always can be limited by reasonableness.
I don't think we should water down section 13 by saying that.
Results: 1 - 15 of 281 | Page: 1 of 19

1
2
3
4
5
6
7
8
9
10
>
>|
Export As: XML CSV RSS

For more data options, please see Open Data