Mr. Chair, members of the committee, good morning.
I would first like to acknowledge that I am joining you from Montreal, on the traditional territory of the Mohawk and other Haudenosaunee peoples.
Thank you for inviting me to speak to you today. With me, as you said, are Joëlle Montminy, senior assistant deputy minister, cultural affairs, and Pierre-Marc Perreault, acting director, digital citizen initiative.
Like you and many other Canadians, I am concerned by the disturbing rise and spread of hateful, violent and exploitive content online and on social media.
[English]
As a legislator and father of four children, I find some of the content of these platforms to be profoundly inhuman.
[Translation]
I am also deeply troubled by the consequences and the echoes of that content in the real world.
The overall benefits of the digital economy and social media are without question. In fact, I published a book, shortly before I took up politics, wherein I talked about the benefits of the digital economy, of artificial intelligence in particular, but also about some unintended negative consequences.
In Canada, more than 9 out of 10 adults use at least one online platform, and since the beginning of the pandemic, online platforms have played an even more important role in our lives.
[English]
We use social media platforms like Facebook, Twitter, Instagram and YouTube to stay connected to our families, friends and colleagues. We use them to work, to conduct business, to reach new markets and audiences, to make our voices and opinions heard, and to engage in necessary and vital democratic debate. However, we have also seen how social media can have negative and very harmful impacts.
[Translation]
On a daily basis, there are Internet users who share damaging content, either to spread hate speech, the sexual exploitation of children, terrorist propaganda, or words meant to incite violence.
[English]
This content has led and contributed to violent outbursts such as the attack on the Islamic Cultural Centre in Quebec City in 2017, and similar attacks in Christchurch, New Zealand, in 2019.
Canadians and people all over the world have watched these events and others unfold on the news with shock and fear. We all understand the connections between these events and hateful, harmful online discourse. We worry about our own safety and security online. We worry about what our children and our loved ones will be exposed to.
According to a recent poll by the Canadian Race Relations Foundation, an overwhelming 93% of Canadians believe that online hate and racism are a problem, and at least 60% believe that the government has an obligation to prevent the spread of hateful and racist content online.
In addition, the poll revealed that racialized groups in Canada are more than three times more likely to experience racism online than non-racialized Canadians.
[Translation]
Since the beginning of the COVID‑19 pandemic, we have seen a rise in anti-Asian hate speech on the Internet and a steady increase in anti-Semitic rhetoric, further fuelled by recent events.
A June 2020 study by the Institute for Strategic Dialogue found that Canadians use more than 6,600 online services, pages and accounts hosted on various social media platforms to convey ideologies tinged with white supremacism, misogyny or extremism. This type of content wreaks havoc and destroys lives. It is intimidating and undermines constructive exchange. In doing so, it prevents us from having a true democratic debate and undermines free speech.
The facts speak for themselves. We must act, and we must act now. We believe that every person has the right to express themselves and participate in Internet exchanges to the fullest extent possible, without fear and without intimidation or concern for their safety. We believe that the Internet should be an inclusive place where we can safely express ourselves.
Our government is therefore committed to taking concrete steps to address harmful content online, particularly if the content advocates child sexual exploitation, terrorism, violence, hate speech, and non-consensual sharing of intimate images.
In fact, this is one of the priorities outlined in the mandate letter given to me by Prime Minister Justin Trudeau. So we have begun the process to develop legislation that will address the concerns of Canadians.
[English]
Over the past few months my office and I have engaged with over 140 stakeholders from both civil society organizations and the digital technology sector regarding this issue. This has included seven round-table discussions. We also spoke with indigenous groups, racialized Canadians, elected provincial officials, municipal officials and our international partners to assess our options and begin to develop a proposed approach.
In addition, given the global nature of the problem, I have hosted a virtual meeting with my counterparts from Australia, Finland, France and Germany—who were part of the multi-stakeholder working group on diversity of content online—to discuss the importance of a healthy digital ecosystem and how to work collectively.
[Translation]
I am also working closely with my colleagues the ministers of , , ,as well as to find the best possible solution.
[English]
Our collaborative work aims to ensure that Canada's approach is focused on protecting Canadians and continued respect for their rights, including freedom of opinion and expression under the Charter of Rights and Freedoms. The goal is to develop a proposal that establishes an appropriate balance between protecting speech and preventing harm.
Let me be clear. Our objective is not to reduce freedom of expression but to increase it for all users, and to ensure that no voices are being suppressed because of harmful content.
[Translation]
We want to build a society where radicalization, hatred, and violence have no place, where everyone is free to express themselves, where exchanges are not divisive, but an opportunity to connect, understand, and help each other. We are continuing our work and hope to act as quickly and effectively as possible. I sincerely hope that I can count on the committee's support and move forward to build a more transparent, accountable and equitable digital world.
I thank you for your attention and will be happy to answer any questions you may have.
:
As I said, we have been hard at work for more than a year to prepare this legislation. We've held consultations with, as I said, in my case, more than 140 organizations. The also held some consultations on some of the more legal aspects of the legislation and issues pertaining to the Criminal Code.
It is a complex issue. There are only a handful of countries in the world that have introduced legislation to do that, namely France and Germany; I spoke earlier about Australia, and the United Kingdom tabled a white paper on this just this past December. I was on the phone recently with the heritage minister in the U.K. to discuss that.
It is a complex issue, but nonetheless an issue we want to tackle. You referred to the 24-hour takedown notion, which is, in fact, in the mandate letter the gave to me at the beginning of the mandate. It's a more novel element; very few countries are doing that. The Australians are just introducing this in their legislation. We want to ensure that we find this right balance, and that's what we're working towards. It is still my intention to introduce the legislation in the very near future, but let me give you, perhaps, one other example of how online hate affects Canadians, and more specifically, indigenous people in this country.
I want to give you two quick examples, if I may. In 2018, two women in Flin Flon, Manitoba were charged with uttering threats and inciting hatred after posting a photo of a vandalized car, saying that indigenous people would be killed and calling for a “shoot an Indian day”. In 2020, two known nationalist groups called the Proud Boys and the Sons of Odin used social media to threaten and attack members of the Wet'suwet'en community during the pipeline protest. In fact, data from Statistics Canada show that police-reported hate crimes against indigenous people are on the rise. Between 2016 and 2018, incidents targeting first nations, Métis and Inuit communities rose by 17% during those two years alone.
Good morning, Minister. I hope you are well on this Monday, as we approach the end of the parliamentary session.
First of all, I congratulate you on all the work you have done on Bill . Of course, I am very disappointed with what is happening right now. In December, the committee made a point of meeting with witnesses to get to the bottom of everything that was going on with child pornography. However, because we are on the Standing Committee on Access to Information, Privacy and Ethics, we had to address other issues.
Today, I would like to shed some light on all of the testimony that we have heard. Initially, our motion was to invite Pornhub executives. We've heard a lot of comments, and I'd like to express a concern that I have.
We talked about the Five Eyes group and how this is a global issue. That being said, our current position is unfortunately not at the forefront. As you said earlier, other countries have already introduced similar legislation or are in the process of doing so. Canada does not have any concrete bills in the works on this topic.
How is Canada positioning itself? How do we position ourselves internationally in terms of protecting our fundamental rights?
:
Thank you very much, Chair.
I want to thank you, Minister Guilbeault, for coming to the committee today and talking about a very important topic.
First of all, I want to go back to your opening statement. You cited an increase of xenophobia and Islamophobia in behaviours or speeches online over the recent months. As a member of the Asian-Canadian community, I observe and witness first-hand some of these intolerable behaviours online.
I have to say that the pandemic is changing people's socialized behaviour. More and more, people are spending time on social media. Then we have some of these bad actors using various platforms, seeing them as tools of disguise, seeing them as a protection, and also utilizing bots and trolls and saying all kinds of things they otherwise wouldn't say in public.
You mentioned that children in the country are being victimized, and the platforms are not doing anything. That's precisely what we are talking about today.
We know that social media companies, including the one we are doing a study on, have been acting unilaterally and opaquely. Sometimes they introduce half measures after public pressure, but they haven't been serious about consulting with industry experts and listening to the recommendations of the audience and the groups of victims.
In your opinion, what can the giants do to respect Canadians' will and Canadian law in terms of protecting the general public? It's in their best interest as well, because that's their audience and their client base. A very few bad actors are contaminating the online environment.
Can you talk a little about that?
My remarks will be a little different. I want to talk to you. What just happened is a concrete example. I think, or rather I know, that I'm the only one who can make this type of comment.
Our conscience is telling us that we must protect our children, our youth. We need to legislate and move quickly to do so as well. We're in the committee making the case that this is important and necessary. We're trying to speed things up, but we've lost a tremendous amount of time. You'll argue that I'm a new member of Parliament. However, the fact remains that people are watching us.
Despite our willingness to help our constituents, the political scene ensures that the pursuit of power takes precedence. We're seeing this right now. We're seeing pre‑campaigning, filibustering and so on. It's all about drawing things out. Minister Guilbeault, I believe that, in order to help our people, we should have had a meeting and a specific bill already in hand. However, we didn't even pass Bill C‑10, which I find extremely disappointing.
People back home are telling me things. If you ask the people back home, they'll tell you to stop carrying on the political games and the pursuit of power. We need to help our people. I'm ashamed of that part. I won't give up. Why won't I? Because my party is the only one that can claim that it promotes and protects the interests of Quebeckers. We aren't looking for power. On the contrary, we don't want it anymore.
That said, Minister Guilbeault, you spoke about five categories of illegal activities included in your bill. I don't know what they are and I would like you to identify them.
:
Hello. Good afternoon. My name is Charles DeBarber and I'm a senior privacy analyst with Phoenix Advocates and Consultants. My background is U.S. Army cyber-intelligence and cybersecurity.
I began my work with victims of non-consensual pornography, or NCP, in 2015, when I worked for the elite firm Fortalice. As the program manager for open source intelligence, I assisted victims of NCP through our reputation services. Since departing Fortalice in 2018, I have done freelance work on behalf of victims of revenge porn, extortion schemes and cyberstalking, and on purging content for victims of human trafficking. I've written bespoke information guides for clients to help protect their digital privacy and to reduce the chances of their being a target of successful doxing.
My background gives me deep insight into the sources of content on the Internet, and today I want to share with you guys some knowledge about the surface web, deep web and dark web. In addition, I'd like to share some research about the sources of adult NCP on these three layers.
As a disclaimer, I want to be clear that my data regarding NCP is limited in a few ways. First, my data is limited to the 90-plus cases that I've undertaken since 2019. You'll see these are sourced as “PAC Research 2016 to 2021”. I recognize there's a selection bias to that data due to it being from only our casework. Second, much of my information on NCP involving children is largely anecdotal, as I've never produced statistics on it. In addition, the bulk of my work has been with adult victims. Third, I am discussing the concepts of surface web, deep web and dark web and how they relate to the volumes and types of NCP often found on them. This is not to paint any of these layers as good or bad. The dark web has an especially heinous reputation, but remember that there are people who use the dark web to subvert censorship or express their free speech in countries where freedom of speech is very limited.
You'll see in the handout the beautiful iceberg graph that is commonly used to explain the three layers. You have surface web, deep web and dark web. We'll start with the surface web.
The surface web is basically the Internet content indexed by search engines themselves and things you can directly jump to from search engines. It's aggregated web content that can be found with web crawlers, also known as spider bots or spiders. Make note of that, because it is very important for one of the points I'll make later. The surface web is the minority of online content, around 4% to 5%.
What's the deep web? That's the majority of the web, more than 90% of it. It's Internet content that's not part of the surface web and is not indexed in search engines. It's mostly content that is not readily accessible through standard means, such as search engines. As I said, it's the majority of content on the Internet.
Then there's the dark web. It's part of the deep web, but what makes it different is that you have to use encryption software and special software to access it—things like Tor Browser or Freenet or Freegate. It's also used interchangeably with dark net. It can be called both.
NCP comes in many forms. Some of the key forms for adult victims include revenge porn, non-consensual surveillance, human trafficking and data or device breaches. We have the following statistics from our casework. The majority of adult NCP, 73.5% of our cases, was found on the surface web. We believe that the reason for this is that adult NCP pornography easily blends in with amateur pornography. The ease of use and popularity of video- and image-sharing sites on the surface web is the main cause of this.
On top of that, the deep web accounts for about 23.2%. These are often private forums for pirated content, BitTorrent sites, and VoIP and messaging apps like Discord communities. The more compartmented nature of the deep web leads to a lower volume of content that is also less viral.
The dark web accounts for little of our content. Content there, in our experience, includes things that we consider highly illegal, things you would find only on the dark web because they are highly illegal. This could be things like hidden bathroom cam footage, extremely violent content, child pornography and bestiality. NCP blends in with amateur pornography and is readily available on upper layers. There's no reason to go to the dark web for it. Only a minority of Internet users have enough expertise and knowledge of the dark web to use it anyway. The even more compartmentalized nature of the dark web just keeps people off it. This results in more extreme and illegal content being relegated to the dark web.
In our casework, only about 3.3% is dark web content.
There are a few observations I would like to share with the committee. I've removed over 100,000 pieces of NCP content in the last five years. My average client has between 400 to 1,200 pieces of content, and that could be the same picture, video or handful of pictures, but it's shared on many different sites. Viral content itself can be upwards of 6,000 pieces of content and above. Very rarely do I utilize the NCP removal processes created by search engines such as Google or Bing or social media like Facebook, Twitter or Reddit.
I normally use the copyright removal process here in the United States, known as the Digital Millennium Copyright Act. The NCP process often is more complicated and takes longer for victims who have to follow it for every piece of content. Imagine, if you have 400 pieces of content out there, that might be 400 different applications you have to put out. These companies, frankly, respect intellectual property more than victims, because the copyright process is so much easier.
The removal process is costly in both time and resources. I utilize automation, which is not cheap. For a client with more than 400 pieces of content, it would usually cost $2,000 for automated removal and $5,000 for bespoke removal services, and that just mitigates the problem. Victims using it manually require a certain level of understanding of information systems, search engines and web caching, and that is if the victim can find most of the content without using automated aggregators. My junior analysts, some of them with information systems and computer science backgrounds, take up to a month of hands-on work to learn how to effectively purge content. The average victim is expected to have this expertise if they cannot afford professional services. The tools for victims to effectively mitigate their digital footprint of content aren’t readily available.
Great strides have been made to get Silicon Valley to recognize the issue, and I don’t wish to demean those efforts or that recognition. Laws in my home country are now in 48 states and two territories to protect victims of NCP. However, picking up the pieces after NCP floods surface web sites is still an uphill battle. We’ve worked tirelessly so clients can google their name without NCP coming up. One of our clients lives in fear of her 10-year-old using the computer and googling her name. Others have lost job opportunities, housing opportunities and relationships. Many of our clients have contemplated or attempted suicide.
Finally, video upload sites that allow pornography, such as Pornhub or Xvideos, have exacerbated the problem. This is one of the big points I want to make. Content goes viral a lot faster with these sites, and these sites use what is called search engine optimization to flood Google with their content. Even if the content is deleted within 72 hours, it often takes days, frankly, for a victim to even find out that they're a victim. Smaller video upload sites then aggregate this material from search engines and repost it, making this a feedback loop that keeps feeding the search engines and makes it a viral issue.
The issue has become so significant that when a victim’s name is posted in a video title that they're aggregated in and it's then used in search engine keywords for porn sites that don't even have their content, it just becomes a random keyword—their name—and God forbid you have a unique name. Imagine googling your name, and hundreds of porn sites coming up because your name is a keyword empowered by SEO techniques.
We need to find a balance between verification and privacy. That's very easy for me to say, but sites having a reasonable policy for age verification is required. I compliment Pornhub in adopting a verified content policy in late 2020. I'm very angry [Technical difficulty—Editor] and I badly want them held accountable for that, but I want to make sure it's also not so cumbersome that sex workers who are free agents can't operate without reasonable privacy.
Search engines—and this is a key one, and I would recommend you put this forward, or at least encourage them to change their policies—shouldn't allow indexing from adult video image upload sites that do not come from verified accounts. This means that, with verified accounts, the spiders can be turned on so that they can feed into Google, Bing and so on. However, spiders should be turned off on any website where any Joe Schmo can come and upload content, whether it be videos or images. They should be turned off on that content until it is verified. That keeps it from hitting search engines in 72 hours.
Remember, with all NCP, you're really fighting time, and that keeps it from going viral a lot more quickly, quite frankly. It makes the clean-up process significantly better, and it can mitigate it. Furthermore, it would probably protect the intellectual property of other sex workers. As I said, Pornhub and other major tube sites have more or less put NCP into the express lane via SEO techniques.
Finally, the doxing of victims and sex workers is a very serious issue. Despite many of my clients being Jane Does, I can't get Google to delist web pages that post the real names of victims. I wish there was a policy that allowed the delisting of the real names of Jane Does, of sex workers, that exist on sites such as the defunct Porn Wikileaks, which were very dangerous for them and were made for doxing victims.
I'm very open to questions you may have and appreciate your welcoming me today. I'm honoured to be here.
Thank you.
Good afternoon, everyone. I think Mr. DeBarber mentioned most of the content that I wanted to share with you, but maybe I'm talking from another perspective, as a researcher. I'm also going share some of my latest findings, which I have already published.
As a short bio, I am Arash Habibi Lashkari, assistant professor in the faculty of computer science at UNB, research coordinator at the Canadian Institute for Cybersecurity and also a senior member of the IEEE.
In the past two and a half decades, I have been involved in different projects related to designing, developing and implementing the next generation of detecting and preventing disruptive technologies in academia and industry.
Actually, on the academic side, I can share with you that I have over 20 years of teaching experience spanning several international universities. On the research side, I have published 10 books and around 90 research articles on a variety of cybersecurity-related topics. I have also received 15 awards in international computer security competitions, including three gold medals. In 2017, I was recognized as one of the top 100 Canadian researchers who will shape the future of Canada. My main research areas are Internet and Internet traffic analysis, malware detection and also threat hunting.
As has been requested here, today I am talking about the dark and deep web and also the dark and deep net, but I'm trying to make it simpler so that it's possible to easily visualize and so that everybody can imagine it.
We have three layers, and the first one, which is the common layer, we call the “surface web”. This is everything that is available and open, everything that can be found as you search the different search engines such as Google, Bing, Baidu and others. We call this the “indexed web”, which means the websites that have been indexed by the search engines.
The second one is the deep web, which is the portion of the Internet that is hidden from the search engines, and we call this “unindexed web”. It includes mainly personal information, such as payment information, medical records and corporate private data, or when, for example, we are using a VPN, a virtual private network, to connect to these contents.
The third one is the dark web, and this portion is certainly hidden from search engines and actually includes the www content that exists on darknets. These websites can be accessible to special software and browsers that allow the users and also the website operators to remain anonymous and untraceable. There are several projects going on here to support the dark net, such as Tor, The Onion Router; I2P, the Invisible Internet Project; and also Riffle, which is the collaborative project between MIT and EPFL in response to the problems we have with the Tor network.
What is the source of the basic darknet? In 1971 and 1972, two Stanford students, using an ARPANET account at the AI laboratory, tried to engage in a commercial transaction with their counterparts at MIT. This means that before Amazon and before eBay, the seminal act of e-commerce was a drug deal, and the students used this network to quietly arrange for the sale of an undetermined amount of marijuana through the precursor to the Internet we know today.
What is the new version of the darknet, or the modern darknet? In 1990 the lack of security on the Internet—and its ability to be useful in tracking and surveillance—became clear, and in 1995 three guys from NRL, which is the U.S. Naval Research Lab, asked themselves if there was any way to create Internet connections that didn't reveal who was talking to whom, even to someone, for example, monitoring the network. The answer was onion routing.
The goal of onion routing was to have a way to use the Internet with as much privacy as possible, and the idea was to route traffic through multiple servers and encrypt it each step of the way, making it completely anonymized.
In 2000, one student from MIT—Roger—had already started to work with one of these guys at the NRL and created a new project named Tor, or The Onion Router. After that, in 2006, another student or classmate joined this team. They received funds from the EFF, and officially in 2006 they opened this non-profit organization.
My latest research results—all of them have been published in 2016, 2017 and 2020—show that it is possible to actually detect users who are connecting to the dark or deep web in a short period of time—around 10 to 15 seconds. Also, we can detect the type of software or application they are using, but from their machine, not from the Internet. From the Internet, everything is completely anonymized, but from the actual user's machine it is possible to detect their activity somehow.
I am completely ready for any question if the committee asks.
Thank you.
:
Hello, friends. I feel like most of us have met before, but in case we haven't, I'll quickly introduce myself.
My name is Melissa Lukings. I'm a juris doctor candidate in the University of New Brunswick's faculty of law. I'm also a cybersecurity law and legal researcher, an alumnus of Memorial University of Newfoundland with a B.A. in linguistics, and a social justice and legal reform advocate. I have intersectional lived experience as related to previous testimonial evidence, which was invited to be heard by this committee before. I sent in some handouts. Everyone can read about my background there. I don't really want to waste time on that. I just want to go right into what I wanted to say.
My message to you today, basically, is one of concern at the overbroad and ambiguous nature of some of the proposed legislation that has been put forward.
Here are the issues.
We're being told that the rationale behind the proposed regulations and the push for digital content censorship is to prevent the prevalence and dissemination of non-consensual pornographic material, child pornography and other abusive material, which tends to pop up mostly on the surface web, as we heard earlier. We also want to deter and detect illegal material, prevent it from being uploaded and, optimistically, reduce the instances of human trafficking done via a connection in Canada, and/or with some ties to Canada.
The last time I was here, I expressed my concern that creating more intensive regulations of any sort on surface web content will inevitably push fringe traffic onto dark forums, which are much more difficult to detect and where an influx of user access would saturate an already challenging area for law enforcement. As Dr. Lashkari pointed out, whereas you can detect dark web traffic from the user source computer, it cannot be detected in the Net, from inside, which presents a challenge.
We have some graphics that we've created. They're all in your handouts. They explain how all the different aspects of the dark web work, so if you have any questions, we have illustrations for that.
When I was last here, the response was that it's not the intention of the federal government to push human trafficking, sexual exploitation, illegal content, violence, child porn and all of that onto the dark web. That's great.
Also, as a side note, I really enjoyed being a professor for, like, a minute in your last meeting. Thanks. That was super fun. I made a GIF.
True, we don't want to push these things onto the dark web, and that's great. You wouldn't want to sweep these under the metaphorical rug that is the hidden Internet, yet we're continuing to discuss the creation of additional regulations as if there's not a direct consequence of doing so, even though there is. It's not just a matter of NIMBY or not in my backyard when it comes to illegal content. Hiding it doesn't make it go away. It just hides it from sight, which isn't really a way to address these issues.
On point number four on my notes, when I was last here, I found it really frustrating that the adult entertainment issue and sex work in general had been conflated with sexual exploitation, abuse and trafficking within discussions at this very committee.
Indeed, MP Arnold Viersen was so taken by the emailed testimony of people with common experiences in commercialized sexual activity that he felt it was appropriate to waste his speaking time reading out victim porn-type emails from unknown persons, rather than engaging with the spoken testimony of people who also had common experiences in commercialized sexual activity and who had been invited to be heard at the committee hearing.
That's not okay. Hearings are usually for being heard. You're supposed to be hearing from the people who you invite and who are to be heard at your hearing. That's why it's called a “hearing”. Anyway, that's that.
Through highly inaccurate media portrayals, the dark web has become nearly synonymous with illegal activities. However, it is also used....
An hon. member: [Inaudible—Editor]
Ms. Melissa Lukings: Chris, are you okay? Do you want me to stop?
The committee is dealing with a Canadian-controlled private corporation, a CCPC, which is a private commercial organization based in and operating with headquarters located in Canada. It is a Canadian company. We know this, and that's fine. Commercial organizations in Canada are bound by the Personal Information Protection and Electronic Documents Act. PIPEDA outlines the rules and remedies, including the fines and other penalties, for corporations that fail to abide by the provisions specified in the act.
Beyond the corporate level, we also have the Criminal Code of Canada, which outlines the criminal offences and punishments for committing such offences. We have these. We need to apply them. Everyone is bound by the Criminal Code of Canada.
Why, then, do we need additional regulations? Why do we need more oversight when we have not yet tried to simply apply the law we already have? We have these laws. We can use them, so let's use them. That's what they're for. What's the point in even having these statutes if you're not going to apply them when they're needed? What are we doing here?
We're here because a portion of those involved have decided to conflate the issue of corporate negligence with highly sexualized and emotive criminal activity—read again, child rape porn testimony. It elicits an emotional response—the sympathetic nervous system and all of that. It doesn't matter. This is about a corporation and user-generated content. It does not matter what is depicted in the content as much as it matters that the content, whatever it may be, should not have gotten past the corporation's screening system before being made live on the site. When the issue was brought to its attention, the corporation responded inadequately at first, so we need corporate law. We need to look at liability and feasibility standards.
Why has this become a forum for grandstanding religious ideologies? I'm sure you've all heard about Exodus Cry in the news, if you've been following it. Exodus Cry is a fundamental Christian organization founded on religious ideologies stemming from the United States. Why is it relevant to a question of corporate liability in Canada? It isn't. It doesn't make any sense.
Why are we arguing about exploitation? Why are we discussing mass censorship? Is that not a massive overreaction to a simple corporate negligence question? It seems glaringly obvious to me, so why are we not discussing reasonable options for encouraging corporations to better serve their users?
Also, I have some opinions about the genderedness of this. You can read about it in my notes.
When it comes down to it, you can't eliminate sex. We're humans, and there is always going to be a demand for sex. You can't eliminate sex work because the demand exists. You can't eliminate extramarital sex or porn or masturbation or demand for sexual services, but sexual assault is illegal, even when that person is your spouse. We need it to be that way. We want to protect people. If you're saying you can do certain things only within the context of marriage, you're setting yourself up for failure. It's true.
Yes, I said “masturbation” in a hearing. Oh my God.
You cannot eliminate base human desires, so you can't eliminate sex. That would be silly. It's okay to not like these things, and just because you don't like a thing or you feel that a thing is not for you, it doesn't mean it's inherently evil and should be eliminated. It doesn't work that way. It's not about and should not be about pornography or the actual content of online material here. This is about creating reasonable laws that work for Canada, Canadian corporations and everyone residing within Canada. We don't need new regulations; we don't need a new regulator, and we don't need online censorship. We need to use the tools we already have, which were designed for a reason. Why be redundant?
That is my diatribe.
Thank you for having me. I will take any questions you throw at me.
:
Thank you so much, and thank you, Ms. Lukings.
Actually, I can highlight this point. When we are looking to detect a person who is using this part of the network, from the Internet side it's completely impossible. Based on the three layers of encryption that we have, rolling up and backwarding to find a source is impossible. If we have access to the machines, from the user side we can monitor the behaviour of the user. We can detect who is using, for example, Tor connections, and with which software application for which purpose—for example, for audio, for a video call, for a chat or for uploading or downloading.
This is the key point that I think we need to consider. It is not possible, even if you have rules or regulations here in the law, to follow from the Internet to detect those guys. It's not possible, except, for example, for ISPs that are delivering Internet service in different cities or provinces. They can do some monitoring of the system that shows who is actually using this type of secure connection.
There is another concern, which is that we are not actually able to detect if they are going to work on the child pornography side. Maybe they are journalists who want to use this ability of anonymization and then deliver the voice or the sound; maybe they want to talk about something that maybe some governments have not given them permission for. This is the key point. We need to be careful [Technical difficulty—Editor] become law here, it should be clear. Can we recognize who is using this part of the super-secure or anonymized connection, for which purposes?
The key point is that, unfortunately, we cannot realize and detect it easily. It would need a huge amount of research. Maybe after five or, I don't know, 10 years, there will be some solutions we can use. At this moment, as I'm talking to you, there is no clear solution. We can detect the type of activity, but we just can't determine who is connecting to this network, for how many hours or which application they are going to use.
This is just an additional part that I would like to add to the point Ms. Lukings already highlighted.
Thank you, Mr. Chair.
Thank you, Ms. Shanahan.
I listened very carefully to our witnesses. My questions will be more geared towards Mr. DeBarber.
I gather that we must correct and improve a response and ensure that it's done properly. However, there are many challenges.
I was surprised to learn that, for 400 images, there can be 400 applications to purge, and that it may cost $2,000 for an automated removal and about $5,000 for a custom removal. So we're talking about money.
In terms of access to the individual, service providers must provide some modulation. However, we've just completely switched gears, since there must be access to the machine. I heard that very clearly.
Obviously, this is about consent, but it's also about identity. As committee members, our job is to protect people's identity. With respect to the surface web and the dark web, I was wondering whether the notion of consent and identification was straightforward. I can give myself another name or I can use a keyword, as Ms. Lukings said earlier. I'm concerned about this. That's my first question. I'd like to hear your thoughts on this, Mr. DeBarber.
I respect the notion of consent. We won't take away what people like. However, we want to make sure that non‑consenting individuals, including minors, can't become victims. I'd like to hear your comments on this as well.