Committee
Consult the user guide
For assistance, please contact us
Consult the user guide
For assistance, please contact us
Add search criteria
Results: 31 - 45 of 88
View Steven Guilbeault Profile
Lib. (QC)
Bill C-10 is not about content moderation. The CRTC, in its last 50 years of existence, has never done content moderation, and Bill C-10 doesn't give the CRTC the ability to do content moderation.
View Rachael Harder Profile
CPC (AB)
There are two sections in this bill that were of significance: proposed subsection 2(2.1), which protects individuals; and proposed section 4.1, which protects their content.
Proposed subsection 2(2.1), on individuals, was kept in, but the section that protects their content, what they post online, was taken out. Therefore, they no longer have that protection. Why?
View Steven Guilbeault Profile
Lib. (QC)
You might have heard, like I did a few minutes ago, Justice Deputy Minister Drouin answer that question very clearly, specifying that the powers given to the CRTC are very narrow and targeted and don't have to do with content moderation.
Melissa Lukings
View Melissa Lukings Profile
Melissa Lukings
2021-04-19 12:09
The current rules under PIPEDA assign a fine to companies that violate the provisions that are related to privacy. All businesses have to follow these rules and have a specific format for how they collect, use, handle, disclose, access and allow people to access their own information as held by a company and as used by a company.
When we do the digital charter implementation act, it wouldn't be far-fetched at all to increase the fine for online platforms without banning them entirely or making it impossible for them to operate. It's a harm reduction idea. It's a safer idea than forcing people onto the dark web, where we literally have our hands tied. We can't intervene or help at all. It's recognizing that with the digital charter implementation act, we have the opportunity to look to the future and say, all right, as much as we might like to say that it is not okay to do this to people, by banning things and by prohibiting them, we're forcing them underground. How did Prohibition work out?
We have this opportunity now to actually talk about it. What do we expect from social media? What do we expect from other platforms without putting the criminalization...? For me, this isn't about criminalization. It's about the rules for companies. Without criminalizing the actual people who are in the content, we can put the onus on the company to do user verification.
Think about the same types of things your bank might use. You have a PIN. Some online platforms will require you to submit your driver's licence. If someone who is a user uploads content that has not been made consensually, that can be flagged and be sent immediately to the moderator. I actually also own a website and run a website, so I know how this works. That can be flagged and sent to the website owner. They can then go and look up the user. You have their driver's licence. You can track them. It's perfect. It works really well—way better than the dark web.
By increasing the amount of controls and security that the company has to do, without regulating the actual people who are involved but putting the onus on the company, it reduces the criminalization of sex workers. It helps us to locate and assist people who are being exploited or who are having images uploaded non-consensually. It gives them more power, because when you flag the video, it immediately comes down.
We can do that. We have the technology to do all these things. We can do it automatically. Automation is a real thing.
View Colin Carrie Profile
CPC (ON)
View Colin Carrie Profile
2021-04-19 12:12
As one of my colleagues said, as technology improves, we're trying to keep up to it.
Ms. Melissa Lukings: Exactly.
Mr. Colin Carrie: It's almost impossible. I've done a lot of work on human trafficking. I think everybody is in agreement there. When you have a young person plead with this committee about a way in which we could work with regulators so that images could be taken down, anything you could send to the committee that would enlighten us would be greatly appreciated.
Talking about sex work is a whole other study.
View David Lametti Profile
Lib. (QC)
Thank you, Mr. Chair.
I'm accompanied today by François Daigle, the associate deputy minister of the Department of Justice. Thank you for the invitation to appear before you today.
I'd like to make some general comments on some of the issues raised during previous meetings of the committee's study.
I'd like to emphasize that the government is committed to keeping our children safe, including online, as Minister Blair just said. Canada's criminal legislation in this area are among the most comprehensive in the world.
The Criminal Code prohibits all forms of making, distributing, transmitting, making available, accessing, selling, advertising, exporting and possessing child pornography, which the Criminal Code broadly defines as material involving the depiction of sexual exploitation of persons under the age of 18 years.
The Criminal Code also prohibits luring—that is, communicating with a young person, using a computer, including online, for the purpose of facilitating the commission of a sexual offence against that young person. It prohibits agreeing to or making arrangements with another person to commit a sexual offence against a child, and it prohibits providing sexually explicit material to a young person for the purpose of facilitating the commission of a sexual offence against that young person.
Furthermore, the Criminal Code also prohibits voyeurism and the non-consensual distribution of intimate images, which are particularly germane to both the online world and the discussion we are having today.
Offences of a general application may also apply to criminal conduct that takes place online or that is facilitated by the use of the Internet. For example, criminal harassment and human trafficking offences may apply, depending upon the facts of the case.
Courts are also authorized to order the removal of child sexual exploitation material and other criminal content, such as intimate images, voyeuristic material or hate propaganda, where it is being made available to the public from a server in Canada.
In addition to the Criminal Code, as Minister of Justice, I'm responsible for the Act respecting the mandatory reporting of Internet child pornography by persons who provide and Internet service. This act doesn't have a short title, but law practitioners refer to it as the mandatory reporting act.
In English, it's the mandatory reporting act, or MRA.
Under the mandatory reporting act, Internet service providers in Canada have two main obligations. The first is to contact the Canadian Centre for Child Protection when they receive child pornography complaints from their subscribers. This centre is the non-governmental agency that operates Cybertip.ca, the national tipline for reporting the online sexual exploitation of children.
The second obligation of Internet service providers is to inform the provincial or territorial police when there are reasonable grounds to believe that its Internet services have been used to commit a child pornography offence.
While Canada's laws are comprehensive, it is my understanding that there has been some concern as to how they are being interpreted and implemented, especially in relation to the troubling media reports about MindGeek and its Pornhub site.
Since I am the Minister of Justice, it would not be appropriate for me to comment on ongoing or potential investigations or prosecutions, but I would also note that the responsibility for the administration of criminal justice, including the investigation and prosecution of such crimes, including the sexual exploitation offences, falls largely on my provincial colleagues and counterparts.
However, as the Prime Minister stated during question period on February 3:
...cracking down on illegal online content is something we are taking very, very seriously. Whether it is hate speech, terrorism, child exploitation or any other illegal acts....
In fact, the government takes these measures so seriously that the Prime Minister has given four ministers the mandate to address different aspects of online harms. Minister Blair and I are two of these ministers. As he has mentioned, the Minister of Canadian Heritage is one of the lead [Technical difficulty—Editor] as well.
While the Internet has provided many benefits to Canada and the world, it has also provided criminals with a medium that extends their reach—and thus, their victim base—and a medium that elevates the level of complexity of investigations. One complicating factor is that telecommunications networks and services transcend international borders, while the enforcement authority of police, such as the RCMP, is generally limited to their domestic jurisdiction.
Further, under international law, court orders are generally enforceable only within the jurisdiction of a state. With limited exceptions, their enforcement requires the consent of the other state in which they are sought to be enforced.
Canada is obviously not the only country facing these challenges, which is why we continue to work with our international partners to facilitate international co-operation in the investigation and prosecution of these crimes, notably to strengthen bilateral co-operation and negotiation of new international mutual legal assistance treaties in criminal matters in order to address these issues.
Although mutual legal assistance treaties are a universally accepted method of requesting and obtaining international assistance in criminal matters, even in emergency situations, they weren't designed for the Internet age, where digital evidence is a common component of most criminal investigations and where timeliness is essential to the collection of this evidence because of its volatility.
Canada is actively working with its international partners to address these issues. For example, we are currently participating in the negotiation of a second protocol to the Council of Europe Convention on Cybercrime to enhance international co-operation on cross-border access to data.
Thank you.
View Martin Champoux Profile
BQ (QC)
View Martin Champoux Profile
2021-03-29 11:28
I will interpret that response as a no. So I have to conclude that you don't have any francophone moderators in Quebec. It was a simple question that you could have answered with yes or no, but you are telling me that you do not want to disclose this information. That's all right.
Mr. Chan, you remember the sad events in Christchurch. I was asking you if you control the content that goes out on your platform, because we're discussing what information Facebook allows, and you have some control over what is broadcast on your platform. For 17 minutes, the Christchurch killer broadcast his actions live on the Facebook platform.
Do you think you could have stopped that broadcast at that time?
Kevin Chan
View Kevin Chan Profile
Kevin Chan
2021-03-29 11:29
We were able to detect it and remove it, ultimately, as you point out. Of course we regret the tragedy and we regret that we were not even faster. We have obviously learned a lot from that terrible incident, not just at Facebook. To be fair, we've worked across the sector to build systems and protocols—with governments as well—to ensure that the entire system actually works, not just on Facebook, but across companies, across platforms and with governments. We've built these protocols to move much faster should the regrettable and unfortunate thing happen again.
Kevin Chan
View Kevin Chan Profile
Kevin Chan
2021-03-29 12:13
There are two ways of enforcing our systems, to be honest. One is the automated system, as I think one of your colleagues mentioned, which uses artificial intelligence. Some of the technology was developed in Canada: machine learning to go and find all these things.
In fact, I have some statistics here. In terms of hate speech, in the last quarter of 2020, our automated systems found over 97% of hate speech directed at groups automatically, before any human had seen them or reported them. That's where we are. Now, 97% is not 100%, so we still have a ways to go, but we're getting better every day. That's our posture. That's the way we do it right now.
The other piece, though, is that because speech is important from a contextual standpoint, we have to be careful on some of the grey zones for speech that, in fact, it is an attack on the community and not something else, for example, spreading awareness about Asian racism. We need humans as well, so part of that 35,000-person team that I referred to consists of people who are going to be looking at the context and saying that this image was shared, this video was shared, or this text was shared, but is the context of this to attack Asians, or is this to raise awareness about discrimination and racism? That context matters in terms of whether or not we would enforce and take it down.
It is really a parallel process that meets when we need to get more context. We have automated systems that go and find things automatically. We're constantly improving, but we're at about 97% of proactive identification and we need humans to verify some of the more challenging ones, where the speech is grey and we have to be sure of the context. Then, in the most complicated cases, they get escalated to people like me and Rachel, where we will look at specific pieces of content emanating from Canada, consult with experts and think through whether or not we're going to be drawing the line in the right place.
Lianna McDonald
View Lianna McDonald Profile
Lianna McDonald
2021-02-22 11:11
Good morning, Chairperson and distinguished members of the committee. Thank you for giving us this opportunity to present.
I am Lianna McDonald, executive director of the Canadian Centre for Child Protection, a charity dedicated to the personal safety of children. Joining me today is Lloyd Richardson, our director of technology.
By way of background, our agency operates Cybertip.ca, which is Canada’s tip line for reporting the online sexual exploitation of children. The tip line has been operating for over 18 years and currently receives, on average, 3,000 or more public reports per month.
Our agency has witnessed the many ways in which technology has been weaponized against children and how the proliferation of child sexual abuse material, otherwise known as CSAM, and non-consensual material fosters ongoing harm to children and youth. Over the last decade, there has been an explosion of digital media platforms hosting user-generated pornographic content. This, coupled with a complete absence of meaningful regulation, has created the perfect storm whereby transparency and accountability are notably absent. Children have been forced to pay a terrible price for this.
We know that every image or video of CSAM that is publicly available is a source of revictimization for the child in that image or video. For this reason, in 2017 we created Project Arachnid. Processing tens of thousands of images per second, this powerful tool detects known CSAM for the purpose of quickly identifying and triggering the removal of this illegal and harmful content. Project Arachnid has provided our agency with an important lens into how the absence of a regulatory framework fails children. To date, Arachnid has processed more than 126 billion images and has issued over 6.7 million takedown notices to providers around the globe. We keep records of all these notices we send, how long it takes for a platform to remove CSAM once advised of its existence, and data on the uploading of the same or similar images on platforms.
At this point, we would like to share what we have seen on MindGeek’s platforms. Arachnid has detected and confirmed instances of what we believe to be CSAM on their platform at least 193 times in the past three years. These sightings include 66 images of prepubescent CSAM involving very young children; 74 images of indicative CSAM, meaning that the child in the image appears pubescent and roughly between the ages of 11 to 14; and 53 images of post-pubescent CSAM, meaning that sexual maturation of the child may be complete and we have confirmation that the child in the image is under the age of 18.
We do not believe the above numbers are representative of the scope and scale of this problem. These numbers are limited to obvious CSAM of very young children and of identified teenagers. There is likely CSAM involving many other teens that we would not know about, because many victims and survivors are trying to deal with the removal issue on their own. We know this.
MindGeek testified that moderators manually review all content that is uploaded to their services. This is very difficult to take seriously. We know that CSAM has been published on their website in the past. We have some examples to share.
The following image was detected by Arachnid. This image is a still frame taken from a CSAM video of an identified sexual abuse survivor. The child was pubescent, between the ages of 11 and 13, at the time of the recording. The image shows an adult male sexually assaulting the child by inserting his penis in her mouth. He is holding the child’s hair and head with one hand and his penis with the other hand. Only his midsection is visible in the image, whereas the child’s face is completely visible. A removal request was generated by Project Arachnid. It took at least four days for that image to come down.
The next example was detected also by Project Arachnid. It is a CSAM image of two unidentified sexual abuse victims. The children pictured in the image are approximately 6 to 8 years of age. The boy is lying on his back with his legs spread. The girl is lying on top of him with her face between his legs. Her own legs are straddling his head. The girl has the boy’s penis in her mouth. Her face is completely visible. The image came down the same day we sent the notice requesting this removal.
We have other examples, but my time is limited.
While the spotlight is currently focused on MindGeek, we want to make it clear that this type of online harm is occurring daily across many mainstream and not-so-mainstream companies operating websites, social media and messaging services. Any of them could have been put under this microscope as MindGeek has been by this committee. It is clear that whatever companies claim they are doing to keep CSAM off their servers, it is not enough.
Let's not lose sight of the core problem that led to this moment. We've allowed digital spaces where children and adults intersect to operate with no oversight. To add insult to injury, we have also allowed individual companies to decide the scale and scope of their moderation practices. This has left many victims and survivors at the mercy of these companies to decide if they take action or not.
Our two-decades-long social experiment with an unregulated Internet has shown that tech companies are failing to prioritize the protection of children online. Not only has CSAM been allowed to fester online, but children have also been harmed by the ease with which they can easily access graphic and violent pornographic content. Through our collective inaction we have facilitated the development of an online space that virtually has no rules, certainly no oversight and that consistently prioritizes profits over the welfare and the protection of children. We do not accept this standard in other forms of media, including television, radio and print. Equally, we should not accept it in the digital space.
This is a global issue. It needs a global coordinated response with strong clear laws that require tech companies to do this: implement tools to combat the relentless reuploading of illegal content; hire trained and effectively supervised staff to carry out moderation and content removal tasks at scale; keep detailed records of user reports and responses that can be audited; be accountable for moderation and removal decisions and the harm that flows to individuals when companies fail in this capacity; and finally, build in, by design, features that prioritize the best interests and rights of children.
In closing, Canada needs to assume a leadership role in cleaning up the nightmare that has resulted from an online world that is lacking any regulatory and legal oversight. It is clear that relying upon the voluntary actions of companies has failed society and children miserably. The time has come to impose some guardrails in this space and show the leadership that our children deserve.
I thank you for your time.
Daniel Bernhard
View Daniel Bernhard Profile
Daniel Bernhard
2021-02-22 11:19
Madam Chair, honourable members of the committee, thank you for inviting me to appear today.
My name is Daniel Bernhard, and I am the executive director of Friends of Canadian Broadcasting, an independent citizens' organization that promotes Canadian culture, values and sovereignty on air and online.
Last September, Friends released “Platform for Harm”, a comprehensive legal analysis showing that under long-standing Canadian common law, platforms like Pornhub and Facebook are already liable for the user-generated content they promote.
On February 5, Pornhub executives gave contemptuous and, frankly, contemptible, testimony to this committee, attempting to explain away all the illegal content that they promoted to millions of Canadians and millions more around the world.
Amoral as the Pornhub executives appear to be, it would be a mistake, in my opinion, to treat their behaviour as a strictly moral failing. As Mr. Angus said on that day the activity that you are studying is quite possibly criminal.
Pornhub does not dispute having disseminated vast amounts of child sexual abuse material, and Ms. McDonald just confirmed that fact. On February 5, the company's executives acknowledged that 80% of their content was unverified, some 10 million videos, and they acknowledged that they transmitted and recommended large amounts of illegal content to the public.
Of course, Pornhub's leaders tried to blame everybody but themselves. Their first defence is ignorance. They claim they can't remove illegal content from the platform because until a user flags it for them, they don't know it's there. In any case, they claim that responsibility lies with the person who uploaded the content and not with them. However, the law does not support this position. Yes, uploaders are liable, but so are platforms promoting illegal content if they know about it in advance and publish it anyway or if they are made aware of it post-publication and neglect to remove it.
This brings us to their second defence, incompetence. Given the high cost of human moderation, Pornhub employs software to find offending content, yet they hold themselves blameless when their software doesn't actually work. As Mark Zuckerberg has done so many times, Pornhub promised you that they'll do better. “Will do better” isn't a defence. It's a confession.
I wish Pornhub were an outlier, but it's not. In 2018, the U.S. National Center for Missing and Exploited Children received over 18 million referrals of child sexual abuse materials, according to the New York Times. Most of it was found on Facebook. There were more than 50,000 reports per day. That's just what they caught. The volume of user-uploaded, platform-promoted child sexual abuse material is now so vast that the FBI must prioritize cases involving infants and toddlers, and according to the New York Times, “are essentially not able to respond to reports of anybody older than that”.
These platforms also disseminate many illegal contents that are not of a sexual nature. These include incitement to violence, death threats, and the sale of drugs and illegal weapons, among others. The Alliance to Counter Crime Online group regularly discovers such content on Facebook, YouTube and Amazon. There is even an illegal market for human remains on Facebook.
The volume of content that these platforms handle does not excuse them from disseminating and recommending illegal material. If widespread distribution of illegal content is an unavoidable side effect of your business, then your business should not exist, period.
Can you imagine an airline being allowed to carry passengers when every other flight crashes? Imagine if they just said that flying is hard and kept going. Yet Pornhub and Facebook would have you believe just that: that operating illegally is fine because they can't operate otherwise. That's like saying, “Give me a break officer. Of course I couldn't drive straight. I had way too much to drink.”
The government promises new legislation to hold platforms liable in some way for the content that they promote and this is a welcome development. But do we really need a new law to tell us that broadcasting child sexual assault material is illegal? How would you react if CTV did? Exactly.
In closing, our research is clear. In Canada, platforms are already liable for circulating illegal user-generated content. Why hasn't the Pornhub case led to charges? Perhaps you can invite RCMP Commissioner Lucki to answer that question. Ministers Blair and Lametti could also weigh in. I'd be curious to hear what they have to say.
Don't get me wrong. The work that you are doing to draw attention to Pornhub's atrocious behaviour is vital, but you should also be asking why this case is being tried at committee and not in court.
Here's the question: Does Pornhub's CEO belong in Hansard or in handcuffs? This is a basic question of law and order and of Canada's sovereignty over its media industries. It is an urgent question. Canadian children, young women and girls cannot wait for a new law and neither should we.
Thank you very much. I welcome your questions.
John F. Clark
View John F. Clark Profile
John F. Clark
2021-02-22 11:25
Good morning, Madam Chair Shanahan and honourable members of the committee.
My name is John Clark. I am the president and CEO of the U.S.-based National Center for Missing and Exploited Children, sometimes known as NCMEC.
I am honoured to be here today to provide the committee with NCMEC's perspective on the growing problem of child sexual exploitation online, the role of combatting the dangers children can encounter on the Internet, and NCMEC's experience with the website Pornhub.
Before I begin with my testimony, I'd like to clarify for the committee that NCMEC and Pornhub are not partners. We do not have a partnership with Pornhub. Pornhub has registered to voluntarily report instances of child sexual abuse material on its website to NCMEC. This does not create a partnership between NCMEC and Pornhub, as Pornhub recently claimed during some of their testimony.
NCMEC was created in 1984 by child advocates as a private, non-profit organization to help find missing children, reduce child sexual exploitation and prevent child victimization. Today I will focus on NCMEC's mission to reduce online child sexual exploitation.
NCMEC's core program to combat online child sexual exploitation is the CyberTipline. The CyberTipline is a tool for members of the public and electronic service providers, or ESPs, to report child sexual abuse material to NCMEC.
Since we created the CyberTipline over 23 years ago, the number of reports we receive has exploded. In 2019 we received 16.9 million reports to the CyberTipline. Last year we received over 21 million reports of international and domestic online child sexual abuse. We have received a total of over 84 million reports since the CyberTipline began.
A United States federal law requires a U.S.-based ESP to report apparent child sexual abuse material to NCMEC's CyberTipline. This law does not apply to ESPs that are based in other countries. However, several non-U.S. ESPs, including Pornhub, have chosen to voluntarily register with NCMEC and report child sexual abuse material to the CyberTipline.
The number of reports of child sexual exploitation received by NCMEC is heartbreaking and daunting. So, too, are the many new trends NCMEC has seen in recent years. These trends include the following: a tremendous increase in sexual abuse videos reported to NCMEC, reports of increasingly graphic and violent sexual abuse images, and videos of infants and young children. These include on-demand sexual abuse in a pay-per- view format, and videos showing the rape of young children.
A broader range of online platforms are being used to access, store, trade and download child sexual abuse material, including chats, videos and messaging apps, video- and photo-sharing platforms, social media and dating sites, gaming platforms and email systems.
NCMEC is fortunate to work with certain technology companies that employ significant time and financial resources on measures to combat online child sexual abuse on their platforms. These measures include large teams of well-trained human content moderators; sophisticated technology tools to detect abusive content, report it to NCMEC and prevent it from even being posted; engagement in voluntary initiatives to combat online child sexual exploitation offered by NCMEC and other ESPs; failproof and readily accessible ways for users to report content; and immediate removal of content reported as being child sexual abuse.
NCMEC applauds the companies that adopt these measures. Some companies, however, do not adopt child protection measures at all. Others adopt half-measures as PR strategies to try to show commitment to child protection while minimizing disruption to their operations.
Too many companies operate business models that are inherently dangerous. Many of these sites also fail to adopt basic safeguards, or do so only after too many children have been exploited and abused on their sites.
In March 2020, MindGeek voluntarily registered to report child sexual abuse material, or CSAM, on several of its websites to NCMEC's CyberTipline. These websites include Pornhub, as well as RedTube, Tube8 and YouPorn. Between April 2020 and December 2020, Pornhub submitted over 13,000 reports related to CSAM through NCMEC's CyberTipline; however, Pornhub recently informed NCMEC that 9,000 of these reports were duplicative. NCMEC has not been able to verify Pornhub's claim.
After MindGeek's testimony before this committee earlier this month, MindGeek signed agreements with NCMEC to access our hash-sharing databases. These arrangements would allow MindGeek to access hashes of CSAM and sexually exploitive content that have been tagged and shared by NCMEC with other non-profits and ESPs to detect and remove content. Pornhub has not taken steps yet to access these databases or use these hashes.
Over the past year NCMEC has been contacted by several survivors asking for our help in removing sexually abusive content of themselves as children that was on Pornhub. Several of these survivors told us they had contacted Pornhub asking them to remove the content, but the content still remained up on the Pornhub website. In several of these instances NCMEC was able to contact Pornhub directly, which then resulted in the content being removed from the website.
We often focus on the tremendous number of CyberTipline reports that NCMEC receives and the huge volume of child sexual abuse material contained in these reports. However, our focus should more appropriately be on the child victims and the impact the continuous distribution of these images has on their lives. This is the true social tragedy of online child sexual exploitation.
NCMEC commends the committee for listening to the voices of the survivors in approaching these issues relating to Pornhub. By working closely with the survivors, NCMEC has learned the trauma suffered by these child victims is unique. The continued sharing and recirculation of a child's sexually abusive images and videos inflicts significant revictimization on the child. When any website, whether it's Pornhub or another site, allows a child's sexually abusive video to be uploaded, tagged with a graphic description of their abuse and downloaded and shared, it causes devastating harm to the child. It is essential for these websites to have effective means to review content before it's posted, to remove content when it's reported as child sexual exploitation, to give the benefit of doubt to the child or the parent or lawyer when they report content as child sexual exploitation, and to block the recirculation of abusive content once it has been removed.
Child survivors and the children who have yet to be identified and recovered from their abuse depend on us to hold technology companies accountable for the content on their platforms.
I want to thank you for the opportunity to appear before this committee. This is an increasingly important topic. I look forward to answering the committee's questions regarding NCMEC's work on these issues.
View Arnold Viersen Profile
CPC (AB)
Mr. Richardson, I'll ask you this seeing as you're the tech guy here.
The big trouble we've been studying here at this committee is around this, the age and the consent of folks who are depicted in these videos. We hear a lot about how long it took to take the video down and things like that, but certainly there would be methods of ensuring that these videos never show up in the first place.
I was wondering if you could comment on that. If you're bragging that you are the leading tech company in the world, surely there's technology to keep this stuff off the Internet to begin with.
Lloyd Richardson
View Lloyd Richardson Profile
Lloyd Richardson
2021-02-22 12:05
There is, but I would kind of invert that a little bit. It's not a technical issue.
Let's reverse in time to the 1980s before the popularized Internet when we had pornography and we didn't see child sexual abuse material showing up in Playboy magazine. It's not necessarily a technical issue. If you're in fact moderating everything that comes up on your platform, this should never happen. We don't see the CBC show up with child pornography on its services because there's moderation that happens. We have control over the content. That's not to say you can't leverage technology, as we do in Project Arachnid, to do proactive detection of known child sexual abuse material, but really, let's not look at the new and fancy, oh, I have an AI classifier that can automatically detect child pornography. That's great and all, but it's never going to detect everything, and it's not going to have the accuracy that you have of actual human moderators looking at material. It's in addition to something that's already there, so it's important not to belabour the technological side of things.
Daniel Bernhard
View Daniel Bernhard Profile
Daniel Bernhard
2021-02-22 12:07
It was just to say that I agree. Platforms want to operate at a certain scale, which requires them not to validate any of the content that comes up, yet that seems to result in illegal outcomes, so it's not really for us to say how they should deal with this, but simply that if it's there, they should face the consequences.
To Mr. Richardson's point, I have one final issue. It's not just CBC, CTV, etc., who make sure that their content is lawful. They also have to make sure the advertising that they run is lawful and that op-eds and other third party contributed content are lawful. Otherwise they are jointly liable. This is how the law works, and I see no reason why it shouldn't apply in the case of Pornhub, Facebook, Amazon or any other provider that is recommending and facilitating illegal behaviour through its service.
Results: 31 - 45 of 88 | Page: 3 of 6

|<
<
1
2
3
4
5
6
>
>|
Export As: XML CSV RSS

For more data options, please see Open Data