Skip to main content
Start of content

SECU Committee Meeting

Notices of Meeting include information about the subject matter to be examined by the committee and date, time and place of the meeting, as well as a list of any witnesses scheduled to appear. The Evidence is the edited and revised transcript of what is said before a committee. The Minutes of Proceedings are the official record of the business conducted by the committee at a sitting.

For an advanced search, use Publication Search tool.

If you have any questions or comments regarding the accessibility of this publication, please contact us at

Previous day publication Next day publication
Skip to Document Navigation Skip to Document Content

House of Commons Emblem

Standing Committee on Public Safety and National Security



Monday, May 27, 2024

[Recorded by Electronic Apparatus]



     I call this meeting to order.
     Welcome to meeting number 108 of the House of Commons Standing Committee on Public Safety and National Security.
    Pursuant to the order of reference referred to the committee on Wednesday, December 13, 2023, the committee is resuming its study of Bill S-210, an act to restrict young persons' online access to sexually explicit material.
    Before we begin, I would like to ask all members and other in-person participants to consult the cards on the table for guidance to prevent audio feedback incidents.
     Please take note of the following preventative measures in place to protect the health and safety of all participants, including the interpreters. Only use a black approved earpiece. The former grey earpieces must no longer be used. Keep your earpiece away from all microphones at all times. When you are not using your earpiece, place it face down on the sticker placed on the table for this purpose.
     Thank you all for your consideration.
    I have a point of order, Chair.
    Today's meeting is taking place in a hybrid format.
    I would like to make a few comments for the benefit of members and witnesses. Please wait until I recognize you by name before speaking. As a reminder, all comments should be addressed through the chair.
    First, we have some bills to pay. The clerk distributed on Monday, May 6—
    Chair, can I raise a point of order?
    Mr. Genuis, go ahead on your point of order.
     Thank you, Chair.
     I know there's been some disagreement about aspects of the agenda over the last few meetings, but there have been some discussions among the parties. We may have an agreement; we may not. I want to propose something, and see if there's unanimous consent for it, and where we are at.
     I propose that there be unanimous consent for the following: That the draft report on the Bernardo prison transfer be immediately distributed; that the report be considered at the next meeting of the committee; that the chair seek additional resources in the week of June 3 for the study of S-210; and that the Minister of Public Safety be invited to appear for two hours in the week of June 10, with one hour dedicated to auto theft and one hour dedicated to the main estimates, and that the Minister of Transport be invited to appear separately for one hour on auto theft.
     I think that has been distributed, and I'm hoping there might be agreement on it.
     Is there unanimous consent?
    Some hon. members: No.
    The Chair: I see no unanimous consent. It's back to business.
    The clerk distributed on Monday, May 6—
    Mr. Chair, I have another point of order. I wonder if I could propose.... Is it possible to have a five-minute suspension for discussion on that?
     Why don't we get through a little bit of this, and then we'll go back to it. Maybe raise another point of order to see if we can do that.
    The clerk distributed on Monday, May 6, and May 24, a draft budget on Bill S-210 in the amount of $15,000.
     Does the committee wish to adopt the budget? Can I have a show of hands?
    (Motion agreed to)
    The Chair: Similarly, I note that the deadline to report Bill S-210 to the House is Friday, June 7.
     Is it the will of the committee to request an extension of 30 sitting days for the consideration of Bill S-210 and that the chair present a report to the House? May I have a show of hands on this, please.
    No, there isn't, Chair.
    All in favour?
    Chair, it's not a vote. You can't call a vote. If you're moving the motion, I'd like to speak to it.
    The chair is entitled to seek guidance from the committee whenever—
     You cannot end debate on it, and this will be an issue in the House if you do, Chair. You can't end debate on it.
     If this is a matter to be debated, then I'd like to debate it.
    Make it an issue in the House.
     You cannot arbitrarily cut off speakers or limit debate on a matter before the committee. That's a clear violation of the Standing Orders, and it is appealable to the House.
     If you'd like to move that an extension be requested, then I would like to speak on that matter, and I will be heard.
     I'm not making such a motion. I'm asking for guidance from the committee.
    There is not agreement of the committee.
    There are two ways that committees come to decisions, either by agreement or by a motion. If there is a motion before this committee, then I wish to speak to it.
     You cannot get agreement of the committee when members dissent unless debate ends. That is how the rules work.
    The decision of the chair is to seek advice of the committee on this matter.
     Do you wish to challenge the chair on the matter?
    Chair, I wish to speak on the motion before the committee. You need to respect the rules. This is not allowed, and I will speak to the motion. If you're putting a motion on the floor, on an extension, I wish to speak to the motion.
    The committee has before it a number of matters to consider, and those matters—
     I have a point of order, Mr. Chair.
    Go ahead, Mr. Noormohamed, on a point of order.
    I don't believe there was an actual motion on the floor, so there should be no debate.
    There is no actual motion on the floor.
    There is not agreement of the committee.
    Excuse me. Please don't interrupt.
     I've asked the committee for advice, for guidance—
    On a point of order, Chair, there is not agreement of the committee to proceed in this fashion. The committee does not agree. You cannot demand agreement without allowing people to speak on it.
    If you let the clerk speak, he will confirm exactly what I'm saying.
    The chair—
    The chair has to respect the rules.
    The chair has decided to ask the committee for guidance.
    The chair must respect the rules.
    Is there a motion on the floor?
    If you wish to challenge the chair on this matter, please do so.
     Chair, is there a motion on the floor?


     There is no motion on the floor.
    Then there is no agreement.
    Chair, you will respect the rules.
    Back up from the mike, please.
    You need to respect the rules, Chair.
    The decision of the chair is to seek advice of the committee—
     Chair, is there a motion before the committee or not?
    Mr. Genuis, the chair has the floor. You do not.
    The chair is asking the committee for advice on a matter that is of time urgency.
     The committee does not agree.
    You do not speak for the committee. I will ask the committee. If you wish to challenge—
     Sir, you cannot presume the agreement of the committee without a motion. I'm not going to back down on defending the rules or the prerogatives of this committee. I know the rules better than you do, Chair. Ask the clerk. Get the clerk's opinion on this, because the clerk knows that I'm right. The Speaker will know that I'm right.
    For you to think that just because you sit in that chair you can show flagrant disregard for the rules and the way they protect the privileges of members is disgraceful.
    If you want to put a motion before the committee, then we can have a motion and we can debate it. If you are not putting before a motion, then you have no agreement of the committee.
    I have a point of order, Chair.
    Ms. Hepfner, go ahead on a point of order.
    It seems to me that in this instance that the points of order are being used to talk over the chair.
    Mr. Genuis keeps taking control of the floor from you, Chair. I'm finding it very difficult to follow. He keeps talking over you.
    Is that part of the rules of the House? Can you talk over the chair? If you want to bully your point across, do you just take the floor? Is that how it works around here?
    I'm finding this very difficult to follow, Chair, because you keep getting talked over.
    Thank you very much for your point of order.
    The committee does not agree.
    Mr. Genuis, I have the floor.
    In like manner as I just achieved approval of the committee to approve the budget, I'm asking for the committee—
    No, Chair, there was unanimous agreement on the budget.
    I have a point of order, Mr. Chair.
     They may have voted unanimously, but I did not ask for unanimous consent.
    The committee agreed unanimously on the budget.
     It is noted in Bosc and Gagnon that any efforts to undermine or disrespect the authority of the chair can be considered disorder and misconduct. At this stage, Mr. Genuis' language in respect of his conduct towards you and the function of this committee, without having given you the opportunity to finish what you were saying, would appear to now be sitting somewhere in that zone.
     On that point, Chair....
    We'll suspend for a couple minutes so we can discuss your other matter, but meanwhile, the chair is—
    There is not agreement.
    You do not agree.
    The committee does not agree.
    You do not agree.
    The committee does not agree. There is no agreement of the committee.
    We'll suspend for five minutes.



     I call this meeting back to order—I hope.
    There were a lot of serious negotiations in the back room.
    Mr. Genuis, did you wish to speak on that point?
    Yes, Mr. Chair.
    I think there's an agreement that we distribute the draft Bernardo report, that an auto theft report be drafted and distributed, that investigation of Bill C-70 begin on Thursday, and that we proceed with the hearings as scheduled today.


    Do we have unanimous agreement?
That the draft report on the study of rights of victims of crime, reclassification and transfer of Federal Offenders, be distributed immediately to committee members; that the analysts prepare a draft report on the study of the growing problem of car thefts in Canada and distribute it; and that the study of Bill C-70, An Act respecting countering foreign interference, be initiated on Thursday, May 30, 2024.
    Some hon. members: Agreed.
    The Chair: There's a question from the clerk about the auto theft report. We haven't given instructions yet to the analysts.
     The analysts are very wise and capable people.
    Well, that is true. That is an absolute. Let's leave it at this point, where they can go forward as best they can, because they teach us, anyway.
     That being the case, we have unanimous consent for that agreement.
    Are you clear, Mr. Clerk, on what that is?
    Okay. Very well. Let us get back to Bill S-210.
     Let me first acknowledge that Mr. Genuis was not wrong before. I hate to say that, but it's true.
    Mr. Chair, I want to acknowledge that it was very gracious of you, and I want to compliment you for saying so. I want to say it's a credit to your character to acknowledge what the rule is. I appreciate it.
    We're going to start hiring you for procedural advice.
     Mr. Noormohamed, go ahead, please.
     I just want to say that we all managed to do this together. I think it is a very important signal that, when we have a commitment to do something together and have common purpose, we can actually get stuff done. I want to thank my former colleagues from this committee, whom I miss dearly, for allowing us to do this and for all working together.
    We're all going to the Canadian heritage committee now.
    Actually, this process is not particularly unusual for this place at this time of year, nor is the sudden rapprochement towards the end.
    To our witnesses, thank you for waiting. We'll continue with your remarks now, followed by questions. You were slated for a full hour. We have resources to give you that full hour. Mr. Dufresne is out there with Ms. Lara Ives. I've advised them that, when we start, we will bring them in an hour later, so we will have that full panel, as well.
    That being the case, I would like to now welcome the witness who is the sponsor of Bill S-210 in the Senate. I should note that Senator Miville-Dechêne has been shepherding this bill through the various processes since 2020. I think she's launched it four times.
     Anyway, we're glad to at last have you able to speak.
    We have the Honourable Julie Miville-Dechêne, senator. We also have, as an individual, Jérôme Lussier, director of parliamentary affairs for the office of Senator Miville-Dechêne.
     I now invite the senator to make an opening statement of up to five minutes.
    Please go ahead.


     Thank you for the invitation to talk about Bill S-210. I'd be happy to answer questions in French, but I'm going to give my speech in English, because that's the language in which most of the criticism has been voiced.
    To answer your question directly, let me say the following: This bill has been the subject of two studies at committee, because of the election. We have heard from 24 witnesses and there were 28 briefs. So we can say that this bill has been thoroughly studied at the Senate.


     Bill S-210 seeks to apply to online porn the rules that normally apply off-line. The bill does three things. First, it requires websites that offer porn to verify the age of users before they can access that content. Second, it sets up an enforcement mechanism that can result in non-complying websites getting blocked in Canada. Third, it provides that acceptable age verification methods will be decided in regulation, to be adopted after consultation and with input from experts. The bill specifies that any approved method must be reliable, collect information solely for age verification purposes, destroy any personal information once the verification is done and comply with best practices.
    Polls indicate that close to 80% of Canadians support age verification to access porn online, but Bill S-210 has been attacked, and I want to correct some misrepresentations.
    It has been said that no country has done this. This is false. Germany, France, the U.K., the European Union and several U.S. states have passed laws and regulation to impose age verification to access porn. Spain is expected to launch a pilot project soon. Australia, which had paused this work, announced last week that it would move ahead.
    It has been painted as a partisan or ideological bill to control sexuality. False. Age verification is supported by the socialist government of Spain and the conservative government in the U.K. In California, an age verification bill was recently approved unanimously by two legislative committees. This is not partisan legislation.
    It has been called an attack on free expression. False again. Bill S-210 would not affect the availability of porn for adults; it would simply prevent children from accessing it. In Europe, porn sites have challenged age verification laws and have failed at every stage. In the U.S.—a country known for its robust speech protection—porn sites have challenged the laws and they have failed all the way to the Supreme Court.
    It has been said that age verification would mean submitting personal identification to porn sites. False again. In Europe and elsewhere, age verification is typically done by third party companies using methods that transmit no personal information to porn sites. These are among the best practices we would expect in Canada as well.
    It has been said that this bill would block all forms of nudity. False. The bill uses the standard definition of pornography found in the Criminal Code. The bill also provides for usual exceptions for art, science and education.
    It has been said that this bill would impose age verification on all websites. False. Bill S-210 only requires age verification to access porn content. If a website contains porn and non-porn content, age verification is only required to access the porn content.
    It has been said that there's no way to check someone's age without compromising privacy. False. France is developing a double anonymous method. The U.K. regulator has recommended age estimation approaches that collect no information. Australia has explained its recent decision to move ahead with age verification by saying that the technology it looked at only a year ago has already improved.
    Finally, it has been said that age verification is useless because kids will find ways around it. This is once again false. Actual studies show that only a small number of children know how to evade these restrictions. It's possible that some older teenagers and adults will use VPNs to bypass age verification, but it's highly unlikely that large numbers of eight, 10 and 12-year-olds will do so.
    I will be happy to take questions, but please, please, don't let Canada be the last place on earth where pornographers are more protected than children.


    Thank you very much. That was 11 seconds over time. That's pretty good; we'll let it go.
    We'll go to questions by Mr. Genuis for six minutes, please.
     Thank you, Senator.
    I'm going to go quite quickly through a series of questions. I'd be grateful if you could be as short as possible—even providing a yes or no answer where appropriate.
    You said that for the Senate study, there were 24 witnesses and 28 briefs. Is that correct?
     Yes. It was in two committees, because we had to do it twice. The bill passed twice in the Senate.
    After those hearings, the bill was passed unanimously and unopposed by all senators. Is that correct?
    Does Bill S-210 create or call for the creation of a digital ID?
     Absolutely not.
    Would it be possible to use the provisions of this bill to create a digital ID?
     No. There's zero mention of a digital ID in this bill.
    Thank you.
     Bill S-210 requires that some age verification method be used, but it does not prescribe the method. Is that correct? Why not explicitly define the age verification method in statute?


     Well, this is how we write proper legislation. Age verifications are technical by nature. We have said in the bill—it's very clear in clause 11—that precautions have to be taken, best practice has to be used and privacy has to be respected. But to choose an age verification method in the bill would be a grave error, because those techniques change. We want it in the regulation that the specialists, the experts and the government choose the best age verification method at the time that they will act on it.
    It's as technology evolves. Okay.
    Are there age verification methods that do not involve identity verification?
     Yes. I can quote from Ofcom, which is the regulatory agency in Britain, on the kinds of age assurance that could be highly effective. They're at the stage of putting in regulation. They have passed a whole bill about it. They mention face age estimation. You've heard about age estimation. It works by analyzing the features of a user's face to estimate their age, but there's no collection at all of identity. It's done through estimation, plus or minus two years.
    That's one of them. There are obviously others.
    So there are others, besides the face method, that would not involve looking at the face but also still provide age estimation without identity.
    Could the verification process be done by a third party, such that a person's identity could never be linked with their pornography use habits, even if there were a data breach?
    Once again, if you look at our bill, clause 11 says that “best practices” have to be followed. Obviously, third party verification is one of the best practices. This is what Germany, France and the U.K. are proposing or have implemented already. The idea, obviously, is not to give any personal data to either porn sites or platforms on the web—
    Just to clarify, that separates the verification from the site.
    Hon. Julie Miville-Dechêne: Yes.
    Mr. Garnett Genuis: You get confirmation of age, and then you take that somewhere else.
    Well, it's called, in fact, double anonymity, because on one side, the third party does the verification with all the best practices and submits to all the privacy laws. It just sends to the porn sites a token. The only question is whether this customer is 18 years old or more. The porn site never knows who is coming to the porn site.
    That's excellent.
    Would the Privacy Act still apply? Does this bill amend or weaken in any way the protections associated with the Privacy Act?
    It applies fully.
     Thank you.
    Would this bill apply to educational materials, artwork involving nudity or other depictions of the human body that are not pornographic in nature?
    No. As I said earlier, in clause 6 there's an exception for educational, scientific and artistic material. All three are excluded from any age verification.
     Thank you.
    In the absence of meaningful age verification, what is the average age today at which children in Canada are accessing pornography?
     This is one of the worrisome figures. The average age varies between, according to the studies, 11 and 13 years old, but you should know that in terms of the first encounter with porn, 50% of 13-year-olds, 27% of 11-year-olds and 10% of nine-year-olds have already watched or been in contact with porn. We're talking here about huge figures.
     Senator, that is horrifying to me. Those are absolutely horrifying numbers.
    Do you believe showing children sexually explicit pictures or videos is a form of child abuse?
     I do, and because the Canadian Centre for Child Protection has an extremely good reputation in Canada, I will quote some of what they've said about the fact that all children have access to porn without any kind of research on their age.
    The centre says the harms to children exposed to sexually explicit material include “difficulty forming healthy relationships”, “harmful sexual beliefs and behaviours” and “a normalization of sexual harm.” Because porn is there, it's normalized; the porn they see, which can contain violence and degrading acts, is normal for them, so obviously they're more susceptible to being harmed by a pornographer.


    Thank you, Mr. Genuis.
    We will go now to Mr. Noormohamed for six minutes, please.
     Thank you so much.
    It's nice to finally get to a place where we can have a good conversation about this bill.
    Senator, I think we all agree that what you are trying to accomplish is very important, and I think there are many means by which to do that. I would submit that Bill C-63 takes into consideration many of the issues you seek to resolve.
    One of the concerns I did want to hear from you on is the whole issue of online privacy. Could you briefly explain what impact this bill might have on online privacy for Canadians? Would there be any concerns with respect to the privacy of online users?
     I have to say, first of all, that Bill C-63 doesn't talk about age verification. There's nothing in this bill about age verification—the words are not even used—and there's very little on pornography. Bill C-63 talks about the very vague concept of “age appropriate design”. It says there should be age-appropriate design; I'm sorry, but age-appropriate design is not age verification. It could be at some point if a committee so decides, but it's not in the bill. That's the first thing I wanted to say.
    Regarding privacy, we have laws in Canada. Why would—
    If I could, though, one of the concerns about this is the whole issue of personal information. Is that right?
    Yes. I'm going to answer that now.
    First of all, on this question, obviously we have privacy laws in Canada. Why would those privacy laws suddenly not apply to any company doing third party verification? That's nonsense.
    I will quote some people who have more expertise on that than I do. The Age Verification Providers Association says, “Privacy-preserving age verification technology is already used at scale; it is highly effective and our members have already completed over a billion age checks.” The AVPA also says that, “Age verification can be designed to completely protect the identity of the users”.
    On that basis, who would design the technology you're proposing? What technology are you proposing would be used? Would it be private sector or government technology?
    No, it would not be government technology.
    The private sector would be the entity that—
    Hon. Julie Miville-Dechêne: First of all—
    Mr. Taleeb Noormohamed: Let me finish the question.
    What you're saying is the private sector would have access to individual Canadians' data. Is that what you are suggesting?
    No, I'm suggesting that all of those questions will be resolved in the regulations. What I'm saying to you is that we have, in clause 11 of the bill, five principles that will ensure privacy will be respected. I will read you the main one about the age-verification method that will be chosen. It says it “maintains user privacy and protects user personal information”.
    On this point, though, one of the concerns people have raised is that subclause 11(2) permits the use of what some would argue are inherently unsafe age-verification methods by Internet services, whether those are the uploading of data or scanning of social media activity.
    How do we assure Canadians they would not have to turn all this information over in order for this age verification to take place?
    One of the things I'm trying to make sure of is that we achieve the goal.


     I'm not following you on subclause 11(2). What are you saying here?
    Under subclause 11(2), as I read it—and it could be my read of it—it would permit the use of any number of options in terms of age verification.
    The options would have to follow those rules because they're in the bill. No method that doesn't respect privacy could be chosen in regulation because regulation, as you know, has to follow the principles that are in the bill.
    Let me try to simplify this.
    In order for someone's age to be verified, who would control, hold, protect and safeguard the data that is shared?
     The way it is done in other countries...for example, Germany has a system of accreditation. Some companies that are specialists in age verification can apply. For them to be accredited to do age verification on Canadians, they would need an accreditation.
    In that regard, though, you've seen age restriction technologies and policies challenged in courts and jurisdictions in the U.S. and in the U.K.
    How do we make sure that this doesn't end up with a challenge in the courts?
    Up to now, in the U.S., the porn companies that have sued have lost. Now in the U.S., which is really very strong in terms of freedom of expression, there's already a court decision that says that those laws are not unconstitutional. That's really interesting because years ago it was the contrary.
    Now the problem of young people—children—watching porn is so prevalent that the courts have obviously changed their idea on it. I would say also—
     I want to be clear that nobody here is saying let's give children unfettered access to porn, so let that not be the message you're hearing.
    What I am trying to do is to make sure that whatever we do does not end up being challenged successfully in the courts.
    For example, you've talked a little bit about the U.S. In the U.K., there has been a series of challenges and other jurisdictions have seen challenges around this. In Canada, we have an ongoing debate in respect of people's privacy.
    How do we ensure that members of the LGBTQ2S+ community in particular, and others, do not find themselves on the wrong end of the long arm of the law? To be clear, this is not about saying let us give unfettered access to pornography to children. Let that not be the takeaway.
    My question is, how do we handle this in a way that protects young people, that provides people the assurance that what is being done here falls within the grounds of Canada's privacy laws, and that we're doing this in a manner that is consistent with the law?
     I'll let the Senator answer, and then we'll go to Madam Michaud.
    On the question of lawsuits, in France and Germany, all the lawsuits have been lost by the porn companies, so we are not facing a great challenge at this point.
    On your second question, this bill's goal is to protect all children—all minors—from porn. What you're saying here is that particular members of the LGBTQ could be particularly vulnerable to the privacy issue.
     I haven't seen any study on that. We are saying that privacy will be respected. On betting and on all kinds of other questions, we are giving some of our identity to a lot of people. Why would it be different here and why would this community be targeted?
    I'm sorry, I don't get it. In Canada, with the laws we have, what makes you think their names would come out?



    Ms. Michaud for six minutes.
    Senator, thank you for coming. I'm eager to hear you speak about Bill S-210, an important bill that aims to restrict young people's online access to sexually explicit material. We could also say that it aims to protect young people from learning about sex from online porn.
    I liked the way you described the bill when you said that its purpose is to do online what we do offline. Things were much easier when children weren't allowed to go to convenience stores to buy magazines containing pornographic material. Now it's a little more complicated, with everything available online.
    You will have noticed though that this bill does not meet with unanimous approval. The government voted against sending it to a parliamentary committee for study. It's thanks to the vote of the other opposition parties that the committee is able to study this bill today. I've read it and I think it's a good bill. In fact, the Bloc Québécois supports it.
    Age verification is obviously not a simple matter. We have read about what's being done in other countries. You mentioned Germany, the UK, France, some US states, Spain and Australia, among others. From what I have read about the concrete measures taken by those countries, I note that the law has yet to be applied in many cases, or that it will only be applied in future pilot projects. So we don't necessarily have a clear indication of what is being done and what Canada could do, or examples from which to draw inspiration.
    I'd like to hear your comments on a question you raised earlier, and I'll allow you to answer in French: Why did you choose to have the operating provisions in the regulations rather than right in the bill?
    What kind of regulations would you like to see the government put in place? Aren't you worried about putting everything in the regulations, given that the government doesn't want anything to do with Bill S-210?
     I'm going to start by talking about other countries.
    Germany has been verifying age for many years by means of regulations. We also heard testimony on this subject at the Senate. There have never been any leaks of personal information. There are 80 different systems, implemented in collaboration with accredited third parties. So it's a system that works, because this country obviously started regulating before Pornhub came along. However, international platforms refuse to comply.
    As for Great Britain, a first bill wasn't successful. A second one has been passed, and they're working on regulations, which will be forthcoming. I know their bill was one of the models that the Canadian government looked at, because it's quite rigorous, it took years to prepare, and it's excellent. It deals extensively both with pornography and with the fact that both pornographic sites and social media have to be addressed. Kids are looking at pornography on Twitter as much as on porn sites. So we have to worry about the whole picture.
    So there are countries that do verify age, but you're right: We're still in the early stages. One cannot say that Germany is a tiny country, though, and its system works. France will be moving ahead in September.
    To come back to your question about regulations, it's true that the government has said quite clearly that it doesn't like my bill. However, I'm one of those people who believes that governments have responsibilities. I trust that, once the bill becomes law, the government will consult experts and look for the best age verification methods, ones that will best protect Canadians. So I don't see why there should be any bad faith with respect to regulations, since creating a poor age-verification system won't benefit a single Canadian.
    Thank you.
    In your opinion, would Germany be the best example to follow?
    No, it's one example.
    I would say that the British law as drafted is far more similar to those we produce in our legal system and government.
    In the regulations to the act, there is now talk of age estimation, age assurance. Yes, there are ID cards, and yes, there are more traditional methods, but there is also age estimation. Without collecting any data, this method estimates a person's age to within two years. If this method doesn't work, you can switch to another method.
    What's very interesting about the British is that they say it's the responsibility of those who distribute pornography to ensure that these methods, which would normally be applied by third parties, work. So we're not taking responsibility away from the porn sites. We're still saying that it's up to them, if there's a choice of methods, not only to choose one, but also to make sure it works.


    I only have a few seconds left, but I want to hear your thoughts on the fact that, according to the government, we don't need Bill S-210, since there's Bill C-63. To my knowledge, they're not the same at all. Bill C‑63 is extremely important, to be sure, but it's not identical to Bill S‑210. Do you share that opinion?
     I supported the first part of Bill C‑63, which deals with the whole issue of children. There are some very good things in there but, strangely, not a word about age verification or pornography, which are nonetheless a major source of harm associated with social media. I'm very disappointed. No matter how many times I'm told that the commission that will be set up one day might decide that age verification is the right way to go, it doesn't reassure me. This is all setting us back.
    For me, this is a pressing public health issue. For the past 15 to 20 years, our children have been subjected to unrestricted pornography, and it's changing the way they view sexuality. Sexuality should be a wonderful thing, and I'm all for healthy sexuality, but that's not what pornography shows. I understand that it's legal for adults, but it was never meant to be consumed by children or used to teach them about sex.
    Not only does it take away all their dreams and any mystery, but it also makes them adopt appalling behaviours that are more violent. According to a study that completely blew my mind, over 47% of young men and women say that sexual relationships are necessarily violent, and that girls expect it. Look where we've ended up.
    Thank you.


     We'll go now to Mr. MacGregor.
    You have six minutes, please.
     Thank you very much, Mr. Chair.
    Thank you, Senator, for joining us today to help inform the committee on this bill.
    I voted in favour of this bill at second reading. I agree with the principle. Many of the members are parents—not only members around this table, but members in the House of Commons—so we come at this not just with a professional interest, but also with a personal interest.
    That being said, it's generated a lot of correspondence from a lot of my constituents and, indeed, many people who know I'm a member of this committee. I come at this personally by trying to balance parents' real concerns over children's access, but I also want to find out a bit more about the people who are raising privacy concerns. That's why I want to pay particular attention to this study.
    One of the questions I have.... We have received a number of briefs on this bill, as well as recommendations. One of the recommendations that came was with regard to the ability to verify users' ages at the point of access on devices, rather than on websites.
     Are you familiar with this technology?
    Yes. It's—
     Can you explain the differences, and why you chose one path over the other?
    This technology is promoted by Ethical Capital Partners, which is now the new owner of Pornhub.
    They've been going around Parliament to promote this technology. When we met, I said to them, “Listen, if you think you can convince the big companies—Google and others—to do it, that's fine, but at this point, we have porn sites that are responsible for having let children watch porn for the last 20 years. They are saying, 'Oh, no. We're not the ones that are going to verify their age. Let somebody else do it'”.
    Obviously, to start with, it's a little strange, and then there's this whole idea of having.... I have nothing against having some age verification on the telephone, but nobody does it around the world. This technology may be possible one day, but at this point, it's not the case.


    Is that because of the sheer variety of devices out there? Some people might be using a laptop, a phone or an old desktop computer.
    It's because you would have to have a lot of.... Google, Microsoft and all of them would have to agree to do that, and we have no indication at all that this is a new method that some countries are using. Nobody has taken this route yet.
     I'm not saying it's not a great route, but it doesn't exist now.
     I want to get into some of the weeds of the bill. When I go through your bill, on page 3, there is subclause 6(2), "Defence—legitimate purpose":
No organization shall be convicted of an offence under section 5 if the act that is alleged to constitute the offence has a legitimate purpose related to science, medicine, education or the arts.
     If I were to go out in the street and ask people about the term “legitimate purpose”, I think it's a term that's very open to interpretation.
    Can you explain how you see it as as the bill's sponsor?
    I understand that this defence is used in different circumstances. Obviously, there is there a question of interpretation. I would say that people say that essentially showing a nude on social media would be problematic. Not only do they not understand this particular definition, but it's larger than that. Sexually explicit material does not refer to simple nudity. It's a term in the Criminal Code, and this definition is very specific.
     I will read it in French. I'm sorry; I have to—
    I don't need you to read it.
    I have it in front of me, and it's in the Criminal Code.
    The goal has to be sexual in nature. It's not only nudity; it has to be there for excitement.
    You feel that clearly helps define the term “legitimate purpose”, the fact that it is in existence in the Criminal Code. Okay, that's good. That's what I wanted to know.
     The other part I wanted to get to is on page five, paragraph 9(5)(a). This is about the effect of the Federal Court order. In this section, it says that the court order may have the effect of preventing persons in Canada from being able to access material other than sexually explicit material made available by the organization that has been given notice.
     What other material do you think could be prevented from people having access to it, and why did you include this section in the bill?
     Can you clarify this a little bit more for me?
     Okay. If you'll permit, I will answer in French on that.


     First, it's important to understand that, before a website can be blocked, the person responsible for it will receive a detailed notice. They will then have 20 days to decide whether or not to comply with the law. After that, the case will be referred to federal court, which will deliberate to decide whether we've really reached the stage where the site needs to be blocked. I want to say this because it's a method that is in keeping with Canadian legal standards, which protect those whom we want to punish by giving them the recourse to speak in court and defend themselves.
    If, after all that, the pornographic site is deemed not to be taking action, the question of blocking arises. When an ISP is told that it must block a site, it can do so. If a website contains both pornographic and non-pornographic material, once you block it, you're blocking more than just the pornographic content. In this case, the reason is simple: it's up to the site to decide whether or not it complies with the law. Let's take the analogy of a bar. If a bar lets minors in and lets them drink alcohol, the bar will eventually lose its liquor license, preventing it from serving alcohol not only to minors, but also to adults.



     Thank you.
    Mr. MacGregor, thank you.


    I'm sorry; your time is up.


    Mr. Genuis, you have five minutes.
     Thank you, Senator.
     I think it's really important to clarify this issue of site blocking, because I think there's been some misinformation around it. To be really clear, and perhaps you can clarify this, but site blocking is only a penalty for those who persistently violate the law. If I commit a crime, I might go to jail, and if I'm sent to jail, that violates my mobility rights and my ability to do all kinds of other things that I would normally be able to do. That's because I've committed a crime, and I'm being penalized for it.
     In the same way, this bill isn't about site blocking in general. It only allows that to be used as a penalty for those who persistently refuse to follow the law.
    Is that correct?
     This is correct.
    I could add here that, in Germany, Twitter was carrying both non-porn material and porn material. The government said, you're not doing age verification. Finally, what happened was that Twitter dropped all of their porn part. They decided by themselves because they didn't want to be subject to the age verification law.
    Twitter just had 60 pages off their site in Germany, so regulations work. That's what I want to say here.
    Thank you.
    On the social media issue, some critics have said that this would apply to social media that chooses to host pornographic content and show it to children, but the point would seem to me that if you can't serve alcohol to children in a bar, you also can't serve alcohol to children in an ice cream stand. If we determine that an activity should be age-protected, then that age verification has to necessarily exist in every context where that content could be shown to children.
    The solution is that it's not age verification for those sites in general, but it is specifically for the pornographic content we're talking about, because Twitter shouldn't be allowed to show pornography to nine-year-olds and neither should any other website.
    It's fairly easy to do because, if you've been on Twitter and you've tried to find porn, you will see that there's a page that announces that sensitive content is coming.
    Twitter knows where it is, so it's easy to find and it can be age-blocked, age-verified—all of those things work.
     I would also expand on this analogy that children can't go in a bar and drink, but they are allowed to go to a restaurant. The server is not allowed to give them a drink.
    It's not so much where, but it's the pornographic content that we're aiming at. This is where it has to be stopped because it's obviously not only with porn sites, but also on social media platforms.
    This is where, many times, children have their first opportunity—even pops-up—to see pornography.
    It seems obvious to me that in a world of rapidly evolving technology, there's no way you could put in the text of legislation precisely the kind of age verification method that would be used. That would forestall the process of gradual improvement in effectiveness.
    For those who are skeptical of government power in general, what are the safeguards on the regulation-making process that are in the bill and that exist in general?
     I would go back to clause 11, because this came out of the process in the Senate. This was an amendment because, obviously, we heard the same concerns, so we said, let's put real safeguards in the bill.
    They are in clause 11. The method that will be chosen has to be reliable, maintain user privacy and protect users' personal information, collect and use personal information solely for age verification purposes, destroy any personal information collected for age verification purposes once the verification is completed—we're talking here about destroying it—and then generally comply with the best practices.
    In truth, how could we have chosen an age-verification method? I've been on this for four years and the world has evolved and technology has evolved.


    Senator, that's extremely clear in the text of the bill.
    To conclude my questions, how do you account for the volume of misinformation and outright disinformation?
    I think some of it may be coming from certain industry groups, but how do you account for the volume of just nonsense, frankly? There are some people who are scared about what would happen if this bill is passed because they don't understand what it's about at all.
    How do you account for that, and what's your response to that?
    It's easy to scare people with disinformation. For me, I'm incapable of understanding that this falsehood becomes more important than protecting all children.
    As we all know, our rights, defended by the charter, are not absolute. Yes, freedom of expression exists, and adults will continue to be able to watch porn, but what about the children? They also have rights.
    The United Nations is asking countries now to have age-verification systems. When I started this bill, it was not as well known. Now that time has passed it's a real public safety issue, and I think it has to be addressed.


    Thank you.


     Mr. Gaheer, go ahead, please, for five minutes.
    Thank you, Senator, for appearing before the committee.
    The legislation you've proposed doesn't include a definition of sexually explicit material or pornography, but rather it refers to a section of the Criminal Code about making sexually explicit material available to children. Is that right?
    Pornography, sexually explicit material, doesn't exist as such in the Criminal Code with respect to adults. It exists only with respect to child pornography. The known term is “sexually explicit material.” I would read you the definition, because when we read—
    Why didn't you provide a definition of that in your proposed bill? Why are you relying on the definition?
     This is the way it's done. In the definition, I'm referring to sexually explicit material. In bills, you don't give the Criminal Code definition. It stands by itself in the Criminal Code, because if the definition is changed, you don't want that to affect your bill.
    Sure, that's fair.
    I want to read into the record what it is.
    No? You don't want me to read it?
    It's just that I have limited time. I do want to get on to my question if that's okay. I apologize.
    As you are relying on that external definition, were you aware that the majority of online streaming organizations—we can think of household names like HBO, Hulu and Netflix, which are commonly available to families across this country—are not accessed for their pornographic content? They do host content that sometimes does contain sensitive aspects and some aspects of sexually explicit material. Were you aware that they would also be implicated within the wording of the bill as it's currently presented?
     No, I didn't consult with them.
    I think you have to understand the definition of sexual—
    I apologize—
    It's not just nudity and it's not just sensitive material.
    I will quote Professor Trudel, who's a specialist on these questions:


In Sharpe, the Supreme Court of Canada clarified that the expression “explicit sexual activity” refers to intimate sexual activity represented in a graphic and unambiguous fashion, and intended to cause sexual stimulation to those who consume such material.
    We're not talking about trivial images here.



    I'm sorry, but I do have to interrupt.
    The aspect I'm reaching out about is that when you have online content, especially on Netflix, HBO or Hulu, the interpretation could be different depending on whether it's meant to include a scene or an entire movie.
    Do you believe that the non-pornographic content in these shows that are available across this country should be evaluated and regulated and should carry the same legal liability as would content that you propose to capture under your bill?
    I don't really get your question in the sense that there are two different things. You know, scenes of nudity or scenes of love are not the same as sexually explicit material. It has been defined by the criminal court. It's been refined by the jurisprudence.
    We're talking here about the equivalent of pornography. It's just that the expression used in the Criminal Code is “sexually explicit material”. It's the same thing.
     The worry is that the wording of the bill as it's currently presented is overly broad and will capture these areas of content that are widely available in Canada.
     This is the definition in the Criminal Code. There's a definition of pornography. It's “sexually explicit material”. If you don't use this definition, what else do you use?
    The idea here is to protect the children, the minors, from pornography. This is the tool we have that has been of service for decades on that matter.
     Sure. As this bill was being drafted, and the language was being drafted, were there any consultations done, or any sort of work done to ensure that content that's not supposed to be covered is not covered?
     Yes. I told you that Pierre Trudel was one of the experts testifying, and we have, obviously, consulted with experts.
    I have been working on this bill for four years, so, obviously, it's not a perfect bill. Sexually explicit material refers to things that are there to excite with close-ups of sexual organs. We are not talking here about very rosy pictures. This is very different. I have watched pornography to prepare this bill. You would not find it difficult to differentiate what you see on HBO and what you see on a porn site. I'm sorry to say that it's very, very different.
    Thank you.
    We will have to wrap that up.


    I'll now give the floor to Ms. Michaud for two and a half minutes.
    Thank you, Mr. Chair.
    Senator, this is a very interesting conversation. I imagine that it's something you've had to constantly repeat over the past four years.
    You said that the bill had been the subject of two studies at the Senate. Inevitably, changes have been made, including to section 11. You talked a bit about it earlier, saying that there were concerns about access to personal information or privacy.
    Could you tell us about the changes made to the original bill and whether, in your opinion, those are changes for the better?
     Yes. I totally agreed with the changes to the first version of the bill.
    Let's not forget that I started to work on this bill during the pandemic when everything was so complicated. We made corrections. In particular, I introduced amendments to correct two errors.
    The first mistake was that, initially, individuals and not organizations were named in the offence. That created the risk of complicating the lives of sex workers in particular. We didn't want to target those individuals, but rather the organizations stipulated in the Criminal Code. So we made that change.
    We've also greatly strengthened the second section of the bill to allow recourse to federal court. Initially, as in some countries where a site is blocked when an offence has been observed or when a site doesn't comply with the law, a site could be blocked if the designated authority deemed it necessary. We added recourse to federal court precisely to ensure that there would be no political or ideological intervention in these decisions, and that they were really made on the basis of facts that could be cross-checked by the court.


     The bill sets out the date that the legislation but not the regulations would come into force. In a bill such as this, is it possible to stipulate that the regulations on age verification must come into force before a specific year?
    We worked on Bill C‑21, but the regulations that followed the passage of the old Bill C‑71 weren't even in force yet. It took several years for that to happen.
    How can we make sure that the government works quickly on an age-verification system if this bill ultimately passes?
     You have to live on hope. You're absolutely right. Age verification has to be perfected, the regulations have to be ready for this bill to come into force. I think one year is a reasonable period, given that we'll be able to see what's being done in France and Great Britain, two countries we know well and which will have all the necessary framework and regulations in place within the coming months. The government will therefore have examples to follow.
    Thank you, Ms. Michaud.


     We will go now to Mr. MacGregor, for two and a half minutes, please.
     Thank you, Mr. Chair.
    Senator, I think the one part I wanted to focus on—as my colleagues have—is the regulations section of this bill and specifically clause 11. Now, subclause 11(1) is very easy to understand, but on subclause 11(2), I had some questions about the wording, because it says:
Before prescribing an age verification method under subsection (1), the Governor in Council must consider whether the method
    Then you have points (a) through to (e). I question the choice of the wording “must consider”, because that seems to give the government a way out, like, “Oh, we considered these five points”. Wouldn't it be stronger to say something like “the Governor in Council must verify that it is reliable, that it maintains privacy, that it collects and uses personal information solely for age verification”?
     For me as a parent looking at this, if I actually had wording in the bill that the Governor in Council must do this rather than consider it.... Do you see where some people might have concerns? The language in the bill might be giving the government a way out. Yes, it has to consider it, but it doesn't have to verify that it's actually in place.
     I would say that obviously “consider” has some strength, but the age verification methods are all different. From my reading, some of those won't apply. If you do age estimation, for example, you're not collecting personal data, so some of those rules will not apply. Depending on the age verification that will be put in place, you will use them or less.... I think that's why there was prudence in choosing “consider” instead of “must apply”.
     Okay. That was the rationale. It was because of the different technologies that are out there. They wouldn't necessarily all satisfy all of these points.
    Given the privacy concerns that are being expressed, wouldn't we want them all to comply with these points? I don't know if you would want four out of five in this case. Would you?
     If there's no collecting of personal information, it doesn't need to apply.
     I think it leaves some leeway for the different methods chosen, but I take your point.
     Thank you, Mr. MacGregor.
    We will go to Mr. Shipley for five minutes, please.
    Thank you, Senator, for being here today.
    My first question has to do with.... I wasn't sure, but someone earlier in the hour—I'm not sure if it was the chair or you, Senator—mentioned that you've tried four different times—I believe that was the number—to get this bill through. Could you expand on that? Where and why has that been stopped other times?
     That's interesting. It's been all my life in the four last years.... Well, not all of it, because I'm a senator on other things, too, but yes, I started in 2020. In fact, I started when I went to demonstrate in front of Pornhub against illegal sexual material. You know, they had child pornography. There were some really serious accusations and allegations of child pornography, so I went there. After the demonstration, I started to talk to myself: “What are the children doing during this pandemic?” They were on their screens. That's where the whole idea started to become real.
    Starting in 2020, I had a first round with my bill, which went through the committee. I had tough questions and a lot of questions. I made some changes. We were still in the pandemic, and then the election was called. In this process, the bill was passed in the Senate and then it died. Then it was off the books, so I had to start all over again after the election, from first reading, and do the whole process again with different people in the committee. The more I went through it, the more supporters I had. At the end, there was no opposition in committee and no opposition in the Senate when this bill was passed.
    It took a long time, but I'm told that it's not completely out of the ordinary that private members' bills can take time. In this case, I was not served because of the election arriving at the time that it arrived.


     Thanks. I'm glad I asked that, because I thought the number was four. It sounds like it's been twice. Thanks for the clarification on that.
    One of the concerns, obviously.... I don't think anybody in this room feels that children should be accessing any types of pornography, but one of the issues, quite frankly, is the age verification and the process. You mentioned and the bill mentions that this is going to be done by a third party company. Are there currently any Canadian companies that do this type of business? Who would they do it for? I mean specifically the age verification.
     Yoti is one of the main players. It has a place of business, I think, on the west coast, but it's originally from Britain. Yoti is one of the big players in age verification.
    There are many of them, but I think, as for Canadian.... There is one in Ontario.
    I'm sorry to interrupt, though. Do you know who or what...?
    There's a joke there, but I'll leave Mr. Bittle alone on that one. He really threw me off.
     Was it about using Yoti?
    Who or what industries are they age verifying? You mentioned the company, but what are they doing now—for whom and for what?
     Well, that's a good question.
    I know there is age verification, for example, in online betting. However, is it done by an age verifier or, in this case, by credit card? I don't think we have a lot of age verifying in Canada.
     I'm sorry. I'm not able to intelligently answer your question. Do you want me to send you an answer?
    An answer would be good, because, as I said, I'd like to know.
    I think I have not researched that part.
    Look at that. I've thrown you right for a loop, so there you go. That's why we're here.
    Yes, I would like to know the names of a couple of companies and what industries they're currently doing age verification for.
    One of the other studies we recently did in here was about cybersecurity. It was astounding to hear about some of the companies, governments and big organizations that have been hacked. A lot of private information has been taken or ransomed.
    Are you concerned about that at all, with this? We heard of one Canadian company. I'm sure there are others, hopefully. We don't know the size of them. We don't know who they're doing.... Do you have any concerns about security and cybersecurity?
    I think we're all concerned. We have companies like Desjardins in Quebec, where we put our money and which have those types of problems. It can obviously happen everywhere.
    However, I am wondering why it would be more of a risk with pornography than with banking and all kinds of other sectors. Once you start to be on the web, transmit information on the web and do banking on the web, all those things exist. However, as I said on this question, Germany has the longest experience, but there have not been les fuites de données or big scandals on that side with data privacy.


    Thank you, Madame.
    Before I carry on, feel free, obviously, to submit any further information you may wish to the committee.
     Yes, I will, because I am excited to answer a good question.
     It will be distributed.
    We'll go now to Mr. Bittle for the hammer.
    You have five more minutes.
     Thank you very much, Mr. Chair.
    Before I ask my question, Senator, I want to say that I think we all appreciate the goal you're trying to accomplish. I just have many concerns about how we're attempting to accomplish this.
    I'll give you an example.
    One of my favourite shows of all time is Game of Thrones. I love that show. I became obsessed with it and watched it throughout. I'm watching the spinoffs. It's on a Canadian provider—Crave. It would be covered by this legislation, as it's defined. I have staff who I know are fans of Bridgerton on Netflix, which is another Internet service provider that provides what is defined as “sexually explicit content”.
    I know you're trying to prevent children from seeing pornography, but you didn't define “pornography” in your legislation. You defined “sexually explicit material”. Things like the shows I mentioned, and even Academy Award-winning movies, such as Schindler's List, The English Patient, Shakespeare in Love, Gladiator, Crash, 12 Years a Slave, The Shape of Water, Green Book and Parasite, based on the definition in the Criminal Code, would all be covered.
    My question to you is this: Why use this overly broad definition? Why didn't you come up with a stronger, stricter definition of “pornography”, rather than using the “sexually explicit” definition in the Criminal Code?


     Mr. Bittle, I'll respond in French, if you don't mind.
    First, I completely disagree with your premise, since neither Game of Thrones nor Bridgerton—which I watch religiously—contain pornographic or sexually explicit material—


     Pardon me.


     —as defined in the Criminal Code. I don't think you're referring to the definition of “sexually explicit material” as set out in the Criminal Code.


     I'm sorry to interrupt you there, Senator. I don't have much time. I'm looking right at the definition now—subsection 171.1(5)—and this is for “sexually explicit material”. It says:
the dominant characteristic of which is the depiction, for a sexual purpose, of a person's genital organs or anal region or, if the person is female, her breasts;
    That is very broad when it comes to what Canadians see on television, which they access without a digital ID. You're proposing to change or alter that. I think Canadians expect that when they go on to Netflix or Crave or platforms such as Amazon Prime that they're able to access these shows without a digital ID to do so.
    There's an old Supreme Court decision in the United States about pornography—“I know it when I see it”. You're kind of making that position now and pointing us to this Criminal Code definition; but it is very broad, Senator, and it is going to capture so much more content than I know you don't intend to. However, this is the language of the legislation.
    How can Canadians be assured that they won't be brought into this based on the wording of the legislation that I've just read—and I know you've been trying to read it to some of my other colleagues.
    Yes, because contrary to you, I don't think this is that broad. I believe that if you look at the jurisprudence on how those particular three words have been interpreted by courts.... You will see earlier that I quoted Professor Trudel, who was quoting the Sharpe decision.


     Is he a judge?
     No, he's not a judge, but he's a legal expert.
    You're right that we can always say there's a danger. However, what kind of definition would we suggest?
     Again, Senator, you're in a “we know when we see it” kind of item. I don't have much more time left. You've left this very vague for the agency that's going to police this.
    Really, the only agency that exists, which the federal government has, is the CRTC. Is this whom you expect will be policing this legislation?
     I did not include it, because I didn't want to have to weigh in on that. I think those decisions—
    However, that's the most likely agency to deal with it, yes or no?
    Is it the RCMP?
     No, no. It's just that—
    It will be the CRTC. They are the most likely competent agency to deal with this, basically.
     It has remained undefined, because, as you know, your own government these days is trying to have a commission, an ombudsman, and all of those things.
    However, you'll agree with me that it's most likely the CRTC. You don't want it in there, because it wouldn't be popular, but you'll agree with me that the CRTC is the most likely agency to deal with this.
     I'm not agreeing with you.
    Is some mythical agency that the government will create—a better agency to deal with it?
    Mr. Bittle—
    With respect to agencies that exist, Senator, you'll agree with me that this is the agency that is most likely to police this particular piece of legislation.
     You may answer, and then we're going to have to cut it off.
    Well, I'm not going to be pushed to answer a question like “you'll agree with me”. It's not the kind of question I like. We have not defined in the bill who it is, so you can certainly think what you're thinking, but I would come back to the following fact. What would you have done to define pornography without taking the definition in the Criminal Code, which is one of our main laws in Canada, and which is serious, which has been interpreted by all the courts as being pornography, not as being artsy pictures of whatever?
    You're right that there's a dose.... It's not scientific, but I would say this definition is pretty good and has been used in the past to show that this is not light.... This is pornography we're talking about when we're using sexually explicit material.


    Thank you.


    Therefore, we disagree.
    Thank you both.
    Thank you to Mr. Lussier and the senator for bearing with us here tonight.
    That concludes this portion of the meeting. We'll suspend and bring in the next panel.
    We're suspended.



     I call this meeting back to order.
     I would like to welcome our witnesses for this second hour.
    From the Office of the Privacy Commissioner of Canada, we have Mr. Philippe Dufresne, the Privacy Commissioner of Canada, and Lara Ives, executive director of the policy, research and parliamentary affairs directorate. From the Department of Canadian Heritage, we have Owen Ripley, associate assistant deputy minister of cultural affairs; Katie O'Meara, policy analyst; and Galen Teschner-Weaver, policy analyst.
    Mr. Dufresne, thank you for waiting. You've been very patient. I now invite you to make an opening statement of five minutes, please.



    Thank you to the Members of the Committee for this invitation to appear on your study of Bill S‑210, An Act to restrict young persons’ online access to sexually explicit material.
    As Privacy Commissioner of Canada, my mandate is to protect and promote individuals’ fundamental right to privacy. This includes providing advice, guidance, and recommendations for protecting personal information, and overseeing compliance with Canada’s two federal privacy laws—the Privacy Act, which applies to federal government institutions, and the Personal Information Protection and Electronic Documents Act, which is Canada's federal private-sector privacy law.
    In January, I launched my Strategic Plan for my Office which is focused on three priority areas: maximizing the OPC’s impact in promoting and protecting the fundamental right to privacy; addressing and advocating for privacy in this time of technological change; and championing the privacy rights of children.


    I support the purposes of Bill S-210, which include protecting the mental health of young people from the harmful effects of being exposed to sexually explicit material, but the bill raises some privacy implications, and I would propose some changes to address them.
     As drafted, the bill provides that any organization that makes sexually explicit material available online to a young person for commercial purposes is guilty of an offence and liable to a fine that would increase in amount, depending on whether it was a first or subsequent offence. A defence is available if an organization believed the young person was at least 18 years of age, having implemented a prescribed age verification method to limit access to the sexually explicit material.
    Age verification can raise privacy implications, as it generally requires the collection of personal information, which could include biometrics or identity documentation. As drafted, the bill would apply to services, such as social media and search engines, that may make available some sexually explicit material, but may be primarily focused on other content. This could result in age verification requirements, including when the majority of content may not be of a sexually explicit nature.
    To address this, the committee could consider restricting the requirement for age verification to websites that primarily provide sexually explicit material for commercial purposes.


    Before prescribing an age-verification method, the Governor in Council would need to consider certain criteria, including whether the method maintains user privacy and protects users’ personal information. These criteria are important and beneficial.
    I would recommend adding additional criteria to the list to ensure that the prescribed methods are sufficiently privacy protective. Specifically, this could include assessing whether the prescribed methods are proportionate and limit the collection of personal information to what is strictly necessary for the verification. Age-verification methods should also prevent tracking or profiling of individuals across visits to websites or services.


    Internationally, various jurisdictions have taken action to prevent children from accessing pornography, but some of these laws have a narrower application than Bill S-210. For example, Texas and Utah only require age verification measures on sites that meet a certain threshold of pornographic content. Some regulators have also worked to mitigate the privacy risk associated with the use of age verification technologies. For example, Spanish and French regulators have worked with researchers to develop and evaluate potential age verification mechanisms.


    My office is conducting further research in this area and is a member of an international working group with other privacy regulators to share information on age-verification methods and learn from each other’s experiences. Notably, members of this working group intend to publish a joint statement of principles for age assurance later this year. My office is also developing guidance for organizations on age assurance and privacy, and will launch an exploratory consultation on this next month.



     Finally, should Bill S-210 be adopted, I would be happy to provide advice on regulations that pertain to privacy and the protection of personal information at the appropriate time. I will be pleased to take your questions.
     Thank you.
     Thank you, Mr. Dufresne.
     We'll go now to Mr. Ripley.
    I invite you to make an opening statement of up to five minutes.


    Mr. Chair, thank you for inviting me to discuss Bill S‑210. As the associate assistant deputy minister for cultural affairs at the Department of Canadian Heritage, I will be responsible for the Online Harms Act that is being proposed as part of Bill C‑63.
    While Bill C‑63 was being drafted, the department heard directly from experts, survivors from civil society and members of the public on what should be done to combat the proliferation of harmful content online.
    A common theme emerged from these consultations: the vulnerability of children online and the need to take proactive measures to protect them. With this in mind, the future online harms act proposes a duty to protect children, which will require platforms to incorporate age-appropriate design features for children. Bill C‑63 also proposes a specialized regulatory authority that will have the skills and expertise to develop regulations, guidance and codes of practice, in consultation with experts and civil society.


     Bill S-210 seeks to achieve a similarly admirable goal of protecting children online. However, the bill is highly problematic for a number of reasons, including a scope that is much too broad in terms of regulated services, as well as regulated content; possible risk to Canadians' privacy, especially considering the current state of age-verification frameworks internationally; structural incoherence that seems to mix criminal elements with regulatory elements; a troubling dependence on website blocking as the primary enforcement mechanism; and a lack of clarity around implementation and an unrealistic implementation timeline.
    I'll briefly unpack a few of these concerns in greater detail.
    As drafted, Bill S-210 would capture a broad range of websites and services that make sexually explicit material available on the Internet for commercial purposes, including search engines, social media platforms, streaming and video-on-demand applications, and Internet service providers. Moreover, the bill's definition of sexually explicit material is not limited to pornography but instead extends to a broader range of mainstream entertainment content with nudity or sex scenes, including content that would be found on services like Netflix, Disney+, or CBC Gem. Mandating age-verification requirements for this scope of services and content would have far-reaching implications for how Canadians access and use the Internet.
    While efforts are under way globally in other jurisdictions to develop and prescribe age-verification technologies, there is still a lack of consensus that they are sufficiently accurate and sufficiently privacy-respecting. For example, France and Australia remain concerned that the technology is not yet sufficiently mature, and the testing of various approaches is ongoing. Over the next couple of years, the U.K. will ultimately require age assurance for certain types of services under its Online Safety Act. Ofcom is currently consulting on the principles that should guide the rollout of these technologies. However, the requirement is not yet in force, and services do not yet have to deploy age assurance at scale. In jurisdictions that have already moved ahead, such as certain U.S. states or Germany, there continue to be questions about privacy, effectiveness and overall compliance.
    In short, these international examples show that mandates regarding age verification or age assurance are still a work in progress. There is also no other jurisdiction proposing a framework comparable in scope to Bill S-210. Website blocking remains a highly contentious enforcement instrument that poses a range of challenges and could impact Canadians' freedom of speech and Canada's commitment to an open and free Internet and to net neutrality.



    I want to state once again that the government remains committed to better protecting children online. However, the government feels that the answer is not to prescribe a specific technology that puts privacy at risk and violates our commitment to an open Internet. It is critical that any measures developed to achieve this goal create a framework for protecting children online that is both flexible and well-informed.
    Thank you for your attention. I look forward to any questions you may have.


     Thank you for your remarks.
    We'll now go to our questions.
    We'll start with Mr. Caputo, for six minutes, please.
    Thank you very much, Chair, and thank you to the witnesses for being here on this important matter.
    It goes without saying that everybody around this table believes that protecting children from materials they shouldn't be consuming is of great importance, particularly sexually explicit materials. I have a 10-year-old son, and it's, unfortunately, time to have the talk. We heard in the senator's prior testimony that about 10% of nine-year-olds... I believe that was the statistic. I had to step out, but I was going to verify that statistic. Clearly, there is a pressing and substantial public interest in restricting children from this type of material.
    To the Privacy Commissioner, did you appear at the Senate committee, or send any briefs, or anything like that?
     No, I did not.
    Thank you.
    The focus of a lot of what I'm looking at here... The Privacy Commissioner did speak about regulations.
    Could any of the concerns you mentioned be allayed through appropriate regulations?
     I mentioned one aspect with respect to the scope of the organizations captured, so that may be something that would require a change to the statute. However, yes, as to the regulations, we recommend necessity and proportionality. We recommend no tracking of individuals with this information. These are consistent with international best practices and agreements between my colleagues.
    They could be put into regulations. If the statute adds the criteria, it just provides more certainty that the government will follow through on that. Either way, I will certainly be making recommendations to the government at the recommendation stage with respect to those contents.
    For example, on the tracking of individuals, I'm looking at the regulations section. I'm focused on subclause 11(2). You're probably well aware of that subclause. I'll let you find subclause 11(2).
    Mr. Philippe Dufresne: I didn't hear the end of your question.
    Mr. Frank Caputo: Could turn to subclause 11(2), please?
     When we're looking at the tracking of individuals, which is something that, as Privacy Commissioner, is probably chief or salient among your concerns... When I look at paragraphs 11(2)(c) and 11(2)(d), they state:
(c) collects and uses personal information solely for age-verification purposes, except to the extent required by law;
(d) destroys any personal information collected for age-verification purposes once the verification is completed;
    Especially, when it comes to paragraph 11(2)(d), does that not address the concern with tracking? My reading of the legislation is that tracking wouldn't be permitted on that basis.
    I want to make sure there is as much clarity as possible on those points.
    Certainly, that would be an argument to put forward in making sure that the regulations preclude that. If that were the understood interpretation by the government in adopting the regulation, that would lead to the right outcome.
    I am flagging that monitoring, profiling, or using this for advertising, or any other different purpose, is a concern. We should be explicit about that, either in the statute, or, at minimum, in the regulations themselves.
    The reason I'm asking this is not to push back, but just to give you my reading, because when I read paragraph 11(2)(c), for instance, it says, “collects and uses personal information solely for age-verification purposes”. When I read that, to me that would preclude the use of this information for marketing. I don't think you can get much clearer than the term “solely”, for instance.
    Do you see where I'm coming from there?


    I understand that. When we're looking at age verification, given some of the concerns, and in order to provide information to Canadians to reassure them about what we are talking about here, it's whether there is a clear contradiction between age verification and privacy. The answer is no. You can have both of those things, but you need to have an age verification system that is appropriate and designed with privacy top of mind. Again, it's making sure this is not information that can lead to those outcomes.
     Where I'm going with this is that when I read subclause 11(2), it's my view that an appropriate balance can be struck based on the instructions that this section gives us in terms of how the regulations should operate. In other words, data is only to be used for age verification. It's not to be used for any other reason and it is to be destroyed once the goal—as in age verification—has been accomplished.
     To me, that seems like it's going fairly far in addressing any privacy concerns. The whole point here is to preclude children from accessing harmful information.
    We're saying that they have to record it only for this purpose, they can't use it for any other purpose, and they must destroy it. I'm just trying to see how we could actually go any further in regulation than that.
    Other data protection authorities—for instance, my Spanish colleagues—have set this out explicitly as a principle, saying that this should not be used for tracking and this should not be used for profiling. Therefore, I think being absolutely clear on that, either in the bill or in the regulations themselves, is important.
     Technology can evolve. Tracking technologies can evolve. This has to be technology neutral. We need to make sure that the principle will capture whatever advances in technology and that it is used to verify age and for nothing else.
    I think the fact that it could be used in regulation satisfies where I was going.
    Thank you very much.
     Thank you, Mr. Caputo.
     We go now to Mr. Bittle for six minutes, please.
    Thank you so much, Mr. Chair. I'd like to thank the witnesses for being here.
     I agree with Mr. Caputo in that, around the table, I think we'd all like to see the same thing, of children not having access to it.
     I'd like to start with Mr. Dufresne.
    You mentioned in your comments changes required perhaps to narrow the focus of the legislation. I mentioned in previous questioning that I'm a huge Game of Thrones fan. I won't ask you if you've seen it. I don't want to put you on the hot seat, if that is a question to put one on the hot seat.
    Would shows like that put a company like Crave, which I believe is owned by Bell, potentially under the scope of this legislation, if we don't bring in amendments like you suggest?
    Again, it would be up to the courts and the organization that would look into that.
    If you look at the definition of sexually explicit material in the Criminal Code, it describes “photographic, film, video or...visual representation” and a person “engaged in explicit sexual activity”. It could be “written material whose dominant characteristic is the description, for sexual purpose, of [explicit] sexual activity”.
     It raises that question of what will be captured and bringing that clarity.
    I understand that the purpose of this is to capture pornographic websites and other types of visual images and so on and protect children from that. That's one of the points I'm highlighting.
     I guess I'll ask the same question of Mr. Ripley.
    When I originally saw this bill, I started thinking of movies I have seen. I listed a few, like Schindler's List, Green Book, Crash and Gladiator. These are movies that I think we all know or can all agree are artistic. I believe they all have won an Academy Award for best picture, though we may disagree on whether some of those films should have won in that year.
     Mr. Ripley, would this bill capture those films if they were put on a website?
     The government's position is that, yes, they would be captured because the definition of sexually explicit material that is being proposed would include things like the depiction of sex scenes or nudity where it's nudity for a sexual purpose. Those services would be required to institute age verification to access that kind of content.
     I think what needs to be understood is that the definition being proposed makes sense in the context of the Criminal Code, where it's speaking to a variety of offences, including when adults may be engaging with minors using sexually explicit content. It makes sense in that context.
    However, it is not limited strictly to pornography. It is broader than pornography and would capture things like sex scenes or nudity.


     To Mr. Ripley, I know you talked about the CRTC a lot when I was on a different committee and you were a witness there, but it would seem to be—and perhaps you can correct me—the most likely organization to be policing this legislation.
    I wonder if you could explain that, or if there are any other organizations you're aware of that would be involved in that.
    Thank you for the question.
    The government is concerned about the implementation proposed by the bill. To be clear, the way this would play out is that there would be a minister designated, and then that minister would need to make a proposal to the Governor in Council or to cabinet about which federal department or agency would be best placed to administer this piece of legislation. If it were adopted as is, indeed, a minister would have to canvass the existing agencies or departments and make an assessment about which was best able.
    I would highlight that in the context of the online harms bill, the government ultimately came to the conclusion that no entity exists, whether it be the CRTC or, with all respect to the person sitting next to me, the Privacy Commissioner, that is well equipped to play that role. The government is of the opinion that we need an entity that has the right expertise and the right framework in order to regulate the online space as it relates to online harms. That is why the government is proposing the creation of a new entity to make sure the oversight of this space is done appropriately with the right safeguards in place.
    Mr. Dufresne, do you think your office would be in a position to police this legislation?
    For this particular legislation, we would take on the mandate that Parliament would give us and fulfill it to the best of our abilities. I don't think the intention is that this is something my office would deal with, with the caveat that, in terms of regulation making, we are happy to provide advice and input and to be consulted on that to ensure that this is done through a privacy lens.
    Thank you so much.
     I don't have much time, but perhaps I could turn to Mr. Ripley for him to expand on Bill C-63, the online harms act, with respect to what the government is intending to do to protect individuals from harms that are on the Internet.
    Thank you for the question.
    The online harms act would apply to social media services. It proposes three core duties, one of which, as was alluded to earlier, is the duty to protect children. That is fashioned as quite a flexible duty, which would permit the digital safety commission to put in place a number of different kinds of measures or obligations to better protect Canadian children in the online space through the use of age-appropriate design.
    The government's view is that this is a sufficiently flexible duty that could accommodate a question of whether there are certain services, for example, that should use age-assurance or age-verification mechanisms. The government's view is that the framework has the appropriate safeguards in place, and there would actually be a regulator with that mandate and the necessary expertise, through consultation with civil society and experts, to do that in an accurate way and in a privacy-respecting way, if that were to be considered.
     Thank you so much.
     Thank you.


    Ms. Michaud, you have the floor for six minutes.
    Thank you, Mr. Chair.
    Thank you to the witnesses for being here.
    Mr. Ripley, this is a very interesting conversation. I did not really understand the government's reluctance to support this bill, but now it's becoming a little clearer. I understand that, in your opinion, the definition of sexually explicit material is too broad. I'm looking for a solution.
    Would this bill be more acceptable to you if it had used the term “pornography” and defined it outright, instead of the term “sexually explicit material”? Would that make a big difference to you? Would it reduce the likelihood that the bill would apply to nudity or a sex scene in a film?


    Thank you for the question.
    I'm not in a position to give an opinion on potential amendments. However, if I understood correctly, the senator's intention is to target pornography. As you mentioned, the definition of “sexually explicit material” is broader than that of “pornography”. If the senator and the members of the committee really intend to target pornography, we could think about how to limit the current broader definition.
    You also have concerns about the protection of privacy and personal information.
    Comparisons are often made with Bill C‑63, but in my opinion, the two are quite different. Bill C‑63 aims to protect children from harmful online content, which is commendable. Bill S‑210 seeks to limit access to pornography.
    The regulator you want to create through Bill C‑63 seems as though it could be very effective in playing that kind of role. The digital safety commission could play the same role as commissions in other countries. The same goes for the age verification processes.
    Can you tell us what concerns you have regarding privacy, as well as any other concerns?
    Thank you for those questions.
    In the sense that the purpose of Bill C‑63 is to promote online safety and reduce harm, the duty to protect children, which is referred to in section 64 of the proposed act, is quite flexible. According to the proposed section, “an operator has a duty, in respect of a regulated service that it operates, to protect children by complying with section 65.” Section 66 of the proposed act gives the commission the power to establish a series of duties or measures that must be incorporated into the service.
    According to the government, the proposed act provides the flexibility needed to better protect children on social media. During the consultations, it is certainly legitimate to wonder whether the appropriate response is to require some services to adopt age verification. Once again, there will be a specialized regulator with the necessary expertise. In addition, there are mechanisms to consult civil society and experts to ensure that these decisions are well-thought-out.
    Thank you.
    I imagine you looked at what is being done in other countries, particularly Germany, the United Kingdom and France, which were mentioned earlier.
    When it comes to age verification, the senator suggested dealing with third parties instead of directly with pornography sites. Do you have any concerns about that? Do you think that's an acceptable way to proceed?
    We are following developments at the international level, absolutely. The government does not deny that there is a lot of movement in this area.
    Again, our reading of the issue is that the technology has not necessarily reached maturity. Internationally, Australia and France are still looking at these issues. The French are in the process of testing some solutions, but they have not yet completed their work. In the United States, as you pointed out, infrastructure needs to be put in place. Louisiana lawmakers introduced verification measures to block access to minors. Other American states have proposed similar legislation, but we see that some services have withdrawn their access in those states because there is still no infrastructure in place.
    Clearly, a number of solutions are possible. I'm thinking in particular of security tokens, where age verification is done by a third party. As you mentioned, we can also look at device verification. Another solution is facial scanning technology that tries to determine the user's age. It is important that these solutions be deployed in a context where safeguards are in place to ensure that privacy rights are respected. We don't want to create a framework that puts a duty in place without safeguards.


    Thank you, Mr. Ripley.
    Thank you, Ms. Michaud.


     We will go now to Mr. MacGregor for six minutes, please.
     Thank you very much, Mr. Chair.
    I would like to thank our witnesses for being here today.
     Mr. Dufresne, I would like to start with you. I just need some clarification. In your opening remarks, you recommended that the committee “consider restricting the requirement for age verification to websites that primarily provide sexually explicit material for commercial purposes”. Where exactly would you like to see us do that?
    I know that in the interpretation section of this bill, it has this definition for organization: “organization has the same meaning as in section 2 of the Criminal Code”. Is that where you'd like to see a little bit more specificity, or can you identify where?
    Sure. It could be in the definition or it could be under the offence, talking about “makes available”, or primarily makes available. We have pointed to international comparators in Texas and Utah where they talk about “substantially”, or they sometimes reference even “a third” of the material.
    Just a quantitative requirement could be a way of targeting those broader pornographic websites.
    Thank you.
     You also had an exchange with Mr. Caputo on clause 11, where I know you had recommended some additional criteria. My question is really on subclause 11(2), on the wording before we get to the list. I asked the senator about this as well.
    It just says, “Before prescribing an age-verification method under subsection (1), the Governor in Council must consider whether the method”. Are you satisfied with language that says “must consider”, or would you prefer language that forces the Governor in Council to actually follow these? For me personally, when I look at that, I think that gives the government a little bit of a way out: Oh, we addressed three out of five. We did consider everything.
    What if the Governor in Council were forced to verify that it is reliable, that it maintains privacy and that it collects and uses personal information solely for age verification purposes? Do you have an opinion on that?
    I think you asked a good question, and I think the senator gave a good answer to your question—that this is in the bill and there's an expectation that the Governor in Council will consider this. It's explicit.
     Would I prefer stronger language, that the mechanism “must include” the following things? I think that's stronger language, and I would be supportive of that, but on the requirement of “must consider”, I also tend to presume good faith on the government that they're going to draft regulations. The statute says this. If it's completely absent, then it's an obvious element to raise. I would certainly, as I say, expect to be consulted on the development of regulations of that nature, and I would be making recommendations to make sure they are privacy protective.
     Thank you, I appreciate that.
    Mr. Ripley, I would like to turn to you.
    I know that you hesitated in forming an opinion on possible amendments this committee might consider. That being said, we as committee members need to know, if we are going to make amendments, how they're going to be perceived by the government.
    I know that you were hesitant to expand a little bit on the definition of what “sexually explicit material” is, but I guess I want to know, from the department's point of view, how open to interpretation you are.
    Do you need us as a committee to create much more specific language? We want to know, from your point of view, what you need us to do. Help us to help you.


     Again, I would reiterate that I'm not in a place to take a position on behalf of the government with respect to amendments. What I would point you towards is that, again, the problematic aspects of the definition in the Criminal Code are really the paragraph 171.1(5)(a), where you do not see, like you see in paragraphs (b) and (c), for example, the caveat of “for a sexual purpose”.
    Again, in paragraph (a), the language is broad enough to capture sex scenes or nudity regardless of whether they are being shown for a sexual purpose, for example. Again, it would capture entertainment content.
    But, again, if we go ahead in the bill to page 4, subclause 8(1) talks about the enforcement authority:
If the enforcement authority has reasonable grounds to believe that an organization committed an offence under section 5;
     How do you interpret “reasonable grounds”? Do we not trust people to know the difference between a movie on Netflix and obvious pornography?
    A movie, as Mr. Bittle said, that has very sexually explicit scenes is not being commercially made available for that single purpose; it's part of a story, whereas, I think we all know that pornography's main raison d'être is that sexually explicit material.
     The challenge is that the definition that the bill is based on, the definition of “sexually explicit material”, is not limited to pornography. The offence in clause 5 of the bill is as follows:
Any organization that, for commercial purposes, makes available sexually explicit material on the Internet to a young person is guilty of an offence....
     The reasonable ground to believe is whether they've made sexually explicit material available. Again, there's nothing in the definition of “sexually explicit material” that limits it to pornography.
    As I mentioned, paragraph 171.1(5)(a) of the definition in the Criminal Code states:
(a) a photographic, film, video, or other visual representation, whether or not it was made by electronic or mechanical means,
(i) that shows a person who is engaged in or is depicted as engaged in explicit sexual activity...
    There's no limitation there that it's pornographic.
    And then subparagraph 171(5)(a)(ii) states:
(ii) the dominant characteristic of which is the depiction, for a sexual purpose...
    There you see the sexual purpose.
    Again, there are certain places where you see this caveat, and that first paragraph that I read doesn't have that limitation.
     I also want to refer to the fact those two are separated by an “or” not an “and”, which is an important point to make.
    Thank you very much.
     Thank you, Mr. MacGregor.
    We'll start our second round now. This is a five-minute, five-minute, two-and-a-half-minute and two-and-a-half-minute round.
    We will end this round after Mr. MacGregor. I think that's the consensus.
    We will start with Mr. Genuis for five minutes, please.
     Thank you, Chair.
    Thank you to our witnesses.
    We're obviously familiar with the Liberal government's position on this bill. With respect to the officials, of course you're in a position to support that position. Your role as an official is not to come here and state your disagreement with government policy, even if you might privately disagree with government policy.
    I will just say that I think that many of the arguments you put forward were clearly refuted by the senator already. I also want to say that I think Bill C-63 is a real disaster. It raises actual censorship issues. It has nothing on age verification. It's far, far broader than Bill S-210 at every level. It's enforced by vaguely empowered bureaucratic agencies and it includes dealing with speech.
    Most Canadians who have seen what your government did.... To be fair, I understand your role as a non-partisan public servant, tasked with providing fearless advice and faithful implementation. However, what the Liberal government has put forward in Bill C-63 is not being well received across the board.
    On the issues with section 171, I'm looking at the Criminal Code and trying to understand the argument here.
    We have one definition of sexually explicit material in the Criminal Code. Implicitly, it's being suggested that maybe we could have multiple different definitions of sexually explicit material operating at the same time. However, it seems eminently logical that you would have one definition that relies on the existing jurisprudence.
    As Mr. Bittle has suggested that if this definition covers the Game of Thrones, then it's already a problem because it already violates the Criminal Code if, in the commission of another offence, you were to show a child that material. Therefore, you already could run afoul of the Criminal Code if you put on Game of Thrones in your home for your 16-year-old. That's not happening. No one's getting arrested and going to jail because they let their 16-year-old watch Game of Thrones. If that's not happening already off-line, then maybe that suggests that this extensive reinterpretation of what the existing law already says is a little bit exaggerated.
    In this context, we also know that Pornhub has been represented by a well-connected Liberal lobbyist who has met with Liberals in the lead-up to the vote.
    I want to ask the Privacy Commissioner about what he said in terms of potential amendments.
    How would this apply on social media? I'm going to just pose the question. I have young children. I obviously don't want them accessing the major, well-known pornography websites. I also don't want them seeing pornographic material on any other website that they might go to for a legitimate purpose. Therefore, if my children are on social media—they're not—or if they were on another website, if they were watching a YouTube video on that, whatever it was, I would want to ensure that 6-, 7-, 8-, 9-, 10-, 11- and 12-year-olds were not accessing pornography, regardless of the platform and regardless of the percentage of that company's overall business model.
    I don't really understand philosophically why it would make sense or protect anyone's privacy to have an exemption for sites where it's just a small part of what they do, because if the point is to protect children, then the point is to protect children wherever they are.
    I'd be curious for your response to that.


    There have been discussions. If we look at the U.K. age-appropriate code or we're looking at general design for sites for children, there can be a range of tools that can be used. There could be parental approval or there could be education, etc.
    A big part of the discussion is the risk-based approach. The higher the risk, the higher the tool you're going to use. If you go to the age verification, that's a higher-level tool than parental approval, or education or outreach.
    All of those things can be considered, but if the goal is to get at pornographic websites, I raise the question in terms of the numerical element. So—
     I'll give it back to you in a moment, but I would just say that the goal is to prevent my children, and other people's children, from accessing sexually explicit material on the Internet. In the same way that we don't let children access alcohol, if they're accessing alcohol in a bar or at a hotel that has many other lines of business, the point is that children shouldn't be accessing alcohol. Given that there already are age gates on, for instance, some of these sites....
    The senator mentioned something in this regard. It's not really in effect, but Twitter is already letting you know that you're going into a particular area here. It doesn't seem like it would be difficult to apply that age verification principle everywhere that kind of material exists.
     Some of the principles that we often put forward are necessity, proportionality and making sure that you're using the least amount of personal information for your goals, and so that's why I'm talking about necessity and proportionality in the regulations, not tracking...using, again, the smallest amount and asking Canadians to provide the least amount possible of personal information. That said, age verification is appropriate in appropriate contexts.
    I hope you're significantly engaged on the regulation, obviously, because you have a lot of expertise to offer at that point.
    Thank you very much.
     Thank you.
    We go now to Mr. MacDonald. You have five minutes, please.
     Thank you for being here.
    This is an extremely important topic. I think we're all trying to get to the same result, and that's to protect children as much as possible.
    I go back to Mr. Dufresne. I was trying to find some information on the program that's being used in Spain, so I wonder, do you have the name of that program they use for verification?
    They're developing information on potential third party tools that would verify your age. The goal is that, when you're verified by this tool and then you go to a website, you're not giving your age to that website: It just says, “I've been verified.” They issued documentation. They call it the “Decalogue of Principles”, and they talk about, “What are the things we want to see in there?” One of them is, “No tracking.” One of them is risk—mitigating risk and using it for the appropriate purposes. They have been working on potential tools, as have my French colleagues. Australia is doing a pilot project to test this out. There's work happening, and we're going to continue this work with our colleagues internationally to develop principles and then work with industry.


    Do you see anything, in the effect of these two proposals that you have put forward to countries—your counterparts in Spain or the U.S.—to say, “Is this being incorporated into the technology verification that they potentially could use?”
    We're working very closely with them, with the goal of issuing a joint statement of principles to say, “Here's what we want to see, as a community, in terms of age verification tools. Here are the principles.” I'll be doing some consultation later in Canada to see, “Here's what we propose and how we get this balance,” while again making sure that you're using this to identify the age, and not the specific age but the fact that the person is a minor or an adult; that you can do this anonymously; that there are some checks and balances; and that there are safeguards for that information. You don't want that information to be breached and then lead to challenges. We're looking at a full spectrum of principles, but also the technological implications.
    We heard in here today a bit about reporting mechanisms, enforcement—maybe we're not clear on it—and digital literacy partnerships with tech companies. For my own verification, when we're using a third party technical company and we're dealing with, let's say Pornhub, and although—and I'm assuming this—Pornhub is getting only the information that says, “Yes, this individual is 18 years old,” or, “No, the individual is not 18 years old,” if there's some infraction with the technical company, who's liable? Is it Pornhub, which hired the technical company, or is government responsible for the technical company? Where does that liability lie?
    The bill as drafted says that it's a defence for the organization—in this case Pornhub—to believe the person was not a minor, and they would have done that by using the age-verification mechanism prescribed by law. If there were a breach of the mechanism, if there were a concern there, I think that would then get raised and would raise questions about that tool and the provider.
    Pornhub would not necessarily have the responsibility or the liability; it would be the technical company.
    We'd have to see the case-by-case scenario. Pornhub would have to establish that it had met its obligation, under the act, which is to verify the age with the prescribed mechanism.
     I know the senator mentioned—and I never had a question for the senator, obviously—the double-anonymous method verification. Can anybody explain exactly what that is for me, or for us?
     My understanding would be that, again, there are a variety of systems in place to protect an individual's privacy—a token-based system whereby that entity validates that you are 18, and then the technology when you go to access a pornographic website, or something like that. There's no identifying information in the token that you use, for example, with other technologies around facial scanning or whatnot in an environment where there's no personal information that is being held.
    Again, the objective would be to make sure that that infrastructure is in place, but again, what we are seeing internationally is that jurisdictions are still working through this. It's been years, and there's not consensus yet on the accuracy and privacy ensuring nature of these. In this context, we have a one-year implementation timeline where the minister would have to choose an entity; that entity would then have to do the appropriate consultations and put in place the framework.
    Otherwise, every service that is in violation of this bill is theoretically subject to committing an offence and, therefore, subject to potential website blocking.


     Thank you, Mr. MacDonald.


    Ms. Michaud, it's over to you for two and a half minutes.
    Thank you, Mr. Chair.
    Mr. Ripley, I want to go back to the definition of sexually explicit material. My colleague Mr. Genuis raised a good point. If there's a concern that nudity or a sex scene in a film or series will be considered sexually explicit material if Bill S‑210 comes into force, it raises questions about the sexually explicit material that already exists. Some movie scenes are already considered as such.
    That's why I'm not sure what your concern is.
    Thank you. That's a good question.
    The Criminal Code includes that definition, but it also refers to other types of offences. Showing sexually explicit material to a child could very well violate the Criminal Code in a particular context, but it also mentions other Criminal Code offences. That's the answer to your question. It's not just a matter of setting limits. It's not the same thing.
    As I mentioned at the outset, the government is also concerned about the mix of criminal law and regulatory law. If the intent is to create a regulatory framework, the positive obligation to limit access to pornography for children should be clarified. A regulatory framework should then be developed with appropriate penalties. However, the way the bill is currently drafted, it isn't clear whether Parliament's intent is to create a criminal offence or to establish a regulatory framework.
    How do you think the bill should be clarified in that regard?
    If Parliament intends to create a regulatory framework, it should clearly establish a positive obligation and then use the tools that are usually used in a regulatory context, such as administrative monetary penalties.
    However, the bill creates a new offence, for which certain defences are provided, but it also allows the agency responsible for enforcing the act to seek an order to limit access to certain websites. That means we could find ourselves in a situation—the bill confirms this—where websites that provide many types of content, including pornography, would be blocked. The question is whether website blocking is an appropriate and proportionate response that respects the rights of Canadians to access information and content. However, with all due respect to the committee, the government does not believe that website blocking is a proportionate response to the framework proposed here.
    Thank you.


     We'll go now to Mr. MacGregor for two and a half minutes, please.
    Thank you, Mr. Chair.
    The testimony today has been interesting. I think it's led the committee down a few different paths as we consider Bill S-210.
     Mr. Ripley, we've heard a lot about the potential pitfalls with the term “sexually explicit material” as referenced in the Criminal Code and how it could be overly broad. If we go down further in the bill—still on page three, in proposed subclause 6(2)—it says:
No organization shall be convicted of an offence under section 5 if the act that is alleged to constitute the offence has a legitimate purpose related to science, medicine, education or the arts.
    Wouldn't that “legitimate purpose” phrase, from the department's standpoint, save streaming companies like Netflix, since they would be under the arts category? How do you interpret that section?
    Does that add further clarity to the concerns you raised about the definition of sexually explicit material?


     Thank you for the question.
    My understanding is that this type of exception is also drawn from the Criminal Code context in a situation whereby, again, you want to make sure that someone does not have a criminal prosecution for engaging in one of these activities. The government isn't clear how this would necessarily be interpreted.
    In a context where—again, from the government's read of the bill—it would apply to a very wide range of Internet services, from search engines to Internet service providers and from social media to websites, you are creating a good deal of business uncertainty by saying that Netflix and Disney+ are going to have to make the case that it's the arts—
    Can I just interrupt?
    Is that because the term “legitimate purpose” is open to interpretation? Is that the problematic element?
    The challenge, to me, would actually be communicating that, for example, entertainment content on those streaming services falls under the arts. It would be incumbent upon them to show it.
    I would also just highlight from the previous exchange that there is no defence about believing the person accessing the content was 18 or over. The way the bill is structured is it is an offence to make this kind of content available. The defence is whether you have deployed one of the prescribed age verification technologies set out in the regulations.
    Again, it is a very binary framework that is being set up. The entity that is going to be charged with administering it has very few enforcement mechanisms available to deal with the kind of nuances you are bringing up.
    Thank you, Mr. MacGregor.
    Thank you to all of our witnesses for being here tonight and for bearing with us until this late hour.
    Thank you to our analysts and interpreters, as well, and to all of our staff.
    With that, I believe we have a consensus that it's time to adjourn. We are adjourned.
Publication Explorer
Publication Explorer