Skip to main content

CHPC Committee Meeting

Notices of Meeting include information about the subject matter to be examined by the committee and date, time and place of the meeting, as well as a list of any witnesses scheduled to appear. The Evidence is the edited and revised transcript of what is said before a committee. The Minutes of Proceedings are the official record of the business conducted by the committee at a sitting.

For an advanced search, use Publication Search tool.

If you have any questions or comments regarding the accessibility of this publication, please contact us at accessible@parl.gc.ca.

Previous day publication Next day publication
Skip to Document Navigation Skip to Document Content






House of Commons Emblem

Standing Committee on Canadian Heritage


NUMBER 015 
l
1st SESSION 
l
45th PARLIAMENT 

EVIDENCE

Monday, November 24, 2025

[Recorded by Electronic Apparatus]

(1530)

[English]

     I call this meeting to order.
     Welcome to meeting number 15 of the Standing Committee on Canadian Heritage.
    I believe that we all know the guidelines with regard to our earpieces. Please don't just throw them down. There's a nice little sticker. If you could just put them there in order to protect the ears of the interpreters, that would be wonderful.
    Today, we have two witnesses joining us. We have Mr. Daniel Clark and Ms. Wanda Polzin Holman, who is joining us virtually from Edmonton, I believe.
     In just a moment, I will give each of you an opportunity to give five-minute opening remarks, and then we will go into questions from members. During that period of time, we start with questions from the Conservatives, then the Liberals and then the Bloc Québécois. We'll continue to rotate through.
    As you know, the study we are discussing today—and this is our first meeting on this study—has to do with the impact of social media, and in particular influencers on social media, on those under the age of 18. We look forward to hearing from those experts who are with us today and learning what you have to explore with us.
    With that, I will hand the opportunity to speak over to Mr. Clark.
    You have the floor for five minutes.
     Madam Chair and committee members, I would like to thank you for the opportunity to testify before you today.
    I've been studying the ethics of children in social media production for the last three years, with research published in the Journal of Business Ethics, a top-ranked journal, entitled “The Child Labor in Social Media: Kidfluencers, Ethics of Care, and Exploitation”. This work is necessary, as a recent Harris poll found that 29% of children from eight to 12 aspire to a career on YouTube—more than any other career—and are at risk of economic exploitation, consent violations, privacy loss and other harms.
    There are two concepts in this field. Sharenting, a portmanteau of “share“ and “parenting”, is where parents build social media channels around their children for other parents as the audience. “Kidfluencing” is where children influence other children or adults through their own social media personality. These concepts differ in the child's role and audience, but they share many ethical concerns.
    Using the UN Convention on the Rights of the Child as our framework, our research suggests that the following children's rights are at risk.
    First, there's the right to consent. These channels are owned and operated by parents. Children participating through parental caveat is a constant risk.
    There's the right to privacy. Even moderately successful channels can expose children to millions of strangers. Their everyday follies, embarrassments and charms exist forever in social media, potentially haunting them for life.
    There's freedom from economic exploitation. These channels can generate thousands to millions monthly from the child's involvement. Parents should benefit proportionally, but there's no guarantee that their children are.
    There's the right to education. Being a social media star requires significant time. Kidfluencing may seem like play, but it's work, like acting or performance. It is not uncommon for kidfluencers to be home-schooled. Where is the time for school, play and sleep?
    There's freedom from harm. Parents may put children at physical risk creating compelling content. There's also the risk of child predators forming parasocial relationships with child influencers. Recently, a study found that 95% of adult influencers had been subject to stalking behaviours, and 40% felt fear as a result. Unfortunately, we don't have such data about child influencers.
    There's freedom of expression. Children are brand ambassadors endorsing products and expressing opinions often not their own. When the child is 25, their 15-year-old opinions might prevent them from getting a job or otherwise committing to a position.
    Over the past year, I've been interviewed numerous times about this paper. I'm always asked, “What can we do about it?”
    Anything in the sphere of child welfare is deferred to the decision-making of the parents—except that, when parents are generating income, sometimes significant income, some parents' decision-making may be compromised. The platforms these videos appear on, such as Instagram, TikTok and YouTube, earned $11 billion in advertising to children in 2022. By limiting account ownership to the age of majority, they place an onus on the parents to ensure that their children are protected. They set guidance about what is and is not allowed on their platforms—violence, gratuitous nudity or sex, etc.—but this is clearly not enough to ensure that online child actors are free from exploitation.
    That means the federal government may have a role to responsibly regulate this practice. We need to protect these children. Internationally, other jurisdictions have made some progress toward protecting children in these arrangements and their future earnings. It's time Canada also took action.
    We need to protect these children from the negative impacts that may arise from the parasocial elements of global exposure through social media. We're all aware of the harm that befell child actors like Macaulay Culkin, Gary Coleman, Shirley Temple and countless others through exploitation. While it's debatable that there is even a need for kidfluencing in any capacity, if it is allowed in this country, then children who are the subject of this social media enterprise deserve as much protection and recourse from the harms as our law can provide.
    Thank you, Madam Chair. I look forward to your questions.
(1535)
    Excellent. Thank you very much, Mr. Clark.
     We will now go to Wanda Polzin Holman for five minutes.
     Good afternoon, Madam Chair and committee members.
    My name is Dr. Wanda Polzin Holman, and I'm the CEO of Little Warriors.
     Little Warriors is a national charitable organization, and we have been recognized through numerous scientific and clinical journals as being a leader in the field of child sexual abuse awareness, prevention, advocacy and evidence-based treatment. I've been involved with Little Warriors for over eight years, including a previous role as clinical director. I'm clinically trained and have obtained a master's degree and a doctorate. I am currently a registered clinical social worker.
    I appreciate the opportunity to share Little Warriors' perspectives with the standing committee. On behalf of the children and families we serve, I'm very appreciative of the committee for undertaking this important study. It is indeed an area requiring further understanding and actions.
    There are some key issues that we have observed at Little Warriors with regard to children and adolescents, which I would like to highlight.
    First and foremost, as a result of social media influencers, we are witnessing significant deterioration with regard to mental health issues, including an overall increase in levels of stress, cyber-bullying, suicidality, anxiety, depression, self-concept concerns and radicalization of gender bias. Also, there are concerns related to sextortion and online grooming, as well as luring of our children. This happens both in plain sight—as we are all on social media—and in very subtle ways through gaming platforms and social media, which parents and educators may not always be apprised of.
     Families, even those who do their best to ensure proper controls on devices, are very concerned about their children's online experiences. We have seen this first-hand at Little Warriors when treating children and adolescents who have come to the hands of predators.
    We understand that there are ongoing issues related to inappropriate content, online interactions with unsafe individuals and algorithm-driven risks. We are seeing gaps in the digital literacy of children's educators as well as the parents, and the platforms are constantly changing, which is concerning for us all.
    Additionally, there are issues regarding loss of privacy that children and teens do not always comprehend, and there are concerns related to children's digital footprints. These could obviously have long-lasting negative implications for them.
    There are gaps in legislation, deterrence and penalties regarding online and in-person harm, and access to children by potential predators across geographical borders. Overall, at Little Warriors we are concerned about child exploitation and sexual abuse, and about the lack of clear and consistent sentencing and regulations. As Canadians, we seem to understand that there are controls required for other aspects relating to children's safety, but we have yet to address social media content harms.
     In light of these concerns, I'm hopeful that this review will result in decisive action to protect children and to uphold accountability. Specifically, first, ensure that survivor centre supports, including prevention programs such as Little Warriors' Prevent It! program, are included in new policy measures to support schools, charities and other in-person and online community organizations to expand prevention and support resources.
    Second, review sentencing gaps and issues of deterrence. We have witnessed child sexual abuse offenders being released into the community with warnings, only to be found reoffending a short time later. Protecting children must take precedence over the rights of offenders who perpetrate abuse.
    Third, legislate stronger, more consistent sentencing provisions for offences related to the possession, access and distribution of child sexual abuse and exploitation materials.
    Fourth, make concessions for individuals to donate to charitable organizations and to financially support organizations such as Little Warriors that invest in prevention efforts and work with survivors.
    The work of this committee is a defining moment for Canada to act with moral clarity and to ensure better safeguards to protect vulnerable children, online and in person.
(1540)
     I appreciate your time today and look forward to questions.
    Thank you very much for your time.
    I will go to our first member of Parliament to ask questions today, and that is Mr. Waugh.
    Mr. Waugh, you have the floor for six minutes.
    Thank you, Madam Chair.
    Thank you, Dr. Polzin Holman and Professor Clark, for being here.
     My first question is for Little Warriors.
    Wanda, I think the public is seeing increased occurrences of exploitation—and this is a word that we've talked a lot about here in the House of Commons this week. We're seeing it at all levels, but can you give us some stats on children? You made four or five points, but do you have any stats that maybe you could share with us here today about the exploitation of children online?
    I certainly do. I'm also happy to provide more statistics afterwards. Some of the recent statistics that I'm aware of are specifically with regard to the sextortion of children and youth. Since 2020, there's been an 80% increase in reported sextortion cases, and victim demographics are most often youth aged 12 to 17.
    In addition, we also know that Cybertip, for example, has reported a 150% increase with regard to sextortion and online luring between June 2022 and August 2023. Those are the most recent statistics that I'm aware of.
(1545)
     You also mentioned gaps in legislation, and I would like you to talk a little bit about that, if you don't mind. We've talked about it a little bit here in Parliament, but you brought it up and you talked about the gaps in legislation, mainly the sentencing gaps that we've seen over the last number of years.
    Just bring us up to date on what you are seeing in the Edmonton and Alberta areas, and the sentencing gaps that could be the job of parliamentarians when we do go through some legislation.
     Well, I know that there have been ongoing issues that have been brought up with regard to the online harms act, and I understand that the reason we're coming together is to explore some pieces related to that.
    We know that there are similar things that are needed to limit the amount of child sex exploitation. We need some regulations to support schools as well as parents in their understanding, as well as understanding across communities.
    We know that people, from our perspective at Little Warriors, have not been held accountable in the way that they need to be. The deterrence is very minimal at this point. We have children who come to us for child sexual abuse treatment as a result of being harmed online and sexually abused and exploited. Many times, the offenders and perpetrators are released without serving any time or having any consequences that relate specifically to the crime. What I mean by that is that the children and teens come to us and require intensive supports and treatment for what has happened to them. Very often, the perpetrators are released into the public, sometimes with notifications, and they are reoffending.
    We've had several situations at Little Warriors where this has happened, and there is just not enough deterrence in place for them to stop what they're doing. It's very difficult to continue to follow offenders and perpetrators online as a result of the ongoing changes that are happening, and the ways that they're doing it through gaming platforms and social media platforms for children as young as seven or eight years old.
     I come from Saskatchewan. There isn't a week that goes by when we don't hear—whether it's through city police in Saskatoon, Regina, Prince Albert, Moose Jaw or other locations, or through the RCMP—that they have convicted somebody. We never hear what happens after the conviction.
    It was interesting to hear when you were talking about it, because those who have been exploited.... What recommendations would you give to this committee, then, about Internet safety? In my city, we have a whole department at the Saskatoon Police Service actually designated to look for perpetrators who are online.
     I think it's a really important question. I think that, in terms of recommendations of safety, the public needs to understand exactly what the numbers are, what is happening and the number of online offenders who are present who cross international borders and have very little deterrence from reaching out to children.
    In Edmonton, we had a very unfortunate case that I'm sure everyone is aware of that crossed into the United States, and unfortunately that particular child and her family have forever been changed. Fortunately, the offender was charged in the United States, but we've had other children whose offenders were in Canada and were released very early or were released and have fled the country, or had other situations where there was very little ability to follow up on their sentencing. That's very important, as well as supporting children and families to understand the complexities and changes that are happening with regard to safety measures online.
(1550)
    Thank you, Dr. Polzin Holman.
    The next person with the mic is MP Al Soud.
    You have the floor for six minutes.
    Thank you, Madam Chair.
    Thank you to our witnesses for being with us today. It is greatly appreciated.
    Social media platforms and influencer culture now play a defining role in the lives of children and adolescents. They shape everything from their self-image to their social interactions. I am part of a generation that has very directly seen and experienced the online and digital environment. It's not just on social media; it's in video games as well, specifically in lobbies. I'm also part of a generation that has notoriously found ways around age verification processes. I think that's much of what I'd like to discuss today.
    Professor Clark, you have been an associate professor of entrepreneurship at the Ivey Business School since July 2025. Your current research focuses on the cognition and decision-making of entrepreneurs. You made reference to an article earlier called “The Child Labor in Social Media: Kidfluencers, Ethics of Care, and Exploitation”.
    You are cited in Western News saying, “Consent isn't a one-time event; it must be continuous, informed and freely given. For kidfluencers, let's be real, it isn't”. I'm curious. Given that children cannot provide ongoing informed consent, what safeguards do you believe platforms or governments should require to ensure that minors' images, data and labour are not exploited in influencer environments?
     That is a great question. You're right. The simple fact is that there is no consent for very young children. At the very least, there is maybe assent in that you know that the child is not doing it truly against their wishes, but they can't consent to all the implications that come with it.
    To be perfectly honest, this is one of my arguments toward the fact that there is a date, probably somewhere around the age of 14 to 16, when young people can take back control of their digital identities, but before that, I do not see the benefits of allowing children to post and feature in social media content to a wide audience. Two things enhance the risk here: the amount of time they spend making content and, more importantly for what you're pointing to, the amount of exposure they get. You get past a thousand people. You get up to the millions and tens of millions and hundreds of millions of exposures. You are magnifying the risk infinitely, and no child has the capacity to understand what they're consenting to when you're talking about those large numbers.
    My own six-year-old struggles with the difference between five and five million, so I can't imagine too many other kids really understand how big the exposure is.
     Thank you for that.
    Growing up, I, too, was quite interested in the YouTube space. My father, at the time and to this day, was very reluctant at the idea of seeing me join or engage in YouTube in any way, shape or form. In hindsight, it made perfect sense. I'm not particularly talented. I'm not a great musician in any way, shape or form, but it did ultimately stand to benefit me significantly.
    In your view, who is currently benefiting from this gap, and who should be held accountable for protecting children from being commercialized online? What policy mechanisms do you think might help us do that?
     The number one beneficiary in this space is the platforms. There's no halfway about it. These are massive companies, making billions in revenue, specifically in the advertising from child content and the advertising to children. The fact that there is no control over this is a giant mistake. The fact that we've been asking them to police themselves is a massive mistake, and it is not in anybody's best interest to do so.
     Beyond that, the primary beneficiary financially is the parents. If you are under the age of 12, you cannot have a YouTube account and you cannot have a TikTok account. Your point about age verification is taken. However, if you want to get paid from those things, you certainly can't get paid through PayPal or the other mechanisms at that age. The kids, then, are employees of their parents, and anything that's happening to them as a result of that is because their parents are putting them in that position to be their employees. The protections ultimately should fall on the parents or, as a proxy, on us as a society.
(1555)
    That same article states that “each kidfluencing venture is a privately owned enterprise and the 'employees' in question are minors”. Given these indicators, how do you believe platforms and regulators distinguish between legitimate participation in social media and ventures that are exploitative, and what concrete steps do you believe we can take to prevent children from being put in those situations?
     I recall that a couple of weeks ago we had Meta here, and Meta made reference to the idea of the app stores taking on that burden of essentially ensuring age verification. Do you believe that might be a venue to be explored?
    The age verification component is valuable when it talks about downloading the app and who's watching social media. However, when it comes to putting content on social media, we effectively punt this to the parents. We say that if the parents are okay with this video—with this production—because they're the ones who have to own the account, that's effectively their responsibility.
     Honestly, I don't think age verification is really going to help us here, beyond a point. Ultimately, we have a responsibility to say that there's harm being done, that this harm is being done irrespective of the de facto age, and that people are profiting from that. That is a broader, wider, more societal problem.
    Thank you very much.
    We will now go to Mr. Champoux for six minutes.

[Translation]

    Thank you, Madam Chair.
    I will start by not thanking Mr. Al Soud for making us feel a bit like dinosaurs when he said he belonged to the YouTube generation. Ours was the vinyl, 8-track cassette, camcorder and Super 8 camera generation, so a huge thank you for reminding us we belong to different generations. That said, I am happy there are different generations around the table to talk about matters affecting everyone.
    Mr. Clark, I am going to continue the discussion you started with my colleague Mr. Al Soud about potential regulations on user age and age verification.
    Australia has passed legislation banning social media for young people under the age of 16. In your opinion, is this an applicable solution? Could we use it as inspiration? Is it effective? Would those who truly intend to use their children as moneymakers find it all too easy to get around as you said?

[English]

    I think that's a great point, and I think what's happening in Australia is a good step.
    We can talk about how people might get around these regulations, but that's a deliberate act. You have to want to get around them. You have to be willing to falsify. You have to be willing to obfuscate. You have to be willing to take a proactive act to break the rules here.
     I'd rather there be rules in place that are imperfect than to have nothing in place. I think you could go straight to bans, or you could have usurious fines, or you could.... There are lots of ways to go about this that I think would, at the very least, reduce the harm.
    While I think eliminating the harm is impossible—and I think your point there is very well taken, and the same thing with Mr. Al Soud—I'm happy to live in a world of harm reduction right now, because there's none right now.

[Translation]

    Will the platforms find ways to circumvent regulations, since they are usually quite good at doing this as soon as regulations are put in place? Will they succeed in circumventing the regulations or legislation that may be adopted? How could we enforce the law? We could be somewhat cynical and think that, no matter what we do, the platforms will always adapt and find ways to circumvent any measures, make and abide by their own rules, to some extent.

[English]

     Yes, I think there is a real risk that the platforms will allow loopholes to exist. Until you tell them that they have to close that loophole, they will say, “Oh, do you know what? They didn't say anything about that. We're going to leave that one open.” Absolutely, that is always a possibility.
     I don't believe the platforms here are good actors. They are not thinking about the best interests of the people who are creating content for their platforms. They are thinking, primarily, about the other side. They're thinking about users. They want to facilitate use, watching, viewing and advertising as much as humanly possible.
    I wish I were an expert on age verification technologies—I'm not. We need better and independent age verification technologies because, as long as we allow the platforms to be in charge of this, it is in their best interest to be bad at it. They don't want to keep people off the platforms, either as content producers or as watchers. That's their audience, so you're absolutely right.
(1600)

[Translation]

    Thank you, Mr. Clark.
    Ms. Polzin Holman, I want to say I was interested in your observations on the effects and dangers on mental health.
    I am going to ask you essentially the same question. Do you think that copying Australia's model could be a solution, by prohibiting access to social media and this kind of online content based on age as much as possible? Australia set the age at 16 years, but it will be up to us to decide whether we want to set it at 14, 16 or 18 years of age. In your opinion, what potential impacts could such regulations have?

[English]

     Thank you very much for the question.
    I agree with Mr. Clark. I think the idea, as it relates to what Australia is doing, is a good first step—having some rules in place versus having nothing. Looking at harm prevention is very important, specifically as it relates to the issues that are coming up related to mental health as well as child exploitation.
    Platforms may try to circumvent any laws that are attempted to be put in place, but, certainly, I think this would limit and address issues of non-intentional consent that's happening with children. We are seeing this, as I mentioned previously, with seven- and eight-year-olds, who are simply clicking the “yes” button—“Yes, I'm over the age of 18.” There's no way to verify otherwise, and it's putting them at enormous harm from online predators.

[Translation]

    Earlier, you talked about radicalization, which is also a worrying phenomenon. Can the effects of radicalization be observed at such a young age?

[English]

    Give a quick answer, please.
    Yes.

[Translation]

    Thank you very much, that is indeed a quick answer.

[English]

    Thank you.
     Next up, we have Mr. Généreux, for five minutes.

[Translation]

    Thank you, Mr. Clark and Ms. Polzin Holman.
    Ms. Polzin Holman, is there a French or francophone version of Little Warriors in Canada?

[English]

    Unfortunately, there is not a French version at this point.
    We do have the Prevent It! program, however, and we have created a Prevent It! version that is French, for French-speaking Canadians. Beyond that, we're looking at Spanish as well as other languages, because it's being looked at as an area outside of Canada.

[Translation]

    Thank you.
    Mr. Clark and Ms. Polzin Holman, based on your experience, you have obviously seen technology evolve over many years. Based on your observations, what are the fundamental differences between technology today and technology from 5, 10 or 15 years ago? I note that the Internet has existed for 15 or 20 years now. In your opinion, what is the most fundamental change with regard to what is happening now?
    In your answer, I ask you to bear in mind that AI adds to all the dangers society faces when it comes to technological tools.
    What are the main things you have noted in terms of harms or risks to young people in the past versus those they face now?
(1605)

[English]

     Who would you like to go first?
    Go ahead, Mr. Clark.
    I think there are a few things. First of all, you're right; artificial intelligence is changing the game because of the speed at which new content can be created. You can put a few prompts into a video-generating engine, a few into audio generation and a few into an editing engine, and within 20 to 40 minutes you can have a brand new video—one that may involve no original human content—up on YouTube. You can recycle and reuse existing content to create new content incredibly fast. The AI on this is terrifying.
     The other thing that I think is really accelerating is the ease of access to the Internet. We've all had smart phones that have been able to watch video, but think about how 5G has increased the quality and the access to video consumption in just the past few years. You can download high-definition videos to your phone anywhere in North America without even the blink of an eye. The ability and the access exist 24 hours a day.
     If you have a kid, that means they probably have an iPad. They probably have a phone. They probably have access to a computer. Keeping them away from all these technological bases is almost a form of child abuse these days, in their eyes, because they want to be on the Internet in some way, shape or form. It becomes ridiculously difficult to police all of them and have the right protections on all of them.
    I would say that the ease of access and the AI component, the ability to create new content, are probably the two that are really accelerating this.

[Translation]

    Ms. Polzin Holman, do you have anything to add?

[English]

    I would agree with what Mr. Clark is saying. At Little Warriors, we've unfortunately already seen the issues with regard to deepfakes related to child sexual abuse, taking images non-consensually and manipulating them in a way that produces child sexual abuse and exploitation material with child and teen images.
    The concern, and certainly what we've seen, is the level of cyber-bullying that has come from adults on the Internet as well as from other children in communities. Unfortunately, when children, as Mr. Clark mentioned, have such ease and accessibility to these things, they're using it and manipulating images, and sharing images without the full understanding of the implications and the effects on the individual they're sharing it about.

[Translation]

    I am going to talk about my own experience. When I was young, there were no ill intentions behind our games, how we played with others and our interactions with other young children.
    Ms. Polzin Holman, is it your impression that, now, young people intentionally want to harm or hurt other young people or make it seem like that? You talked about deepfake images and videos, and Mr. Clark also spoke earlier about how easy it was to do that.
    Have you noticed an increase in this kind of thing? Essentially, my question is this: is our society evolving towards—

[English]

     I'm sorry. You have to wrap up.

[Translation]

    No.

[English]

    Voices: Oh, oh!

[Translation]

    Is society going in the wrong direction because of these technologies?

[English]

    Dr. Polzin Holman, I'm so sorry. Unfortunately, I'm not going to let you answer that question. Hopefully, you'll have an opportunity to speak to that in a moment or two.
    With that, I will give the floor to Ms. Royer for five minutes.
    My first question is for you, Mr. Clark, beginning with some of the extraordinary statistics you shared. You said that 80% of youth surveyed aspire to a career as an influencer. I think that's what you said.
    It's 29%.
    Okay. Thank you for the clarification. That's already incredibly high.
    Daniel Clark: Yes.
    Zoe Royer: The other one was that $11 billion was earned by children in 2022. I guess that would be their parents, really, or the account holders.
(1610)
     If $11 billion was earned in advertising, the amount that was shared with the account holders would be substantially less, but we do know that the top-earning kidfluencers routinely make north of $30 million a year.
    How do algorithm-driven recommendation systems amplify harmful or age-inappropriate content for minors?
     If you've ever gotten lost down a YouTube hole, you know the power of the algorithms. They effectively reach into your brain, figure out what it is you like or want and just keep feeding you more and more of the same.
    For my six-year-old daughter, it's videos about Minecraft and Roblox. There's an unending stream of them. It keeps her online, keeps her watching advertising and keeps her as a consumer, unless my wife or I basically say, “Enough.”
    There is no end to the Internet anymore. Once upon a time, I used to joke and say that I'd reached the end of the Internet. That doesn't exist anymore. The algorithms are always able to find and probe content that is going to allow me to stay on board and keep earning them money.
    Is there certain influencer content that generates higher-risk profiles than others? Would it be Minecraft searches? Would it be beauty or sports? What are some of the top ones?
    The questions about risk come to which group you're talking about.
    For, let's say, preteen kids—anybody who's sub-12 years old—you're looking at things that allow them to engage in a world and get lost in a world. Gaming content is a big one. Ones that feature personalities that look and act like them—other 10-year-olds, seven-year-olds and eight-year-olds—allow them to escape into a world where they have friends instantly.
    My daughter frequently says, “Oh, I'm just watching my friend.” I ask her how this person is her friend. She says, “Well, they talk to me. They watch me and I watch them.” She wants to make videos so that her friends can watch her. There's that particularly immersive world to which preteens are susceptible.
    For teens, it's about creating a more idealized, socialized world where they have larger friend groups. It might be, as you say, fashion videos. It might be kids hanging out and doing fun things or doing silly stunts, etc. There are different pockets that appeal to boys and there are different pockets that appeal to girls, but it is a fundamentally different architecture where it's more about specifically the kids driving their own world in the teenage years, whereas in the sub-teen years, it's about creating a world for the kids to get immersed into.
    That's interesting.
    What do you think would be the long-term societal impacts if the current trend in youth exposure remains and if it remains unaddressed? What would be the long-term impacts?
    It would be devastating.
    We use proxies of former child TV and movie stars a lot as a warning sign. It's the kid who got all of their money stolen by their parents or the kid who became an alcoholic or a drug addict or committed suicide. All those situations were fundamentally contained by the fact that there's a relatively small amount of TV and movie content that's produced.
    Now, multiply that by infinity, because there's no limit to how much social media content can be produced. You're potentially creating a pool of victimized young people that is enormous. If we thought it was bad to watch Macaulay Culkin get all of his money stolen by his parents, imagine that multiplied by tens of thousands of kids being pillaged of their income, which is taken by their parents or whomever.
    That's the time. Thank you so much.
    Next is Mr. Champoux for two and a half minutes.

[Translation]

    Thank you, Madam Chair.
    Ms. Polzin Holman, you mentioned deepfakes earlier. In November 2024, one year ago, the committee conducted a study on the harm caused by viewing illegal sexually explicit material online. That study led to recommendations, and some of them were quite simple. I would like to know what you think about some of those recommendations.
     To summarize, the fourth recommendation called for digital platforms to implement processes for detecting and reporting illegal, sexually explicit content, such as child sexual exploitation material and the non-consensual distribution of intimate images, including deepfakes.
    The fifth recommendation was also fairly easy to implement. Witnesses who appeared before the committee requested that section 162.1 of the Criminal Code, which defines intimate images, be amended to include the concept of sexually explicit deepfakes. However, this has not been done.
    In your opinion, are these interesting tools to put in place? Should we reformulate these requests to government, in order to implement the recommendations set out in the study we undertook in the previous Parliament?
(1615)

[English]

     Thank you so much.
    I completely agree. I think that as things have been outlined in those four recommendations, clearly it would be a good place for us to begin to look at setting clear guidelines and expectations. Just to add to that, without that, I think we will be looking at future further issues related to child exploitation, luring and other online issues, and we will have other issues that come forward as a result, requiring treatment and intervention, which is very costly.

[Translation]

    Do you sometimes feel like we're going around in circles and that, no matter what we do, almost nothing changes, because the platforms will always be extremely vigilant when it comes to regulations and will always manage to find loopholes, as Mr. Clark said earlier? Are we doing this work for nothing? At some point, it is discouraging.

[English]

    I understand the discouragement. We certainly see that in the work we do at Little Warriors. I think, again, having nothing in place is sort of giving up and not ensuring the safety of our vulnerable children and teens is in place. I think that having some kind of rules in place.... Having nothing and not taking responsibility as a society is just not acceptable.

[Translation]

    Thank you.

[English]

    Thank you.
    We'll now go to Mr. Diotte for five minutes.
    Thank you, Madam Chair.
    Thank you to both witnesses. It's a really fascinating topic.
     I wanted to talk to Dr. Polzin Holman.
    First of all, thank you for your good work. I'm very familiar with your organization, being a fellow Edmontonian. You have a great reputation.
     I wanted to bring this home to talk about a real-world story. I noticed that on your website you talk about the case of Michelle, a mother whose story reminds us of why Little Warriors was created. Can you tell us a bit about Michelle's story and the story she shared about her daughter and how her daughter fell victim?
     Thank you, Mr. Diotte. I appreciate your kind words.
     I certainly can say a little bit more about the child and family that were impacted. The child was treated at Little Warriors in our intensive episodic treatment centre. The reason she came to us is that there was an online predator. The parents of the child, who at the time was somewhere around 12 years of age, were very much involved in this child's life and very much involved in setting controls and checking and supporting their child's digital literacy.
    The issue actually came when the child was at school and was able to access social media platforms, where the offender took advantage of safety that wasn't in place and continued to reach out. They came to Edmonton, not once but twice, and met her at her school. At that point, they took her across the border in the United States and significantly sexually assaulted and raped her.
    This, unfortunately, is a very significant example of what can happen and what the harms are. Unfortunately, this is not an isolated incident. We've had numerous other situations of boys and girls under the age of 18 who have had very similar situations and have been assaulted as a result and manipulated in various ways. It is a problem that definitely needs to be addressed. We can't settle for what is currently in place.
(1620)
    Thanks for that.
     You mentioned sexual offenders being released with mere warnings. Can you talk a bit about that and how often that occurs?
    Unfortunately, we have situations.... Mr. Diotte, you would also be familiar with the case. There was an offender who was reported at a day care just outside of Edmonton and who had access to children under the age of five at a day care setting. He was awaiting trial and actually fled the country. Since that time, we are aware that there have been other children—again, four, five or six years old—who unfortunately have experienced the abuse he put upon them in the day care.
    We've had numerous situations where children and teens who are awaiting treatment and can't get into treatment have their offenders released because there's not enough information or not enough sentencing or they've served a short period of time. They were released and have reoffended again.
    What would you like to see done in that regard? What are some immediate steps that could be taken there?
    It's not my area of expertise to say. Certainly, I think we need to be looking at our sentencing and perhaps having mandatory minimums, as well as looking at the process and the speed at which we support vulnerable children, youth and their families to report sexual abuse.
     I take it that you would not be a fan of the recent Supreme Court decision against mandatory minimums for child pornography.
    I know it's a complex issue. I understand some of the rationale behind that.
    From the perspective of supporting vulnerable children and youth, my position is always to ensure that our focus is on supporting children and youth and ensuring that they're safe. Offenders should be serving their sentences and be treated with very significant penalties for their actions.
     Thank you, Doctor.
    I'm going to give the floor to Mr. Ntumba for five minutes.

[Translation]

    Thank you very much, Madam Chair.
    I salute both witnesses and thank them for their opening remarks, which were quite interesting.
    Mr. Clark, I don't know if the same thing happens to you, but whenever I talk to my children and mention a product, such as a brand of water or a watch, I automatically see sales offers on my Facebook or Instagram page. Yet I haven't searched for the product, and it is not Black Friday.
    Are we being tapped? If so, how can we fix that?
(1625)

[English]

    My wife would agree with you that somebody's listening in.
    To a certain extent, there are a lot of ways this is happening. You type anything into Google, and that gets saved as a cookie that other platforms can draw upon and create content and push advertising towards you that is specific to your interests and what things you like. A big part of the Internet phenomenon is that anything you do on your phone can be doing that. There is considerable risk that things like Alexa and Siri could indeed be listening to you and making sense of what you want.
    Also, there's WhatsApp, for example. There's the ability of the platforms to mine what you're doing on WhatsApp, and particularly Meta, obviously, because they own WhatsApp. Then they can port that information about you over to Facebook or Instagram, their other platforms, and use that to figure out what you like, where you're going and what you're interested in and direct content towards you. That is absolutely happening.

[Translation]

    Earlier, you mentioned cookies. Generally speaking, cookies contain a lot of information; there are several pages to read and the font size is small. Personally, I admit that I have never read all of these pages.
    If we decided to read them, would we discover that we agreed to be monitored by clicking on the “Accept” button? We use our devices to communicate with the outside world, of course, but using them is no longer safe.

[English]

    Yes, they change the fine print in the user agreements on a regular basis. Even if you actually did read it with a fine-toothed comb and spent many hours going through every single line, a month later they'd likely change it on you and it would look completely different.
    Yes, you are effectively giving them permission to do pretty much whatever they want when you click on the box.

[Translation]

    Ms. Polzin Holman, I know a little about your background and your commitment to combatting child porn.
    As young parents, we want to take photos of our children and post them on Facebook, Meta, or Instagram, for example. Recently, Mr. Généreux and I went to Kelowna. In a workshop we attended, we heard that some people used one child's photo to make an ID card for another child elsewhere.
    As parents, we know what photos we post online. How can we protect those images? Should they not be posted?

[English]

    Given what we know, I don't think we can protect them, other than by not putting images online. We have had training through the RCMP and other specialists specifically in the area of online images, and what we know is that even regular sorts of photos that would get put up by parents on Facebook or other social media platforms of their children playing in the bath or something like that can actually be manipulated and taken and used for sexual content without consent and without awareness. We are aware through the training we've received that there are children's and teens' images that are manipulated and utilized and that we would actually never even necessarily know are online in that capacity.
     Dr. Polzin Holman, thank you very much for your remarks.
    At this point in time, I am going to take five minutes for questions, and then we'll continue from there.
    My first question is for you, Dr. Polzin Holman.
     I'm wondering if you can comment a little bit further on.... Maybe we'll start here, actually. On your website, you state that more than 95% of child sexual abuse cases in Canada go unreported. I'm just curious as to why that number is so high.
(1630)
     There are several reasons why that number is so high. First and foremost is that with many children and teens who are sexually abused, it is happening with offenders who are very well known to them; family members or other people within their social circles are offending against them. Over time, they and/or their caregivers have been manipulated and groomed, either in person or online. Oftentimes, by the time the offences take place, there has been a lengthy period of time when they've been involved with certain activities and been exploited, so they begin to feel that they are somehow partially to blame, or there's embarrassment or shame. Oftentimes, offenders will manipulate them and tell them that they will be hurt or their family will be hurt.
    There are many reasons. I recently read a study—not a Canadian study—that shared that for adults, as child sexual abuse survivors, their first time sharing the offence is actually in their fifties. I don't have Canadian stats on that, but there are many reasons why that number is as high as it is.
    Thank you for your reflections on that.
    I'm curious if you can reflect a little bit more on what unique harms arise when the sexual abuse, grooming or exploitation takes place online versus in person.
    I think there are many of the same harms. Obviously, there are significant impacts for safety and significant impacts to mental health and wellness, as we've talked about.
    I think the difference is that when there are things going on online, we don't know the extent to which this is happening, and it crosses borders. It doesn't just stay in the home or the community, but it can be transitioned halfway across the world. The images are shared and no one would know that. While there are similarities—and I don't think that one is better or worse, less disruptive or less disturbing—I think the issue is really with regard to how far-reaching it is.
    Doctor, in your time with us today, if there was one thing that you wanted us to take away from our time with you, given your experience over many years—probably decades—of working with children in these vulnerable situations, what would be the thing you would want us to take away?
    I think it's understanding the level of concern. Oftentimes, people think that this happens to other people or other people's children. This is happening. I guarantee that once you start talking about this and removing the barriers for communication and discussion around this, the prevalence is very high.
    We know that one in four girls and one in six boys under the age of 18 is sexually abused in some way, and we know that 95% isn't reported. When we talk about the numbers, it's a very significant number. We need to be taking it seriously and recognizing that this isn't about someone else's child. This is about all of our children and all of our teens, and within all of our families and communities.
     Thank you very much for those reflections. They're very much appreciated.
    I have a quick question for you, Mr. Clark. You said that $11 billion is earned in ad revenue in the United States of America. I'm curious to know whether you have any stats for Canada.
(1635)
    Unfortunately, I don't. I believe that was actually the total earnings of the platform. That would include Canada as well. I don't know what the Canadian ad revenue is for media consumed by children under the age of 18.
    Thank you very much.
    MP Myles, you have five minutes.
     Thanks, Madam Chair.
    I have been on social media for most of my career. I've witnessed how it influences my own mood and how I react to the things I put on. I was a musician for years. You know, social media behaves very differently from TV or movies, as you said before. The thing I find about it particularly—I have young kids—is the addictiveness. Sometimes I wonder if we're going to look back at this time and say, “We were giving cigarettes to our kids”, because in many ways, it's really similar. I see the same gap. I see how addictive it is. We're probably all addicted to it in some way. I'm a disciplined person, and I have a hard time. I find it hard to discipline myself to turn the thing off and stop. The algorithm knows what I want to see, what's going to get me upset and what's going to make me happy.
    Maybe both of you could answer this. In terms of regulation, when we're talking about who uses it and how they use it, how can we do this when the algorithm itself is so addictive? Is there any possibility, from your discussions, to address the actual addictiveness of the algorithm? Of course, the platforms aren't going to be interested in that, and maybe that's overstepping, but at some point I think we need to talk about this as a highly addictive thing in the same way that we talk about cigarettes. We regulate cigarettes. We regulate their advertisement to kids. We regulate how people interact with them, how they're sold and all those kinds of things.
    This is a little bit outside my area of expertise, but I actually think I want to weigh in on this a little bit. There is a privacy issue here. You are a unique individual to YouTube or to Instagram. They know you. They know what you like. They know what you want. They know what you've watched before. They have a file on you somewhere.
     Maybe they shouldn't. Maybe they shouldn't be allowed to capture personal usage data on us so that they can target us. I don't know about our ability to regulate this, and chances are this data does not exist in Canada. It's probably in some vault somewhere in the States or in Iceland or wherever, but the simple fact is that they are collecting information. They are using it to figure out who you are, what you like and what you want.
    I swear, I think YouTube knows me better than my wife does, and we've been married for 10 years. She knows me very well, but she can't tell you what I want to watch on YouTube the way YouTube can. That's a frightening thing for me. I wonder about that degree of privacy and the platforms' ability to know us so well.
     I would just add something that you brought forward with regard to the addictiveness and brain development. We know that the brains of children and teens are not fully developed. Their brains aren't fully developed until well into their twenties. We have seen the effects of the addictiveness of the algorithms and of the changes with regard to socialization and interests of even infants who are given iPads and other devices. We know that the reward centre of the brain is completely connected to those algorithms.
    It's very clear from a neurodevelopmental perspective that this kind of thing is actually changing brains, and is therefore changing the trajectory of our children's development.
    That's exactly what I fear, that we're going to look back at this and say, “Oh, man, we really missed this. We were unaware.”
    What you were saying was interesting, Dr. Clark, about some of the opportunities. Maybe it's not about looking at the algorithm in general but at where in that algorithm it becomes more addictive.
    I think the other thing, and maybe we can speak about this a little bit more too, is that these algorithms are agnostic. They don't just feed you what is positive. They also feed you what makes you angry. As a content creator, you can see something really blow up on the Internet. It might feel good, but something can blow up because of the negativity of it as well, as we've all probably experienced. We get lots of comments, and they might not be positive. When you think about that in the adolescent mind or in the young mind in terms of self-image, these can be devastating. I mean, it's one thing to get applause for a song you sing. It's another thing to become a meme because you didn't sing well and you become a joke.
     It's different from television in that way. This algorithm is agnostic. It just wants to gain a reaction, whether it be negative or positive. That certainly has an effect on one's concept of self-image.
     Maybe you can speak to that, Dr. Polzin Holman, from your own experience.
(1640)
     I think you've said it very nicely.
    I would add that we're not just seeing the positives or the things that make us angry; we're seeing things that are not real. With AI, people are not able to determine—and certainly children and teens are not able to determine—what's actually real and what is fake.
     I think this changes a lot of things with regard to how we view the world and how we see ourselves. Your point is well taken: Are we missing an opportunity to step in and address this issue? I would wholeheartedly agree.
    Thank you.
     Mr. Champoux, please go ahead for two and a half minutes.

[Translation]

    Ms. Polzin Holman, I found Mr. Myles's parallel with cigarettes very interesting. In the 1970s and 1980s, people smoked everywhere, including in classrooms, at school and in CEGEP, and in smoking rooms. Then we realized how harmful it was to our health and began to implement regulations. Everyone agreed on this, since it was obvious.
    What we are experiencing today is an equally worrying phenomenon, even if it is obviously on a different scale. It is just as worrying to see what is happening with young people's exposure to platforms, social media, and the Internet in general. My 14-year-old daughter questions what her teacher teaches her at school because she saw something that contradicts what he has told her. She gives credibility to what she has seen online, for example, to an influencer on TikTok.
    I think we also have a duty to educate. Parents should do it much earlier, as a preventive measure rather than a reaction.
    Do you think we are not doing enough? Do you think that we are taking a long time to wake up, as was the case with cigarettes?

[English]

    I think that's a good comparison. I have two things to say about it.
    First and foremost, with children and teens, developmentally, we know that there are certain stages. The stages of development require that our experience inform those stages. When we don't have controls, when we don't have the ability to walk them through or talk them through certain things that they're experiencing in those stages, it impacts them as individual human beings.
    We're seeing in social media that freedom of speech is often manipulated and disguised in terms of anything and everything goes versus slowing it down enough so that parents, educators and all of us in our children's and teens' lives have the ability to intervene. We don't have the ability to intervene when things are so fast-paced and when influencers, as you mentioned, are being seen as experts.

[Translation]

    Thank you very much, Doctor.

[English]

    Thank you very much.
    Next, we have Mr. Généreux for five minutes.

[Translation]

    Thank you, Madam Chair.
    Ms. Polzin Holman, I would like you to tell us a little bit about your organization. I think it does some very interesting work.
    Earlier, you talked about what I would call digital literacy among young people, whether on the part of parents or organizations such as yours. You mentioned the possibility of getting more funding, particularly through tax credits, in order to be able to do this explanatory and educational work. This is what I understood earlier. The donors or philanthropists who already fund you could be more numerous and bring in more money not only to your organization, but also to Canadian organizations like yours.
    Could you tell me a bit more?
(1645)

[English]

    Thank you so much for your comments. I'm certainly in full agreement.
    As a charity organization, in the last eight years, our organization has quadrupled in size. When I first started, eight years ago, we had a children's program to support children and their families who had been sexually abused. Since that time, we've expanded to support adolescents. Since the pandemic, we've had to increase our services to support families in intensive family programs for intergenerational trauma. We've also expanded our online psycho-educational programs for children, teens and families. We support educators and other professionals in the field, across Canada.
    We know that we're one organization. There are other really great organizations doing similar work, but our funding has not expanded. We continue to fundraise for these things, and we know that the issues and the concerns continue to grow, as demonstrated by our own growth to be able to support children and families in various ways, both online and in person, and within different developmental areas.

[Translation]

    I would simply like to correct something Mr. Ntumba said earlier. He said we all went to Kelowna together, but it was, in fact, Toronto.
    It was with Joël Godin.
    That's right. In Toronto, we took a course on AI. It was impressive to see young people coming into class thinking they were experts because they had heard influencers say certain things that contradicted what their teacher had said. This ties in with what my colleague from the Bloc said. In itself, arguing about an issue or debating it with a teacher or an adult is not a bad thing, because it is a learning experience.
    Mr. Clark, earlier we made a comparison with cigarettes. Parents have a role to play in all this. Do you think that today, despite incredible advances in technology, particularly AI, this parental role has diminished or become less important? We have just completed a study on AI.
    Should parents be even more informed or involved in managing this?
    I note that Quebec has decided to ban cellphones in classrooms. In my opinion, this is an excellent decision. As Mr. Myles said, we are addicted to our cellphones. Last week, I was asked to put my cellphone away because I was no longer talking to anyone. We talk to ourselves.

[English]

     I think parents obviously can be an important part here. There's no halfway about it. The parents are not less important. They're probably more important than they've ever been, but they don't necessarily have the tools they potentially need to effect the protection of their children.
     Let me give you an example. In 2002, a young man in Quebec, Ghyslain Raza, was fooling around with a golf club as if it was a lightsaber. His friend videotaped it and then put it on YouTube. It attracted a billion views. That kid then got absolutely pilloried online through trolls and through abuse, to the point where he had significant psychological repercussions from it. Nowhere in there was the parent really in a position to do anything about it. It was a video that was just taken and put online, and it just exploded.
    Yes, parents are going to do everything they can to try to protect their kids, but their kids have so many points of access to this information. They can use a computer at school or at a coffee shop. They can use their phones anywhere.
    Either we're going to become Luddites and give up technology, which I don't think any of us are prepared to do.... We don't necessarily think that's good for our kids either. As parents, we can do our best, but we need help, we need tools and we need the ability to have support.
    As you say, the rule about blocking phones in schools is a great idea, absolutely, but it only covers part of the problem.
(1650)
     That's the time. Thank you very much, Mr. Clark.
     We will now go to MP Myles for five minutes.
    I will just keep going on this topic, because I think part of it comes down to awareness, particularly about the behaviour of the algorithm. I think we're still behind in understanding exactly how it reacts.
    One thing I am particularly concerned about, and Monsieur Champoux mentioned it as well, is the idea of radicalization and polarization. Because of the nature of the algorithm, it seeks strong reaction, both negative and positive. A moderate position that doesn't get anybody super angry or super happy doesn't have much of a life online.
    I wonder about the effect on young people of civil public discourse and being able to disagree. We're quite good at it here in this room, I think, which is a very important thing for polarization in the future of democracy in our country. I do worry about this effect of the algorithm on young people's abilities to have civil discourse. As you know, if you're just looking through the comments...but even the way the algorithm itself behaves and reacts makes me fearful of these things.
     How can we make people aware of these things? How can we address these issues so people understand that discourse in person looks very different—I hope?
     I think we can all learn from hockey a little bit here. You have a game where people are checking each other and slashing each other and shooting pucks at each other's heads. At the end of the game, you go up to each other and shake hands. You have some high-fives.
    I think we have to learn to have intellectual battles the same way we have physical battles and know that it's not personal. If we talk about the role of schools and the role of parents, I think if you could engage kids in a way that allowed them to learn how to have arguments—it's not about you; it's about the ideas—and how not to take everything so personally but just have free-flowing, progressive ideas, it would be great. I'd love that.
    I mean, that's what being an academic is all about. That's what we try to preach in academia. If we could figure out how to teach our kids to do that, that would be amazing.
    The challenge, of course, is that in a hockey game, you are in the room with the person. There is a human being on the other side. In a university class, there's a human being. You sense that they have feelings and dignity. Online, you don't have that. There is a facelessness to it. That is the danger of this.
    Just in terms of it becoming the norm of behaviour, how do you address that? They're not in the same room. They're not seeing each other at the end of the game.
     It's a great question. I don't know if you can address it. The whole value of the Internet for some people is the anonymity. They are not personally held to account by the same rules of society that we have face to face. In fact, it's one of the reasons that some kids are so comfortable being online, because this live and in-person judgment is hard. It is risky. Online, you can say anything. You can be anything. You can do anything. Tomorrow is a fresh new day.
    Dr. Polzin Holman, please, do you have anything to add?
     I would agree fully. From the perspective of offenders and perpetrators who are specifically looking to befriend children and teens and sometimes even single parents, for example, you can present in any way you want online. You're not having that discussion in person.
    One thing we teach parents or caregivers at Little Warriors in our Be Brave bridge program, which is an online psycho-educational component of our program, is to really have discussions about things. Issues can't be black and white, or parents can't just control everything, because teens specifically go out of their way to try to access things when their parents say, no, you can't do this.
    The issue is partially that, but it's also partially giving parents the tools to be able to have the very important conversations about safety and risk and why they're concerned, because there can be discord between them.
(1655)
    Thank you very much, both of you.
    Fantastic. Thank you.
    MP Diotte, you have the floor for five minutes.
     Thank you, Madam Chair.
    Professor Clark, you mentioned, I believe, that some children influencers can make upwards of $30 million. Can you name, say, five or six of the top influencers in Canada?
     That's a much harder thing to do. It seems that all the really big money influencers are in the United States or Europe or Asia. Honestly, I don't know of any Canadian kids who are in that strata of income. I would also add that $30 million is what they earn from the platforms. It's not what they earn from advertising revenue and other forms of monetization that they do.
    Who are some who are fairly popular right across the board? Obviously, it's a global village. Can you name five or six? I wouldn't know who they are.
     The biggest YouTuber in the world is a channel called MrBeast. He has transformed the space. He is the first YouTube billionaire. He has product marketing. He now has a Netflix series.
     The most powerful kid is a kid named Ryan Kaji. He is the first person to break the $30-million-per-year level as a child. He started at the age of six. I believe he's 15, and he has kind of phased out of kidfluencing. However, his parents, who own the channel, obviously have passed it on to his younger brother and sister, so they're the ones who are making the money for the family. They have a company with 50 employees. They have cameramen, sound people, writers, editors, etc. It is a massive organization.
    Those are two of the really big ones that I think are worth noting.
    However, I have a list in my paper that I'm happy to send you, which includes all of the top 1,000 influencers on YouTube. It's everything from something that's utterly mundane to stuff that is radicalized and very political. They all have tremendous following, and they have a tremendous ability to change and affect behaviours.
    I guess, for one thing, if parents were watching this committee, they'd probably be fairly alarmed by just how pervasive it is. I guess the big question they would be asking is what they can do. Parents, as you rightly noted, are the best keepers of their children. We don't want governments telling parents what to do with their children, obviously.
    What would you give as advice to parents to help them deal with this phenomenon?
    The classic thing, which was introduced 20 years ago, is just as true today: Make online activities communal. If you're using your iPad, you have to be in the living room, where we can all see what you're doing, or on the family computer. The point is to limit the ability to do this in their room alone without being able to watch it. That gets progressively harder the older the kids get. You have privacy issues. You want the kids to develop their own morals of right and wrong.
     At least with my young daughter, we basically say, yes, she can be online, but we're going to watch what she's doing, and we're going to be there. If we hear something and don't like the way that person is talking, we can say to change the channel, to watch something else, to do something else. Honestly, without that, I don't know what we would do.
    That's a really good point, but is there anything else that's sort of top of mind?
     I would also say to limit the amount of individual access that they have to social media. Don't let them have a Facebook account. Don't let them have an Instagram account or, at the very least, don't allow them to post. They can follow other people and watch other people, but their ability to put themselves out there is limited.
     Fundamentally, you have to think about online activity as a game of Russian roulette. The barrel can have millions and millions of slots and one bullet, but the more people do online and the more exposed they are and the more views they get, the more the number of slots starts shrinking, to the point where you are almost guaranteed that something bad is going to happen to you when you get into that realm of 100 million views.
    However, everything is a risky behaviour. Everything you do online is a risky behaviour, and it's about how risk-tolerant you are. I don't want kids under the age of 14 having to make decisions about risk. They're generally very bad at it.
(1700)
    Thank you. Those were good points.
    Thank you, Mr. Clark.
    I'll turn the floor over to Mr. Ntumba for five minutes.

[Translation]

    Thank you, Madam Chair.
    Mr. Clark, I would like to come back to influencers—Mr. Champoux and Mr. Myles also mentioned them earlier. As their title suggests, these people have a lot of influence on young people, on the current generation, and on adults in some cases. Influencers talk about finances and weight loss as much as they do about skin care, for example. In some cases, they do so simply out of habit. In most cases, they do not have the formal training or skills to do so.
    How do you see regulating these people who talk to our children and us? Should they have special knowledge about the subjects they talk about on social media?

[English]

     That's a great question.
    The problem is that there are so many channels available. Most influencers are not on just one channel. Take somebody like a podcaster, like Joe Rogan. He has enormous influence in the United States—political influence, social influence. He can create ideas and he can create trends on a whim, but you don't have to listen only to his podcast, because it's sliced and diced and ends up on YouTube. It ends up on TikTok. It ends up on Instagram. It ends up on Facebook. I don't want to be a free ad for all the platforms out there, but the simple fact is that one voice can have so much reach because it can be amplified in these ways.
    To be honest, there is literally nothing we can do to monitor what they're saying or control what they're saying, because they have free speech, and when they're adults, they truly do. The best thing we can do is try to figure out how to limit the forms of consumption, or at least give control to parents over forms of consumption.

[Translation]

    You are right to say that control needs to be given back to the parents. However, social media has found a really clever way to create accounts for children. YouTube Kids has been created, and Instagram for kids will soon be launched. Parents manage the account with their child and tell themselves they control the situation, but the child is still being exposed to screens.
    How can we limit screen time? Children could tell their parents that it is set up as a kids' account, and ask them why they cannot have access to these platforms. In today's world, we often talk about rights and freedoms. We often hear we have the right to express ourselves and that parents do not have the right to deny their children access to these platforms.

[English]

     Yes, you're right. The YouTube for kids and Instagram for kids is like watering down wine: You're still drinking alcohol, and if you drink enough alcohol, you're still going to get drunk.
    YouTube Kids doesn't allow advertising. That's awesome, excellent, wonderful. However, they do allow product placement and they do allow other forms of sponsorship and spokesmanship. What they allow content-wise changes based on how old the kid is, so if you're under three, all you're getting is a bunch of cartoons or canned media, and that's fine, but when you start getting close to 10 or 11 or 12, you get more adult content that has a wider range of influence.
    Honestly, I think the only beneficiaries from YouTube Kids are the platforms. By creating platforms specifically aimed at kids, all they're doing is turning them into the addicts that Mr. Myles was talking about. They are just creating a behaviour that is going to spill over. It doesn't really protect them that much in the here and now, and it certainly doesn't protect them in the long run, because the content is.... The medium is the message.
(1705)

[Translation]

    Thank you, Mr. Clark.
    Ms. Polzin Holman, in your opinion, what public policies could help reduce the psychological risks of excessive content consumption on social media?

[English]

     I think that going back to supporting parents and caregivers with the right kind of information is really important. The platforms are constantly changing, so without having access to understanding how to support their children, it's very difficult. Many times, the children and teens themselves are more savvy on the computer than their parents are, so for the parents, having the right resources is incredibly important to protect their children.
    I think the limits and whatnot, as Mr. Clark has spoken about, do need to come from parents, but also, as has been mentioned, it's very difficult. You can have a parent who is very good at that in one household, but if the child goes on a play date with another child, suddenly there's open access in another home. How do we ensure that all parents have access and that this access is ongoing? Very few parents understand that when their child is playing Roblox, for example, they can have access to friends online who may be 60-year-old adults posing as 12-year-olds. Parents need the information, and it's constantly changing and it needs to be updated.
    The second thing I would say about the influencers is that right now, the influencers are motivated in lots of ways, but is there potentially a way to motivate influencers to support mental health and wellness and awareness of these things? I think those areas should be and can be explored further.
     Thank you.
    It's over to Mr. Champoux for two and a half minutes.

[Translation]

    Thank you, Chair.
    Mr. Clark, earlier, I used the example of my 14-year-old teenage daughter who gives more credibility to some influencer she saw on TikTok than to what a teacher tells her on about a given subject.
    This leads me to reflect on the educational models in Quebec and Canada. In your opinion, given the influence of social media on young people, isn't it time for Quebec and Canada to rethink the way we teach elementary and high school children? Isn't the traditional classroom, meaning a 50- or 60-minute lesson on a given subject, a thing of the past? Isn't it time to rethink this model, particularly in light of what young people are consuming on social media, among other things?

[English]

    I'll punt the question about what the best model of education is to my wife, who is an elementary school teacher, but we do think it's a good idea for our kids to learn to question everything.
    Fundamentally, as an academic and somebody who teaches in university, I want them to learn how to question things and ask their own questions, but here's the thing: As you say, if your 14-year-old is discounting what they learn in the classroom and then getting their information from social media because they hear things they like and they hear things that confirm their world view and their perspectives and give them permission to act, behave and think the way they want, that's not better.
    We fundamentally start chasing the opinions we like and the opinions that matter to us, and that is where influencers actually get their power.

[Translation]

    It is also a matter of format. Isn't the format that influencers use to communicate information something we could use to get young people more interested in core subjects?

[English]

    There's a really interesting point there.
    Let's face it, to Mr. Myles' point of view, there is nothing as addictive as a short video, like a 60-, 80- or 180-second video. It requires very little engagement, but you get very quick hits of the endorphins Ms. Polzin Holman was talking about.
    You may remember hearing that every time you hear your phone buzz, you get an endorphin rush because it means somebody is engaging with you and somebody is interested in you. Those short videos basically work the same way. They create a rush of endorphins as you get something interesting or fun or cool or exciting, and they never let your brain go.
    I worry about us using the exact same techniques in the classroom, because you get a teacher who is basically manipulating your endorphin levels, but I do think it is a very effective way to communicate and it does actually seem to create something that children respond to.
(1710)

[Translation]

    Professor Clark, if we manage to do this with algebra, we will become millionaires, you and I both. We should try it.
    Thank you very much.

[English]

    All right. I will be taking the Conservative time slot for five minutes.
    My first question goes to Ms. Polzin Holman.
    You talked about this in your opening remarks, but I'm hoping you can expand on it quite a bit. On the Internet, social media platforms and gaming platforms are often used to lure, groom and eventually exploit young people. You work with these individuals after these events have transpired, and, of course, they have a significant effect on them.
    If you could, I would like you to talk about how this works in really simplistic terms. I think we have members of the Canadian public who are unaware of the fact that gaming and social media platforms can be used in this way and the danger that is posed to young people.
     I think there are numerous ways this happens online. We know that there are social platforms such as Instagram and Snapchat, as well as Roblox and other games where children can be playing even in front of their parents in their living room or dining room. Parents can see it happening. The children are plugged into headphones, and they're having conversations with friends. They don't know if their friends, like I said, are actually who they say they are. They might have certain photos of them. They can have conversations. Oftentimes, it's not verbally; it's through different texts or instant messaging.
    What happens is that the grooming process, from the perspective of the children we see at Little Warriors, happens immediately. There are predators out there who are looking for vulnerable kids. They start gaming, and it all seems very innocuous, but over time, what happens is that they start learning that the child is maybe in a single-parent household where there's not a lot of supervision, that the parent is working two jobs or that the child is alone at certain times. Then the child has a picture behind them of their school information or is wearing a hoodie or a sweatshirt. Those things can get very easily manipulated.
    We have parents who have been very involved in their children's lives and have checked their social media. They have accounts that they can look in at any time, and they wouldn't necessarily see that there's an issue until there are different stages of the grooming process—when they're meeting up, sending images and then manipulating children and teens because they have sent inappropriate materials. Sometimes they're actually paying for different access to games or for coins, for example, that they can use to improve their standing in certain games, and they can send that through the Internet without anybody really knowing anything at all.
    This happens online, as I said, much more than parents are aware of. It's frightening to know the extent to which this is happening and the ease with which it is happening. Unfortunately, as I've mentioned, it has ended very dramatically and very sadly with children being sexually exploited and abused.
(1715)
    How does it transpire, then? How does it go from sitting in your living room, where your parents are aware of your conversations and what's transpiring, to being sexually abused? How does that happen?
     Well, sometimes it even happens that the offender is grooming the parent. It may even be that they're talking with the parent online. We've had that happen, too. They can manipulate their voice. They can manipulate their messaging. With the child, it happens very insidiously. It's very small, little things, as Mr. Clark talked about. They start talking about the same likes—that they like hockey and that they like certain games—and they are watching each other playing the games. Then they'll start asking to send pictures, and it just keeps building and building to a place where suddenly the child has revealed their location, their school information, their age. They've sent photos, and none of that can be taken back. If a child is feeling that they'll get in trouble with their parents or caregivers, they're not about to share that information even with their friends or siblings or their parents.
     Thank you very much. I appreciate that.
    For our final question round, we are going to Ms. Royer.
     Thank you very much.
    I would be remiss if I didn't mention Carol Todd. She's an educator in school district 43, and that's in my riding of Port Moody—Coquitlam. Her daughter committed suicide in 2012 at the age of 14. Her daughter was being groomed by a friend at the time, an anonymous friend in the Netherlands. Amanda Todd was her name, and Carol has been a great advocate around tabling legislation.
    My first question is for Dr. Polzin Holman. It's around the fact that our government tried to bring forward legislation to mitigate the risk of exposure to harmful content, including content that victimizes children. I know that this legislation died in January this year, when Parliament was prorogued. I'd like to hear from you, Doctor, about what within the online harms act, Bill C-63, you feel should come back in its original form, or what might need to be changed or strengthened.
    I am fully in support of the online harms act. That act, or something similar, is needed to ensure the protection of our children and our teens.
    I think there are a few key areas. One is that we need to ensure that whatever we put forward limits child sexual exploitation, and there needs to be deterrence. There needs to be some kind of deterrence associated for anyone who's producing the material or using the material. Currently, it's very unclear and it's not consistent across provinces and across judgments.
    The second is that regulation is needed for schools and for other community activities. It's very difficult for educators to take on this task on their own. Oftentimes, as I mentioned, the platforms are changing. The information needs to be shared with parents and caregivers as well, on an ongoing basis, and it needs to be updated.
    The third piece is that, at a community level, there need to be updates and supports for programs such as the Little Warriors' Prevent It! program, or for other organizations that are doing similar programming, so that parents and educators, as well as children and teens, have the right information and can continue to have conversations about this.
    Just limiting things and putting laws onto these areas is not enough. We have to continue to have the conversations, because things are constantly changing and being updated.
(1720)
    Given that her daughter would have had her 15th birthday in three days' time, I know that Carol was very upset that this legislation kind of died. We will certainly take your comments into consideration when, hopefully, bringing it back.
    I'm going to share my time with Mr. Al Soud.
    Thank you, Ms. Royer.
    Dr. Polzin Holman, when you testified at the Standing Committee on the Status of Women last year, you noted—and I pray that the number here is wrong—“recent reports indicate that over the past five years, online sexual luring of Canadian children is up 815%.” Is that accurate?
    I leave this question open-ended. Could you speak to it a little, please?
     It is accurate. I took my stats from specific areas. I can provide more information to this committee, regarding similar stats, as well as that. We are certainly seeing that.
    I think the biggest change point we've certainly recognized is that, during the pandemic, there were so many things that just completely opened up for children and youth as a result of being online and having access to devices for school. For those reasons, it served a purpose. However, since that time, I don't think any of the controls and any of the time spent have been pulled back. Even though children and teens are now in school, they're constantly on devices. Schools that have taken the stance to not have children on their phones have certainly limited things like cyber-bullying as well.
    There are numerous harms that continue to present themselves. The problem is not going away. As mentioned, with AI and other areas, there are just new things that are opening up for us to address. At some point, we need to take a stand on creating some ways to address this, but it's going to be a work in progress because we know that things are constantly changing.
    Well, folks, thank you. Thank you to all of the members for being here and for asking such good questions.
    Thank you to Mr. Clark and Dr. Polzin Holman for being here as well. We very much appreciate the information you shared with us today. It will be very helpful in the study that we do, and then, of course, in the report that will be drafted and the recommendations that will be made to the government. Thank you.
     Very quickly here, Dr. Polzin Holman, I want to follow up with you. You made a comment earlier and said that it would be possible to provide the committee with some more statistics. If you could draft just a short brief outlining that, it would be deeply appreciated. Thank you so much.
    The meeting is adjourned.
Publication Explorer
Publication Explorer
ParlVU