:
I call this meeting to order.
Welcome to meeting number 15 of the Standing Committee on Canadian Heritage.
I believe that we all know the guidelines with regard to our earpieces. Please don't just throw them down. There's a nice little sticker. If you could just put them there in order to protect the ears of the interpreters, that would be wonderful.
Today, we have two witnesses joining us. We have Mr. Daniel Clark and Ms. Wanda Polzin Holman, who is joining us virtually from Edmonton, I believe.
In just a moment, I will give each of you an opportunity to give five-minute opening remarks, and then we will go into questions from members. During that period of time, we start with questions from the Conservatives, then the Liberals and then the Bloc Québécois. We'll continue to rotate through.
As you know, the study we are discussing today—and this is our first meeting on this study—has to do with the impact of social media, and in particular influencers on social media, on those under the age of 18. We look forward to hearing from those experts who are with us today and learning what you have to explore with us.
With that, I will hand the opportunity to speak over to Mr. Clark.
You have the floor for five minutes.
:
Madam Chair and committee members, I would like to thank you for the opportunity to testify before you today.
I've been studying the ethics of children in social media production for the last three years, with research published in the Journal of Business Ethics, a top-ranked journal, entitled “The Child Labor in Social Media: Kidfluencers, Ethics of Care, and Exploitation”. This work is necessary, as a recent Harris poll found that 29% of children from eight to 12 aspire to a career on YouTube—more than any other career—and are at risk of economic exploitation, consent violations, privacy loss and other harms.
There are two concepts in this field. Sharenting, a portmanteau of “share“ and “parenting”, is where parents build social media channels around their children for other parents as the audience. “Kidfluencing” is where children influence other children or adults through their own social media personality. These concepts differ in the child's role and audience, but they share many ethical concerns.
Using the UN Convention on the Rights of the Child as our framework, our research suggests that the following children's rights are at risk.
First, there's the right to consent. These channels are owned and operated by parents. Children participating through parental caveat is a constant risk.
There's the right to privacy. Even moderately successful channels can expose children to millions of strangers. Their everyday follies, embarrassments and charms exist forever in social media, potentially haunting them for life.
There's freedom from economic exploitation. These channels can generate thousands to millions monthly from the child's involvement. Parents should benefit proportionally, but there's no guarantee that their children are.
There's the right to education. Being a social media star requires significant time. Kidfluencing may seem like play, but it's work, like acting or performance. It is not uncommon for kidfluencers to be home-schooled. Where is the time for school, play and sleep?
There's freedom from harm. Parents may put children at physical risk creating compelling content. There's also the risk of child predators forming parasocial relationships with child influencers. Recently, a study found that 95% of adult influencers had been subject to stalking behaviours, and 40% felt fear as a result. Unfortunately, we don't have such data about child influencers.
There's freedom of expression. Children are brand ambassadors endorsing products and expressing opinions often not their own. When the child is 25, their 15-year-old opinions might prevent them from getting a job or otherwise committing to a position.
Over the past year, I've been interviewed numerous times about this paper. I'm always asked, “What can we do about it?”
Anything in the sphere of child welfare is deferred to the decision-making of the parents—except that, when parents are generating income, sometimes significant income, some parents' decision-making may be compromised. The platforms these videos appear on, such as Instagram, TikTok and YouTube, earned $11 billion in advertising to children in 2022. By limiting account ownership to the age of majority, they place an onus on the parents to ensure that their children are protected. They set guidance about what is and is not allowed on their platforms—violence, gratuitous nudity or sex, etc.—but this is clearly not enough to ensure that online child actors are free from exploitation.
That means the federal government may have a role to responsibly regulate this practice. We need to protect these children. Internationally, other jurisdictions have made some progress toward protecting children in these arrangements and their future earnings. It's time Canada also took action.
We need to protect these children from the negative impacts that may arise from the parasocial elements of global exposure through social media. We're all aware of the harm that befell child actors like Macaulay Culkin, Gary Coleman, Shirley Temple and countless others through exploitation. While it's debatable that there is even a need for kidfluencing in any capacity, if it is allowed in this country, then children who are the subject of this social media enterprise deserve as much protection and recourse from the harms as our law can provide.
Thank you, Madam Chair. I look forward to your questions.
:
Good afternoon, Madam Chair and committee members.
My name is Dr. Wanda Polzin Holman, and I'm the CEO of Little Warriors.
Little Warriors is a national charitable organization, and we have been recognized through numerous scientific and clinical journals as being a leader in the field of child sexual abuse awareness, prevention, advocacy and evidence-based treatment. I've been involved with Little Warriors for over eight years, including a previous role as clinical director. I'm clinically trained and have obtained a master's degree and a doctorate. I am currently a registered clinical social worker.
I appreciate the opportunity to share Little Warriors' perspectives with the standing committee. On behalf of the children and families we serve, I'm very appreciative of the committee for undertaking this important study. It is indeed an area requiring further understanding and actions.
There are some key issues that we have observed at Little Warriors with regard to children and adolescents, which I would like to highlight.
First and foremost, as a result of social media influencers, we are witnessing significant deterioration with regard to mental health issues, including an overall increase in levels of stress, cyber-bullying, suicidality, anxiety, depression, self-concept concerns and radicalization of gender bias. Also, there are concerns related to sextortion and online grooming, as well as luring of our children. This happens both in plain sight—as we are all on social media—and in very subtle ways through gaming platforms and social media, which parents and educators may not always be apprised of.
Families, even those who do their best to ensure proper controls on devices, are very concerned about their children's online experiences. We have seen this first-hand at Little Warriors when treating children and adolescents who have come to the hands of predators.
We understand that there are ongoing issues related to inappropriate content, online interactions with unsafe individuals and algorithm-driven risks. We are seeing gaps in the digital literacy of children's educators as well as the parents, and the platforms are constantly changing, which is concerning for us all.
Additionally, there are issues regarding loss of privacy that children and teens do not always comprehend, and there are concerns related to children's digital footprints. These could obviously have long-lasting negative implications for them.
There are gaps in legislation, deterrence and penalties regarding online and in-person harm, and access to children by potential predators across geographical borders. Overall, at Little Warriors we are concerned about child exploitation and sexual abuse, and about the lack of clear and consistent sentencing and regulations. As Canadians, we seem to understand that there are controls required for other aspects relating to children's safety, but we have yet to address social media content harms.
In light of these concerns, I'm hopeful that this review will result in decisive action to protect children and to uphold accountability. Specifically, first, ensure that survivor centre supports, including prevention programs such as Little Warriors' Prevent It! program, are included in new policy measures to support schools, charities and other in-person and online community organizations to expand prevention and support resources.
Second, review sentencing gaps and issues of deterrence. We have witnessed child sexual abuse offenders being released into the community with warnings, only to be found reoffending a short time later. Protecting children must take precedence over the rights of offenders who perpetrate abuse.
Third, legislate stronger, more consistent sentencing provisions for offences related to the possession, access and distribution of child sexual abuse and exploitation materials.
Fourth, make concessions for individuals to donate to charitable organizations and to financially support organizations such as Little Warriors that invest in prevention efforts and work with survivors.
The work of this committee is a defining moment for Canada to act with moral clarity and to ensure better safeguards to protect vulnerable children, online and in person.
I appreciate your time today and look forward to questions.
:
Thank you, Madam Chair.
Thank you, Dr. Polzin Holman and Professor Clark, for being here.
My first question is for Little Warriors.
Wanda, I think the public is seeing increased occurrences of exploitation—and this is a word that we've talked a lot about here in the House of Commons this week. We're seeing it at all levels, but can you give us some stats on children? You made four or five points, but do you have any stats that maybe you could share with us here today about the exploitation of children online?
:
Well, I know that there have been ongoing issues that have been brought up with regard to the online harms act, and I understand that the reason we're coming together is to explore some pieces related to that.
We know that there are similar things that are needed to limit the amount of child sex exploitation. We need some regulations to support schools as well as parents in their understanding, as well as understanding across communities.
We know that people, from our perspective at Little Warriors, have not been held accountable in the way that they need to be. The deterrence is very minimal at this point. We have children who come to us for child sexual abuse treatment as a result of being harmed online and sexually abused and exploited. Many times, the offenders and perpetrators are released without serving any time or having any consequences that relate specifically to the crime. What I mean by that is that the children and teens come to us and require intensive supports and treatment for what has happened to them. Very often, the perpetrators are released into the public, sometimes with notifications, and they are reoffending.
We've had several situations at Little Warriors where this has happened, and there is just not enough deterrence in place for them to stop what they're doing. It's very difficult to continue to follow offenders and perpetrators online as a result of the ongoing changes that are happening, and the ways that they're doing it through gaming platforms and social media platforms for children as young as seven or eight years old.
:
Thank you, Madam Chair.
Thank you to our witnesses for being with us today. It is greatly appreciated.
Social media platforms and influencer culture now play a defining role in the lives of children and adolescents. They shape everything from their self-image to their social interactions. I am part of a generation that has very directly seen and experienced the online and digital environment. It's not just on social media; it's in video games as well, specifically in lobbies. I'm also part of a generation that has notoriously found ways around age verification processes. I think that's much of what I'd like to discuss today.
Professor Clark, you have been an associate professor of entrepreneurship at the Ivey Business School since July 2025. Your current research focuses on the cognition and decision-making of entrepreneurs. You made reference to an article earlier called “The Child Labor in Social Media: Kidfluencers, Ethics of Care, and Exploitation”.
You are cited in Western News saying, “Consent isn't a one-time event; it must be continuous, informed and freely given. For kidfluencers, let's be real, it isn't”. I'm curious. Given that children cannot provide ongoing informed consent, what safeguards do you believe platforms or governments should require to ensure that minors' images, data and labour are not exploited in influencer environments?
:
That is a great question. You're right. The simple fact is that there is no consent for very young children. At the very least, there is maybe assent in that you know that the child is not doing it truly against their wishes, but they can't consent to all the implications that come with it.
To be perfectly honest, this is one of my arguments toward the fact that there is a date, probably somewhere around the age of 14 to 16, when young people can take back control of their digital identities, but before that, I do not see the benefits of allowing children to post and feature in social media content to a wide audience. Two things enhance the risk here: the amount of time they spend making content and, more importantly for what you're pointing to, the amount of exposure they get. You get past a thousand people. You get up to the millions and tens of millions and hundreds of millions of exposures. You are magnifying the risk infinitely, and no child has the capacity to understand what they're consenting to when you're talking about those large numbers.
My own six-year-old struggles with the difference between five and five million, so I can't imagine too many other kids really understand how big the exposure is.
Growing up, I, too, was quite interested in the YouTube space. My father, at the time and to this day, was very reluctant at the idea of seeing me join or engage in YouTube in any way, shape or form. In hindsight, it made perfect sense. I'm not particularly talented. I'm not a great musician in any way, shape or form, but it did ultimately stand to benefit me significantly.
In your view, who is currently benefiting from this gap, and who should be held accountable for protecting children from being commercialized online? What policy mechanisms do you think might help us do that?
:
The number one beneficiary in this space is the platforms. There's no halfway about it. These are massive companies, making billions in revenue, specifically in the advertising from child content and the advertising to children. The fact that there is no control over this is a giant mistake. The fact that we've been asking them to police themselves is a massive mistake, and it is not in anybody's best interest to do so.
Beyond that, the primary beneficiary financially is the parents. If you are under the age of 12, you cannot have a YouTube account and you cannot have a TikTok account. Your point about age verification is taken. However, if you want to get paid from those things, you certainly can't get paid through PayPal or the other mechanisms at that age. The kids, then, are employees of their parents, and anything that's happening to them as a result of that is because their parents are putting them in that position to be their employees. The protections ultimately should fall on the parents or, as a proxy, on us as a society.
:
Thank you, Madam Chair.
I will start by not thanking Mr. Al Soud for making us feel a bit like dinosaurs when he said he belonged to the YouTube generation. Ours was the vinyl, 8-track cassette, camcorder and Super 8 camera generation, so a huge thank you for reminding us we belong to different generations. That said, I am happy there are different generations around the table to talk about matters affecting everyone.
Mr. Clark, I am going to continue the discussion you started with my colleague Mr. Al Soud about potential regulations on user age and age verification.
Australia has passed legislation banning social media for young people under the age of 16. In your opinion, is this an applicable solution? Could we use it as inspiration? Is it effective? Would those who truly intend to use their children as moneymakers find it all too easy to get around as you said?
:
I think that's a great point, and I think what's happening in Australia is a good step.
We can talk about how people might get around these regulations, but that's a deliberate act. You have to want to get around them. You have to be willing to falsify. You have to be willing to obfuscate. You have to be willing to take a proactive act to break the rules here.
I'd rather there be rules in place that are imperfect than to have nothing in place. I think you could go straight to bans, or you could have usurious fines, or you could.... There are lots of ways to go about this that I think would, at the very least, reduce the harm.
While I think eliminating the harm is impossible—and I think your point there is very well taken, and the same thing with Mr. Al Soud—I'm happy to live in a world of harm reduction right now, because there's none right now.
:
Yes, I think there is a real risk that the platforms will allow loopholes to exist. Until you tell them that they have to close that loophole, they will say, “Oh, do you know what? They didn't say anything about that. We're going to leave that one open.” Absolutely, that is always a possibility.
I don't believe the platforms here are good actors. They are not thinking about the best interests of the people who are creating content for their platforms. They are thinking, primarily, about the other side. They're thinking about users. They want to facilitate use, watching, viewing and advertising as much as humanly possible.
I wish I were an expert on age verification technologies—I'm not. We need better and independent age verification technologies because, as long as we allow the platforms to be in charge of this, it is in their best interest to be bad at it. They don't want to keep people off the platforms, either as content producers or as watchers. That's their audience, so you're absolutely right.
:
Thank you very much for the question.
I agree with Mr. Clark. I think the idea, as it relates to what Australia is doing, is a good first step—having some rules in place versus having nothing. Looking at harm prevention is very important, specifically as it relates to the issues that are coming up related to mental health as well as child exploitation.
Platforms may try to circumvent any laws that are attempted to be put in place, but, certainly, I think this would limit and address issues of non-intentional consent that's happening with children. We are seeing this, as I mentioned previously, with seven- and eight-year-olds, who are simply clicking the “yes” button—“Yes, I'm over the age of 18.” There's no way to verify otherwise, and it's putting them at enormous harm from online predators.
Mr. Clark and Ms. Polzin Holman, based on your experience, you have obviously seen technology evolve over many years. Based on your observations, what are the fundamental differences between technology today and technology from 5, 10 or 15 years ago? I note that the Internet has existed for 15 or 20 years now. In your opinion, what is the most fundamental change with regard to what is happening now?
In your answer, I ask you to bear in mind that AI adds to all the dangers society faces when it comes to technological tools.
What are the main things you have noted in terms of harms or risks to young people in the past versus those they face now?
:
I think there are a few things. First of all, you're right; artificial intelligence is changing the game because of the speed at which new content can be created. You can put a few prompts into a video-generating engine, a few into audio generation and a few into an editing engine, and within 20 to 40 minutes you can have a brand new video—one that may involve no original human content—up on YouTube. You can recycle and reuse existing content to create new content incredibly fast. The AI on this is terrifying.
The other thing that I think is really accelerating is the ease of access to the Internet. We've all had smart phones that have been able to watch video, but think about how 5G has increased the quality and the access to video consumption in just the past few years. You can download high-definition videos to your phone anywhere in North America without even the blink of an eye. The ability and the access exist 24 hours a day.
If you have a kid, that means they probably have an iPad. They probably have a phone. They probably have access to a computer. Keeping them away from all these technological bases is almost a form of child abuse these days, in their eyes, because they want to be on the Internet in some way, shape or form. It becomes ridiculously difficult to police all of them and have the right protections on all of them.
I would say that the ease of access and the AI component, the ability to create new content, are probably the two that are really accelerating this.
:
If you've ever gotten lost down a YouTube hole, you know the power of the algorithms. They effectively reach into your brain, figure out what it is you like or want and just keep feeding you more and more of the same.
For my six-year-old daughter, it's videos about Minecraft and Roblox. There's an unending stream of them. It keeps her online, keeps her watching advertising and keeps her as a consumer, unless my wife or I basically say, “Enough.”
There is no end to the Internet anymore. Once upon a time, I used to joke and say that I'd reached the end of the Internet. That doesn't exist anymore. The algorithms are always able to find and probe content that is going to allow me to stay on board and keep earning them money.
:
The questions about risk come to which group you're talking about.
For, let's say, preteen kids—anybody who's sub-12 years old—you're looking at things that allow them to engage in a world and get lost in a world. Gaming content is a big one. Ones that feature personalities that look and act like them—other 10-year-olds, seven-year-olds and eight-year-olds—allow them to escape into a world where they have friends instantly.
My daughter frequently says, “Oh, I'm just watching my friend.” I ask her how this person is her friend. She says, “Well, they talk to me. They watch me and I watch them.” She wants to make videos so that her friends can watch her. There's that particularly immersive world to which preteens are susceptible.
For teens, it's about creating a more idealized, socialized world where they have larger friend groups. It might be, as you say, fashion videos. It might be kids hanging out and doing fun things or doing silly stunts, etc. There are different pockets that appeal to boys and there are different pockets that appeal to girls, but it is a fundamentally different architecture where it's more about specifically the kids driving their own world in the teenage years, whereas in the sub-teen years, it's about creating a world for the kids to get immersed into.
:
It would be devastating.
We use proxies of former child TV and movie stars a lot as a warning sign. It's the kid who got all of their money stolen by their parents or the kid who became an alcoholic or a drug addict or committed suicide. All those situations were fundamentally contained by the fact that there's a relatively small amount of TV and movie content that's produced.
Now, multiply that by infinity, because there's no limit to how much social media content can be produced. You're potentially creating a pool of victimized young people that is enormous. If we thought it was bad to watch Macaulay Culkin get all of his money stolen by his parents, imagine that multiplied by tens of thousands of kids being pillaged of their income, which is taken by their parents or whomever.
:
Thank you, Madam Chair.
Ms. Polzin Holman, you mentioned deepfakes earlier. In November 2024, one year ago, the committee conducted a study on the harm caused by viewing illegal sexually explicit material online. That study led to recommendations, and some of them were quite simple. I would like to know what you think about some of those recommendations.
To summarize, the fourth recommendation called for digital platforms to implement processes for detecting and reporting illegal, sexually explicit content, such as child sexual exploitation material and the non-consensual distribution of intimate images, including deepfakes.
The fifth recommendation was also fairly easy to implement. Witnesses who appeared before the committee requested that section 162.1 of the Criminal Code, which defines intimate images, be amended to include the concept of sexually explicit deepfakes. However, this has not been done.
In your opinion, are these interesting tools to put in place? Should we reformulate these requests to government, in order to implement the recommendations set out in the study we undertook in the previous Parliament?
:
Thank you, Madam Chair.
Thank you to both witnesses. It's a really fascinating topic.
I wanted to talk to Dr. Polzin Holman.
First of all, thank you for your good work. I'm very familiar with your organization, being a fellow Edmontonian. You have a great reputation.
I wanted to bring this home to talk about a real-world story. I noticed that on your website you talk about the case of Michelle, a mother whose story reminds us of why Little Warriors was created. Can you tell us a bit about Michelle's story and the story she shared about her daughter and how her daughter fell victim?
:
Thank you, Mr. Diotte. I appreciate your kind words.
I certainly can say a little bit more about the child and family that were impacted. The child was treated at Little Warriors in our intensive episodic treatment centre. The reason she came to us is that there was an online predator. The parents of the child, who at the time was somewhere around 12 years of age, were very much involved in this child's life and very much involved in setting controls and checking and supporting their child's digital literacy.
The issue actually came when the child was at school and was able to access social media platforms, where the offender took advantage of safety that wasn't in place and continued to reach out. They came to Edmonton, not once but twice, and met her at her school. At that point, they took her across the border in the United States and significantly sexually assaulted and raped her.
This, unfortunately, is a very significant example of what can happen and what the harms are. Unfortunately, this is not an isolated incident. We've had numerous other situations of boys and girls under the age of 18 who have had very similar situations and have been assaulted as a result and manipulated in various ways. It is a problem that definitely needs to be addressed. We can't settle for what is currently in place.
:
Thank you very much, Madam Chair.
I salute both witnesses and thank them for their opening remarks, which were quite interesting.
Mr. Clark, I don't know if the same thing happens to you, but whenever I talk to my children and mention a product, such as a brand of water or a watch, I automatically see sales offers on my Facebook or Instagram page. Yet I haven't searched for the product, and it is not Black Friday.
Are we being tapped? If so, how can we fix that?
:
My wife would agree with you that somebody's listening in.
To a certain extent, there are a lot of ways this is happening. You type anything into Google, and that gets saved as a cookie that other platforms can draw upon and create content and push advertising towards you that is specific to your interests and what things you like. A big part of the Internet phenomenon is that anything you do on your phone can be doing that. There is considerable risk that things like Alexa and Siri could indeed be listening to you and making sense of what you want.
Also, there's WhatsApp, for example. There's the ability of the platforms to mine what you're doing on WhatsApp, and particularly Meta, obviously, because they own WhatsApp. Then they can port that information about you over to Facebook or Instagram, their other platforms, and use that to figure out what you like, where you're going and what you're interested in and direct content towards you. That is absolutely happening.
:
Ms. Polzin Holman, I know a little about your background and your commitment to combatting child porn.
As young parents, we want to take photos of our children and post them on Facebook, Meta, or Instagram, for example. Recently, Mr. Généreux and I went to Kelowna. In a workshop we attended, we heard that some people used one child's photo to make an ID card for another child elsewhere.
As parents, we know what photos we post online. How can we protect those images? Should they not be posted?
:
Dr. Polzin Holman, thank you very much for your remarks.
At this point in time, I am going to take five minutes for questions, and then we'll continue from there.
My first question is for you, Dr. Polzin Holman.
I'm wondering if you can comment a little bit further on.... Maybe we'll start here, actually. On your website, you state that more than 95% of child sexual abuse cases in Canada go unreported. I'm just curious as to why that number is so high.
I have been on social media for most of my career. I've witnessed how it influences my own mood and how I react to the things I put on. I was a musician for years. You know, social media behaves very differently from TV or movies, as you said before. The thing I find about it particularly—I have young kids—is the addictiveness. Sometimes I wonder if we're going to look back at this time and say, “We were giving cigarettes to our kids”, because in many ways, it's really similar. I see the same gap. I see how addictive it is. We're probably all addicted to it in some way. I'm a disciplined person, and I have a hard time. I find it hard to discipline myself to turn the thing off and stop. The algorithm knows what I want to see, what's going to get me upset and what's going to make me happy.
Maybe both of you could answer this. In terms of regulation, when we're talking about who uses it and how they use it, how can we do this when the algorithm itself is so addictive? Is there any possibility, from your discussions, to address the actual addictiveness of the algorithm? Of course, the platforms aren't going to be interested in that, and maybe that's overstepping, but at some point I think we need to talk about this as a highly addictive thing in the same way that we talk about cigarettes. We regulate cigarettes. We regulate their advertisement to kids. We regulate how people interact with them, how they're sold and all those kinds of things.
:
This is a little bit outside my area of expertise, but I actually think I want to weigh in on this a little bit. There is a privacy issue here. You are a unique individual to YouTube or to Instagram. They know you. They know what you like. They know what you want. They know what you've watched before. They have a file on you somewhere.
Maybe they shouldn't. Maybe they shouldn't be allowed to capture personal usage data on us so that they can target us. I don't know about our ability to regulate this, and chances are this data does not exist in Canada. It's probably in some vault somewhere in the States or in Iceland or wherever, but the simple fact is that they are collecting information. They are using it to figure out who you are, what you like and what you want.
I swear, I think YouTube knows me better than my wife does, and we've been married for 10 years. She knows me very well, but she can't tell you what I want to watch on YouTube the way YouTube can. That's a frightening thing for me. I wonder about that degree of privacy and the platforms' ability to know us so well.
:
That's exactly what I fear, that we're going to look back at this and say, “Oh, man, we really missed this. We were unaware.”
What you were saying was interesting, Dr. Clark, about some of the opportunities. Maybe it's not about looking at the algorithm in general but at where in that algorithm it becomes more addictive.
I think the other thing, and maybe we can speak about this a little bit more too, is that these algorithms are agnostic. They don't just feed you what is positive. They also feed you what makes you angry. As a content creator, you can see something really blow up on the Internet. It might feel good, but something can blow up because of the negativity of it as well, as we've all probably experienced. We get lots of comments, and they might not be positive. When you think about that in the adolescent mind or in the young mind in terms of self-image, these can be devastating. I mean, it's one thing to get applause for a song you sing. It's another thing to become a meme because you didn't sing well and you become a joke.
It's different from television in that way. This algorithm is agnostic. It just wants to gain a reaction, whether it be negative or positive. That certainly has an effect on one's concept of self-image.
Maybe you can speak to that, Dr. Polzin Holman, from your own experience.
:
Ms. Polzin Holman, I found Mr. Myles's parallel with cigarettes very interesting. In the 1970s and 1980s, people smoked everywhere, including in classrooms, at school and in CEGEP, and in smoking rooms. Then we realized how harmful it was to our health and began to implement regulations. Everyone agreed on this, since it was obvious.
What we are experiencing today is an equally worrying phenomenon, even if it is obviously on a different scale. It is just as worrying to see what is happening with young people's exposure to platforms, social media, and the Internet in general. My 14-year-old daughter questions what her teacher teaches her at school because she saw something that contradicts what he has told her. She gives credibility to what she has seen online, for example, to an influencer on TikTok.
I think we also have a duty to educate. Parents should do it much earlier, as a preventive measure rather than a reaction.
Do you think we are not doing enough? Do you think that we are taking a long time to wake up, as was the case with cigarettes?
:
I think that's a good comparison. I have two things to say about it.
First and foremost, with children and teens, developmentally, we know that there are certain stages. The stages of development require that our experience inform those stages. When we don't have controls, when we don't have the ability to walk them through or talk them through certain things that they're experiencing in those stages, it impacts them as individual human beings.
We're seeing in social media that freedom of speech is often manipulated and disguised in terms of anything and everything goes versus slowing it down enough so that parents, educators and all of us in our children's and teens' lives have the ability to intervene. We don't have the ability to intervene when things are so fast-paced and when influencers, as you mentioned, are being seen as experts.
:
Thank you, Madam Chair.
Ms. Polzin Holman, I would like you to tell us a little bit about your organization. I think it does some very interesting work.
Earlier, you talked about what I would call digital literacy among young people, whether on the part of parents or organizations such as yours. You mentioned the possibility of getting more funding, particularly through tax credits, in order to be able to do this explanatory and educational work. This is what I understood earlier. The donors or philanthropists who already fund you could be more numerous and bring in more money not only to your organization, but also to Canadian organizations like yours.
Could you tell me a bit more?
:
Thank you so much for your comments. I'm certainly in full agreement.
As a charity organization, in the last eight years, our organization has quadrupled in size. When I first started, eight years ago, we had a children's program to support children and their families who had been sexually abused. Since that time, we've expanded to support adolescents. Since the pandemic, we've had to increase our services to support families in intensive family programs for intergenerational trauma. We've also expanded our online psycho-educational programs for children, teens and families. We support educators and other professionals in the field, across Canada.
We know that we're one organization. There are other really great organizations doing similar work, but our funding has not expanded. We continue to fundraise for these things, and we know that the issues and the concerns continue to grow, as demonstrated by our own growth to be able to support children and families in various ways, both online and in person, and within different developmental areas.
:
That's right. In Toronto, we took a course on AI. It was impressive to see young people coming into class thinking they were experts because they had heard influencers say certain things that contradicted what their teacher had said. This ties in with what my colleague from the Bloc said. In itself, arguing about an issue or debating it with a teacher or an adult is not a bad thing, because it is a learning experience.
Mr. Clark, earlier we made a comparison with cigarettes. Parents have a role to play in all this. Do you think that today, despite incredible advances in technology, particularly AI, this parental role has diminished or become less important? We have just completed a study on AI.
Should parents be even more informed or involved in managing this?
I note that Quebec has decided to ban cellphones in classrooms. In my opinion, this is an excellent decision. As Mr. Myles said, we are addicted to our cellphones. Last week, I was asked to put my cellphone away because I was no longer talking to anyone. We talk to ourselves.
:
I think parents obviously can be an important part here. There's no halfway about it. The parents are not less important. They're probably more important than they've ever been, but they don't necessarily have the tools they potentially need to effect the protection of their children.
Let me give you an example. In 2002, a young man in Quebec, Ghyslain Raza, was fooling around with a golf club as if it was a lightsaber. His friend videotaped it and then put it on YouTube. It attracted a billion views. That kid then got absolutely pilloried online through trolls and through abuse, to the point where he had significant psychological repercussions from it. Nowhere in there was the parent really in a position to do anything about it. It was a video that was just taken and put online, and it just exploded.
Yes, parents are going to do everything they can to try to protect their kids, but their kids have so many points of access to this information. They can use a computer at school or at a coffee shop. They can use their phones anywhere.
Either we're going to become Luddites and give up technology, which I don't think any of us are prepared to do.... We don't necessarily think that's good for our kids either. As parents, we can do our best, but we need help, we need tools and we need the ability to have support.
As you say, the rule about blocking phones in schools is a great idea, absolutely, but it only covers part of the problem.
:
I will just keep going on this topic, because I think part of it comes down to awareness, particularly about the behaviour of the algorithm. I think we're still behind in understanding exactly how it reacts.
One thing I am particularly concerned about, and Monsieur Champoux mentioned it as well, is the idea of radicalization and polarization. Because of the nature of the algorithm, it seeks strong reaction, both negative and positive. A moderate position that doesn't get anybody super angry or super happy doesn't have much of a life online.
I wonder about the effect on young people of civil public discourse and being able to disagree. We're quite good at it here in this room, I think, which is a very important thing for polarization in the future of democracy in our country. I do worry about this effect of the algorithm on young people's abilities to have civil discourse. As you know, if you're just looking through the comments...but even the way the algorithm itself behaves and reacts makes me fearful of these things.
How can we make people aware of these things? How can we address these issues so people understand that discourse in person looks very different—I hope?
:
I think we can all learn from hockey a little bit here. You have a game where people are checking each other and slashing each other and shooting pucks at each other's heads. At the end of the game, you go up to each other and shake hands. You have some high-fives.
I think we have to learn to have intellectual battles the same way we have physical battles and know that it's not personal. If we talk about the role of schools and the role of parents, I think if you could engage kids in a way that allowed them to learn how to have arguments—it's not about you; it's about the ideas—and how not to take everything so personally but just have free-flowing, progressive ideas, it would be great. I'd love that.
I mean, that's what being an academic is all about. That's what we try to preach in academia. If we could figure out how to teach our kids to do that, that would be amazing.
:
I would agree fully. From the perspective of offenders and perpetrators who are specifically looking to befriend children and teens and sometimes even single parents, for example, you can present in any way you want online. You're not having that discussion in person.
One thing we teach parents or caregivers at Little Warriors in our Be Brave bridge program, which is an online psycho-educational component of our program, is to really have discussions about things. Issues can't be black and white, or parents can't just control everything, because teens specifically go out of their way to try to access things when their parents say, no, you can't do this.
The issue is partially that, but it's also partially giving parents the tools to be able to have the very important conversations about safety and risk and why they're concerned, because there can be discord between them.
:
The biggest YouTuber in the world is a channel called MrBeast. He has transformed the space. He is the first YouTube billionaire. He has product marketing. He now has a Netflix series.
The most powerful kid is a kid named Ryan Kaji. He is the first person to break the $30-million-per-year level as a child. He started at the age of six. I believe he's 15, and he has kind of phased out of kidfluencing. However, his parents, who own the channel, obviously have passed it on to his younger brother and sister, so they're the ones who are making the money for the family. They have a company with 50 employees. They have cameramen, sound people, writers, editors, etc. It is a massive organization.
Those are two of the really big ones that I think are worth noting.
However, I have a list in my paper that I'm happy to send you, which includes all of the top 1,000 influencers on YouTube. It's everything from something that's utterly mundane to stuff that is radicalized and very political. They all have tremendous following, and they have a tremendous ability to change and affect behaviours.
:
The classic thing, which was introduced 20 years ago, is just as true today: Make online activities communal. If you're using your iPad, you have to be in the living room, where we can all see what you're doing, or on the family computer. The point is to limit the ability to do this in their room alone without being able to watch it. That gets progressively harder the older the kids get. You have privacy issues. You want the kids to develop their own morals of right and wrong.
At least with my young daughter, we basically say, yes, she can be online, but we're going to watch what she's doing, and we're going to be there. If we hear something and don't like the way that person is talking, we can say to change the channel, to watch something else, to do something else. Honestly, without that, I don't know what we would do.
:
I would also say to limit the amount of individual access that they have to social media. Don't let them have a Facebook account. Don't let them have an Instagram account or, at the very least, don't allow them to post. They can follow other people and watch other people, but their ability to put themselves out there is limited.
Fundamentally, you have to think about online activity as a game of Russian roulette. The barrel can have millions and millions of slots and one bullet, but the more people do online and the more exposed they are and the more views they get, the more the number of slots starts shrinking, to the point where you are almost guaranteed that something bad is going to happen to you when you get into that realm of 100 million views.
However, everything is a risky behaviour. Everything you do online is a risky behaviour, and it's about how risk-tolerant you are. I don't want kids under the age of 14 having to make decisions about risk. They're generally very bad at it.
:
Thank you, Madam Chair.
Mr. Clark, I would like to come back to influencers—Mr. Champoux and Mr. Myles also mentioned them earlier. As their title suggests, these people have a lot of influence on young people, on the current generation, and on adults in some cases. Influencers talk about finances and weight loss as much as they do about skin care, for example. In some cases, they do so simply out of habit. In most cases, they do not have the formal training or skills to do so.
How do you see regulating these people who talk to our children and us? Should they have special knowledge about the subjects they talk about on social media?
:
That's a great question.
The problem is that there are so many channels available. Most influencers are not on just one channel. Take somebody like a podcaster, like Joe Rogan. He has enormous influence in the United States—political influence, social influence. He can create ideas and he can create trends on a whim, but you don't have to listen only to his podcast, because it's sliced and diced and ends up on YouTube. It ends up on TikTok. It ends up on Instagram. It ends up on Facebook. I don't want to be a free ad for all the platforms out there, but the simple fact is that one voice can have so much reach because it can be amplified in these ways.
To be honest, there is literally nothing we can do to monitor what they're saying or control what they're saying, because they have free speech, and when they're adults, they truly do. The best thing we can do is try to figure out how to limit the forms of consumption, or at least give control to parents over forms of consumption.
:
Yes, you're right. The YouTube for kids and Instagram for kids is like watering down wine: You're still drinking alcohol, and if you drink enough alcohol, you're still going to get drunk.
YouTube Kids doesn't allow advertising. That's awesome, excellent, wonderful. However, they do allow product placement and they do allow other forms of sponsorship and spokesmanship. What they allow content-wise changes based on how old the kid is, so if you're under three, all you're getting is a bunch of cartoons or canned media, and that's fine, but when you start getting close to 10 or 11 or 12, you get more adult content that has a wider range of influence.
Honestly, I think the only beneficiaries from YouTube Kids are the platforms. By creating platforms specifically aimed at kids, all they're doing is turning them into the addicts that Mr. Myles was talking about. They are just creating a behaviour that is going to spill over. It doesn't really protect them that much in the here and now, and it certainly doesn't protect them in the long run, because the content is.... The medium is the message.
:
I think that going back to supporting parents and caregivers with the right kind of information is really important. The platforms are constantly changing, so without having access to understanding how to support their children, it's very difficult. Many times, the children and teens themselves are more savvy on the computer than their parents are, so for the parents, having the right resources is incredibly important to protect their children.
I think the limits and whatnot, as Mr. Clark has spoken about, do need to come from parents, but also, as has been mentioned, it's very difficult. You can have a parent who is very good at that in one household, but if the child goes on a play date with another child, suddenly there's open access in another home. How do we ensure that all parents have access and that this access is ongoing? Very few parents understand that when their child is playing Roblox, for example, they can have access to friends online who may be 60-year-old adults posing as 12-year-olds. Parents need the information, and it's constantly changing and it needs to be updated.
The second thing I would say about the influencers is that right now, the influencers are motivated in lots of ways, but is there potentially a way to motivate influencers to support mental health and wellness and awareness of these things? I think those areas should be and can be explored further.
Mr. Clark, earlier, I used the example of my 14-year-old teenage daughter who gives more credibility to some influencer she saw on TikTok than to what a teacher tells her on about a given subject.
This leads me to reflect on the educational models in Quebec and Canada. In your opinion, given the influence of social media on young people, isn't it time for Quebec and Canada to rethink the way we teach elementary and high school children? Isn't the traditional classroom, meaning a 50- or 60-minute lesson on a given subject, a thing of the past? Isn't it time to rethink this model, particularly in light of what young people are consuming on social media, among other things?
:
I'll punt the question about what the best model of education is to my wife, who is an elementary school teacher, but we do think it's a good idea for our kids to learn to question everything.
Fundamentally, as an academic and somebody who teaches in university, I want them to learn how to question things and ask their own questions, but here's the thing: As you say, if your 14-year-old is discounting what they learn in the classroom and then getting their information from social media because they hear things they like and they hear things that confirm their world view and their perspectives and give them permission to act, behave and think the way they want, that's not better.
We fundamentally start chasing the opinions we like and the opinions that matter to us, and that is where influencers actually get their power.
:
There's a really interesting point there.
Let's face it, to Mr. Myles' point of view, there is nothing as addictive as a short video, like a 60-, 80- or 180-second video. It requires very little engagement, but you get very quick hits of the endorphins Ms. Polzin Holman was talking about.
You may remember hearing that every time you hear your phone buzz, you get an endorphin rush because it means somebody is engaging with you and somebody is interested in you. Those short videos basically work the same way. They create a rush of endorphins as you get something interesting or fun or cool or exciting, and they never let your brain go.
I worry about us using the exact same techniques in the classroom, because you get a teacher who is basically manipulating your endorphin levels, but I do think it is a very effective way to communicate and it does actually seem to create something that children respond to.
:
All right. I will be taking the Conservative time slot for five minutes.
My first question goes to Ms. Polzin Holman.
You talked about this in your opening remarks, but I'm hoping you can expand on it quite a bit. On the Internet, social media platforms and gaming platforms are often used to lure, groom and eventually exploit young people. You work with these individuals after these events have transpired, and, of course, they have a significant effect on them.
If you could, I would like you to talk about how this works in really simplistic terms. I think we have members of the Canadian public who are unaware of the fact that gaming and social media platforms can be used in this way and the danger that is posed to young people.
:
I think there are numerous ways this happens online. We know that there are social platforms such as Instagram and Snapchat, as well as Roblox and other games where children can be playing even in front of their parents in their living room or dining room. Parents can see it happening. The children are plugged into headphones, and they're having conversations with friends. They don't know if their friends, like I said, are actually who they say they are. They might have certain photos of them. They can have conversations. Oftentimes, it's not verbally; it's through different texts or instant messaging.
What happens is that the grooming process, from the perspective of the children we see at Little Warriors, happens immediately. There are predators out there who are looking for vulnerable kids. They start gaming, and it all seems very innocuous, but over time, what happens is that they start learning that the child is maybe in a single-parent household where there's not a lot of supervision, that the parent is working two jobs or that the child is alone at certain times. Then the child has a picture behind them of their school information or is wearing a hoodie or a sweatshirt. Those things can get very easily manipulated.
We have parents who have been very involved in their children's lives and have checked their social media. They have accounts that they can look in at any time, and they wouldn't necessarily see that there's an issue until there are different stages of the grooming process—when they're meeting up, sending images and then manipulating children and teens because they have sent inappropriate materials. Sometimes they're actually paying for different access to games or for coins, for example, that they can use to improve their standing in certain games, and they can send that through the Internet without anybody really knowing anything at all.
This happens online, as I said, much more than parents are aware of. It's frightening to know the extent to which this is happening and the ease with which it is happening. Unfortunately, as I've mentioned, it has ended very dramatically and very sadly with children being sexually exploited and abused.
I would be remiss if I didn't mention Carol Todd. She's an educator in school district 43, and that's in my riding of Port Moody—Coquitlam. Her daughter committed suicide in 2012 at the age of 14. Her daughter was being groomed by a friend at the time, an anonymous friend in the Netherlands. Amanda Todd was her name, and Carol has been a great advocate around tabling legislation.
My first question is for Dr. Polzin Holman. It's around the fact that our government tried to bring forward legislation to mitigate the risk of exposure to harmful content, including content that victimizes children. I know that this legislation died in January this year, when Parliament was prorogued. I'd like to hear from you, Doctor, about what within the online harms act, Bill , you feel should come back in its original form, or what might need to be changed or strengthened.
:
I am fully in support of the online harms act. That act, or something similar, is needed to ensure the protection of our children and our teens.
I think there are a few key areas. One is that we need to ensure that whatever we put forward limits child sexual exploitation, and there needs to be deterrence. There needs to be some kind of deterrence associated for anyone who's producing the material or using the material. Currently, it's very unclear and it's not consistent across provinces and across judgments.
The second is that regulation is needed for schools and for other community activities. It's very difficult for educators to take on this task on their own. Oftentimes, as I mentioned, the platforms are changing. The information needs to be shared with parents and caregivers as well, on an ongoing basis, and it needs to be updated.
The third piece is that, at a community level, there need to be updates and supports for programs such as the Little Warriors' Prevent It! program, or for other organizations that are doing similar programming, so that parents and educators, as well as children and teens, have the right information and can continue to have conversations about this.
Just limiting things and putting laws onto these areas is not enough. We have to continue to have the conversations, because things are constantly changing and being updated.
:
It is accurate. I took my stats from specific areas. I can provide more information to this committee, regarding similar stats, as well as that. We are certainly seeing that.
I think the biggest change point we've certainly recognized is that, during the pandemic, there were so many things that just completely opened up for children and youth as a result of being online and having access to devices for school. For those reasons, it served a purpose. However, since that time, I don't think any of the controls and any of the time spent have been pulled back. Even though children and teens are now in school, they're constantly on devices. Schools that have taken the stance to not have children on their phones have certainly limited things like cyber-bullying as well.
There are numerous harms that continue to present themselves. The problem is not going away. As mentioned, with AI and other areas, there are just new things that are opening up for us to address. At some point, we need to take a stand on creating some ways to address this, but it's going to be a work in progress because we know that things are constantly changing.
:
Well, folks, thank you. Thank you to all of the members for being here and for asking such good questions.
Thank you to Mr. Clark and Dr. Polzin Holman for being here as well. We very much appreciate the information you shared with us today. It will be very helpful in the study that we do, and then, of course, in the report that will be drafted and the recommendations that will be made to the government. Thank you.
Very quickly here, Dr. Polzin Holman, I want to follow up with you. You made a comment earlier and said that it would be possible to provide the committee with some more statistics. If you could draft just a short brief outlining that, it would be deeply appreciated. Thank you so much.
The meeting is adjourned.