:
I call this meeting to order. Welcome to meeting number 17 of the Standing Committee on Canadian Heritage.
Before we begin, I'd ask all in-person participants to read the guidelines written on the updated cards on your table. These are measures in place to help prevent audio incidents and to protect the health and safety of all participants, including the interpreters. You will also notice there's a QR code to a short awareness video, should you need that. Pursuant to the routine motion adopted by the committee, I can confirm that all witnesses online have completed the required connection tests in advance of this meeting. Please wait until I recognize you by name before you speak. All comments should be addressed through the chair.
Pursuant to Standing Order 108(2) and the motion adopted by the committee on Wednesday, November 5, 2025, this committee is meeting to study the effects of influencers and social media content on children and adolescents.
We have with us today, by video conference, Fenwick McKelvey, associate professor of information and communication technology policy at Concordia University.
[Translation]
From the Association des créatrices et créateurs de contenu du Québec, we welcome Laurence Labrosse‑Héroux and Farnell Morisset.
[English]
From B'nai Brith Canada, we have Richard Robertson, by video conference.
From the Canadian Centre for Child Protection, we have Monique St. Germain, by video conference.
Also by video conference, we have Dr. Jocelyn Monsma Selby from Connecting to Protect.
Everybody has five minutes for an opening statement.
We'll start with Fenwick McKelvey.
Go ahead, sir.
:
I am an associate professor in information and communication technology policy at Concordia University. I'm presenting along with Dr. Elizabeth Dubois, who cannot be here today, and sharing research developed at the Pol Comm Tech Lab's project on political and civic influencers.
We are presenting today as experts in digital media systems and automated media, with a growing attention into influencers, real and virtual. While our research largely attends to the broader media dynamics, we emphasize some trends necessary for understanding how these changes may impact children, and especially youth.
Research shows that children have lived through profound changes in our media systems. Twitch and TikTok, as well as games like Roblox and Fortnite, normalize and teach children to use cryptocurrencies, such that 80% of respondents in a youth survey claim to have invested in an in-game currency. AI usage as well is dominated by gen Z users, with many of our students being exposed to AI content daily.
We are just beginning to appreciate the scale of these changes. Fifty-five per cent of teens use TikTok, whereas just 22% of adults use it. The gap in media habits seems to be growing, so we need to be mindful not to panic over the change. Youth have sophisticated media habits; they are neither helpless nor without support. Parents and schools play important roles in teaching media literacy. Most parents have talked to their children about online safety.
One major challenge has been the rise of online influencers in our media system. An influencer is “a highly visible subset of digital content creators defined by their substantial following, distinctive brand persona, and patterned relationships with commercial sponsors”, according to Brooke Erin Duffy in 2020. Influencers vary in quality and reliability, but speaking as a professor and from what we've encountered in our research, I know that our students know people who have careers as influencers, and many of our students can better identify with influencers than with journalists as a career choice. Seventy-eight per cent of youth follow an influencer. Dr. Elizabeth Dubois studies the role of political influencers, a key consideration in how youth form political opinions today.
Good as their strategies may be, youth have to navigate a complex media environment that at times is adversarial. Influencers may be used intentionally to perpetuate known harms, such as cyber-bullying, negative mental and physical health impacts and disinformation, but they are not the only challenge to today's media system.
Media systems are increasingly flooded with fake, scammy and AI-generated content. The next turn in this ongoing experiment on youth will involve AI. Internet users of all kinds are also impacted by scams from unaccountable online advertising and ineffective online safety measures in many of our platforms. Reuters reports that $16 billion—10% of Meta's advertising revenue—comes from scam advertising. Being online now requires constant attention to avoid being scammed.
Influencers, AI and media systems more broadly have complex positive and negative effects, but increasingly, the problem is not connectivity, exposure or being online; it's about good habits, good supports and good choices.
Policy can improve this environment through better accountability for advertising and advertising-supported industries like social media and very large online platforms; better protection against scams, and more accountability for advertising firms and platforms that do not effectively address scams, harmful apps and false advertising on their platforms; and better standards to help influencers demonstrate their trustworthiness and reliability.
These objectives can happen through support to ensure age-appropriate design in platforms and through efforts to ensure that platforms fulfill their social mission; enforcement and scaling of existing regulation through the CRTC and the Competition Bureau to compel these regulators to be more proactive against false advertising, to combat stereotyping and to work to create good jobs for this new class of media creators—influencers; and finally, continued support for public service media and greater support for frontline workers teaching media education.
Thank you very much. That concludes my comments.
We're very pleased to be here to discuss issues that directly concern our industry and our community.
I'll start by introducing myself. My name is Laurence Labrosse‑Héroux and I'm the co‑founder and general manager of the Association des créatrices et créateurs de contenu du Québec, or ACREA. I'm joined by my colleague Farnell Morisset, who is himself a content creator and a member of the board of directors. In fact, given that he creates content that is focused on politics and news, Farnell is doing a lot, in my opinion, to combat disinformation.
We're here today to represent Quebec's content creators. ACREA's mission is to bring together Quebec's content creators within a single community in order to develop unified talking points, lend credibility to the industry, get organized and increase our influence. Over the course of the year, we mainly organize small events to bring together content creators. We're also the organization that produces the Gala InfluenceCréation, the official content creator gala, which promotes content creation in Quebec. Our organization has been in existence for about two years, and it currently has almost 1,000 members.
ACREA aims to offer its members various resources to help them in their journey as content creators. We offer various online training courses, one of which focuses specifically on mental health. This fall, we also organized a 14‑unit training session, which included two full sections dealing with disinformation and public relations. We're aware of the issues affecting our community and are on top of them. We aim to ensure that our members are aware of these issues as well. Naturally, because we represent a rapidly developing industry, there's a lot that has to be done. Our community is slowly getting organized.
There's still a lot to do. We're also aware that we have a direct influence on Gen Z and Gen Alpha, who are very present on social media. That's where they get all the information that makes up their daily lives and their culture. There's a lot to be done in this industry, and we believe it's very important to make sure that content creators are aware of the influence they can have on their own audiences, particularly young people. We have seen a number of examples of abuses in this regard in recent years.
Currently, we get the sense that the web reacts to the web. That doesn't mean that things are always well regulated or well managed, but content creators, who generally specialize in specific fields, do go to the trouble of responding to abuses that can occur on the web.
We're very pleased to be here with you today to talk about such important issues. We hope we can shed some light on the reality we experience in this respect.
:
Thank you, Madam Chair.
Honourable committee members, my name is Richard Robertson. I'm B'nai Brith Canada's director of research and advocacy.
B'nai Brith Canada is Canada's oldest human rights organization and the voice of Canada's grassroots Jewish community. Our organization, which was established in 1875, is dedicated to eradicating racism, anti-Semitism and hatred in all of its forms, as well as championing the rights of the marginalized.
The radicalization and indoctrination of Canadian youth online represents a dangerous threat to our national security and the continued vitality of communities across Canada. The spread of disinformation and misinformation online propagated by influencers on social media is having a devastating impact on the well-being of Canadian children and adolescents. Not only are they being victimized by the obscene content they are encountering, but our online spaces are sadly being used to inculcate them towards violent extremism.
Canada has a duty to protect its next generation from the dangers of increasingly virulent and polarized online content. As a society, we cannot afford to remain idle while our youth are subjected to the disseminations of nefarious actors and the proliferation of radical ideologies on social media and other digital platforms.
This committee, through its present study, has the opportunity to make recommendations that will provide additional protections for young Canadians as they navigate online spaces. If implemented, our recommendations will ensure that our nation is proactively confronting the risk posed by the radicalization of Canadian youth online. To assist the committee, B'nai Brith Canada presents the following three recommendations.
Our first recommendation is that the committee recommend that the Standing Committee on Public Safety and National Security commence a study on the threat of youth radicalization online in Canada.
In July 2025, the Canadian Security Intelligence Service, CSIS, published its public report for 2024, which discussed violent extremism and youth radicalization. The report indicates that Canada has seen a growing trend of youth involved in counterterrorism investigations, some of them as young as 13 years old. In its submission to the committee, B'nai Brith Canada highlighted a series of cases involving youth who were radicalized online and subsequently charged with offences ranging from terror to distributing child pornography. The transformation, in part through online radicalization, of the role of youth from victim to perpetrator, as indicated in CSIS's 2024 public report, warrants the Standing Committee on Public Safety and National Security commencing a study on the threat of youth radicalization online to assist the federal government in better understanding the issue at hand.
Our second recommendation is that the committee recommend a national program be developed to enhance the digital literacy of youth relating to the misinformation and disinformation they may encounter online.
The need for youth digital literacy in Canada has been acknowledged by multiple actors. In a 2011 report by the RCMP titled “Youth Online and at Risk: Radicalization Facilitated by the Internet”, the radicalization of youth is not described as a new phenomenon but, rather, is acknowledged for only growing in its pervasiveness. The report lists several methods to counter youth radicalization, including website helplines and reporting mechanisms, but ultimately calls for a multi-sphered approach to open dialogue and education for youth.
In the preceding decade, efforts have been made to enhance the availability of digital literacy programs for Canadian youth. However, as B'nai Brith Canada suggests in our submission, these efforts have been advanced by non-governmental actors. There exists a need for a coordinated national program. The time is now for such a program to be developed and implemented by our federal government.
Our final recommendation is that the committee recommend the Government of Canada take tangible steps to actuate on the commitments made with regard to social media and online harms in the 2025 G7 interior and security ministers communiqué. These commitments are enumerated in our submission. We urge the committee to use its report to insist that Canada uphold the obligations Canada has made to reduce online extremism and to create a safer online environment for Canada's youth.
Thank you.
My name is Monique St. Germain, and I am general counsel for the Canadian Centre for Child Protection, which is a national charity with the goal of reducing the incidence of missing and sexually exploited children.
We operate Cybertip.ca, which is Canada's tip line to report the online sexual exploitation of children.
In 2024 alone, Cybertip processed over 29,000 reports, most of which involved child sexual abuse and exploitation material, also known as CSAM. The next most common reporting category was tied to online luring or sextortion.
To tackle the explosive growth in online CSAM, we launched Project Arachnid in 2017. It is an innovative, victim-centered set of tools that targets the detection and removal of CSAM online.
Operating at scale, Project Arachnid issues roughly 10,000 requests for removal every day, and some days it's over 20,000. To date, over 67 million notices have been issued to over 1,500 service providers worldwide. It is because we operate Project Arachnid that we understand the challenges of content removal and the immense harm to children when content is not promptly removed. It is through Cybertip.ca that we hear every day from Canadian children and families impacted by something happening online.
In addition to processing those reports, in 2024 alone, we managed nearly 2,800 requests from survivors, youth and caregivers for assistance and support. This unique lens equips us to understand how children are being targeted, victimized and sextorted on the platforms they use every day.
We understand the focus of this committee to be specifically on the issue of child influencers and social media harms to children. On the issue of child influencers, while we understand that these accounts can be a source of income for the child and their family, this comes at a personal cost to the child and their safety.
The followers of these types of accounts tend to overwhelmingly be men with a sexual interest in children. In addition, the images child influencers share are often reposted in online forums and chats amongst groups of users who comment on and sexualize these children. This heightens the risk to the individual child and to children generally.
The way social media works makes it easy for those who have a sexual interest in children to not only find child social media accounts, but also to connect with like-minded individuals who share their sexual desires about children. Images of these children are then shared within these groups to fuel sexual discussions about the child. This is likely to have repercussions for the child, extending into adulthood.
Adding fuel to the fire are algorithms. Once a user of a social media platform engages with content in some way, such as liking it or sharing it on their own account, the algorithms are tuned to ensure that the user will see even more of that type of content. The algorithms effectively amplify the content within certain user groups and connect users together who may not otherwise have been connected.
The reality is that social media is focused on ways to increase user engagement, as that's what makes the company's money. To maximize engagement—and thus profits—social media companies have developed these sophisticated algorithms, which help ensure that the user sees more of the content they like.
This is a complex child safety issue. As such, we welcome measures by the federal government to tackle the company's role in this directly, as well as measures by provincial governments to tackle it through existing child welfare and labour legislation.
On the federal side, we need to regulate a duty of care on platforms with Canadian users or where they utilize content depicting Canadians. Mandating basic safety and design expectations that need to be adhered to is critical. We also need to mandate the detection and removal of known CSAM. Young people need easily understandable and readily accessible ways to have content involving them removed quickly.
Bill , introduced in the last Parliament, while not expressly dealing with child influencers, would have started Canada in a positive, meaningful direction to begin tackling issues like this. It would have imposed the duty of care on companies. It contemplated the development of an age-appropriate design code for children, and it included specific measures to ensure that types of sexual content were promptly removed.
We would also add that there's an urgent need to implement age assurance tools and to increase the use of tools like project arachnid to enhance removal and to prevent the re-upload of CSAM.
Thank you for allowing us to be part of this committee's study.
:
Thank you, honourable Chair, and committee, for the opportunity to be here today.
The regulation of influencers and social media content affecting children and adolescents is a rapidly evolving global issue. Canadian lawmakers must critically assess and learn from international regulatory successes and failures.
My submission is based on 44 years of clinical practice experience and several years of chairing a global working group in conjunction with the University of Calgary Faculty of Social Work not only to research but to address the harms from children accessing sexually explicit material online. Hopefully we'll have a publication that is available at the end of the first quarter in 2026.
I would like to encourage one of you to spearhead the opportunity to develop a non-partisan bill for regulation of Internet access to protect Canadian children. When child protection is framed as a universal societal priority rather than a political issue, we see the most powerful results. Do not assume that because certain countries are using certain approaches that they are the most successful options for child protection. Be aware of these approaches and the research each one has used, who is supporting it and whether they have a financial stake in what they are advocating.
Global models show success through comprehensive research and hearings, input from child safety advocates, mental health professionals, parents, tech companies, industry stakeholders, independent regulatory bodies and cross-party support.
Examples of successful approaches are the Kids Online Safety and Privacy Act and COPPA 2.0 in the U.S., the Online Safety Act in the U.K., the Digital Services Act and the audiovisual media services directive in the EU and Brazil's ECA Digital law. Other countries with independent regulators—Germany, South Africa, Korea and Spain—are all coming on [Technical difficulty—Editor]. You need to look, in turn, at all of the international laws, treaties and conventions.
A single guiding principle is article 5 of the UNCRC, the United Nations Convention on the Rights of the Child, concerning the importance of the child's evolving capacities. A child is considered to be under 18 years of age. Therefore, I question Australia's choice of 16. Is this arbitrary? Is it based on research?
Also, the International Centre for Missing and Exploited Children has developed an international model that involves over 68 countries—there might be more than that now—emphasizing the need for a non-partisan evidence-based legal framework.
As for the risks and harms, algorithms on platforms like TikTok, YouTube and Instagram quickly adapt to underage users, exposing vulnerable children and struggling adolescents to significantly more problematic and distressing content. Users behaving like eight-year-olds receive almost seven times more child-directed content than 16-year-olds. Algorithms also react to accounts behaving like struggling adolescents, which receive 30% more problematic and over 70% more distressing content than their non-struggling peers.
If you're a teen with access to OpenAI's Sora 2, you can easily generate AI videos of school shootings and other harmful, disturbing content, despite what CEO Sam Altman has claimed. A recent study from the University of Cambridge’s MRC Cognition and Brain Sciences Unit highlighted adolescence now as the stage between nine and 32 years of age as a period of heightened risk for mental health disorders. This is a much longer time span than we have ever been aware of, and that is current research that came out in October 2025.
In 2023, the IWF, Internet Watch Foundation, hashed 2,401 self-generated sexually explicit imagery and videos of three- to six-year-olds. Ninety-one per cent were of girls, and 62% of them were assessed as images shown of children in sexual poses displaying their genitals to the camera. It takes just a few clicks to find child sexual abuse material online, including AI-generated images across many platforms.
As an influencer, Andrew Tate's online presence has made him a hero to millions of young men, but his ideology is widely toxic and extremist. He is known for commercializing misogyny and encouraging followers to adopt harmful attitudes towards women.
Current Canadian laws under the Criminal Code and Alberta's Protection of Children from Sexual Exploitation Act are not sufficient. There is a need for regulation that is fit for purpose and safety by design, requiring platforms to identify, report and remove illegal and harmful content immediately.
:
Yes. To date, we don't have any specific legislation that talks about a duty of care for platforms. What we've done is defaulted under the American model, where we're not really doing anything about the companies themselves.
There was a lot of discussion leading up to the introduction of Bill . Our executive director was part of a committee that was advising on which direction Canada should go in. The duty of care model is something that was adopted in the U.K. under their online safety legislation. It does appear to be something that is a step in the right direction in terms of holding the companies themselves accountable.
We certainly have very robust criminal laws in Canada to tackle when somebody is committing a criminal offence against an individual, and we use those criminal offence laws quite a bit. What we do not do, though, is use those criminal laws against companies, and there are very good reasons for that in terms of how difficult it is to prosecute a company.
I'm aware of one known instance were we did prosecute a company for their role in making available child sexual abuse material. The charges were laid in 2012, and the guilty plea was not received until 2020. For eight years, there were numerous court appearances, and there was a lot of public money that was spent in this prosecution. Now, I'm glad that we had that prosecution, but I think eight years is far too long for a resolution on something like this. Criminal law is perhaps not the best fit when it comes to companies.
:
Thank you, Madam Chair. It's great to see you, as always.
Thank you to our witnesses for being with us today.
Last week I asked a certain witness a question, and the response went quite viral. I'd like to take a quick moment to clear a misconception. From reading the comments, it's clear to me that some seem to think this study is about banning young Canadian men from going to the gym or from listening to and watching CBum. The gym has been an important part of my life for as long as I recall. There are videos and pictures of me watching my father bench-press at a commercial gym since before I could say the word “bench”. However, the gym alone is not the reason I am who I am today. That credit goes to my parents, who created a safe environment for me and who ensured that the influences around me were good ones. Doing that is becoming increasingly challenging, sometimes impossible. That is what the study is looking to address.
I will now turn to my questions.
Dr. Monsma Selby, at the justice and human rights committee in December 2024, you stated that it “takes just three clicks to find child sexual abuse imagery or child sexual exploitation material” on the Internet. You alluded to this earlier. What does that reveal about the way in which platform design exposes them to or shields them from exploitative material?
[Translation]
Ms. Labrosse‑Héroux, according to the ACREA website, the creation of social media content is undeniably a large part of the culture of younger generations.
According to what you're observing among creators in Quebec, how do influencers shape the goals, self‑image and day-to-day decisions of children and adolescents?
How has that changed over time, for example, compared to celebrities? What's the difference between the two?
:
If I may, I'll answer the question as a content creator and influencer myself. The parasocial relationship, I think, is not necessarily new either. Its impact is significant because the device through which content is consumed is built to encourage two-way relationships, unlike with television or radio.
I communicate with my audience that's made up of strangers in the same way I communicate with my nephew on FaceTime. A social adjustment may need to be made. I remember working on a project where we were studying newly established media in history. People used to dress nicely to watch television at home until standards and habits changed. There's been a change in culture and, naturally, because of the nature of the device and the two-way relationship, people can respond right away. It's very easy to make a clip or video in response to someone else and to make sure that the person will see it as well.
We are witnessing a change in parasocial relationships that are different from the television era because interaction is now easier and, in theory, much more accessible. However, this isn't a fundamentally new phenomenon. Rather, it's a phenomenon that adapts to technological tools and, as we become more accustomed to it, we'll likely create different social standards around that as well.
:
Thank you very much, Madam Chair.
I also want to thank the witnesses for joining us today for this very important study. It has been very surprising so far. I must admit that the perspectives and points of view that have been shared since we started this study are really interesting. It is all very informative.
We're seeing that many testimonies continue to highlight the harm that is caused by young people's exposure to social media. We also talked about the legal aspects as well as the lack of education and digital literacy.
As part of the meetings we held to prepare for this study, I had the pleasure of having an extremely rewarding and refreshing conversation with Laurence LH. That's her real name. In fact, Labrosse-Héroux is an alias used for the meeting, since we needed a full name. The meeting was rewarding because it opened the door to something other than the stark observations we may be inclined to make. That's why I find it interesting that we are having this conversation in public with ACREA.
First, Ms. Labrosse‑Héroux, tell us a little more about the association.
What kind of content creators and influencers are members?
How do you attract these people, apart from the gala, which is indeed a magnet in this respect?
Also, how do you see the future? How do you see the organization growing to have even more influence among young Quebec and Canadian content creators?
:
Basically, our association is a group of content creators, from very small to very large creators. It could be people with 2,000 subscribers, 300,000 subscribers or 1.3 million subscribers. It's very varied. The topics they discuss online are also varied. There are some content creators that are specialized. For some, it is a cult of personality. We see them a bit more as influencers. Some deal with specific topics. For example, Farnell Morisset is a content creator who focuses on current events. Others deal with health, psychology, body positivity or sport. There are a million topics that content creators on the web talk about.
Our association was created in order to build community. Content creators often work alone at home. We wanted to break down that barrier and create a group to let people know that resources exist and that we are a community. People attend the content creator gala, Gala InfluenceCréation, every year because of that.
I also think that, socially and politically, there is a heartfelt need in Quebec to rediscover culture right now. As an association, we're here to promote the idea that just because we turn off the television, it doesn't mean that we've stopped consuming Quebec or Canadian culture. Rather, we've migrated to a new platform. We need to speak to young people where they are, quite simply.
:
This weekend, I read a report on Radio‑Canada's website that I found really interesting. I don't know if you've read it. There's a problem with flies in Mexico. Mexico is breeding sterile flies in order to release them in areas where livestock are being affected by this type of fly. These flies reproduce and literally eat the livestock. It's something to be seen.
In short, Mexico's technique—I apologize, it was the weekend and I was in a laid‑back mode—made me think about what we could do with social media in general. Mexico's technique is to release more sterile flies than the number of flies that are in the wild in order to stamp out these harmful flies.
I think that this could be a solution. We could put out into the community as many people as possible who have good intentions and who follow certain rules in order to counter the negative effects of others so we could get back to a healthy environment.
I know my time is almost up. Perhaps we can come back to this topic later.
Very briefly, in a few seconds, do you think my analogy is flawed or do you think it makes sense?
:
Thank you, Madam Chair.
I want to thank all the witnesses for being here with us.
Mr. Morriset, you have heard the comments made by the other witnesses today. We like to believe that, generally speaking, people are acting in good faith, choose to be well informed and want to engage appropriately. However, we are also hearing that there is an extremely alarming increase in sextortion and that young people are being exposed to senseless sexual content, more specifically.
Don't you think that, no matter what the circumstances, this represents a real danger?
Earlier, you said that the web reacts to the web. I don't know if it was you or Ms. Labrosse‑Héroux who said that.
What does that mean?
:
I'll give you an example to help explain it.
There was some misinformation circulating about how people shouldn't use sunscreen because it is toxic. A group of content creators relayed this information and then, other content creators, doctors, said directly in the comments section that people should stop spreading rumours and that the information made no sense. That's what I meant. That's how the web reacts to the web.
Of course, that doesn't solve the problems of the other witnesses here. However, when it comes to misinformation, many content creators who specialize in particular fields are named in videos, sometimes dozens of times over. They then take the time to respond to those videos and dispel beliefs that are circulating in their field.
:
Obviously, we live in North America. Quebec is therefore part of a francophone region. It's like a small French-speaking village compared to North America as a whole.
When it comes to influencers, how does the francophone world compare to the English-speaking world of North America, and by that I mean English Canada and the United States? The francophone world is a small slice of the Canadian and North American influencer pie.
Despite everything, is French doing well in terms of influencers' reach?
There is reason to think that there is more English-language content, but are young francophone individuals influenced as much by francophone content as by English-language content?
:
I would tend to say that the algorithm drives the algorithm, as we have heard from other witnesses. The more French-language content you consume, the more French-language content you will be sent. It is determined by consumption.
We do not have any specific studies that give us a super-accurate picture of the situation. However, when it comes to our members and the content creators who are in the organization, it's clear that the vast majority of their subscribers are Quebeckers.
Once they reach a certain threshold, we see that they also turn to the global francophone community. Some also have very large audiences in France. It really depends on each individual case. Is the content created by our members consumed in Texas? The answer is definitely no. However, is it consumed in Sherbrooke and Quebec City? That is most likely the case.
My colleague could also tell you about his experience.
:
Thank you very much, Madam Chair.
Thank you to the committee members as well.
Thank you to the witnesses.
As the chair mentioned, I'm not a regular member of the committee, but this is a really important topic, from a professional, national policy standpoint, but also, for many of us, from a personal one. I have two kids who are under the age of 10 and, for better or for worse, at some point in the near future they will be entering the online world. Therefore, making sure it's a tool that is as safe as possible for them, like any other tool, is a really important thing.
Ms. St. Germain, I want to ask you a few questions.
In your opening comments you touched on the issues around age when it comes to access to social media. A constituent of mine, Jenna Poste, is a member of a group called Unplugged Canada, which has been advocating for an increase, to age 16, for social media use by teenagers in Canada. Former colleagues of mine in the Nova Scotia legislature have introduced legislation that would do the same in Nova Scotia, and you touched on it. I'm just wondering if you think it's a good idea to move to age 16. Many of these are set at age 13 right now. If it is a good idea, how would we enforce that and how would we make sure it actually functions?
:
Canada is out of step with a lot of its counterparts in other jurisdictions that have already moved to or had an increased age from the beginning. The age of 16 seems a sensible place to go from the perspective of our agency. The difficulty, of course, is that our country has been significantly influenced by decisions made on the other side of the border. In the United States, the age of adulthood was set at 13 years. Canada really did nothing to combat that for years, so there's going to need to be a lot of public education if there is a move in that direction.
We do believe, though, based on what we hear from parents and based on the better understanding the families and youth coming in to us have about the harms from all of this, that education about what some of the risks are would be welcome.
There are, for example, parental controls that are made available on the different platforms that are supposed to help parents in this space. The difficulty, though, is that new apps are coming online every day, so you might learn how to use the parental controls on something like Facebook or Twitter, but you're going to have to learn all sorts of different ways to do that when it comes to another platform. That's why our organization has been advocating that we need some ground rules for all of them.
We also have to recognize that social media is but one of many types of platforms that are causing harm to children in different ways. There are the apps and platforms that children themselves are using, and then there are the apps, websites and other things in which images and information about children are being exchanged, often without their knowledge.
:
Either you or someone else touched on the fact that the off-line world can be heavily regulated for things we know are harmful to children. When that was mentioned, I thought of cigarettes, for example. We recognized a long time ago that cigarettes are certainly dangerous for all people, but especially for young people, so we banned advertising and we made it more difficult for people to access cigarettes. That's easier when it's a physical thing that exists in the real world.
Obviously, the difficulty with social media and technology is that it's digital and it's more ephemeral, if I could say that. You talked about the algorithms. They are part of the problem, obviously, because they feed on themselves and create this cycle that is often negative, and it spirals.
When you're dealing with companies that are titanic, is there a way to adjust or control this so these algorithms do not feed on themselves and keep pushing more negative and harmful things onto young people in particular who may not be able to distinguish and differentiate that as well as an adult might?
:
That's where the duty of care comes in. Knowing that your user is a young person, the algorithm should behave differently than it does with an adult person who may be in a different position to recognize when it is an algorithm that is doing things.
We are aware of several lawsuits that have been launched in the United States against AI platforms, for example, related to conversations going on with children that are completely unmoderated and unregulated. In some instances, particularly with very vulnerable children, it can convince them to do things they would not otherwise do. If that was happening in the real world, a person would intervene. There would be some sort of therapeutic or mental health response that would happen, but that's not happening when the conversation is happening with AI.
We have these big problems looming in terms of AI. We already have the existing problems from social media, which we have not tended to yet. Therefore, from the perspective of our organization, we really want to see the Government of Canada and all provincial governments use every power at their disposal to do something about this. Yes, there are going to be difficulties along the way, but we cannot be afraid to use the tools we have to do the things we need to do to protect our citizens, particularly those who are very vulnerable, like children.
:
I find it very interesting that we are discussing age limit laws that have been adopted in other countries. I admit that, before starting this study, I was quite readily in favour of the idea of prohibiting youth under the age of 16 from having access to social media. The more we talk about it today, the more I think that, ultimately, it may not be such a good idea after all.
We could draw a parallel with other things that have been banned for young people, like cigarettes and cigarette advertising. We know that cigarettes are entirely bad for them. The same goes for drugs and alcohol, which they should avoid altogether—although for adults, it may be different when it comes to alcohol—but social media is not necessarily entirely bad. We are facing this situation because we did not react early enough or take charge early enough. Now the idea is to implement measures that may simply be a knee-jerk reaction to the panic that has set in.
I would like to ask the representatives from ACREA whether we should legislate access to social media or the age at which people can access it, or should we instead put all possible resources into educating and improving digital literacy? We could target the young people who are consuming social media as well as those who are creating content.
More and more young content creators are entering this uncharted territory, where there are very few rules to follow. Isn't this where we should be focusing our energy, as opposed to coercion, say?
:
We do not have a position on age or age legislation. We would need to consult our members, as we do not want to misrepresent their views. However, there is a nuance to be made. Cigarettes are entirely bad, and you cannot only partially indulge in them.
Social media, on the other hand, can be consumed in bits and pieces. There are different ways to consume social media. It may be appropriate, even beneficial, for young people to have access to certain educational and positive content published on social media—because there is plenty of that kind of content out there. However, it may not be appropriate to subject them to an unregulated algorithm that is entirely controlled by the platforms.
Perhaps there is an age when it would be appropriate to allow young people to interact, but only in writing. It may be appropriate to make videos at a certain age, but not to allow comments. There are a variety of ways to answer the question, which is not limited to asking at what age young people should be allowed to use social media.
What do you think?
:
We cannot divorce the misinformation and disinformation spreading online from the acts of violent extremism we are seeing on our streets. We've heard from our national security apparatus, from CSIS, the RCMP and other stakeholders that increasingly, individuals, specifically youth, are being driven to extremism or being radicalized online.
There are numerous reports of individuals who were radicalized online now facing criminal charges in this country. Many of them are young members of our society. We've seen groups such as the Goyim Defense League, 764, the Maniac Murder Cult and terrorist organizations like Samidoun and others continuing to spread misinformation and disinformation online. The impact of this on Canadian society should not be taken lightly. This is a national security threat.
You have individuals hearing disinformation online. For instance, the gentleman who stabbed a woman at an Ottawa supermarket was an individual who was radicalized online in his own online echo chamber. He became increasingly radicalized, and the result was that a Canadian citizen almost died.
There's been a lot of talk today about sexploitation. When you have groups like 764, a nihilistic group intending to create the downfall of western society, actively targeting youth online and encouraging youth to distribute and produce child pornography, those are threats to our national security.
We've seen increasingly through our monitoring of these extremist threats that they are growing online. Their pervasiveness is growing. The multiplicity of the platforms they are using to spread radical ideology online is increasing by the day, and we are behind the times. That is why our second recommendation is digital literacy for Canadian youth.
The train has left the station. Legislation is fantastic, and setting age limits is fantastic. However, we need to ensure that these youth aren't flying blind into a storm and that they are equipped with the tools to actively confront the misinformation and disinformation they're seeing.
You asked about the connectivity with the Jewish community. Over the past two years, we've seen an increase of 124% in anti-Semitism. We're dealing with an endemic of online radicalization and a national crisis of anti-Semitism.
In our 2024 audit, we pointed to the use of AI to disseminate hateful content. There were images of Holocaust survivors on roller coasters at concentration camps and images of Anne Frank using a ballpoint pen, which is a Holocaust conspiracy theory. All of these were generated in AI.
Anti-Semitism, online radicalization, the misuse of social media and the nefarious acts of social media influencers are inseparable in today's society.
:
Thank you very much, Madam Chair.
Good morning, everyone.
I want to come back to Mr. Champoux's analogy about flies. It struck a chord with me, but I want to take it in the opposite direction.
You said that there are many people with good intentions who share interesting content online. However, there are still people who act more like supporters of Mr. Trump's MAGA movement and who are followed by millions of people.
Can we put something in place to attract more good flies, as Mr. Champoux said, who will serve as a counterweight to these people?
To echo Ms. Labrosse‑Héroux's comments about the web, namely that the web responds to the web, I think that what you are describing is indeed a very real situation. However, we should not ignore the current reality, which is that within social media ecosystems, there is a kind of natural immune system that is not perfect, but it does exist.
If someone presents or creates content that is deeply misleading, inaccurate, or even harmful, there are a lot of good flies, so to speak. These are people who will respond and who, generally speaking, will bring a great deal of credibility to their responses. It's not perfect, and we don't claim otherwise, but we shouldn't imagine that we don't exist in reality, that we are not capable of responding, of rectifying things, of creating an immune response that is still somewhat effective.
:
If I may, I'd like to add some comments.
It's really up to the person consuming social media to do the necessary checks. If you see a video on TikTok about some news story, for example, look at the first five comments, and that's often where you'll see the first warning signs. If the content was generated by AI, someone will report it. If it's fake news, someone will write that it's not a real source.
We can certainly do a better job of identifying misinformation. However, we also need to raise awareness among social media users in general, so that they know they need to find a way to verify information. This may be less obvious for young people, but it's a bit like what happened on Facebook ten years ago when we started seeing fake news. We saw that certain generations were completely missing the point, meaning they weren't able to distinguish real information from fake information.
:
Okay. Thank you very much.
Mr. Morisset or Ms. Labrosse‑Héroux, one of you also spoke about the freedom and power that our online interactions generate, directly in relation to algorithms. I also heard earlier that, ultimately, each organization manages its own algorithms.
Are algorithms truly democratic, presenting things to us based on our habits, or is there someone behind the scenes who has programmed them to suggest things for us to follow that would be to their advantage?
:
We're seeing increasingly young Canadians who are influenced by the various forms of what is called violent extremism—political violent extremism, religious violent extremism, ideological violent extremism. In this current case, it's political violent extremism that seems to have been the impetus for the attack.
What we need to look at for the purpose of this study is how these individuals are being radicalized. For a young individual to be targeted online, to be indoctrinated, is something we've never had to grapple with before as a society, and it is a national security threat because, as you correctly pointed out, the ramifications are severe.
For far too long, our security apparatus has been ringing the alarm bells about the need for a whole-of-society approach to confront violent extremism as it is permeating the online environment.
Sadly, that is just one of dozens of examples across Canada of an individual being radicalized online, and but for the good work of our national security apparatus and our law enforcement, we would be seeing far more casualties.
I'd like to go back to our second recommendation, which is digital literacy.
We can equip our youth to confront the information. Perhaps it's Parliament's purview that spreading misinformation or disinformation is not illegal, but we can prepare youth to be less susceptible to the ramifications of it.
If content is going to be awful but lawful, that doesn't mean that it's not impactful, and our youth are especially vulnerable to the ramifications of such content, so there's a need for us to ensure that Canadian youth are prepared for the information that they're witnessing online.
It's not so much about policing the information that's being spread online. It's about preparing youth so that they're not vulnerable to it and won't be indoctrinated in a negative way.
I'd also like to point out our third recommendation, which involved holding Canada to the commitments it made recently as a result of the meeting of the G7 interior and security ministers. There's work to be done by working with these platforms to root out the unlawful content that is online. Right now it's far too easy for harmful content to continue to exist online in perpetuity. We need to work to ensure that we're removing the awful content and preparing our youth to handle the awful but lawful content.
:
I really appreciate the conversation today on the importance of digital literacy and educating children and our general population, because people are so unaware of the dangers of how our social media apps work and the algorithms on everything. However, I think there is a solution if we look at the broader picture. Certain levels need certain solutions.
If we look at individuals accessing the Internet, period—and I've been a part of discussions around sexually explicit material in the adult industry and children accessing it—and all of these issues around children accessing the Internet, the question is, how do we protect children and how do we protect the majority of children.
By singling out certain social media apps or certain adult industry players, we are not protecting the majority of children. I think we need to look at solutions that are very pragmatic, not solutions that are going to favour one particular industry.
When I look at what's going on all over the globe right now, I see age verification popping up in numerous places. It is seen as successful, but as the reports start to come out, you're going to see that it's like whack-a-mole with a lot of these sites. There are so many of them, and you might think that you're getting Meta sites, but then other sites are going to pop up.
I think the issue is, and you're looking at what's happening in Australia—
:
Thank you, Madam Chair.
Thank you very much to all the witnesses. This is a fascinating discussion. I always come in with a certain idea. By the time the discussion is halfway through, I have a bunch more questions.
It seems to me that when we're talking about age verification, and we're talking about comparisons with things like alcohol or gambling, things that are considered mostly bad, part of the challenge here, when we're talking about age, is that we have age limits on those things because of what the understanding is: At what point in someone's life are they able to perceive the risk and understand the risk of the thing they are undertaking? This is my concern with social media. At one level, we can talk about the content that is online, but at the other level, we need to talk about the algorithm itself.
[Translation]
I think Mr. Morisset said something similar, namely that it is not always the content, but rather the algorithm that can be dangerous for children. Perhaps there is a way to share content without the algorithm.
[English]
For me, sometimes the idea of sharing content is actually not what concerns me as much as the algorithm itself. We've talked about radicalization. We've talked about the idea of explicit images being put in front of people who are looking for them. We know that's how the algorithm works. I tend to think there's a very big difference between how a television works and how social media works. It is driven by negative response as much as it is by positive response, and the proximity between, as we mentioned as well, comments, which can be really, really harmful for young people.
Monique St. Germain, maybe you can speak to some of the nuances between content and the algorithm itself. Is there a way for us to actually regulate the algorithm?
If anybody else wants to speak about it, that would be great too.
:
Certain types of content, such as child sexual abuse material, everyone agrees will be harmful. No one seems to be disputing that this ought not to be online. That's simple. Where it gets more complicated is when we're talking about content that's perhaps encouraging self-harm or encouraging or contributing to the rise in eating disorders and those sorts of things. That's where we start getting into this trickier space of when it crosses the line. That's where the algorithms can really contribute to the harm. They are pushing more and more of this type of content as opposed to more of the balanced type of exposure you might get if you were truly getting results coming up based on a search that was talking about both sides of the equation.
The algorithms themselves and the way in which they are used to push content, particularly when the user is a young person, that's one thing. That's perhaps an area that could be tackled through regulation. Underpinning all of that, of course, is that we need to know who the young users are, not who they are but that the user is a young user. There are many different ways to go about that, but realistically, because of the way the Internet works, there needs to be a multipronged approach. There can't just be one approach. Device-level is one thing, but there's also the individual websites and the individual platforms, particularly the ones that deliver content that we know young people should not be exposed to, such as explicit pornography and those types of things. Maybe you need to have heightened expectations and requirements in terms of who can access that content and why.
What we're going to start seeing—what we're already seeing—is that in the criminal court context, a lot of younger and younger people are getting involved in the criminal justice system. They are telling the courts they were looking at porn and they had been since they were very young. It's driving a lot of other behaviours. Maybe we can't completely pin it on that—we can't say this causes that—but we can certainly see enough correlations out there that we have to start doing something.
:
I may have something of interest to share with the committee.
One of my friends, who is also a fellow content creator, Sylvain Duclos, is a high school math teacher in Lévis. He started a TikTok channel to offer homework help and explain math concepts. Of course, his target audience was high school students, so they were not 18 years old.
I think this is a very good initiative. We should try to encourage this idea and promote it. It should not be blocked in a cavalier, overzealous manner out of concern for protecting young people.
A few months ago, TikTok launched a separate algorithm with a separate news feed called Steam. This algorithm can identify educational content in the fields of science, technology, engineering and mathematics, and then, when someone decides to follow that feed, it redirects them to content similar to that of Sylvain Duclos.
Another friend of mine, Michelle Houde, is a health professional. She creates sex education content that is, at times, perfectly suited and appropriate for teenagers. There is social value there, too.
I am concerned that by casting too wide a net, by wanting to protect children from sexual content, this type of content may also be blocked. I believe that the platforms themselves are the algorithm experts; therefore, they are the best people to talk to.
However, I would simply ask the committee not to assume, out of a desire to protect young people from risky content, that there is no good content for them on social media.
:
The potential solutions are really interesting. I am learning a great deal from this discussion.
Right now, we're talking about responsible and positive use of social media. TikTok is generally perceived as the big bad guy, because it's likely the most widely used social media platform. It may have an undeserved bad reputation, but it is often the platform with the most problematic content, so to speak. However, knowing that there are initiatives like the ones we've just discussed, I believe we need to protect them. We need to take that into account in our recommendations.
I personally think we should include recommendations from today's meeting in the report, as they will shed light on the idea of age verification for access to social media. This could simply involve restricting activities, as Mr. Morisset said earlier, to allow access to content, but not necessarily conversations.
However, in a context where TikTok offers help with math homework, it is a bit tricky to prohibit young people from asking questions. It's a complex situation. If we decide to legislate on this, we have our work cut out for us.
To return to the subject of algorithms, Mr. Morisset, you say that we need to talk to the platforms. Believe me, when we talk to the platforms and mention the word “algorithms”, they draw the shutters and turn out the lights, and there is nothing we can do about it. The conversation is over and people are on the defensive. So, we also need to be able to have this conversation with everyone.
Ms. Labrosse‑Héroux, when you spoke earlier about consuming culture, you said that content creators and influencers are also vehicles of culture. We also need to take them into account when we talk about the need to promote Quebec culture.
The discussion is focusing more on regulation, but how could we use the carrot rather than the stick?
What could we do to encourage content creators to share more artistic and cultural content, such as songs, videos, films and series produced by Quebeckers?
Is there a way to help content creators and convince them to do more to promote our culture?
:
That's a very good question.
The timing is perfect, because this afternoon the Coalition for the Diversity of Cultural Expressions, or CDCE, launched its call for projects needing funding for content creation. This is the first call for projects targeting content creators. Obviously, this project will fund whatever falls within the CDCE's definition of culture, and artists are part of that.
We could also redefine what culture is. For the CDCE, cuisine may not be a cultural element. However, it may actually be culture. These are avenues for reflection to be considered later.
Promoting our culture is already very well established, at least among Quebec content creators. I will speak about the field I am familiar with. There is a strong sense of nationalism and protection of Quebec Canadian culture among francophone content creators, and even when it comes to speaking French. This is already something that is very deeply rooted.
The more official institutions recognize that content creation is part of culture, the more there will be. So there is still a lot of room to welcome content creators, both in Quebec and in Canada. Our market is very small compared to the United States and France, for example, where content creation is 10 times more prevalent than here. So it will come with time.
The more government institutions encourage us to create, the more creators will do so.
:
There are a lot of reports that we receive through Cybertip.ca. Of course, we know from our work with police that this is a significant issue. There are the individual offenders who target children online in different ways, but there are also groups of what appear to be organized criminal networks operating offshore that are targeting young people. In many instances, they're targeting young males. They pretend to be female and they manipulate them into some sort of a sexual chat where the male child in Canada believes that they are talking to a female peer. As soon as an image or video is created or shared, the person on the other side will turn around and start threatening that they're going to release to friends and family and everyone else.
A lot of times what ends up happening is that the connection happens on a mainstream social media site like Instagram, and then the conversation is moved into a private channel like Snapchat. In fact, Snapchat is implicated in a lot of the reports that are made to Cybertip. Some recent research that we released last week, where half the children who had responded—all of whom were in Canada and had experienced some form of online harm—implicated Snapchat in that. Not all of those kids, of course, were manipulated or part of a criminal network, but this is a big part of what we know is happening.
We also know that the types of tactics that are being used are very scary. They are very aggressive, and the young people who come into our organization for assistance are terrified in terms of what is going to happen next, etc. A lot of these kids are trying to manage these situations on their own, and they only reach out for help when they feel that they have no other options. Some kids talk to their parents, and then their parents come in.
We know that, for people who are going into law enforcement, it's very challenging for law enforcement to investigate these types of scenarios when they come from an outside organized criminal network. There are all sorts of barriers in place for the police in terms of getting the up-to-the-minute technical information that they need. Obviously, the investigation of online crimes is a lot more complicated than it is when you're just going out and interviewing witnesses in the real world. You need the co-operation of the other countries. There have been some developments on that front.
Canada recently signed on to a cybercrime treaty in Hanoi, Vietnam, that a number of other countries signed on to. That should help in terms of the exchange of information. Still, this is a problem that is facing not just Canadian young people. It is facing young people in many developed nations, because a lot of this is driven towards financial extortion. They're out to make money. It's just like we used to see happening to senior citizens, and that's still happening. Young people are seen as a target. Young people in our countries have money, and criminal networks know they do.
Mr. McKelvey, it is your time, my friend. I see you there waiting patiently to answer a question.
You obviously have some thoughts on all of this. My particular question revolves around the age at which kids are able to assess the risks of social media, knowing that, of course, there are some great examples of math education online through social media but also that the algorithm will favour content potentially very harmful to kids.
Do you have any thoughts on this? Is there an age appropriateness or a time at which these risks can be understood and perceived appropriately?
:
There are two components to unpack there.
One is an understanding of how algorithms work and what steps we've made or not made in Canada to do that. One point to emphasize is the difficulty in assessing how any algorithmic or AI system works. It is a consistent concern of the research community to effectively describe what algorithms or AI systems are doing.
The second point is the distinction between knowing exactly or guessing that a system is working on or targeting one group. That's an important distinction for me versus calls to limit or ban access to social media for certain groups. That requires a fairly invasive privacy or data collection regime to know versus guessing.
If you're looking at more age appropriate design and saying you think, as a platform, that this is a youth viewing it so you're going to take measures to ensure you're recommending good content or have a different strategy for recommendations, we don't actually know what those strategies are because platforms don't usually disclose how that works.
That would be a different measure: not necessarily banning but making sure platforms have certain design criteria when they think they're in situations where there's a probability they're interacting with a youth or a minor.
:
Mr. McKelvey, I'm sorry you waited until the end. I hope you had enough chances to convey your thoughts in this meeting. However, that is all the time we have for questions today.
I really appreciate all of the witnesses joining us today.
If there's anything you didn't get on the record, or if there's anything that occurs to you later and you have data to share, please send it in an email to our clerk. We can include it in our report. Our analysts and our members will all see that information, so there is another opportunity; you're not done here. We'd really love to hear more from you, if there's more you have to say.
Witnesses, you're free to leave the meeting. We're not going to change to in camera or anything, but your part here is done. We really appreciate your time.
With that, we're going to move into a discussion of committee business.
We have a new . We had invited our , who had agreed to be here with us on December 8, which is next Monday. We could still invite the new minister to come and testify. I doubt we'd get a very fulsome testimony from him, as he'll only have been in the role for a short time. However, I'm leaving it to members for a conversation about what we'd like to do.
The clerk has also advised that he can try to fill that time with witnesses on this study instead.
Then, I think we should talk about whether we want to continue this study beyond the four meetings we've already laid out.
I have Mr. Champoux first.
:
I think it would be good for us to invite the new . We are supposed to meet again on Monday, but that might be too soon.
However, I still think it would be good to see whether Mr. Miller is available to meet with us here in committee. Of course, it would be good for him to have time to review his new files a bit. However, I am certain that he is already up to speed on the subjects we are discussing here.
My first choice would be to meet with Minister Miller next Monday. If that is not possible, I would like to discuss the possibility of adding meetings to this study, which seems to have generated quite a bit of interest. I know that several groups have asked to be heard as witnesses.
If the minister can come see us next week, I would like us to discuss adding a number of meetings after the holiday break. I got the impression that this was more or less unanimous, but we can discuss it.
:
I can appreciate the question around the . I know that the minister was invited for December 8 and originally Minister Guilbeault had said yes. He had accepted that invitation, which we were very pleased with. Now there's been a change in who the minister is. The position still remains the same. The file still remains the same. I think we can continue to extend the invitation to the position, which happens to be held by a new minister now, , and see what he says.
The other question is that it seems to be somewhat assumed, I'll say, that this study is going to continue, but the members at this table actually have not come to that conclusion.
According to the motion of study it was a four-meeting study, which will come to an end this Wednesday. There is another conversation that needs to be had and that is with regard to the study currently being undertaken and whether or not members at this table wish to extend it, and if so, for how long.
I don't want there to be assumptions and would like it to be explicitly defined, please.
:
Incidentally, I completely agree with Mrs. Thomas that we have two issues to discuss. No one has actually concluded that we will continue with this study. That is what I have expressed, and I thought that others were in agreement. However, we have not concluded that we will continue with the study. We also need to have this discussion.
The first question we need to answer is whether we are going ahead with the idea of having the come on Monday. We also need to ask ourselves whether we would still want to have department officials come if the minister does not.
Personally, I would like us to maintain the decision to invite the to appear. If the minister is unable to come, I would prefer to wait until after the holidays to receive the minister and the officials rather than receiving only the officials.
If we want to resolve the issue of the invitation to appear concerning the minister, I would like us to extend the invitation to him. Let's proceed as if will be with us next Monday, as planned. If that is not possible, I would then postpone the meeting with the minister and the officials until 2026.
:
Thank you very much, Chair.
Respectfully, I have a question about that. I understand that this is a really important study. The thing with studies like this is that they can extend for quite some time. It's hard to know when the natural cut-off point is because you do hear such interesting testimony and it all does seem very relevant and important, so I give you that, for sure.
My honourable Bloc colleague put forward a number of witnesses he wanted to hear from. As of Wednesday, he will have heard from most, if not all, of them, I believe. There are other witnesses, of course, who were suggested by the Liberals and the Conservatives.
I can see extending this study by two more meetings. I have a hard time seeing it go beyond that, just because I know there are other topics we want to look at as well.