Evidence of meeting #17 for Canadian Heritage in the 45th Parliament, 1st session. (The original version is on Parliament’s site, as are the minutes.) The winning word was online.

A video is available from Parliament.

On the agenda

Members speaking

Before the committee

McKelvey  Associate Professor, Information and Communication Technology Policy, Concordia University, As an Individual
Labrosse-Héroux  General Manager, Association des créatrices et créateurs de contenu du Québec
Robertson  Director, Research and Advocacy, B'nai Brith Canada
St. Germain  General Counsel, Canadian Centre for Child Protection
Monsma Selby  Chair, Clinical Therapist and Forensic Evaluator, Connecting to Protect
Morisset  Secretary, Board of Directors, Association des créatrices et créateurs de contenu du Québec

The Chair Liberal Lisa Hepfner

I call this meeting to order. Welcome to meeting number 17 of the Standing Committee on Canadian Heritage.

Before we begin, I'd ask all in-person participants to read the guidelines written on the updated cards on your table. These are measures in place to help prevent audio incidents and to protect the health and safety of all participants, including the interpreters. You will also notice there's a QR code to a short awareness video, should you need that. Pursuant to the routine motion adopted by the committee, I can confirm that all witnesses online have completed the required connection tests in advance of this meeting. Please wait until I recognize you by name before you speak. All comments should be addressed through the chair.

Pursuant to Standing Order 108(2) and the motion adopted by the committee on Wednesday, November 5, 2025, this committee is meeting to study the effects of influencers and social media content on children and adolescents.

We have with us today, by video conference, Fenwick McKelvey, associate professor of information and communication technology policy at Concordia University.

From the Association des créatrices et créateurs de contenu du Québec, we welcome Laurence Labrosse‑Héroux and Farnell Morisset.

From B'nai Brith Canada, we have Richard Robertson, by video conference.

From the Canadian Centre for Child Protection, we have Monique St. Germain, by video conference.

Also by video conference, we have Dr. Jocelyn Monsma Selby from Connecting to Protect.

Everybody has five minutes for an opening statement.

We'll start with Fenwick McKelvey.

Go ahead, sir.

Fenwick McKelvey Associate Professor, Information and Communication Technology Policy, Concordia University, As an Individual

I am an associate professor in information and communication technology policy at Concordia University. I'm presenting along with Dr. Elizabeth Dubois, who cannot be here today, and sharing research developed at the Pol Comm Tech Lab's project on political and civic influencers.

We are presenting today as experts in digital media systems and automated media, with a growing attention into influencers, real and virtual. While our research largely attends to the broader media dynamics, we emphasize some trends necessary for understanding how these changes may impact children, and especially youth.

Research shows that children have lived through profound changes in our media systems. Twitch and TikTok, as well as games like Roblox and Fortnite, normalize and teach children to use cryptocurrencies, such that 80% of respondents in a youth survey claim to have invested in an in-game currency. AI usage as well is dominated by gen Z users, with many of our students being exposed to AI content daily.

We are just beginning to appreciate the scale of these changes. Fifty-five per cent of teens use TikTok, whereas just 22% of adults use it. The gap in media habits seems to be growing, so we need to be mindful not to panic over the change. Youth have sophisticated media habits; they are neither helpless nor without support. Parents and schools play important roles in teaching media literacy. Most parents have talked to their children about online safety.

One major challenge has been the rise of online influencers in our media system. An influencer is “a highly visible subset of digital content creators defined by their substantial following, distinctive brand persona, and patterned relationships with commercial sponsors”, according to Brooke Erin Duffy in 2020. Influencers vary in quality and reliability, but speaking as a professor and from what we've encountered in our research, I know that our students know people who have careers as influencers, and many of our students can better identify with influencers than with journalists as a career choice. Seventy-eight per cent of youth follow an influencer. Dr. Elizabeth Dubois studies the role of political influencers, a key consideration in how youth form political opinions today.

Good as their strategies may be, youth have to navigate a complex media environment that at times is adversarial. Influencers may be used intentionally to perpetuate known harms, such as cyber-bullying, negative mental and physical health impacts and disinformation, but they are not the only challenge to today's media system.

Media systems are increasingly flooded with fake, scammy and AI-generated content. The next turn in this ongoing experiment on youth will involve AI. Internet users of all kinds are also impacted by scams from unaccountable online advertising and ineffective online safety measures in many of our platforms. Reuters reports that $16 billion—10% of Meta's advertising revenue—comes from scam advertising. Being online now requires constant attention to avoid being scammed.

Influencers, AI and media systems more broadly have complex positive and negative effects, but increasingly, the problem is not connectivity, exposure or being online; it's about good habits, good supports and good choices.

Policy can improve this environment through better accountability for advertising and advertising-supported industries like social media and very large online platforms; better protection against scams, and more accountability for advertising firms and platforms that do not effectively address scams, harmful apps and false advertising on their platforms; and better standards to help influencers demonstrate their trustworthiness and reliability.

These objectives can happen through support to ensure age-appropriate design in platforms and through efforts to ensure that platforms fulfill their social mission; enforcement and scaling of existing regulation through the CRTC and the Competition Bureau to compel these regulators to be more proactive against false advertising, to combat stereotyping and to work to create good jobs for this new class of media creators—influencers; and finally, continued support for public service media and greater support for frontline workers teaching media education.

Thank you very much. That concludes my comments.

The Chair Liberal Lisa Hepfner

You didn't use the full five minutes. Thank you very much, sir.

We will go to the room now. Between the two of you,

Laurence Labrosse‑Héroux and Farnell Morrisset, on behalf of the Association des créatrices et créateurs de contenu du Québec, you have the floor for five minutes.

Laurence Labrosse-Héroux General Manager, Association des créatrices et créateurs de contenu du Québec

Good morning everyone.

We're very pleased to be here to discuss issues that directly concern our industry and our community.

I'll start by introducing myself. My name is Laurence Labrosse‑Héroux and I'm the co‑founder and general manager of the Association des créatrices et créateurs de contenu du Québec, or ACREA. I'm joined by my colleague Farnell Morisset, who is himself a content creator and a member of the board of directors. In fact, given that he creates content that is focused on politics and news, Farnell is doing a lot, in my opinion, to combat disinformation.

We're here today to represent Quebec's content creators. ACREA's mission is to bring together Quebec's content creators within a single community in order to develop unified talking points, lend credibility to the industry, get organized and increase our influence. Over the course of the year, we mainly organize small events to bring together content creators. We're also the organization that produces the Gala InfluenceCréation, the official content creator gala, which promotes content creation in Quebec. Our organization has been in existence for about two years, and it currently has almost 1,000 members.

ACREA aims to offer its members various resources to help them in their journey as content creators. We offer various online training courses, one of which focuses specifically on mental health. This fall, we also organized a 14‑unit training session, which included two full sections dealing with disinformation and public relations. We're aware of the issues affecting our community and are on top of them. We aim to ensure that our members are aware of these issues as well. Naturally, because we represent a rapidly developing industry, there's a lot that has to be done. Our community is slowly getting organized.

There's still a lot to do. We're also aware that we have a direct influence on Gen Z and Gen Alpha, who are very present on social media. That's where they get all the information that makes up their daily lives and their culture. There's a lot to be done in this industry, and we believe it's very important to make sure that content creators are aware of the influence they can have on their own audiences, particularly young people. We have seen a number of examples of abuses in this regard in recent years.

Currently, we get the sense that the web reacts to the web. That doesn't mean that things are always well regulated or well managed, but content creators, who generally specialize in specific fields, do go to the trouble of responding to abuses that can occur on the web.

We're very pleased to be here with you today to talk about such important issues. We hope we can shed some light on the reality we experience in this respect.

The Chair Liberal Lisa Hepfner

You're all very precise today.

Next, we will turn to B'nai Brith Canada and Richard Robertson.

Sir, you have the floor for five minutes, should you need five minutes.

Richard Robertson Director, Research and Advocacy, B'nai Brith Canada

Thank you, Madam Chair.

Honourable committee members, my name is Richard Robertson. I'm B'nai Brith Canada's director of research and advocacy.

B'nai Brith Canada is Canada's oldest human rights organization and the voice of Canada's grassroots Jewish community. Our organization, which was established in 1875, is dedicated to eradicating racism, anti-Semitism and hatred in all of its forms, as well as championing the rights of the marginalized.

The radicalization and indoctrination of Canadian youth online represents a dangerous threat to our national security and the continued vitality of communities across Canada. The spread of disinformation and misinformation online propagated by influencers on social media is having a devastating impact on the well-being of Canadian children and adolescents. Not only are they being victimized by the obscene content they are encountering, but our online spaces are sadly being used to inculcate them towards violent extremism.

Canada has a duty to protect its next generation from the dangers of increasingly virulent and polarized online content. As a society, we cannot afford to remain idle while our youth are subjected to the disseminations of nefarious actors and the proliferation of radical ideologies on social media and other digital platforms.

This committee, through its present study, has the opportunity to make recommendations that will provide additional protections for young Canadians as they navigate online spaces. If implemented, our recommendations will ensure that our nation is proactively confronting the risk posed by the radicalization of Canadian youth online. To assist the committee, B'nai Brith Canada presents the following three recommendations.

Our first recommendation is that the committee recommend that the Standing Committee on Public Safety and National Security commence a study on the threat of youth radicalization online in Canada.

In July 2025, the Canadian Security Intelligence Service, CSIS, published its public report for 2024, which discussed violent extremism and youth radicalization. The report indicates that Canada has seen a growing trend of youth involved in counterterrorism investigations, some of them as young as 13 years old. In its submission to the committee, B'nai Brith Canada highlighted a series of cases involving youth who were radicalized online and subsequently charged with offences ranging from terror to distributing child pornography. The transformation, in part through online radicalization, of the role of youth from victim to perpetrator, as indicated in CSIS's 2024 public report, warrants the Standing Committee on Public Safety and National Security commencing a study on the threat of youth radicalization online to assist the federal government in better understanding the issue at hand.

Our second recommendation is that the committee recommend a national program be developed to enhance the digital literacy of youth relating to the misinformation and disinformation they may encounter online.

The need for youth digital literacy in Canada has been acknowledged by multiple actors. In a 2011 report by the RCMP titled “Youth Online and at Risk: Radicalization Facilitated by the Internet”, the radicalization of youth is not described as a new phenomenon but, rather, is acknowledged for only growing in its pervasiveness. The report lists several methods to counter youth radicalization, including website helplines and reporting mechanisms, but ultimately calls for a multi-sphered approach to open dialogue and education for youth.

In the preceding decade, efforts have been made to enhance the availability of digital literacy programs for Canadian youth. However, as B'nai Brith Canada suggests in our submission, these efforts have been advanced by non-governmental actors. There exists a need for a coordinated national program. The time is now for such a program to be developed and implemented by our federal government.

Our final recommendation is that the committee recommend the Government of Canada take tangible steps to actuate on the commitments made with regard to social media and online harms in the 2025 G7 interior and security ministers communiqué. These commitments are enumerated in our submission. We urge the committee to use its report to insist that Canada uphold the obligations Canada has made to reduce online extremism and to create a safer online environment for Canada's youth.

Thank you.

The Chair Liberal Lisa Hepfner

Thank you.

Next, we'll turn to the Canadian Centre for Child Protection.

Monique St. Germain, you have five minutes.

Monique St. Germain General Counsel, Canadian Centre for Child Protection

Thank you.

My name is Monique St. Germain, and I am general counsel for the Canadian Centre for Child Protection, which is a national charity with the goal of reducing the incidence of missing and sexually exploited children.

We operate Cybertip.ca, which is Canada's tip line to report the online sexual exploitation of children.

In 2024 alone, Cybertip processed over 29,000 reports, most of which involved child sexual abuse and exploitation material, also known as CSAM. The next most common reporting category was tied to online luring or sextortion.

To tackle the explosive growth in online CSAM, we launched Project Arachnid in 2017. It is an innovative, victim-centered set of tools that targets the detection and removal of CSAM online.

Operating at scale, Project Arachnid issues roughly 10,000 requests for removal every day, and some days it's over 20,000. To date, over 67 million notices have been issued to over 1,500 service providers worldwide. It is because we operate Project Arachnid that we understand the challenges of content removal and the immense harm to children when content is not promptly removed. It is through Cybertip.ca that we hear every day from Canadian children and families impacted by something happening online.

In addition to processing those reports, in 2024 alone, we managed nearly 2,800 requests from survivors, youth and caregivers for assistance and support. This unique lens equips us to understand how children are being targeted, victimized and sextorted on the platforms they use every day.

We understand the focus of this committee to be specifically on the issue of child influencers and social media harms to children. On the issue of child influencers, while we understand that these accounts can be a source of income for the child and their family, this comes at a personal cost to the child and their safety.

The followers of these types of accounts tend to overwhelmingly be men with a sexual interest in children. In addition, the images child influencers share are often reposted in online forums and chats amongst groups of users who comment on and sexualize these children. This heightens the risk to the individual child and to children generally.

The way social media works makes it easy for those who have a sexual interest in children to not only find child social media accounts, but also to connect with like-minded individuals who share their sexual desires about children. Images of these children are then shared within these groups to fuel sexual discussions about the child. This is likely to have repercussions for the child, extending into adulthood.

Adding fuel to the fire are algorithms. Once a user of a social media platform engages with content in some way, such as liking it or sharing it on their own account, the algorithms are tuned to ensure that the user will see even more of that type of content. The algorithms effectively amplify the content within certain user groups and connect users together who may not otherwise have been connected.

The reality is that social media is focused on ways to increase user engagement, as that's what makes the company's money. To maximize engagement—and thus profits—social media companies have developed these sophisticated algorithms, which help ensure that the user sees more of the content they like.

This is a complex child safety issue. As such, we welcome measures by the federal government to tackle the company's role in this directly, as well as measures by provincial governments to tackle it through existing child welfare and labour legislation.

On the federal side, we need to regulate a duty of care on platforms with Canadian users or where they utilize content depicting Canadians. Mandating basic safety and design expectations that need to be adhered to is critical. We also need to mandate the detection and removal of known CSAM. Young people need easily understandable and readily accessible ways to have content involving them removed quickly.

Bill C-63, introduced in the last Parliament, while not expressly dealing with child influencers, would have started Canada in a positive, meaningful direction to begin tackling issues like this. It would have imposed the duty of care on companies. It contemplated the development of an age-appropriate design code for children, and it included specific measures to ensure that types of sexual content were promptly removed.

We would also add that there's an urgent need to implement age assurance tools and to increase the use of tools like project arachnid to enhance removal and to prevent the re-upload of CSAM.

Thank you for allowing us to be part of this committee's study.

The Chair Liberal Lisa Hepfner

Thank you for your evident expertise and sobering testimony.

Finally, we will go to Connecting to Protect and Dr. Jocelyn Monsma Selby.

You have the floor now, Madam, for five minutes.

Jocelyn Monsma Selby Chair, Clinical Therapist and Forensic Evaluator, Connecting to Protect

Thank you, honourable Chair, and committee, for the opportunity to be here today.

The regulation of influencers and social media content affecting children and adolescents is a rapidly evolving global issue. Canadian lawmakers must critically assess and learn from international regulatory successes and failures.

My submission is based on 44 years of clinical practice experience and several years of chairing a global working group in conjunction with the University of Calgary Faculty of Social Work not only to research but to address the harms from children accessing sexually explicit material online. Hopefully we'll have a publication that is available at the end of the first quarter in 2026.

I would like to encourage one of you to spearhead the opportunity to develop a non-partisan bill for regulation of Internet access to protect Canadian children. When child protection is framed as a universal societal priority rather than a political issue, we see the most powerful results. Do not assume that because certain countries are using certain approaches that they are the most successful options for child protection. Be aware of these approaches and the research each one has used, who is supporting it and whether they have a financial stake in what they are advocating.

Global models show success through comprehensive research and hearings, input from child safety advocates, mental health professionals, parents, tech companies, industry stakeholders, independent regulatory bodies and cross-party support.

Examples of successful approaches are the Kids Online Safety and Privacy Act and COPPA 2.0 in the U.S., the Online Safety Act in the U.K., the Digital Services Act and the audiovisual media services directive in the EU and Brazil's ECA Digital law. Other countries with independent regulators—Germany, South Africa, Korea and Spain—are all coming on [Technical difficulty—Editor]. You need to look, in turn, at all of the international laws, treaties and conventions.

A single guiding principle is article 5 of the UNCRC, the United Nations Convention on the Rights of the Child, concerning the importance of the child's evolving capacities. A child is considered to be under 18 years of age. Therefore, I question Australia's choice of 16. Is this arbitrary? Is it based on research?

Also, the International Centre for Missing and Exploited Children has developed an international model that involves over 68 countries—there might be more than that now—emphasizing the need for a non-partisan evidence-based legal framework.

As for the risks and harms, algorithms on platforms like TikTok, YouTube and Instagram quickly adapt to underage users, exposing vulnerable children and struggling adolescents to significantly more problematic and distressing content. Users behaving like eight-year-olds receive almost seven times more child-directed content than 16-year-olds. Algorithms also react to accounts behaving like struggling adolescents, which receive 30% more problematic and over 70% more distressing content than their non-struggling peers.

If you're a teen with access to OpenAI's Sora 2, you can easily generate AI videos of school shootings and other harmful, disturbing content, despite what CEO Sam Altman has claimed. A recent study from the University of Cambridge’s MRC Cognition and Brain Sciences Unit highlighted adolescence now as the stage between nine and 32 years of age as a period of heightened risk for mental health disorders. This is a much longer time span than we have ever been aware of, and that is current research that came out in October 2025.

In 2023, the IWF, Internet Watch Foundation, hashed 2,401 self-generated sexually explicit imagery and videos of three- to six-year-olds. Ninety-one per cent were of girls, and 62% of them were assessed as images shown of children in sexual poses displaying their genitals to the camera. It takes just a few clicks to find child sexual abuse material online, including AI-generated images across many platforms.

As an influencer, Andrew Tate's online presence has made him a hero to millions of young men, but his ideology is widely toxic and extremist. He is known for commercializing misogyny and encouraging followers to adopt harmful attitudes towards women.

Current Canadian laws under the Criminal Code and Alberta's Protection of Children from Sexual Exploitation Act are not sufficient. There is a need for regulation that is fit for purpose and safety by design, requiring platforms to identify, report and remove illegal and harmful content immediately.

The Chair Liberal Lisa Hepfner

Dr. Selby, I'm sorry to interrupt, but we are out of time. If you would wrap up your comments and then hopefully, we can get to more of your points later in our questioning.

3:55 p.m.

Chair, Clinical Therapist and Forensic Evaluator, Connecting to Protect

Jocelyn Monsma Selby

I will. Absolutely.

In Canada, we have PIPEDA, the Personal Information Protection and Electronic Documents Act, that is only enforceable if there is a breach of privacy. Canadians must close the gap between rapid Internet development and needed regulation, moving beyond a partisan debate to protect children and vulnerable individuals with pragmatic leadership.

Thank you for the opportunity to be here today.

The Chair Liberal Lisa Hepfner

Thank you for joining us.

We'll turn to questions now, starting with—

Martin Champoux Bloc Drummond, QC

Excuse me, Madam Chair.

The Chair Liberal Lisa Hepfner

Mr. Champoux, you have the floor.

Martin Champoux Bloc Drummond, QC

Before we start the question period, Madam Chair, I want us to set aside some time at the end of today's meeting to discuss committee business.

As we all know, a new Minister of Canadian Identity and Culture was appointed today, which throws a wrench in our plans for our break in the coming weeks. We will need to discuss that.

We also need to discuss how we see things moving forward for the current study. I know that some have expressed a desire to add meetings. Madam Chair, I want to make sure that we set aside at least 15 minutes at the end of today's meeting to discuss these issues.

4 p.m.

Liberal

The Chair Liberal Lisa Hepfner

Are there any other comments?

4 p.m.

Conservative

Rachael Thomas Conservative Lethbridge, AB

It was translated as 45 minutes. That seems to be a bit much. Conservatives would be in support of 10 minutes at the end of the meeting, if possible.

4 p.m.

Liberal

The Chair Liberal Lisa Hepfner

I understood Mr. Champoux to say 15 minutes.

4 p.m.

Conservative

Rachael Thomas Conservative Lethbridge, AB

Okay. We have a lot of witnesses, and I just want to make sure that they have an opportunity today.

Martin Champoux Bloc Drummond, QC

Ten minutes is okay.

4 p.m.

Liberal

The Chair Liberal Lisa Hepfner

I agree.

Thank you very much, Mrs. Thomas.

Now you have the floor for six minutes.

4 p.m.

Conservative

Rachael Thomas Conservative Lethbridge, AB

Thank you so much.

Ms. St. Germain, thank you so much for joining us virtually here today. I'm compelled by your testimony. I have a few quick questions for you.

First off, I'm hoping that you can dive into explaining a little bit more. You talked about the importance of duty of care with regard to platforms, and you suggested that legislation is needed in this regard. I'm hoping that you can expand on that for this committee.

4 p.m.

General Counsel, Canadian Centre for Child Protection

Monique St. Germain

Yes. To date, we don't have any specific legislation that talks about a duty of care for platforms. What we've done is defaulted under the American model, where we're not really doing anything about the companies themselves.

There was a lot of discussion leading up to the introduction of Bill C-63. Our executive director was part of a committee that was advising on which direction Canada should go in. The duty of care model is something that was adopted in the U.K. under their online safety legislation. It does appear to be something that is a step in the right direction in terms of holding the companies themselves accountable.

We certainly have very robust criminal laws in Canada to tackle when somebody is committing a criminal offence against an individual, and we use those criminal offence laws quite a bit. What we do not do, though, is use those criminal laws against companies, and there are very good reasons for that in terms of how difficult it is to prosecute a company.

I'm aware of one known instance were we did prosecute a company for their role in making available child sexual abuse material. The charges were laid in 2012, and the guilty plea was not received until 2020. For eight years, there were numerous court appearances, and there was a lot of public money that was spent in this prosecution. Now, I'm glad that we had that prosecution, but I think eight years is far too long for a resolution on something like this. Criminal law is perhaps not the best fit when it comes to companies.

4 p.m.

Conservative

Rachael Thomas Conservative Lethbridge, AB

I can appreciate that. If criminal law is not maybe the best fit when it comes to large companies, what is the best way, then? What is the best way to enforce compliance? How do we create teeth?