Evidence of meeting #15 for Canadian Heritage in the 45th Parliament, 1st session. (The original version is on Parliament’s site, as are the minutes.) The winning word was parents.

A recording is available from Parliament.

On the agenda

Members speaking

Before the committee

Clark  Professor, Ivey Business School, Western University, As an Individual
Polzin Holman  Chief Executive Officer, Little Warriors

3:30 p.m.

Conservative

The Vice-Chair Conservative Rachael Thomas

I call this meeting to order.

Welcome to meeting number 15 of the Standing Committee on Canadian Heritage.

I believe that we all know the guidelines with regard to our earpieces. Please don't just throw them down. There's a nice little sticker. If you could just put them there in order to protect the ears of the interpreters, that would be wonderful.

Today, we have two witnesses joining us. We have Mr. Daniel Clark and Ms. Wanda Polzin Holman, who is joining us virtually from Edmonton, I believe.

In just a moment, I will give each of you an opportunity to give five-minute opening remarks, and then we will go into questions from members. During that period of time, we start with questions from the Conservatives, then the Liberals and then the Bloc Québécois. We'll continue to rotate through.

As you know, the study we are discussing today—and this is our first meeting on this study—has to do with the impact of social media, and in particular influencers on social media, on those under the age of 18. We look forward to hearing from those experts who are with us today and learning what you have to explore with us.

With that, I will hand the opportunity to speak over to Mr. Clark.

You have the floor for five minutes.

Daniel Clark Professor, Ivey Business School, Western University, As an Individual

Madam Chair and committee members, I would like to thank you for the opportunity to testify before you today.

I've been studying the ethics of children in social media production for the last three years, with research published in the Journal of Business Ethics, a top-ranked journal, entitled “The Child Labor in Social Media: Kidfluencers, Ethics of Care, and Exploitation”. This work is necessary, as a recent Harris poll found that 29% of children from eight to 12 aspire to a career on YouTube—more than any other career—and are at risk of economic exploitation, consent violations, privacy loss and other harms.

There are two concepts in this field. Sharenting, a portmanteau of “share“ and “parenting”, is where parents build social media channels around their children for other parents as the audience. “Kidfluencing” is where children influence other children or adults through their own social media personality. These concepts differ in the child's role and audience, but they share many ethical concerns.

Using the UN Convention on the Rights of the Child as our framework, our research suggests that the following children's rights are at risk.

First, there's the right to consent. These channels are owned and operated by parents. Children participating through parental caveat is a constant risk.

There's the right to privacy. Even moderately successful channels can expose children to millions of strangers. Their everyday follies, embarrassments and charms exist forever in social media, potentially haunting them for life.

There's freedom from economic exploitation. These channels can generate thousands to millions monthly from the child's involvement. Parents should benefit proportionally, but there's no guarantee that their children are.

There's the right to education. Being a social media star requires significant time. Kidfluencing may seem like play, but it's work, like acting or performance. It is not uncommon for kidfluencers to be home-schooled. Where is the time for school, play and sleep?

There's freedom from harm. Parents may put children at physical risk creating compelling content. There's also the risk of child predators forming parasocial relationships with child influencers. Recently, a study found that 95% of adult influencers had been subject to stalking behaviours, and 40% felt fear as a result. Unfortunately, we don't have such data about child influencers.

There's freedom of expression. Children are brand ambassadors endorsing products and expressing opinions often not their own. When the child is 25, their 15-year-old opinions might prevent them from getting a job or otherwise committing to a position.

Over the past year, I've been interviewed numerous times about this paper. I'm always asked, “What can we do about it?”

Anything in the sphere of child welfare is deferred to the decision-making of the parents—except that, when parents are generating income, sometimes significant income, some parents' decision-making may be compromised. The platforms these videos appear on, such as Instagram, TikTok and YouTube, earned $11 billion in advertising to children in 2022. By limiting account ownership to the age of majority, they place an onus on the parents to ensure that their children are protected. They set guidance about what is and is not allowed on their platforms—violence, gratuitous nudity or sex, etc.—but this is clearly not enough to ensure that online child actors are free from exploitation.

That means the federal government may have a role to responsibly regulate this practice. We need to protect these children. Internationally, other jurisdictions have made some progress toward protecting children in these arrangements and their future earnings. It's time Canada also took action.

We need to protect these children from the negative impacts that may arise from the parasocial elements of global exposure through social media. We're all aware of the harm that befell child actors like Macaulay Culkin, Gary Coleman, Shirley Temple and countless others through exploitation. While it's debatable that there is even a need for kidfluencing in any capacity, if it is allowed in this country, then children who are the subject of this social media enterprise deserve as much protection and recourse from the harms as our law can provide.

Thank you, Madam Chair. I look forward to your questions.

3:35 p.m.

Conservative

The Vice-Chair Conservative Rachael Thomas

Excellent. Thank you very much, Mr. Clark.

We will now go to Wanda Polzin Holman for five minutes.

Wanda Polzin Holman Chief Executive Officer, Little Warriors

Good afternoon, Madam Chair and committee members.

My name is Dr. Wanda Polzin Holman, and I'm the CEO of Little Warriors.

Little Warriors is a national charitable organization, and we have been recognized through numerous scientific and clinical journals as being a leader in the field of child sexual abuse awareness, prevention, advocacy and evidence-based treatment. I've been involved with Little Warriors for over eight years, including a previous role as clinical director. I'm clinically trained and have obtained a master's degree and a doctorate. I am currently a registered clinical social worker.

I appreciate the opportunity to share Little Warriors' perspectives with the standing committee. On behalf of the children and families we serve, I'm very appreciative of the committee for undertaking this important study. It is indeed an area requiring further understanding and actions.

There are some key issues that we have observed at Little Warriors with regard to children and adolescents, which I would like to highlight.

First and foremost, as a result of social media influencers, we are witnessing significant deterioration with regard to mental health issues, including an overall increase in levels of stress, cyber-bullying, suicidality, anxiety, depression, self-concept concerns and radicalization of gender bias. Also, there are concerns related to sextortion and online grooming, as well as luring of our children. This happens both in plain sight—as we are all on social media—and in very subtle ways through gaming platforms and social media, which parents and educators may not always be apprised of.

Families, even those who do their best to ensure proper controls on devices, are very concerned about their children's online experiences. We have seen this first-hand at Little Warriors when treating children and adolescents who have come to the hands of predators.

We understand that there are ongoing issues related to inappropriate content, online interactions with unsafe individuals and algorithm-driven risks. We are seeing gaps in the digital literacy of children's educators as well as the parents, and the platforms are constantly changing, which is concerning for us all.

Additionally, there are issues regarding loss of privacy that children and teens do not always comprehend, and there are concerns related to children's digital footprints. These could obviously have long-lasting negative implications for them.

There are gaps in legislation, deterrence and penalties regarding online and in-person harm, and access to children by potential predators across geographical borders. Overall, at Little Warriors we are concerned about child exploitation and sexual abuse, and about the lack of clear and consistent sentencing and regulations. As Canadians, we seem to understand that there are controls required for other aspects relating to children's safety, but we have yet to address social media content harms.

In light of these concerns, I'm hopeful that this review will result in decisive action to protect children and to uphold accountability. Specifically, first, ensure that survivor centre supports, including prevention programs such as Little Warriors' Prevent It! program, are included in new policy measures to support schools, charities and other in-person and online community organizations to expand prevention and support resources.

Second, review sentencing gaps and issues of deterrence. We have witnessed child sexual abuse offenders being released into the community with warnings, only to be found reoffending a short time later. Protecting children must take precedence over the rights of offenders who perpetrate abuse.

Third, legislate stronger, more consistent sentencing provisions for offences related to the possession, access and distribution of child sexual abuse and exploitation materials.

Fourth, make concessions for individuals to donate to charitable organizations and to financially support organizations such as Little Warriors that invest in prevention efforts and work with survivors.

The work of this committee is a defining moment for Canada to act with moral clarity and to ensure better safeguards to protect vulnerable children, online and in person.

I appreciate your time today and look forward to questions.

3:40 p.m.

Conservative

The Vice-Chair Conservative Rachael Thomas

Thank you very much for your time.

I will go to our first member of Parliament to ask questions today, and that is Mr. Waugh.

Mr. Waugh, you have the floor for six minutes.

3:40 p.m.

Conservative

Kevin Waugh Conservative Saskatoon South, SK

Thank you, Madam Chair.

Thank you, Dr. Polzin Holman and Professor Clark, for being here.

My first question is for Little Warriors.

Wanda, I think the public is seeing increased occurrences of exploitation—and this is a word that we've talked a lot about here in the House of Commons this week. We're seeing it at all levels, but can you give us some stats on children? You made four or five points, but do you have any stats that maybe you could share with us here today about the exploitation of children online?

3:40 p.m.

Chief Executive Officer, Little Warriors

Wanda Polzin Holman

I certainly do. I'm also happy to provide more statistics afterwards. Some of the recent statistics that I'm aware of are specifically with regard to the sextortion of children and youth. Since 2020, there's been an 80% increase in reported sextortion cases, and victim demographics are most often youth aged 12 to 17.

In addition, we also know that Cybertip, for example, has reported a 150% increase with regard to sextortion and online luring between June 2022 and August 2023. Those are the most recent statistics that I'm aware of.

3:45 p.m.

Conservative

Kevin Waugh Conservative Saskatoon South, SK

You also mentioned gaps in legislation, and I would like you to talk a little bit about that, if you don't mind. We've talked about it a little bit here in Parliament, but you brought it up and you talked about the gaps in legislation, mainly the sentencing gaps that we've seen over the last number of years.

Just bring us up to date on what you are seeing in the Edmonton and Alberta areas, and the sentencing gaps that could be the job of parliamentarians when we do go through some legislation.

3:45 p.m.

Chief Executive Officer, Little Warriors

Wanda Polzin Holman

Well, I know that there have been ongoing issues that have been brought up with regard to the online harms act, and I understand that the reason we're coming together is to explore some pieces related to that.

We know that there are similar things that are needed to limit the amount of child sex exploitation. We need some regulations to support schools as well as parents in their understanding, as well as understanding across communities.

We know that people, from our perspective at Little Warriors, have not been held accountable in the way that they need to be. The deterrence is very minimal at this point. We have children who come to us for child sexual abuse treatment as a result of being harmed online and sexually abused and exploited. Many times, the offenders and perpetrators are released without serving any time or having any consequences that relate specifically to the crime. What I mean by that is that the children and teens come to us and require intensive supports and treatment for what has happened to them. Very often, the perpetrators are released into the public, sometimes with notifications, and they are reoffending.

We've had several situations at Little Warriors where this has happened, and there is just not enough deterrence in place for them to stop what they're doing. It's very difficult to continue to follow offenders and perpetrators online as a result of the ongoing changes that are happening, and the ways that they're doing it through gaming platforms and social media platforms for children as young as seven or eight years old.

3:45 p.m.

Conservative

Kevin Waugh Conservative Saskatoon South, SK

I come from Saskatchewan. There isn't a week that goes by when we don't hear—whether it's through city police in Saskatoon, Regina, Prince Albert, Moose Jaw or other locations, or through the RCMP—that they have convicted somebody. We never hear what happens after the conviction.

It was interesting to hear when you were talking about it, because those who have been exploited.... What recommendations would you give to this committee, then, about Internet safety? In my city, we have a whole department at the Saskatoon Police Service actually designated to look for perpetrators who are online.

3:45 p.m.

Chief Executive Officer, Little Warriors

Wanda Polzin Holman

I think it's a really important question. I think that, in terms of recommendations of safety, the public needs to understand exactly what the numbers are, what is happening and the number of online offenders who are present who cross international borders and have very little deterrence from reaching out to children.

In Edmonton, we had a very unfortunate case that I'm sure everyone is aware of that crossed into the United States, and unfortunately that particular child and her family have forever been changed. Fortunately, the offender was charged in the United States, but we've had other children whose offenders were in Canada and were released very early or were released and have fled the country, or had other situations where there was very little ability to follow up on their sentencing. That's very important, as well as supporting children and families to understand the complexities and changes that are happening with regard to safety measures online.

3:50 p.m.

Conservative

The Vice-Chair Conservative Rachael Thomas

Thank you, Dr. Polzin Holman.

The next person with the mic is MP Al Soud.

You have the floor for six minutes.

Fares Al Soud Liberal Mississauga Centre, ON

Thank you, Madam Chair.

Thank you to our witnesses for being with us today. It is greatly appreciated.

Social media platforms and influencer culture now play a defining role in the lives of children and adolescents. They shape everything from their self-image to their social interactions. I am part of a generation that has very directly seen and experienced the online and digital environment. It's not just on social media; it's in video games as well, specifically in lobbies. I'm also part of a generation that has notoriously found ways around age verification processes. I think that's much of what I'd like to discuss today.

Professor Clark, you have been an associate professor of entrepreneurship at the Ivey Business School since July 2025. Your current research focuses on the cognition and decision-making of entrepreneurs. You made reference to an article earlier called “The Child Labor in Social Media: Kidfluencers, Ethics of Care, and Exploitation”.

You are cited in Western News saying, “Consent isn't a one-time event; it must be continuous, informed and freely given. For kidfluencers, let's be real, it isn't”. I'm curious. Given that children cannot provide ongoing informed consent, what safeguards do you believe platforms or governments should require to ensure that minors' images, data and labour are not exploited in influencer environments?

3:50 p.m.

Professor, Ivey Business School, Western University, As an Individual

Daniel Clark

That is a great question. You're right. The simple fact is that there is no consent for very young children. At the very least, there is maybe assent in that you know that the child is not doing it truly against their wishes, but they can't consent to all the implications that come with it.

To be perfectly honest, this is one of my arguments toward the fact that there is a date, probably somewhere around the age of 14 to 16, when young people can take back control of their digital identities, but before that, I do not see the benefits of allowing children to post and feature in social media content to a wide audience. Two things enhance the risk here: the amount of time they spend making content and, more importantly for what you're pointing to, the amount of exposure they get. You get past a thousand people. You get up to the millions and tens of millions and hundreds of millions of exposures. You are magnifying the risk infinitely, and no child has the capacity to understand what they're consenting to when you're talking about those large numbers.

My own six-year-old struggles with the difference between five and five million, so I can't imagine too many other kids really understand how big the exposure is.

Fares Al Soud Liberal Mississauga Centre, ON

Thank you for that.

Growing up, I, too, was quite interested in the YouTube space. My father, at the time and to this day, was very reluctant at the idea of seeing me join or engage in YouTube in any way, shape or form. In hindsight, it made perfect sense. I'm not particularly talented. I'm not a great musician in any way, shape or form, but it did ultimately stand to benefit me significantly.

In your view, who is currently benefiting from this gap, and who should be held accountable for protecting children from being commercialized online? What policy mechanisms do you think might help us do that?

3:50 p.m.

Professor, Ivey Business School, Western University, As an Individual

Daniel Clark

The number one beneficiary in this space is the platforms. There's no halfway about it. These are massive companies, making billions in revenue, specifically in the advertising from child content and the advertising to children. The fact that there is no control over this is a giant mistake. The fact that we've been asking them to police themselves is a massive mistake, and it is not in anybody's best interest to do so.

Beyond that, the primary beneficiary financially is the parents. If you are under the age of 12, you cannot have a YouTube account and you cannot have a TikTok account. Your point about age verification is taken. However, if you want to get paid from those things, you certainly can't get paid through PayPal or the other mechanisms at that age. The kids, then, are employees of their parents, and anything that's happening to them as a result of that is because their parents are putting them in that position to be their employees. The protections ultimately should fall on the parents or, as a proxy, on us as a society.

Fares Al Soud Liberal Mississauga Centre, ON

That same article states that “each kidfluencing venture is a privately owned enterprise and the 'employees' in question are minors”. Given these indicators, how do you believe platforms and regulators distinguish between legitimate participation in social media and ventures that are exploitative, and what concrete steps do you believe we can take to prevent children from being put in those situations?

I recall that a couple of weeks ago we had Meta here, and Meta made reference to the idea of the app stores taking on that burden of essentially ensuring age verification. Do you believe that might be a venue to be explored?

3:55 p.m.

Professor, Ivey Business School, Western University, As an Individual

Daniel Clark

The age verification component is valuable when it talks about downloading the app and who's watching social media. However, when it comes to putting content on social media, we effectively punt this to the parents. We say that if the parents are okay with this video—with this production—because they're the ones who have to own the account, that's effectively their responsibility.

Honestly, I don't think age verification is really going to help us here, beyond a point. Ultimately, we have a responsibility to say that there's harm being done, that this harm is being done irrespective of the de facto age, and that people are profiting from that. That is a broader, wider, more societal problem.

3:55 p.m.

Conservative

The Vice-Chair Conservative Rachael Thomas

Thank you very much.

We will now go to Mr. Champoux for six minutes.

Martin Champoux Bloc Drummond, QC

Thank you, Madam Chair.

I will start by not thanking Mr. Al Soud for making us feel a bit like dinosaurs when he said he belonged to the YouTube generation. Ours was the vinyl, 8-track cassette, camcorder and Super 8 camera generation, so a huge thank you for reminding us we belong to different generations. That said, I am happy there are different generations around the table to talk about matters affecting everyone.

Mr. Clark, I am going to continue the discussion you started with my colleague Mr. Al Soud about potential regulations on user age and age verification.

Australia has passed legislation banning social media for young people under the age of 16. In your opinion, is this an applicable solution? Could we use it as inspiration? Is it effective? Would those who truly intend to use their children as moneymakers find it all too easy to get around as you said?

3:55 p.m.

Professor, Ivey Business School, Western University, As an Individual

Daniel Clark

I think that's a great point, and I think what's happening in Australia is a good step.

We can talk about how people might get around these regulations, but that's a deliberate act. You have to want to get around them. You have to be willing to falsify. You have to be willing to obfuscate. You have to be willing to take a proactive act to break the rules here.

I'd rather there be rules in place that are imperfect than to have nothing in place. I think you could go straight to bans, or you could have usurious fines, or you could.... There are lots of ways to go about this that I think would, at the very least, reduce the harm.

While I think eliminating the harm is impossible—and I think your point there is very well taken, and the same thing with Mr. Al Soud—I'm happy to live in a world of harm reduction right now, because there's none right now.

Martin Champoux Bloc Drummond, QC

Will the platforms find ways to circumvent regulations, since they are usually quite good at doing this as soon as regulations are put in place? Will they succeed in circumventing the regulations or legislation that may be adopted? How could we enforce the law? We could be somewhat cynical and think that, no matter what we do, the platforms will always adapt and find ways to circumvent any measures, make and abide by their own rules, to some extent.

3:55 p.m.

Professor, Ivey Business School, Western University, As an Individual

Daniel Clark

Yes, I think there is a real risk that the platforms will allow loopholes to exist. Until you tell them that they have to close that loophole, they will say, “Oh, do you know what? They didn't say anything about that. We're going to leave that one open.” Absolutely, that is always a possibility.

I don't believe the platforms here are good actors. They are not thinking about the best interests of the people who are creating content for their platforms. They are thinking, primarily, about the other side. They're thinking about users. They want to facilitate use, watching, viewing and advertising as much as humanly possible.

I wish I were an expert on age verification technologies—I'm not. We need better and independent age verification technologies because, as long as we allow the platforms to be in charge of this, it is in their best interest to be bad at it. They don't want to keep people off the platforms, either as content producers or as watchers. That's their audience, so you're absolutely right.

Martin Champoux Bloc Drummond, QC

Thank you, Mr. Clark.

Ms. Polzin Holman, I want to say I was interested in your observations on the effects and dangers on mental health.

I am going to ask you essentially the same question. Do you think that copying Australia's model could be a solution, by prohibiting access to social media and this kind of online content based on age as much as possible? Australia set the age at 16 years, but it will be up to us to decide whether we want to set it at 14, 16 or 18 years of age. In your opinion, what potential impacts could such regulations have?

4 p.m.

Chief Executive Officer, Little Warriors

Wanda Polzin Holman

Thank you very much for the question.

I agree with Mr. Clark. I think the idea, as it relates to what Australia is doing, is a good first step—having some rules in place versus having nothing. Looking at harm prevention is very important, specifically as it relates to the issues that are coming up related to mental health as well as child exploitation.

Platforms may try to circumvent any laws that are attempted to be put in place, but, certainly, I think this would limit and address issues of non-intentional consent that's happening with children. We are seeing this, as I mentioned previously, with seven- and eight-year-olds, who are simply clicking the “yes” button—“Yes, I'm over the age of 18.” There's no way to verify otherwise, and it's putting them at enormous harm from online predators.