Evidence of meeting #19 for Canadian Heritage in the 45th Parliament, 1st session. (The original version is on Parliament’s site, as are the minutes.) The winning word was children.

A recording is available from Parliament.

On the agenda

Members speaking

Before the committee

Mason  Restorative Justice Consultant, As an Individual
Sharifi  Chief Executive Officer, Center for Exploitation Education
Pavlounis  Director, Research, Civix
Tessier-Bouchard  Editor, Les As de l'info, Les Coops de l'information
Hanson  High School Counsellor, Saskatoon Public Schools
Côté  Interim Executive Director, The Dais, Toronto Metropolitan University

The Chair Liberal Lisa Hepfner

I'm going to call this meeting to order.

Welcome to meeting number 19 of the Standing Committee on Canadian Heritage.

We don't have any in-person participants, so I'll skip that part.

Pursuant to the routine motion adopted by the committee, I can confirm that all witnesses have completed the required connection test in advance of this meeting.

Please wait until I recognize you by name before you speak. All comments should be addressed through the chair.

Pursuant to Standing Order 108(2) and the motion adopted by this committee on Wednesday, November 5, 2025, the committee is meeting to study the effects of influencers and social media content on children and adolescents.

We have six witnesses with us online today, starting with Perry Mason. Yes, in Hamilton we have a detective named Perry Mason—not a crack defence lawyer, but impressive all the same. It's good to see you again, sir.

Tiana Sharifi from the Center for Exploitation Education is here with us, as are Dimitri Pavlounis from Civix, Ève Tessier-Bouchard from Les Coops de l'information, Stacy Hanson from Saskatoon Public Schools and André Côté from The Dais at Toronto Metropolitan University.

Welcome to you all. We will give each of you five minutes for an opening statement.

We're going to start with Mr. Mason. For full disclosure, Mr. Mason is occupying my constituency office today.

Perry Mason Restorative Justice Consultant, As an Individual

I'm keeping your chair warm.

The Chair Liberal Lisa Hepfner

Very good. It's an excellent office you have there.

Sir, you have the floor for five minutes for an opening statement. You can start at any time.

3:45 p.m.

Restorative Justice Consultant, As an Individual

Perry Mason

I'd like to talk about the crisis of misunderstanding. I'm here today because I see the crisis through a unique, trifocal lens.

I'm a grandfather raising a digital native. I'm a former school resource officer who watched the digital transition hit our schools. Perhaps more importantly, my neurodivergence grants me a specific strength, which is pattern recognition. I don't just see incidents; I see the meta patterns connecting the last 30 years of youth culture. I'm here to report something simple and perhaps uncomfortable: The map we adults may be using may be wrong.

To explain the mistake I think we're making, I have to take you north. I had the distinct privilege of visiting the Cree nation in northern Quebec twice to facilitate restorative justice circles. I was even honoured to speak on CBC Cree radio. When I arrived, I had to check my assumptions. To an outsider, these communities look isolated, but when I sat in those circles and when I engaged with curiosity instead of judgment, I didn't find isolation. I found a vibrant, self-sustaining culture operating on a frequency that the south often fails to tune into. They weren't lost; they were sovereign.

This is a lesson I apply to our children. Our youth have migrated to a new territory. They have their own language, their own culture and their own economy. Just as we fail when we try to regulate cultures that we don't understand, I think we're failing to regulate the digital generation—I believe it's called gen Z—because we, I think, refuse or are unable to learn their language.

To prove how borderless this reality is, let me tell you what my grandson watches or has watched. Visualize a standard influencer vlog. There is upbeat music, slick editing and a young woman laughing while eating ice cream. Now here's the reveal: She is broadcasting from North Korea. To my generation, North Korea means a nuclear threat. To his generation, it's just another channel. He doesn't see propaganda. He sees content. He admires her not because she supports a regime but because she knows how to beat the algorithm and evade the restrictions. He respects the hacker ethic, for lack of a better term. That's the reality gap. We see a dictatorship; he sees a creator winning the game.

I guess a good term to call him and others is “digital sovereign”. We assume these kids are victims. A couple of nights ago, I sat down with my 22-year-old grandson for a two-hour, unfiltered and, to be frank, unexpected conversation, and he corrected me. He told me he hasn't failed to launch. He built an e-commerce business. He has travelled to Europe. He has lived in penthouses and sailed on yachts. He told me that he can go back to that luxury whenever he wants, but right now, the value proposition is here. He's a digital sovereign. He's physically present, but socially and economically, he's living on a plane that this committee may not even have mapped.

Here's the critical “but”. Digital sovereignty comes with a loss of consequences. In the real world, words have weight. In their world, words are noise. Slurs become punctuation. Morality becomes an algorithmic score. Because nothing seems to stick, they become uniquely vulnerable to predators.

I know this darkness personally through my relationships with Carol Todd, Amanda Todd's mother, and Leah Parsons, Rehtaeh Parsons' mother. The danger isn't that kids are wasting time. The danger is that they are being groomed in a world where the rules of real life don't apply. When a predator tells a child, “I'm the only one who gets you”, the child believes it because we have stopped trying to understand them.

My message is simple: You can't regulate a reality that you may not fully understand. If you try to legislate safety from the outside, you'll fail, perhaps. The only way to protect them is to do what I did with my grandson: drop the judgment, enter with curiosity, translate and don't punish.

Finally, I want to be transparent about how these remarks were prepared. As a neurodivergent thinker with ADHD, my mind works in patterns and curiosity, but struggles with linear culture. To prepare for today, I used artificial intelligence, not to generate my ideas, but to organize them. I supplied the lived experience; the AI supplied the executive function.

I share this for a reason. I used the tools of the digital sovereign to communicate with this committee. If a grandfather can partner with a machine to be understood, surely we can find a way to connect with the generation that lives inside it.

Thank you.

The Chair Liberal Lisa Hepfner

Thank you, sir.

We'll turn to Tiana Sharifi, from the Center for Exploitation Education.

Go ahead. You have the floor for five minutes.

Tiana Sharifi Chief Executive Officer, Center for Exploitation Education

Thank you for the opportunity to speak today.

My name is Tiana Sharifi. I'm the founder and CEO of the Center for Exploitation Education, specializing in child and youth sexual exploitation prevention. I've been a subject matter expert for over a decade. I'm a mother and I come in with a unique angle: I am in many ways an influencer myself. I've built a large online following on social media, where my educational content has reached millions of families it wouldn't have otherwise, and it has saved the lives of young people who have directly reached out for help.

Not all social media platforms are created equal, because they vary in different levels of risk. Some are proactive about child safety and want their platforms to be used for good, and others have no intention other than to monetize off of their users and consumers.

For those of us working in the anti-sexual exploitation field, the largest threats to the well-being of children online are unmistakably clear. First, influencer culture is feeding the sexualization and exploitation of youth. Second, boys are being pulled into harmful gender beliefs that normalize violence and predation. Third, grooming, sextortion and child sexual abuse materials are rising as offenders operate freely online.

I want to begin with the first. Influencers today have an enormous impact. Many create helpful content for adults, and as a whole, they're not inherently harmful. However, the problem is that the influencers children follow are not entirely appropriate for them. Young people look to influencers to understand gender norms, relationships, sexuality and even what level of objectification is considered normal, and this is inherently harmful.

Influencer culture has also normalized the commodification of people. Young girls in particular are being turned into products economically, not metaphorically. They see bodies, intimacy and personal lives monetized, teaching them that people can be bought, sold and consumed. This normalization has consequences. I have seen students being groomed into the idea of participating in OnlyFans and sugar dating. Of course, the moment they turn 18, we're just going to suddenly label that participation as consent.

At the same time, boys are learning who they're supposed to be as men, and they're learning it from some of the worst voices online. I've noticed a sharp shift in young boys when I give presentations, and I now hear rhetoric such as “girls must be submissive” and “rejection is disrespect”. These are messages straight from the manosphere, which includes incel communities, male supremacy, and anti-feminist influencers who package misogyny as empowerment for our young boys.

These ideologies are reaching our boys at the exact moment they are forming their identities. If you are a boy and you create an account on pretty much any social media platform, the content you receive is engineered to be shocking, violent and misogynistic. Boys are being shaped by these predators, misogynists and extremists telling them that aggression equals confidence and empathy equals weakness. This content is not coming from obscure corners of the Internet; it's coming from the mainstream platforms themselves.

The algorithms that are pushing sexualized content to girls and misogynistic content to boys are the algorithms that are making grooming faster, easier and more scalable for predatory people. Grooming, image-based abuse, and sextortion are skyrocketing, as we know, because offenders no longer need to seek opportunity. The platforms are creating it for them.

Many platforms use misleading language that creates a false sense of security for kids. Snapchat's “My Eyes Only” folder and Instagram's vanish mode suggest privacy, secrecy and control, when in reality, these features make exploitation easier and detection harder. Some apps are even predatory by design, encouraging kids to swipe to meet strangers or interact with AI bots that initiate sexual conversations.

We cannot keep asking children and parents to navigate a system that was never designed for their safety. The government must work with the platforms, and if a platform refuses to engage, it should lose access to the Canadian market, period.

My key recommendations include having monetary consequences for platforms that fail to comply with Canadian regulations, having straightforward and quick reporting channels with time restrictions on response times, holding liability on the platforms and having mandated minimum age requirements and enforced age verifications, to say the least. I have a number of others, but I know that I'm short on time.

The Chair Liberal Lisa Hepfner

You still had 20 seconds, but hopefully we'll get to more of those recommendations as we continue this study today.

We'll turn now to Dimitri Pavlounis from Civix.

You have five minutes. Go ahead.

Dimitri Pavlounis Director, Research, Civix

Thank you, Madam Chair and members of the committee.

My name is Dimitri Pavlounis. I'm the research director at Civix. We are a national civic education charity dedicated to building the habits and skills of citizenship.

We work primarily with K-to-12 teachers from every province and territory, providing free programs in both English and French. We are best known for our flagship student vote program, but we provide many other programs around civic engagement, civic discourse and digital media literacy.

Over the last while, you've heard about many different social challenges, each intersecting with social media in distinct ways. Some of these challenges might be best addressed through regulation; others might require support or education frameworks. Many likely need both.

While I will focus today on the need for media literacy education, I want to be clear that we don't see this as the only or even the best solution to all of these challenges. Rather, we see it as an essential part of a broader national strategy to help young Canadians navigate their online lives.

Media literacy is of course a large umbrella term, encapsulating many different concepts and competencies, but at Civix we approach digital media literacy as a necessary component of informed citizenship. Our digital literacy program teaches empirically supported skills that have been shown to significantly increase students' ability to navigate information and make more informed judgments about the content they see online. This includes discerning between true and false, but it also involves navigating all of the agenda-driven, polarizing or manipulative material that blurs the line between true and false and that reflects much of what we and young people actually encounter online.

Since the program launched, over 8,000 teachers have registered and over 4,000 have attended one of our training workshops. Today, I want to share four things we have learned that help explain the current state of digital literacy education and what we believe is needed to better support youth in the future.

First, the good news is that this is not primarily a curricular problem. While curricula could certainly be strengthened and streamlined, media literacy is already in the curriculum in every province and territory in Canada. The problem, at least when it comes to teaching information evaluation, is that just because something is in the curriculum, that does not guarantee it is prioritized or being taught effectively. As a case in point, in our study of over 2,300 students from across Canada, we found that many students are being taught methods that either don't work or backfire in practice. These findings align with research from elsewhere, including research in the U.S. context from Stanford.

I want to be clear: This is not the fault of teachers. Much of this content is brand new to them, and educators often lack support to keep up with best practices, especially within a rapidly shifting media landscape. Most importantly, many ineffective methods still routinely appear in resources directed at educators, which just muddy the waters and cause confusion.

Second, even if our focus is on addressing the disinformation problem, we cannot focus on discernment alone. Media literacy education in all forms must address the social-emotional factors that make people susceptible to misleading and harmful messaging. Information on social media doesn't exist in a vacuum, and people don't typically incorporate information into their mental models simply because they are exposed to it, but rather because it resonates with them authentically or fulfills some emotional or cognitive need.

As such, it's not enough to just teach young people how to discern credibility. They must also have authentic opportunities to reflect on how the content they see online produces meaning and how, true or not and harmful or not, it may appeal to our cognitive biases and contribute to our sense of identity or community. These essential conversations could and should happen in many spaces, including in the home and in schools, where they can occur with trusted adults in safe and structured ways that are already supported by existing curricula.

Third, no amount of media literacy education can make up for a rapidly eroding media ecosystem. National efforts grounded in evidence-based practices are essential, but these efforts are futile without a healthy information environment to support them. Social media regulation alone won't improve conditions without significant investments in accessible, high-quality information that is meaningful and relevant to the lives of Canadian youth.

Finally, I want to advocate for including the voices of young people in your consultations and decision-making. Currently, I'm supporting a student-led project in New Brunswick on AI and education. These high-schoolers, like youth across the country, care deeply about the impacts of technology on their lives. Most importantly, their lived experiences provide insight that cannot be communicated through statistics, survey results or adult voices. If we are concerned that young people are turning to social media in part because they are disillusioned with our democratic institutions or with traditional forms of expertise, inviting them into these conversations about one of the defining civic issues of their lives can go a long way toward building trust and can lead to better policy outcomes.

Thank you for your time. I look forward to your questions.

4 p.m.

Liberal

The Chair Liberal Lisa Hepfner

Thank you, sir.

C Next we have Ève Tessier‑Bouchard, editor at Les As de l'info, Les Coops de l'information.

Ms. Tessier‑Bouchard, you have five minutes.

Ève Tessier-Bouchard Editor, Les As de l'info, Les Coops de l'information

Madam Chair, honourable members of this important committee, thank you very much for inviting me to speak with you today.

I currently run Canada's only French-language daily newspaper for children, called Les As de l'info. This media outlet is aimed specifically at children aged 8 to 12 and explains the news to them every day on a web site. To date, there are exactly 26,291 children registered on our web site, which is free of charge.

We are not a social media network like the ones we are talking about today, but rather a secure community where children are protected by a username and an avatar, and where they can engage, learn, interact with each other, comment and give their opinions in all kinds of ways.

Les As de l'info is a small social media network for information that is moderated seven days a week. We ask for parental consent upon registration and foster an atmosphere of respect on the site, while encouraging the children, most of whom are still in elementary school, to express their critical thinking skills.

Our site is free, so it takes a lot of effort to finance it. However, we are determined to keep it that way because it opens the door to all curious children from all social backgrounds, which is essential. What we are trying to do is offer concepts of digital citizenship and media and information literacy, and above all, inspire children to get involved as young citizens.

By the way, they are not just the citizens of tomorrow. They may be young, but they are already members of our society and need to be included in the conversation. In fact, we conducted a Léger poll of children aged 8 to 12 in 2024: 60% of them clearly told us that they felt the government did not think enough about children before making decisions; 66% of them said they were capable of advising the government. So, take note.

We also want to instill a sense of competence in our readers, to enable them to be little pollinators of good information in their communities. Last week, you met Marie‑Ève Carignan from the UNESCO-PREV chair, and she told you about a joint study we conducted which shows that children can indeed become leaders in the fight against misinformation if they are well supported, if they feel confident, and if they understand the issues at stake.

We know that 25% of 8-year-olds say they have an account on at least one social media platform. For this age group, it is often YouTube and TikTok. In other words, it is part of their lives. Our collective concern should be this: How can we support them in discovering and using these tools, which are of their time, and how can we prevent them from becoming digitally illiterate while protecting them from negative content, conspiracy theories, or content that could lead to radicalization?

Les As de l'info believes that the best approach to dealing with the influence of social media on young people is to focus on the importance of information, education, support for children and teenagers, as well as the availability of verified content online. As we know, the media is blocked by GAFAM in our country, and this makes our task much more difficult. The discoverability of reliable content is really being hampered.

Investing in digital education, as the Coops de l'information that support Les As de l'info is doing, is not within the reach of all media outlets here, most of which are struggling financially. Is the solution to outright ban social media for young people, as some countries are doing? Do we need greater vigilance, a more restrictive security barrier at the entrance to social media networks? Should we perhaps establish age categories for accessing content? Do we need to make a significant investment in educating young people, in digital education for young people from childhood onwards?

For Les As de l'info and for me, as someone who has been working in youth content for 35 years and is a mother and grandmother, the education and skills of children and teens are key. It's bigger than just prohibition. You need to gain experience in navigation to be a good captain and to avoid pitfalls, both on the water and on the web.

Thank you.

The Chair Liberal Lisa Hepfner

Thank you.

Next we have Stacy Hanson, a high school counsellor from Saskatoon Public Schools. I can probably guess who invited you here today.

You have the floor, ma'am, for five minutes.

Stacy Hanson High School Counsellor, Saskatoon Public Schools

Thank you for the opportunity to speak today.

My name is Stacy Hanson. I'm a high school counsellor. The concerns I'm sharing today are from counsellors and restorative action program facilitators across Saskatoon public secondary schools. They reflect patterns we see regularly, not isolated incidents.

Our first concern is around targeted online harassment. In one case, a group of students manipulated a video in which they provoked a conflict, edited out their role and posted only the victim's retaliation to frame him as a racist. The online hate became so severe that he feared for his safety, carried a weapon to school and ultimately transferred schools and changed his name.

Targeted harassment also includes coordinated social exclusion. Group chats and Snapchat stories are used to incite entire peer groups against a single person or student. Students create warning pages about socially vulnerable peers, spreading accusations without evidence. The speed, reach and public nature of these attacks magnify the harm far beyond what was possible before social media. In one case, the victim of this harassment took his own life.

We also see teens impersonating other students and even teachers by creating fake accounts. These accounts post troubling images or messages that damage reputations, causing embarrassment, shame and significant mental health impacts.

Online harassment now extends far beyond traditional bullying. A common solution we recommend is for students to block the person targeting them, but teens have learned to circumvent this. They create new accounts or new group chats, or have friends add the victim back into conversations. Blocking no longer stops the behaviour. In fact, it often escalates it. As a result, young people have effectively no way to escape their harassers. It follows them 24-7, even in their own homes.

Our second concern is sexual exploitation and grooming. One 15-year-old gained more than half a million followers on a Chinese version of TikTok, and then was targeted by adults posing as executives from an adult content site. They coached her to send sexualized videos and profit-shared with her. She earned thousands per month without her family knowing. Police in Canada and in China intervened to prevent further harm.

In another case, a grade 11 student was groomed by a 35-year-old man she met on the online game Roblox. He flew from the United States to Saskatchewan to meet her while her parents were at work. When the school intervened, they learned that she fully believed that this was a loving relationship, demonstrating how easily it is that teens form bonds with people they've never met in person.

Our third concern is around sexting, sextortion and coercion. Teens share intimate images believing they will stay private, and they often don't. Images are used for manipulation or revenge after breakups. Where youth once threatened self-harm to prevent relationships from ending, many now threaten to release private photos. Girls especially experience deep shame. Students also hide explicit content in encrypted or disguised apps, making detection difficult. Families receive inconsistent guidance about when police will intervene.

Our fourth area of concern is exposure to harmful online communities. Students access content promoting self-harm, suicide, disordered eating and other high-risk behaviours. Algorithms push this content to vulnerable teens, who become drawn in quietly and privately. We saw a rise in this during COVID, and it remains a serious concern.

We also see dangerous social media trends. Recently, RAP facilitators in multiple schools reported a TikTok trend where youth cut their faces under their eyes and along their cheeks and cover the wounds with band-aids. Unlike traditional self-harm, this behaviour is meant to be visible and public.

Students are also forming emotional relationships with AI chatbots—romantic, friendship and even counselling relationships—leading to isolation, harmful advice and exposure to unfiltered content. Just this fall, a small committee and I created an infographic to highlight the many risks of these chatbots for students.

Influencer messaging is also affecting our youth. Figures like Andrew Tate have normalized misogynistic attitudes among some male students, influencing their behaviour toward peers and female teachers.

Across all these situations, students are chronically online. They stay up late on social media or gaming and maintain large networks of people they've never met. Nearly every conflict we address now has a digital component—fake accounts, impersonation, group chats, viral posts or altered images.

Our last concern is platform accountability. Harmful accounts using school names or staff and student photos are difficult to remove. Fake pages remain active for long periods, and AI-altered images circulate widely with little recourse.

Based on what we're seeing, we offer several recommendations: one, platform accountability and rapid response; two, age verification measures; three, regulation of algorithmic exposure; four, stronger protection against image-based abuse; and five, prevention of online grooming. We especially would like to have platforms detect and disrupt adult-minor contact and mandate reporting to law enforcement when grooming behaviours are detected.

These issues are widespread and deeply harmful. Without stronger protections and platform accountability, more young people will be exploited, isolated and traumatized.

Thank you for the opportunity to share what we're seeing on the front lines. I welcome your questions.

The Chair Liberal Lisa Hepfner

That's a very disturbing dispatch from the front lines. Thank you very much for that.

Finally, we have André Côté from The Dais at Toronto Metropolitan University.

You have the floor for five minutes.

André Côté Interim Executive Director, The Dais, Toronto Metropolitan University

Thank you very much, Chair.

Thanks, committee.

It's an honour to be here speaking with the other witnesses, with alarming but amazing perspectives on these issues.

My name is André Côté. I'm the executive director of The Dais. We are a think tank at Toronto Metropolitan University. We are focused on public policy and leadership developments at the intersection of technology, education and democracy.

We have been doing work on social media in Canada since back around 2018, with a major focus on kids and tech issues. This includes our survey of online harms in Canada, which is the longest-running survey of its kind and looks at social media use, the harms Canadians are experiencing and attitudes about what government should do about it.

We have a national screen break project that's focused on mobilizing for effective phone-free school policies across the country, and we have other research in areas like digital literacy, AI and children's privacy, social media labelling and AI deepfakes, misinformation and disinformation. I'll pull from some of this in my remarks.

Our survey of online harms reinforces that Canadian youth are the most significant users of social media, which is probably not really news. They are also significantly more likely to be exposed to various categories of harms, many of which we've heard about in more detail through some of the other speakers.

As a caveat to begin, our survey covers Canadians aged 16 plus, so we track young Canadians, capturing older adolescents to age 29. It's still highly relevant, I think.

Young Canadians spend far more time on online platforms than older generations, and they're more likely to use platforms like Instagram, YouTube, TikTok, Snapchat and X, whereas older Canadians are more so on Facebook and YouTube. Young Canadians are also about 50% more likely to report exposure to the worst types of online harms—or some of them. They're targeted by online harassment and hate speech and see violent content and hate speech, and intimate images are shared without their consent. We heard many more examples from Stacy and others.

They are subject to the manipulative design of these platforms, which we've heard about from other speakers. Our research finds that as AI-generated content is flooding our online spaces, deepfake labelling by platforms isn't effective, meaning that users can't tell the difference between what's real and what's fake. As a result, as we've heard, a fifth of Canadian youth aged 12 to 17 report negative mental health effects related to their online activity.

Second, I'd point out that beyond the impacts on mental health and physical health, social media is a threat to young Canadians' civic health and to Canada's democracy. Others have talked about the impacts on the information ecosystem. They are the only generation that is more likely to go to YouTube or Instagram for news than to traditional media like television or news websites, despite Meta's platforms, like Instagram, blocking news content. We also find that greater social media use is linked to greater belief in misinformation and disinformation. In short, younger Canadians are getting their media on platforms where the conversation is dominated and pushed by algorithms and influencers.

Third, Canadians are demanding that government take action to improve online safety for children and youth, but also for everybody. Our research finds that seven in 10 Canadians support government intervention to require online platforms to behave responsibly in protecting users, even where there are some trade-offs, and there's near universal support for specific types of measures.

We tested a whole bunch of them but focused on protecting children online. For instance, we found about 90% support for requiring platforms to quickly remove CSAM content and report it to police; north of 80% support for requiring platforms to develop measures for child users like parental control; and a bunch more specific things that Canadians really get behind.

From our phone-free schools work across Canada, a clear and consistent message is that it's not just the phones but the social media apps on the phones that are the challenge, driving distraction, bullying and other bad outcomes in schools. Stacy spoke to this in great detail.

I think a key point is that students, parents, teachers and school board leaders of all manners are calling for policy-makers to step in. This is out of their control. They need governments to step in. Coalitions and campaigns that are a part of safer online spaces are coming together across party lines to demand action on this.

In sum, my message would be this: Let's get on with it. The government needs to table a new online safety bill, and Parliament needs to move quickly to pass it.

We've now spent the better part of five years working through this with two bills that didn't quite make it across the finish line. The core of Bill C-63, from the previous Parliament, worked. Civil society, youth, experts and even opposition parties supported core parts around the duty of care and the transparency in parts 1 and 4 of the bill.

Holding these big tech platforms to account requires an independent regulator, in our view, such as a digital safety commission or something of that sort. The bill should include beefed-up youth protections, including age-appropriate design standards and rights to opt out from algorithmic feeds. I'm sure we could have a longer list.

Finally, the bill should include new provisions for AI, including bringing AI chatbots in scope as a regulated service and addressing deepfakes on social platforms.

The very last thing I'd say is on phone-free schools. One of the most inspiring parts of our work over the last year or so has been the engagement with youth and the extent to which they want to lead on these issues. I really liked hearing, from Dimitri, Ève and most of the other witnesses, this idea of youth really needing a voice in this. I think, to Perry's point, a lot of us don't fully understand the worlds they're living in, and we need to be respectful of their experiences and include them in this process.

Thank you very much.

The Chair Liberal Lisa Hepfner

Thank you, sir.

We'll now move to questions from members, starting with Ms. Thomas for six minutes.

4:20 p.m.

Conservative

Rachael Thomas Conservative Lethbridge, AB

Thank you so much, everyone, for being here with us today.

My first questions are going to Tiana.

I'm just curious. Toward the end of your opening remarks, you commented on recommendations that you would leave with the committee, and I know you were cut short. I invite you to repeat the ones you listed during your opening remarks and to go a little deeper into your explanation of those recommendations and why they would be so important to us. Of course, you may add any additional recommendations that you didn't have the chance to get to at that time.

4:20 p.m.

Chief Executive Officer, Center for Exploitation Education

Tiana Sharifi

I appreciate that question, especially because I definitely felt the need to get to a few that were left unmentioned.

The first is that there have to be monetary consequences for the platforms that fail to comply with Canada's safety regulations and fail to work with the government. I say that because, although they're what we call “platforms”, they're businesses and corporations. They make hundreds of millions and billions of dollars off their consumers. In any other space, we make sure we have regulations in place to keep corporations accountable to their consumers. Unfortunately, what we've seen is that it's the monetary piece that keeps these platforms accountable. In B.C., we have legislation with regard to non-consensual intimate image sharing whereby we hold companies accountable monetarily.

The second one is that there should be straightforward reporting channels, with time restrictions on response times. Reporting anything related to sexual exploitation, child sexual exploitation or sexual content should not be dealt with by an automatic or automated AI bot that decides whether or not it meets a threshold. It should be met with actual agents and individuals. There needs to be a very quick response time, especially with regard to child sexual exploitation reports, and again, there need to be consequences in place if the platforms are not meeting those needs.

Another is mandated minimum age requirements and enforced age verifications. Yes, social media is a powerful tool, but I don't believe it's appropriate at all under a certain age. I don't believe that children under the age of 16 should have access to social media platforms, because they're not made for children. The ones they are connecting with are not made for children. I know that's a contentious point for many people.

I strongly encourage algorithm protection so that minors are not exposed to harmful or inappropriate influencers and so that the content that will be put forth to them the second they sign up is not predetermined and predestined just because their identity is male or female or because of their age.

I would like to see a ban on targeted advertising to minors, because it's just a different ball game. It's not the same as watching TV or a movie and having an ad come in. I'd like to see oversight for child influencer accounts to prevent exploitation.

Also, although many of these platforms are trying to put this in place, we definitely need a default prevention of adult contact with minors. That's where that age verification piece I believe is incredibly important. This includes restricting direct messages and providing a heightened review of sexualized content for accounts with large followings by minors.

4:20 p.m.

Conservative

Rachael Thomas Conservative Lethbridge, AB

Thank you very much for that. I appreciate it.

I just want you to go a bit more into detail on that. You said “algorithm protection”. What exactly does that mean?

4:20 p.m.

Chief Executive Officer, Center for Exploitation Education

Tiana Sharifi

As it stands right now, depending on how you sign up on a platform, a lot of them ask for your gender—or assume your gender—with your age. What I've seen is that if you are a boy or a man and you sign up with a social media account, it doesn't matter what your interests are. You are fed a very specific type of content. What you're going to consume is predetermined and predestined.

That content is oftentimes shocking, violent and provocative. Then it kind of leads—especially for young boys, who aren't able to maintain much control over their consumption of content—to access that makes them vulnerable. When I say “algorithm protection”, I'm talking about these platforms not being able to predetermine an algorithm based on somebody's identity when they're signing up, particularly if they are a minor.

4:25 p.m.

Conservative

Rachael Thomas Conservative Lethbridge, AB

Thank you very much. I understand and definitely see your point there.

I want to dive a bit deeper into one more point you made, and that was with regard to increasing the minimum age a person needs to be in order to have a social media account.

Recently, we've seen Australia implement legislation that requires a person to be 16 years of age or older to hold an account. I'd be curious about your thoughts on that. You also talked about meaningful age verification, so I suppose that would be the way you'd suggest we make sure that's enforceable. Maybe you can just dive a bit deeper into that.

4:25 p.m.

Chief Executive Officer, Center for Exploitation Education

Tiana Sharifi

Absolutely.

We're seeing Denmark and Australia do that. I just don't want to see Canada left behind.

I think it's necessary because of the way these social media platforms work and the dynamics. These are companies that intentionally use their systems, and even the language they use to define the tools the platform provides.... It's harmful and it's going to result in harm.

It holds the platforms accountable when we have a minimum age. Again, not all platforms are created the same, so verification can be flexible depending on what kind of platform we're talking about. I think it would be effective.

As to age verification, people are concerned about security. They already have all the information they need. A kid showing their passport photo so they can be protected from exploitation, violence and self-harm is well worth it, when that information is already out there for these companies.

4:25 p.m.

Conservative

Rachael Thomas Conservative Lethbridge, AB

Thank you.

The Chair Liberal Lisa Hepfner

Mr. Al Soud, you have six minutes.

Fares Al Soud Liberal Mississauga Centre, ON

Thank you, Madam Chair, and thank you all for being with us today.

Ms. Tessier-Bouchard, I almost asked you a question last week before noticing that you were not at our meeting.

I am all the more pleased to be able to ask you questions today.

As you know, the goal of Les As de l'info is to give children the keys to understanding, so that they don't catastrophize scenarios that fuel anxiety.

I took the time to read a few articles, which are really very interesting. I have here a few titles:Tu ne rêves pas! Les ratons sont de plus en plus mignons!, POUR ou CONTRE les soins pour la peau à partir de 3 ans, La photo de la semaine: à Gaza, vive les mariés!.

How do you determine what is appropriate for a young audience? In your opinion, what can social media learn from your approach?

4:25 p.m.

Editor, Les As de l'info, Les Coops de l'information

Ève Tessier-Bouchard

We are working from the premise that a child who understands things is already much less anxious. What causes anxiety are all the scenarios a person can imagine based on a headline, a snippet of information they hear on the radio or in conversations between adults. When we understand the basis of a piece of information, we experience less anxiety. This has been tested and proven time and time again.

So how do we decide what to talk about? If there is something in the news that is likely to be heard repeatedly on the radio, as I was saying, or even seen on social media—we know that children use social media—then we will definitely talk about it, regardless of how serious the subject is.

We have excellent support. We have psychologists, grief experts and philosophers for children. We are careful about the words we use and the images we choose to show and not to show. So we sort through the material. However, we do not sort based on the subject matter, but rather on the angle, i.e., how we are going to approach the subject. We ask ourselves what can be filtered out so that the essentials remain and the children understand what we are talking about without having all the details, especially those that might scar or shock them.

There is no topic that we don't talk about. We sometimes make choices. If the news is very negative, we make choices. For example, if we believe that we have posted enough negative stories for the week, we will post, as you mentioned, a little raccoon here and there, a small animal, or a nice story. We try to strike a balance.

We listen to children. Sometimes, very sensitive children comment on our posts. They say that it causes them a lot of pain. It's important to know that we have moderators on the site seven days a week. We sometimes communicate with the child, and if they tell us something that we consider to be dangerous for them, we take action.

Last week, we talked about bullying. A child wrote to us in a comment saying that his bullies lived next door to him and that neither teachers nor adults were helping him. So we did the following: since his email address was linked to a school, we contacted the school, told them that a child might need help, and gave them his email address. We also wrote to the child to let him know that we weren't abandoning him. We told him that there were resources available and we shared those with him. We provide a certain amount of support.

Children learn on the site how to behave as good digital citizens and how to comment respectfully. If a child makes comments that are not respectful, we don't ban them right away. Instead, we contact them to ask if they are proud of their comment, as it does not really comply with our rules. In 99% of cases, the child will say that they didn't even know their comment was being read. That's media literacy, that's digital life education.

We let him know that his comment has indeed been read and that there are real people behind the screen reading his posts. We tell him that if he isn't nice, people might get hurt. We ask him if he wants to rewrite his comment and tell him that, if so, we will delete the old one.

It's a lot of work. Obviously, it's an investment to have two people reading all the comments. There are 110,000 page views per week and there are a lot of comments. We do this because it's absolutely essential to educate children about digital life.

I draw a parallel to not giving children access to social media before the age of 16. Let's take the example of a child who has not had access to sweets before the age of 16. If they have not been informed about the harmful effects of sugar and the health risks of eating too much sugar, they will be no better equipped at 16 than they were at 8.

For our part, we focus heavily on education and support. I think the same applies to social media. My colleagues talked about this today. Age restrictions could be put in place at the entrance to web sites, but young people are crafty. They can enter a false age or use their older brother or sister's account.

We must therefore provide them with better support, but it is also essential that we educate them at the same time. We have lost an entire generation. We must catch up and invest heavily in digital citizenship education.