Evidence of meeting #18 for Canadian Heritage in the 45th Parliament, 1st session. (The original version is on Parliament’s site, as are the minutes.) The winning word was young.

A video is available from Parliament.

On the agenda

Members speaking

Before the committee

Morin  Full Professor, UNESCO Chair in the Prevention of Violent Radicalization and Extremism, Université de Sherbrooke, as an individual
Bridgman  Director, Media Ecosystem Observatory
Cooper  Vice-President, Data and Partnerships, Mental Health Research Canada
Paul  Director, Tech Transparency Project
Carignan  Full Professor, UNESCO Chair in the Prevention of Violent Radicalization and Extremism, Université de Sherbrooke, as an individual

The Chair Liberal Lisa Hepfner

I call this meeting to order.

Welcome to meeting number 18 of the Standing Committee on Canadian Heritage.

Before we begin, I ask our two in-person participants to look for the green card in front of you. There are guidelines and measures in place to help prevent audio feedback incidents and protect the health and safety of all participants, including the interpreters. There is a QR code on that card, as well, if you need further instruction.

Pursuant to the routine motion adopted by the committee, I can confirm that all witnesses have completed the required connection tests in advance of this meeting. We do have some witnesses online today.

Welcome. Please wait until I recognize you by name before you speak. All comments should be addressed through the chair.

Pursuant to Standing Order 108(2) and the motion adopted by this committee on Wednesday November 5, 2025, the committee is meeting to study the effects of influencers and social media content on children and adolescents.

With us today David Morin, full professor and UNESCO Chair in the Prevention of Violent Radicalization and Extremism, Université do Sherbrooke. From the Media Ecosystem Observatory, we have Aengus Bridgman, director. Online, we have Michael Cooper, vice-president of data and partnerships from Mental Health Research Canada. We also have Katie Paul, director of the Tech Transparency Project.

Welcome.

I will note that we have another witness joining us at 5:30, Marie-Eve Carignan, also from UNESCO. We will give her five minutes to speak when she arrives at 5:30.

Starting now, each delegation, each witness, has five minutes to give some opening remarks.

We'll start with you, Mr. Morin. You have five minutes, starting now. You have the floor.

David Morin Full Professor, UNESCO Chair in the Prevention of Violent Radicalization and Extremism, Université de Sherbrooke, as an individual

Thank you very much, Madam Chair.

Thank you for inviting me and giving me the opportunity to speak to you today about a rather specific aspect of social media, namely the link between exposure to hateful content and violent extremism, one of the dark sides of social media.

My daughter would be very upset with me for not starting by noting that social media has many virtues. Overall, it is often very helpful and great for young people. However, today I’m going to talk to you specifically about one aspect, namely the link between social media and violent extremism.

I will start with three very recent examples in Canada.

The first is the arrest of a teenager in Nova Scotia who was charged with child pornography, among other things. He was part of what’s now called “nihilist extremism”, which glorifies violence and cruelty by using references or codes related, among other things, to Nazism and jihadism. That teenager also belonged to an online movement called group 764, which recruits young people to commit violent acts, including mutilation and suicide.

I’m mentioning this example because, obviously, the 764 movement recruits a lot of people on digital social networks, and these individuals are getting younger and younger.

The second example is the arrest of a young jihadist this summer in Montreal. Radicalized online in favour of the Israeli‑Palestinian conflict, he pledged allegiance to the Islamic State and was preparing to commit a violent act. It reminds us that the virtual caliphate, that of the Islamic State, and online communities play an important role in this terrorist organization.

The third example is that of Patrick Gordon MacDonald, alias the “Dark Foreigner.” He was sentenced to prison for charges of terrorism and hate propaganda. He was promoting a violent far‑right ideology for the neo‑Nazi accelerationist group Atomwaffen Division. Here too, the Atomwaffen Division was an extremely active group online, which has also been added to the Canadian list of terrorist entities. This reminds us that, long before many other groups in the United States, the far right understood the enormous potential of social media to spread its extremist messages.

I’ll talk to you very quickly about the Internet today, digital social networks and violent extremism. What are the current trends?

I would like to emphasize three points.

First, it should be remembered that social media today knows how to exploit periods of polarization and attempts to recruit people by targeting younger and younger individuals. There is therefore a trend toward younger people becoming radicalized through the Internet in an increasingly short period of time.

Next, it’s important to know that mainstream platforms, where we find radical but nonviolent content, are being used as a gateway to then direct young people toward much more violent content on different platforms. That’s an important point.

Finally, and I want to stress this point, today, video games with online connectivity features are being increasingly used to ultimately try to recruit young people into all sorts of violent extremism. This last element obviously relates to the issue of generative artificial intelligence, which will multiply the possibilities for these extremist groups to radicalize young people.

I wanted to talk to you today about the results of systematic reviews on the potential effects on young people of online exposure to hate. What does the evidence say? It says that exposure to extremist content online today does indeed seem to be linked to the adoption of radical attitudes, regardless of the type of media in question. Exposure to extremist content online also seems to be linked to the adoption of extremist behaviour, not only in the virtual world but also in real life. It’s important to note that. Finally, I would like to add that exposure to hateful content on the Internet is not the only factor. We must also consider the other factors in an individual’s life that may lead to radicalization, such as personal crises, mental health issues, belonging to a radical group, etc.

Indeed, the evidence reminds us today that there are repercussions on social attitudes when people are exposed to hate speech. It increases negative attitudes toward targeted groups; it decreases general positive attitudes; and it has potential effects on mental health, and societal consequences on trust between social groups, aggressive behaviour or the normalization of violence.

I will note certain elements. According to Statistics Canada, in 2022, 71% of young Canadians aged 15 to 24 reported having seen hateful content online in the previous 12 months compared to 49% of the general population. According to the police, more than a third of the victims of hate cybercrimes were under the age of 25. The Royal Canadian Mounted Police, the RCMP, also noted that, between April 2023 and March 2024, 25 people were charged with terrorism, and seven of those accused were minors. In that context, obviously, the status quo is not acceptable.

I repeat, it’s not necessarily about having an approach that’s solely punitive and overly restrictive. There are examples elsewhere. We can see what’s happening right now in Australia, the United Kingdom and Europe. We need to take matters into our own hands and do it in a targeted manner. This is what we stress a lot, by first placing the primary responsibility on platforms to regulate harmful online content. Next, it’s up to other actors in society to work on prevention and awareness.

In conclusion, Madam Chair, I would like to note the importance of accountability for both women and men in politics. It is their duty to make responsible statements that do not fuel the growing polarization in our society; this obviously does not prevent politicians from addressing sensitive and controversial issues and engaging in politics, since politics is all about debate.

Thank you for today’s initiative, which is undoubtedly another step on this long and winding road.

Thank you.

The Chair Liberal Lisa Hepfner

Thank you.

Mr. Bridgman, you are next. You have the floor for five minutes.

Aengus Bridgman Director, Media Ecosystem Observatory

Thank you, Madam Chair.

Thank you for the invitation to speak here today. I want to open by saying that my expertise is as a scholar of the information ecosystem and the overall information environment. I'm not an expert on children or youth. Nevertheless, I find our studies of influencers and the information environment very pertinent for this study and very pertinent for this committee.

Recently, we ran a study looking at the rise of influencers in Canada. We know now that amongst the youngest cohort we were able to survey, over four-fifths of Canadians, so 81% of youth, are getting their news typically from influencers now. They're getting their news, their political information and their entertainment content. That is the source of their political and social life. That is at its base. This has enormous repercussions for our political reality and for the training of youth in the political process.

I want to highlight two major findings from that recent influencer study that I think are particularly pertinent. The first is the way in which influencers spread and come to appear on the screens of youth here in Canada. The primary way in which influencers reach new listeners, new adherents, is through the recommendation algorithm. It is not through explicit preference. It is not through social relationships. It is through the algorithm. In your day-to-day behaviour on social media, it is the platform itself that is determining what you see, and not any intentionality. This reduction in intentionality, and the way in which particularly youth consume and think about information, is enormously important. We haven't really appreciated the consequence of it.

If we think back to 20 to 30 years ago, the way you chose to get your information and where you got your information was very much about a choice that you would make. You would go out and you would make a decision for a paper, for a TV channel or for people to talk to. It is not so today. For the youth of today, your choice is the platform. In some ways, that is determined by your social status and by your friend group. Then, once on the platform, your choices are much less important than your behaviours and your actions that you don't even know you're necessarily engaging in. That loss of intention is enormously important for political and cultural socialization.

Number two is that influencers are now central to the political conversation. They make up the majority of engagement. The majority of Canadian eyeballs that see political content online are now seeing influencer content. We have a system, a set of norms and rules, around speech, around disclosure and around transparency that grew up in an era when influencers didn't exist and when it was unimaginable that a private citizen with a telephone in their bedroom would be able to reach millions of Canadians, but that is the state we are in right now. Our regulatory approach, particularly during elections but outside elections as well, is completely unprepared and is ill-adapted to the new reality.

I have three recommendations for this study. First, this is what it is. This is not a phenomenon unique to Canada. Influencers and social media are now the primary sources of social life for youth. Any policy or approach that doesn't take adequate account of that is doomed to fail. We need to operate within that regime. We need to operate within this idea that youth like their social media. They want to continue to use it. We can better protect them, and we can better encadrer that space, but it is what it is.

Second, algorithmic discovery is the key mechanism and the key way this stuff is shared. That algorithmic discovery is not a neutral process. It is a process by which platforms have made a series of choices about what content gets amplified and shared and which influencers are seen. The idea is that they would like you to think that there is no decision, that it is some black box that has no control, but that is not the case. There are decisions behind that. That is one of the key levers available.

The last thing I want to leave you with is that, look, the line between entertainment, culture, community and political information has never been blurrier. For youth today, in their day-to-day consumption of information, politics, entertainment, culture and TikTok dances are all intermeshed and together. That creates an environment where they can become incredibly informed, but it also creates some dangers. Some of these dangers are that our media literacy training programs, the way we have taught people to consume news in this country, are completely ill-adapted for an environment where all of this is blended together. I urge this committee to reflect on and to account for that.

I'll leave it there.

The Chair Liberal Lisa Hepfner

That was perfect timing. Thank you.

We'll go online now to Michael Cooper with Mental Health Research Canada.

Mr. Cooper, you have the floor for five minutes.

Michael Cooper Vice-President, Data and Partnerships, Mental Health Research Canada

Thank you for having me here, and my apologies that I could not be there in person. I very much would have liked to be.

Again, my name is Michael Cooper. I'm the vice-president of data and partnerships here at Mental Health Research Canada. We have been funded to collect ongoing trackers of mental health indicators since 2020, as a pandemic response, and since that time we've evolved to include a number of cross-sectional issues that intersect mental health. I can share some of them here today.

Specifically, I want to share a few things I've learned about age 16 and older. We don't collect any data for anyone under the age of 16. I can speak to a bit of other research on that topic. Specifically, I want to mention that we've been tracking online gambling specifically among youth. The algorithm is showing a lot of information about that particular issue and how that's driving problematic gambling.

I also can speak a bit about screen time. One of the things we've been tracking is the volume of screen time. We've identified that for a number of youth—essentially, for anyone who consumes more than six hours of personal screen time per day—there are significant mental health implications, from anxiety to depression and to suicide ideation. We've published some reporting on that. Of course, we've seen that youth aged 16 to 24 are the group most likely to spend more than six hours a day on screen time. Therefore, they would be the ones most impacted by these indicators.

The other issue I wanted to speak about a bit is how we have tracked social media specifically. We've tracked what youth are doing on social media: what sorts of activities; cyber-bullying; what their experiences have been along the lines of FOMO, the fear of missing out; and whether or not they're experiencing issues around comparing themselves to others as well. I've put together a deck and have sent it along to the group if you're interested in asking any questions about that specifically.

I do want to speak on a few other issues that are more general around mental health and specific to social media, where we would say that we've been tracking long-term trends since the 1970s on mental health. We have indicators that have tracked this since that time. They're not clinical in nature, but they do track general mental health indicators. We did see a significant shift in about 2004 for a lot of these youth in terms of their mental health, which would have corresponded to when a lot of these smart phones would have ended up in individuals' hands. We did see another movement again in 2020, through the pandemic, and not a recovery since that time as well. I want to highlight that this is another area of research we are privy to as well.

I want to highlight the social connection aspect of it. Individuals who are more connected to their community, to family and to loved ones are far more likely to have positive mental health indicators and to seek out help. We do know that for a number of individuals, the experience they're having online through social media is shallow, and not necessarily engaging with outside individuals could be one of those reasons why their mental health is poor if they're spending so much time on social media.

The other thing I wanted to speak to very quickly is this idea of influencers. I do not track influencers. However, I am a vast consumer of research, and I know that a tremendous amount of research exists on understanding how youth process, especially, advertising. We have this from past studies by Concerned Children's Advertisers.

We have a great amount of data on this. We know that youth are not fully developed in terms of their ability to discern between informational content and selling content. We also know that it becomes especially blurry if the line is blurred. If there's not a price tag at the end of an ad, most youth would not be able to identify that it is in fact an advertisement. When I think about influencers, I'm thinking about the fact that a lot of these influencers are being used to sell products to youth, essentially circumventing a lot of these Concerned Children's Advertisers laws that we've had over a period of time.

I'm more than happy to speak to any of these topics. These would be areas that we have data and expertise in. I probably have 300 stats on these issues. I don't want to just throw numbers at you, but I can assure you that we are tracking these issues and other issues such as body dysmorphia and eating disorders. We do know that about one in four young women is experiencing a high risk of eating disorders. A lot of that does tie in to social media as well. We're seeing connections with the high social media use as well. There are a lot of very troubling statistics around what's happening in mental health as it pertains to social media for youth.

Thank you.

The Chair Liberal Lisa Hepfner

Thank you very much.

I'm coming up with lots of questions after the testimony we've had so far today.

Katie Paul, from the Tech Transparency Project, you are up next. You have five minutes starting now.

Katie Paul Director, Tech Transparency Project

Thank you so much for the opportunity to speak with you today about the impacts of social media on young people.

My name's Katie Paul, and I'm the director of the non-profit Tech Transparency Project in Washington, D.C. We are a non-partisan research organization that investigates the influence and impact of big tech on the public.

Our research has found that big tech platforms have not only amplified harm to children, but often profited in the process. Recent reports from a multi-district lawsuit in the United States revealed that big tech companies like Meta and YouTube are internally aware, based on their own research, of the potential harms of their content to children. That research was then buried by the companies so they could continue to profit from that harm.

The revelations from the lawsuit track with years of research from the Tech Transparency Project. Our investigations in 2021 and 2022 found widespread drug trafficking on Instagram that was algorithmically pushed to accounts for users under the age of 16. Meta's platform design and algorithms make it easier for kids to contact drug dealers than to log off the platform. The study found that while it takes only two clicks for a teen to find and connect with a drug dealer on Instagram, it takes five clicks to log out of the platform.

Instagram's automated technologies also undermine the company's own efforts to address drugs. For instance, while Instagram banned hashtags for popular drugs like MDMA, its search autofill recommended alternative hashtags for those drugs, driving kids directly toward dealers.

The problem isn't just platform design. Meta also directly profits from pushing drugs to users on its platform. A series of TTP investigations found that Facebook routinely approved ads pushing pill parties, alcohol, gambling and vaping, as well as extreme weight loss to kids under the age of 18.

Meta's primary business model relies on advertising. It's the company's main product, but it has little oversight and quality control. Meta does little to implement safety when it comes to ads. In July of last year, our organization published a report that found Meta has run hundreds of ads for deadly drugs like cocaine and fentanyl. These ads are not simply content posted by third parties. Meta has reviewed, approved and is profiting from these advertisements. These kinds of advertisements continue today, as was reported by the Toronto Star in a recent investigation.

The problem isn't limited to ads for drugs. In October of last year, TTP found that Meta was also running hundreds of ads for weapons, in some cases amounting to international arms trafficking. These were not ads for big box stores or local gun dealers. They were illicit ads selling ghost guns, fully automatic weapons and illegal gun parts. These ads not only help put illegal trafficked weapons into the hands of people across North America, but they also undermine the business of legitimate licensed gun dealers.

Ads for both guns and drugs follow the same pattern. They feature an image or a video of the illicit content and link to a private messaging service like Telegram or WhatsApp, which is also owned by Meta, to conduct transactions.

Meta is perhaps the most critical piece of this puzzle. These dealers buy ads from Meta to get their product in front of as many people as possible. They could not attain this reach without the help of Facebook or Instagram.

While these social media and tech companies are aware of the harms of their platforms, they don't take action to mitigate those harms until after the potential consequences have been raised. Companies like OpenAI, which is facing a major lawsuit for its AI chatbot's role in teen suicide, created a teen version of the chatbot only after it was sued by the family of Adam Raine following ChatGPT's AI chatbot providing instructions on how to make a noose and encouraging Raine to commit suicide.

In 2024, Meta launched its teen Instagram accounts, holding up the feature as a move for parents to help keep kids safe on the platform that they failed to effectively moderate. The move was largely part of a broader effort by Meta to stave off the implications of civil lawsuits and a wave of pending regulations from lawmakers in the U.S. and abroad. What Meta had pitched as new features to keep teens safe was simply a repackaging of things the company had already claimed it was implementing years earlier. TTP recently tested these accounts and found that the content Meta had claimed was barred from teens—notably graphic content and fight content—was served readily to teen accounts despite the heavily promoted claims of protections. This continues today.

As companies like Meta have come under pressure, they have funded organizations like ConnectSafely and the National PTA to ensure they launder their narrative through paid allies.

These social media companies and chatbots are among the most well resourced and technologically advanced in the world, but those profits have been built on decades of harm to children, which the companies are aware of but take no action to address unless faced with the potential of repercussions.

They have the capital and capabilities, but have proven time and again that they cannot be trusted to act in good faith. It's imperative for national governments to effectively regulate these companies for their role in profiting from the harms to the most vulnerable population.

Thank you very much.

The Chair Liberal Lisa Hepfner

Thank you.

We will now turn to members for questions, starting with Mrs. Thomas, for six minutes.

4:55 p.m.

Conservative

Rachael Thomas Conservative Lethbridge, AB

Thanks to our witnesses for taking the opportunity to be with us here today. It's much appreciated.

My first question is going to go to you, Ms. Paul.

You said that big tech companies, such as Meta, profit from these ads that are advertising illicit drugs, weapons or gambling to underage individuals. For us to have a better understanding of this, would you have examples of these ads that you could supply to the committee, so that we could see what they look like?

4:55 p.m.

Director, Tech Transparency Project

Katie Paul

Yes. I have submitted a write-up of the testimony, with lots of links to citations, including a report that has multiple slide shows on the ads with regard to drugs and weapons, as well as the teen ad account tests that we ran over a three-year period, and the ads that were submitted and approved by the platform for those.

4:55 p.m.

Conservative

Rachael Thomas Conservative Lethbridge, AB

Perfect. My apologies that I have not seen that. I was just told that it is in translation, which is why I have not yet received it. I look forward to being able to review that. Thank you so much for sending that our way, Ms. Paul.

My next question is going to Mr. Morin.

I understand you've done quite a bit of research with regard to the radicalization of young people. Last month, the CSIS director, Dan Rogers, warned about hateful ideologies, including anti-Semitism. He said that young people are being radicalized in a dangerous way, and that this has been amplified since October 7.

Can you explain what the link is between anti-Semitism and radicalization here in Canada, and what can be done about that?

4:55 p.m.

Full Professor, UNESCO Chair in the Prevention of Violent Radicalization and Extremism, Université de Sherbrooke, as an individual

David Morin

Thank you very much for this question. Can you give me half an hour to respond?

Yes, absolutely. In the ecosystems we’ve been monitoring since the attacks of October 7, 2023, we have indeed observed a convergence of hate speech directed at the Jewish community, which is associated with the Israeli government without any nuance.

You refer to influencers. I’ll avoid naming names, but there are obviously radical Islamist groups in Canada and some prominent figures in Quebec who are trying to take advantage of the feelings of injustice and anger among some young people concerning the situation in Gaza to promote a narrative that emphasizes the supposed incompatibility between Islam and western values.

Obviously, this kind of speech tends to radicalize some of our youth, which is why it’s important to have extremely nuanced political discourse. Again, I don’t want to name names, but some groups do have a storefront, are present on major social media platforms, while others are on much more alternative platforms, and they still reach a fairly significant, albeit targeted, audience. I’m not sure, if I named them, that it would necessarily resonate.

As for antisemitism, as you know, it did not originate with the attacks of October 7. Antisemitism has been present in our societies for a long time, but this type of conflict indeed contributes to reactivating it. In my opinion, we should better regulate hate speech because—as the statistics show—hate speech against the Jewish community has significantly increased in recent years. It does not seem to be weakening. It has stabilized, but not actually decreased.

I hope I’ve answered your question.

5 p.m.

Conservative

Rachael Thomas Conservative Lethbridge, AB

Thank you.

That's a good start. Can you break down further how social media platforms are used to enhance that radicalization of young people, or that pursuit of young people for the purposes of radicalization?

5 p.m.

Full Professor, UNESCO Chair in the Prevention of Violent Radicalization and Extremism, Université de Sherbrooke, as an individual

David Morin

Social media does several things. First, it obviously allows audiences to be reached and targeted, and I think my colleagues have said that well. It’s therefore possible to go into virtual spaces where we know, for example, that young people will be playing online war games, etc. That’s one example.

It’s known that they can be reached. Through Internet messaging functions, it’s possible to contact these young people, quietly ask them questions about their life experiences and their political views. Indeed, that’s where the most vulnerable individuals are identified and gradually radicalized.

We’ve seen it a lot from the Islamic State. I would say that it really invented a kind of banner that, even on digital social networks, made it possible to pledge allegiance to that group and to commit, without ever having been solicited to do so, a knife attack, a vehicle ramming, etc.

Again, social media is, among other things, one thing in the tool box of terrorist organizations. Many people go on social media and, fortunately, do not become radicalized. I wouldn’t want anyone to think otherwise. On the other hand, we see that, among young people who are being radicalized, there is indeed a very high consumption of digital social networks. There’s no doubt about it, and it’s a consensus among researchers working on issues of violent radicalization.

So it’s this ability that social media have to reach people. Obviously, there are also all the encrypted platforms that allow for the exchange of information. In addition, there are also all the sources of funding today using cryptocurrency, which make it possible to fund terrorist organizations or groups.

So it’s an extremely useful and powerful tool for terrorist organizations.

5 p.m.

Liberal

The Chair Liberal Lisa Hepfner

Thank you.

Mr. Al Soud, you're next for six minutes.

5 p.m.

Liberal

Fares Al Soud Liberal Mississauga Centre, ON

Thank you, Madam Chair.

Thank you all for those opening remarks and for being with us.

I asked about this on Monday, and I thought the response was extremely interesting. I'd like to focus a little on the parasocial relationship between influencers and young consumers. Many adolescents follow celebrities, like Drake and other top streamers, who openly promote online gambling platforms. Even when platforms claim to restrict access to minors, the content itself is watched, primarily, by youth. Based on your expertise, how concerned should we be that gambling-style content is normalizing high-risk gambling behaviour among children and teens?

I open this question up a bit. I know, Mr. Cooper, that MHRC has previously considered this impact of gambling at large. I'd welcome your thoughts.

5:05 p.m.

Vice-President, Data and Partnerships, Mental Health Research Canada

Michael Cooper

I'm happy to get started on this.

As you know, gambling is illegal for those under 18. We did not ask about that specifically, but what we can identify is that there is a tremendous amount of bending the rules. Typically, Ontario has been the one that has legalized single-game sports betting, but we're seeing that coming in from every province and, of course, it's not legal in those provinces. In some provinces you go into, they're running ads saying, “Don't gamble on Bet365, but bet on our local platforms.”

There's some great work that is coming out of UBC and the centre for gambling there. Dr. Clark is his name. He's done a lot of work looking at the neurology as to what's happening with youth and, specifically, with gambling-like activities. I would think it's along the lines of Roblox, going in there, buying a randomized loot box and, then getting some random item inside that, or going on Call of Duty or some video game and getting a randomized loot box. It's essentially the same dopamine hit you get when you get that sort of experience. You're essentially participating in a gambling-adjacent activity, and these are available at any age.

There are even reports coming in, which we're seeing in some other countries as well, where they're taking artificial currencies like Robux and are actually able to participate and gamble their currency in Robux. It's not regulated because it's not a legal currency, so there are lots of ways that organizations are getting around this.

We do know that youth are being inundated with ads for these sorts of things, and it is essentially rewiring their brains for both expectations and what they're prepared to do. It's basically, again, legalized gambling.

I'll stop now because I'm sure other providers want to answer that as well.

The Chair Liberal Lisa Hepfner

Actually, your audio cut out there for a bit, so I was going cut you off. Hopefully, we all heard the answer.

Were we good with translation? All right.

You still have several minutes, Mr. Al Soud.

Fares Al Soud Liberal Mississauga Centre, ON

Would anyone else like to add to that before I jump into my next question? It seems not.

Touching on this exact piece of loot boxes, I'm quite interested.... Do you believe Canada needs age-appropriate design rules or restrictions targeted specifically at gambling-style content, separate from traditional gambling legislation?

5:05 p.m.

Vice-President, Data and Partnerships, Mental Health Research Canada

Michael Cooper

If I could jump in there very quickly, we study 16 and up, so—

The Chair Liberal Lisa Hepfner

I'm sorry, Mr. Cooper. Your audio is not working for us. We're going to have somebody call you to try to fix that problem.

Fares Al Soud Liberal Mississauga Centre, ON

I'd be happy to jump into a separate question, if you like. Could I ask how much time I have left?

The Chair Liberal Lisa Hepfner

I paused the time for this. You have three minutes.

Fares Al Soud Liberal Mississauga Centre, ON

Fantastic.

Ms. Tessier‑Bouchard, as media focused on—