Evidence of meeting #127 for Justice and Human Rights in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was platforms.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Frances Haugen  Advocate, Social Platforms Transparency and Accountability, As an Individual
Marni Panas  Canadian Certified Inclusion Professional, As an Individual
Jocelyn Monsma Selby  Chair, Clinical Therapist and Forensic Evaluator, Connecting to Protect
Andrew Clement  Professor Emeritus, Faculty of Information, University of Toronto, As an Individual
Guillaume Rousseau  Full Professor and Director, Graduate Applied State Law and Policy Programs, Université de Sherbrooke, As an Individual
Joanna Baron  Executive Director, Canadian Constitution Foundation

The Chair Liberal Lena Metlege Diab

Good morning, everyone.

I call this meeting to order.

Welcome to meeting 127 of the House of Commons Standing Committee on Justice and Human Rights.

Pursuant to Standing Order 108(2) and the motion adopted by the committee on December 2, 2024, the committee is meeting in public to continue its pre‑study of the subject matter of Bill C‑63.

Before I welcome the witnesses for the first panel, I have a few introductory remarks to make.

For those appearing in person, please use your microphone and your headset. Move it away from the microphone so that we do not give a hard time to our interpreters. It's also for their safety and health. For those in the room and those appearing virtually, please wait to be recognized by the chair.

I'm speaking French right now. The English participants should be hearing the English interpretation.

If you did not understand what I just said in French, you do not have your device turned the proper way. I would ask that you ensure that you have your device turned the right way so that you understand the language of your choice and we're not interrupted midstream.

Please mute your electronic devices.

If you are appearing virtually, unmute yourself only when you are recognized by the chair.

I will now introduce our three panellists this morning.

First, we have Frances Haugen.

She is an advocate for social platform transparency and accountability. She is appearing by video conference.

Marni Panas is a Canadian certified inclusion professional.

From Connecting to Protect, we have Jocelyn Monsma Selby, chair, clinical therapist and forensic evaluator, by video conference.

I will give each of you up to five minutes to say your introductory remarks. I understand that it's a little bit difficult, particularly if you're on your screen. When you have 30 seconds left, I will let you know. When the time is up, I will interrupt you as softly and delicately as possible, whether during your five-minute remarks or during your answers to members' questions.

I want to let you know that we have Senator Kristopher Wells with us today. He will be here for the first hour. Welcome, Senator.

I will now ask Ms. Frances Haugen to please start.

You have up to five minutes.

Frances Haugen Advocate, Social Platforms Transparency and Accountability, As an Individual

Thank you for inviting me today.

You have probably had the opportunity to hear from a lot of people about the harms of social media, so I will not repeat the laundry list again. Instead, I'd like to focus on two topics that hopefully will give context to that testimony and provide urgency for action.

First, I want to emphasize that we are profoundly underestimating the severity of social media's impact on children, due to limitations in how we observe and measure these effects. When researchers and policy-makers discuss the harmful effects of social media, they typically point to studies of teenagers documenting rates of self-harm, eating disorders and declining mental health among 16-year-olds, but these studies are echoes of the past, capturing the aftermath of social media exposure that began years earlier, typically around 12 or 13.

What's alarming is that when we talked to today's 12-year-olds and 13-year-olds, we discovered that they started on social media around age eight or nine. In 2022, 30% of American children between the ages of seven and nine were already active on social media platforms. That number is probably higher today. This creates what I call the telescope effect of our understanding of social media's impact. Like astronomers observing distant galaxies, we're always looking at information about the past: how social platforms were designed in the past, past usage patterns. This may be okay when we look at the stars, because the heavens change slowly, but when we examine the digital lives of teenagers, their rapidly changing world means that we end up continually surprised that rates of harm keep going up.

A seven-year-old is influenced and impacted in a meaningfully different way than a 13-year-old is. The children starting social media use today are doing so at ever younger ages, during even more crucial developmental periods and with even more sophisticated and engaging platforms than today's teenagers whom we're currently studying. If we don't act, we're on track to wake up in 10 years to realize that we've fundamentally altered a generation's development in ways that we failed to anticipate or prevent.

My second point concerns the emerging and under-reported threat of the rise of AI avatars and their impact on children's social development. These AI avatars are sophisticated virtual companions. They use artificial intelligence to engage in conversation, respond to emotions and build what feel like genuine relationships with users. They're designed to be always available, eternally patient and perfectly attuned to their users' interests and needs.

The leading provider of these AI avatars proudly announces that the average user—predominantly children under 18—spends two hours daily interacting with these virtual companions. This statistic should alarm us. Learning to navigate real human relationships is inherently challenging and sometimes uncomfortable. It requires compromise, patience and the ability to engage with others' interests and needs, not just our own. AI avatars, in contrast, offer a path of least resistance. They never disagree uncomfortably. They never have conflicting needs, and they never require the complex emotional labour that real friendships demand.

We need to expand our understanding of what constitutes social media. These AI-driven spaces represent a new frontier of potential harm, where the artificial ease of virtual relationships further erodes children's ability and motivation to build genuine human connections. If we don't act now to understand and regulate these technologies, we risk being blindsided by their effects, just as we were with social media platforms.

In conclusion, the problems we're seeing with social media are reflections of broader societal issues. The adults most negatively impacted by social media are often those already marginalized in our society, whether geographically, physically or economically. While in-person socialization carries real costs in terms of transportation, activities and time, online socialization appears free at first. The true cost is paid in terms of mental health, development and human connection.

Similarly, and perhaps most critically, the children most likely to become deeply enmeshed in these virtual worlds, whether traditional social media or AI-driven spaces, are likely to be our most vulnerable and marginalized youth. These are often the children with fewer opportunities for in-person social interaction, fewer resources for supervised activities and fewer adult mentors to guide them through the challenges of growing up or to provide context and support when they face online harm.

We must act now to ensure that children have appropriate and safe digital spaces, because their ability to meaningfully build relationships and connect will shape the world we all live in for decades to come.

Thank you.

The Chair Liberal Lena Metlege Diab

Thank you very much.

I will now ask Ms. Panas to please proceed.

Marni Panas Canadian Certified Inclusion Professional, As an Individual

I am Marni Panas. I use the pronouns “she” and “her”. I am a Canadian certified inclusion professional. I led the development of diversity and inclusion activities at Alberta Health Services, Canada's largest health care services provider. I am the director of DEI for one of Canada's most respected corporations, and I am the board chair for the Canadian Centre for Diversity and Inclusion.

Today, I am speaking on behalf of myself and my own experiences. I'm here to vehemently defend every Canadian's right to freedom of expression, the foundation of our democracy. However, I and millions like me do not have freedom of expression, because it is safer to be racist, homophobic, sexist and transphobic online than it is to be Black, gay, a woman or transgender online. Online hate is real hate. It descends into our streets. It endangers Canadians in real life.

In September 2021, I took the stage at a university in my hometown of Camrose, Alberta, to deliver a lecture on LGBTQ2S+ inclusion, a lecture I've delivered to thousands of students, medical professionals, and leaders around the world. While I was on stage, unbeknownst to me, a student, like many other youth who have been radicalized by online hate, was livestreaming my presentation on Facebook and several far-right online platforms. By the time I got off stage, thousands of people were commenting on my appearance, my identity and my family. The worst of the comments included threats to watch my back. My next lecture was cancelled. Police escorted me off campus for my own safety.

In March 2023, I was invited to participate on a panel celebrating International Women's Day to raise awareness for an organization in Calgary that works to protect women and children from domestic violence. Because of the many online threats of violence directed towards me, the Calgary Police Service and my employer's protective services unit had to escort me in and out of the Calgary Public Library, where the event was being held.

Last February, emboldened by the introduction of anti-trans legislation in Alberta, people harassed and threatened me and others online at levels I had never experienced before, even trying to intimidate me by contacting my employer. I'm grateful for the support of my current employer, who once again had to step in to have my back.

It is rarely the people spewing hate online who are the greatest threat, but words are never just words. It is the people who read, listen and believe in hate speech who become emboldened to act on what's been said. These words and the actions they fuel have followed me to my community, my workplace and even my doorstep. The impact of this relentless harassment for simply living my life publicly, proudly and joyfully as me has profoundly impacted my mental health, my well-being and my sense of safety where I live and work, leaving me withdrawn from the communities I cherish and leaving me wondering every time someone recognizes me on the street whether this is the moment where online hate turns to real physical violence. I feel far less safe in my community and in my country than I ever have before.

No, I don't have freedom of expression. There is a cost to being visible. There is a cost to speaking out. There is a cost to speaking before you today, knowing that this is being broadcast online. Most often, the cost just isn't worth it. The people all too often silenced are those who desperately need these online platforms the most to find community and support. This is made worse when the same platforms allow disinformation to be spread that aims to dehumanize and villainize LGBTQ2S+ people, contributing to the significant rise in anti-LGBTQ2S+ violence as highlighted by CSIS this past year.

The status quo is no longer acceptable. Platforms need to be held accountable for the hateful content they host and the disinformation they allow to spread. The federal government needs to act. We can't wait. I've been called brave, courageous and even resilient, but I'd rather simply just be safe. People have a right to freely exist without fear because of who they are and whom they love. This is needed in online spaces, too. In fact, our communities and our democracy depend on it.

Uphold freedom of expression. Pass Bill C-63, and protect us all from online harms.

Thank you.

The Chair Liberal Lena Metlege Diab

Thank you as well.

I now turn to Jocelyn Monsma Selby, please.

Dr. Jocelyn Monsma Selby Chair, Clinical Therapist and Forensic Evaluator, Connecting to Protect

Honourable Chair Diab and all member of the Standing Committee on Justice and Human Rights, thank you for the opportunity to be here today.

My first point is that, in Canada, our current legal framework addresses child sexual abuse and exploitation via the Criminal Code and the law for protection of children from sexual exploitation. However, we should not be relying on a broad duty of care by any Internet platform. There should be law requiring the identification and immediate action to report and take down illegal sexually explicit images. We need regulation that is fit for purpose and safety by design.

My second point is this. Bill C-63 reads, “reduce harms caused to persons in Canada as a result of harmful content online and ensure that the operators of social media services...respect...their duties under that Act.” This is a glitch. All Internet platforms need accountability, not just social media sites. It takes just three clicks to find child sexual abuse imagery or child sexual exploitation material on the regular Net, and this includes images generated by artificial intelligence found through accessing many, many online platforms, including the dark web. These IPAs are disguised within websites and embedded in emojis and hidden links, requiring the viewer to follow a digital pathway that can disappear as quickly as the next link is clicked on.

In 2022, the IWF found a 360% increase in self-generated child sexual abuse reports of seven-year-olds to 10-year-olds, more prevalent than non-self-generated content. This trend has continued into 2023, when the IWF hashed 2,401 self-generated sexually explicit images and videos of three-year-olds to six-year-olds. Of those images, 91% were girls showing themselves in sexual poses, displaying their genitals to the camera. It's normal for children to have curiosity, explore their bodies or experiment sexually, but that is not what the IWF found. What is shocking is the unsupervised access of children using digital devices.

My third point is with regard to guidelines respecting the protection of children in relation to regulating services and age of consent to data processing and in using social media. There is a duty to make certain content inaccessible. Caution should be used in passing regulation based on precedents set out in other countries. We need to look in turn at all the international laws, treaties and conventions. A single guiding principle is in article 5 of the UNCRC, concerning the importance of having regard for an individual child's “evolving capacities” at any moment in time in their interactions with the online world.

My fourth point is the establishment of a digital safety office of Canada, a digital safety commission and a digital safety ombudsperson. Could Canada benefit by establishing an online safety office and a children's commissioner or ombudsperson? The answer is yes, and several countries have been blazing a trail for us. These countries are part of a global online safety regulators network that aims to create a coordinated approach to online safety issues. Canada, sadly, is not at the table.

Last week, I was invited to attend a global summit in Abu Dhabi, sponsored by WeProtect and the UAE government. I was the only child protection representative from Canada, and I'm a self-funded third party voice.

I have a few final thoughts.

It took 50 years from the development of the Gutenberg Press to develop 20 million books. It took Ford 10 years to develop 10 million Model Ts. It took Playboy approximately two years to sell over a million copies each month. It took the global Internet in 1995 two years to develop 20 million users. It took Facebook 10 months to reach one million users. Today, Meta's ecosystem—including Instagram, WhatsApp and Messenger—has approximately 2.93 billion daily active users.

We need to close the gap between the rapid development and access of the Internet and needed regulation. We cannot have a continued partisan approach, lacking civility, to develop the important regulations needed to protect children and vulnerable individuals.

The Chair Liberal Lena Metlege Diab

Thank you very much.

11:15 a.m.

Chair, Clinical Therapist and Forensic Evaluator, Connecting to Protect

Dr. Jocelyn Monsma Selby

Thank you for the opportunity to appear today.

The Chair Liberal Lena Metlege Diab

You'll be able to answer questions as well.

We will start with our first six-minute round.

I will ask Ms. Ferreri to please start.

11:15 a.m.

Conservative

Michelle Ferreri Conservative Peterborough—Kawartha, ON

Thank you so much, Madam Chair.

Thank you to our witnesses.

We're talking here about one of the most serious bills, I think, that have come before Parliament, certainly in my time and many others' time. That is Bill C-63.

I want to start with you, Ms. Selby. This is on record from the Canadian Constitution Foundation:

“Bill C-63 combined things that have no reason to go together,” Van Geyn said. “The issue of the online sexual exploitation of children through pornography is urgent and serious and should not be lumped in together with the government’s controversial plans to criminalize all kinds of speech and allow for civil remedies through the Canadian Human Rights Commission for speech,” she added.

My question for you is this: Shouldn't we have a stand-alone bill or legislation that protects children from online perverts? Shouldn't that be its own legislation?

11:20 a.m.

Chair, Clinical Therapist and Forensic Evaluator, Connecting to Protect

Dr. Jocelyn Monsma Selby

We definitely need regulation to protect children from sexual exploitation online. I wouldn't use the term “online perverts”. I think there are many groomers and there are many reasons that children are exploited on the Internet. There's obviously a market for it, or it wouldn't be such a problem.

11:20 a.m.

Conservative

Michelle Ferreri Conservative Peterborough—Kawartha, ON

I guess I'd push back on that. I would certainly call those groomers “perverts”. I guess we just have a difference of language, but I see your point.

What I'm trying to ask you is this: Shouldn't we have a stand-alone policy that enforces laws to protect these children and that also ensures that social media platforms have a duty of care to ensure they're not allowing this to happen?

11:20 a.m.

Chair, Clinical Therapist and Forensic Evaluator, Connecting to Protect

Dr. Jocelyn Monsma Selby

This is where we have a glitch. I believe all Internet platforms have a duty of care. We need regulation that is what I would call “best in practice” to protect all children and vulnerable individuals on the Internet. Under the age assurance umbrella, you have numerous tools. If they are legislated to happen at the device level, everyone gets protected.

Picking and choosing just certain media sites is not the only answer here. Very few people are protected when you take that approach. You need a broader approach to all Internet providers and all Internet platforms.

11:20 a.m.

Conservative

Michelle Ferreri Conservative Peterborough—Kawartha, ON

Certainly. I think we would agree on that, for sure.

I guess what I'm trying to say is.... There's a bill that the Conservatives have. It's called Bill C-412. It was put forward by my Conservative colleague. It deals with exactly what you're saying immediately, as opposed to Bill C-63, where the Liberals have combined two separate issues that are not targeting the predators online and the sexual exploitation.

To Ms. Haugen's point, the brains of these young children are forever changed. There's not a parent out there who isn't concerned about this.

11:20 a.m.

Chair, Clinical Therapist and Forensic Evaluator, Connecting to Protect

Dr. Jocelyn Monsma Selby

I view it as child sexual abuse via digital images. There isn't a child protection expert on the planet who would agree that it's okay for children to have this kind of exposure.

11:20 a.m.

Conservative

Michelle Ferreri Conservative Peterborough—Kawartha, ON

I agree 100%. I think that's where we're going with this. Bill C-412 directly deals with this immediately, as opposed to Bill C-63, which has combined too many issues that will not hold these perpetrators, whom I will call “perverts”, to account, as well as social media platforms that, to your point exactly, need to have accountability.

Ms. Haugen, I was very interested in your testimony. It was profound. You hit a lot of nails on the head in terms of the impact social media is having on our children and exposing them too young to this. Without mandatory parental controls or algorithmic accountability in Bill C-63, how do we ensure that platforms are actually protecting children?

11:20 a.m.

Advocate, Social Platforms Transparency and Accountability, As an Individual

Frances Haugen

That's a great question. Having a positive duty of care is a really critical component of the Canadian bill, because it says you have to be actively thinking about how your product might be misused and be designing proactively for it.

Parental controls can be really powerful. They are one set of tools, but not all children have parents who understand technology well enough. Remember, most parents today didn't grow up as a 10-year-old with a smart phone, or I hope not. It means that we need to make sure there is, at a minimum, a floor or a net that is catching all children.

We also need to ask whether we should be putting the obligation on parents, when they have so much to deal with already, to also stay abreast of exactly what threat is coming from where and what setting and toggle they need to put on their phones.

11:20 a.m.

Conservative

Michelle Ferreri Conservative Peterborough—Kawartha, ON

Are you familiar with Bill C-412, which my Conservative colleague put forward?

11:25 a.m.

Advocate, Social Platforms Transparency and Accountability, As an Individual

Frances Haugen

I don't know all of the details for it.

11:25 a.m.

Conservative

Michelle Ferreri Conservative Peterborough—Kawartha, ON

I would love to direct it to you or send it to you. I think it's addressing what you're saying. It gets to the heart of the issue quickly and more efficiently than Bill C-63.

I have another question for you. How does Bill C-63 ensure that platforms understand their obligations without explicit definitions?

11:25 a.m.

Advocate, Social Platforms Transparency and Accountability, As an Individual

Frances Haugen

That's a great question.

One of the challenges when writing any kind of Internet regulation law is that technology moves very quickly. For example, right now the Europeans are suggesting things like banning addictive features. In technology, it can be really hard to define what an addictive feature is. If I say, “This is the thing you're not allowed to do”, what usually happens is that either the definition is specific enough to be easy to understand, in which case the tech companies immediately just do a slight twist and say, “Well, it's not in anymore”, or you have a situation where you write them at such a high level that you have to ask what it means to have an addictive feature.

Duty of care is a nice, flexible in-between where you say, “Hey, you need to be demonstrating proactively that you're looking out for the needs of children and designing safety by design.”

The Chair Liberal Lena Metlege Diab

Thank you very much.

We'll now move to the next six minutes, please.

Ms. Dhillon.

Anju Dhillon Liberal Dorval—Lachine—LaSalle, QC

Thank you, Madam Chair.

Thank you to all of our witnesses for being here this morning.

I'll start with Ms. Marni Panas.

Ms. Panas, I'm very sorry about what you've gone through: not being able to express yourself and also being threatened for who you are. Can I please ask you if you think that having freedom of expression includes this kind of hatred that you have faced? Please, can you elaborate a little bit?

11:25 a.m.

Canadian Certified Inclusion Professional, As an Individual

Marni Panas

I have so many privileges that I actually do okay. That's with all of my privileges. I can't imagine children and youth and people who don't have privileges like the support of my employer and the people around me. That's with all of those supports.

There is freedom of expression, but there are consequences. We all have consequences for speaking. You folks in the House of Commons can't just say anything in the House of Commons without some consequence. That has to occur online, and it has to occur in all of our spaces.

Today, again, I do not have freedom of expression. Even just visibly posting a picture of my partner and me being happy, dancing at a concert, comes at a cost. That cost is often ridicule. That cost turns into harassment. Then that cost turns into people believing the disinformation that is spread online, which leads to policies that restrict my ability to even participate fully in society. It goes so far. This is for somebody who has all of the privileges that I have.

Anju Dhillon Liberal Dorval—Lachine—LaSalle, QC

I don't know if you had the chance to hear last week's testimony when Jane Doe came. It was a very painful testimony. It was painful for all of us to hear what kind of evil can exist in this world.

These parents came. There was one parent whose child was part of the armed forces and committed suicide. They're begging. We're talking about how parents should not be held responsible, completely responsible, because there's also a duty on governments and the platforms. We know that Bill C-63 applies to all online platforms. They're begging for us to do something as quickly as possible to mitigate the damages that are already done and that could come in the future.

We keep hearing things about regulatory bodies and delays. Do you not think that, at this point, it's better to pass something rather than nothing? Nothing is perfect, but at least something can give you support. We can give you support.