Evidence of meeting #127 for Justice and Human Rights in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was platforms.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Frances Haugen  Advocate, Social Platforms Transparency and Accountability, As an Individual
Marni Panas  Canadian Certified Inclusion Professional, As an Individual
Jocelyn Monsma Selby  Chair, Clinical Therapist and Forensic Evaluator, Connecting to Protect
Andrew Clement  Professor Emeritus, Faculty of Information, University of Toronto, As an Individual
Guillaume Rousseau  Full Professor and Director, Graduate Applied State Law and Policy Programs, Université de Sherbrooke, As an Individual
Joanna Baron  Executive Director, Canadian Constitution Foundation

11:25 a.m.

Canadian Certified Inclusion Professional, As an Individual

Marni Panas

Yes, I'm not sacrificing the good for perfection. The good needs to happen now.

The fact is that if online platforms honoured their own standards of practice and the community standards they already have in place, we probably wouldn't be here. They all come out with these great standards of practice, but any time I report anybody not following those, they're ignored. We're ignored. We need something now. Lives are being lost to this.

What's important is that there are a lot of youth who find community in online platforms. That's essential when you think of rural populations and when you think of people like myself. The very first time I found somebody like me, when for the first time in my life I realized that I'm not alone and that there are other people like me, was a life-saving moment for me. That was 20 years ago, when the Internet first started. That saved my life.

We need to protect those environments for youth and people to find social connection in a healthy and meaningful way. That has been robbed from them. The impact of that is violence, death, isolation, loneliness and having to hide the most important parts of your identity. That needs to change now. We can no longer wait. Too many lives have been interrupted. Too many lives have been lost because of the harms experienced online.

Anju Dhillon Liberal Dorval—Lachine—LaSalle, QC

I'll continue with a story that just came out about an AI chatbot that encouraged a child to commit suicide. It was saying, “Come home. Come home, my king.” He was 14. He had become addicted to the chatbot because that was his only friend. I guess he didn't have many real-life friends.

Can you talk to us a little bit about how you see this going forward in terms of the addiction children have towards social media platforms, chatbots and things like that?

11:30 a.m.

Canadian Certified Inclusion Professional, As an Individual

Marni Panas

When I was growing up as a child in rural Alberta, I got really good at being alone. I got really good at keeping my secrets—the secrets of my gender identity—because that was where my secrets were safest. I know that many youth are like that even today.

Having somebody, whether real or artificial, reach out and show some attention feels good. It feels validating. Then you start to seek it out. That's where the dangers lie. That might be your only place for that validation, which then leads to significant harms and violence. That is who I'm concerned about.

Anju Dhillon Liberal Dorval—Lachine—LaSalle, QC

Thank you so much, Ms. Panas.

Ms. Haugen, I think you wanted to jump in as well. Please go ahead.

11:30 a.m.

Advocate, Social Platforms Transparency and Accountability, As an Individual

Frances Haugen

I was going to say that I think “addiction” is maybe not the exact word. People form relationships. We form relationships with those we spend the most time with. In the case of the child who killed himself, it wasn't so much that he got addicted, but he fell in love with this person he talked to every day for hours and hours. The reality is that if you had intimate conversations where you always felt safe with someone and they always validated you, you might fall in love with them too.

It wasn't that the child didn't have any friends. I worry that sometimes we look at these issues around relationships with AI and say, “Oh, that person must be so pathetic. You'd only turn to an AI if you had nothing else.” His parents didn't know anything was wrong. All they found out was what they saw on his phone afterwards. He lamented that he would never be able to live a life with this person he had fallen in love with.

The Chair Liberal Lena Metlege Diab

Thank you very much for that.

Mr. Fortin, you have the floor for six minutes.

Rhéal Fortin Bloc Rivière-du-Nord, QC

Thank you, Madam Chair.

My first question is for Ms. Panas.

I understand that you've looked at Bill C‑63, which provides for the creation of three bodies, including an ombudsperson's office and a commission.

How do you assess the effectiveness of the complaint process with those organizations?

11:30 a.m.

Canadian Certified Inclusion Professional, As an Individual

Marni Panas

Right now, we do not have a meaningful process in place at all. We can't go online. There are no supports in place to go to, so we remain silent. We'll just withdraw from the tools and the platforms if we don't find safety there.

I think not having a place to go is a real problem. When we look at the processes that will be in place, it's certainly better than leaving us to try to fend for ourselves on this complex issue. Most people with the experiences I've had don't even bother going through the platforms, and we have no other recourse.

Rhéal Fortin Bloc Rivière-du-Nord, QC

Do you think the complaint process provided for in Bill C‑63 is effective? I'm thinking of complaints made to the ombudsperson, for instance.

11:30 a.m.

Canadian Certified Inclusion Professional, As an Individual

Marni Panas

It will be certainly much more effective than what we have in place today.

Rhéal Fortin Bloc Rivière-du-Nord, QC

Ms. Haugen, could you tell us about the issue of violations of freedom of expression and privacy? We obviously agree that we need to better protect everyone, and especially our children, on digital platforms, but we must always keep in mind the problem of violations of freedom of expression. It's a bit of a juggling act, so to speak.

How do you see it? Does it go too far, does it go far enough, or should it go further?

How can we protect our freedom of expression and our right to privacy while protecting our children on digital platforms?

11:35 a.m.

Advocate, Social Platforms Transparency and Accountability, As an Individual

Frances Haugen

You have to be very careful when writing these laws. You can either write them from the perspective of saying platforms are responsible for disclosing risk and demonstrating progress to reduce risk, or you can write them to say that every instance of hate speech has a penalty. The challenge with the latter kind, where you say you have a zero-tolerance perspective—no hate speech—is that computers can't accurately and reliably identify what is okay and what is not okay. What it will mean is things like erasing trans people from the Internet, because it can't tell whether a comment is hateful. It would mean erasing religious minorities: Can you talk about religion confidently if there is a $40,000 violation for the platform?

As long as the law stays within the bounds of saying you must disclose what risks you believe exist and demonstrate your progress, that can be okay, but we need to be careful not to believe that we can erase hate from these platforms without accepting that we will also erase lots of legitimate speech, because the computers are just not that smart.

Rhéal Fortin Bloc Rivière-du-Nord, QC

I'll quickly read the definition of “intimate content” proposed in Bill C‑63:

(b) a visual recording…that falsely presents in a reasonably convincingly manner a person as being nude or exposing their sexual organs or anal region or engaged in explicit sexual activity, including a deepfake that presents a person in that manner, if it is reasonable to suspect that the person does not consent to the recording being communicated.

That seems like a rather long definition that seeks to cover a number of areas. Maybe I wouldn't have done any better. So it's not really a criticism.

Do you think that's a good definition, or should it be amended differently?

11:35 a.m.

Advocate, Social Platforms Transparency and Accountability, As an Individual

Frances Haugen

The issue of non-consensual intimate partner images—some call this “revenge porn”—is an example where we accept that the computer will get it wrong and be more aggressive. It sometimes might take down an image of a person who looks very similar to that person. It's one of these questions around whether we accept false positives and false negatives. Having a broader definition for something like that is okay if the consequence is that a little bit of pornography disappears from the Internet.

For things that are more controversial topics, that's where relying on censorship is much harder, because the scope, the complexity and the diversity of ideas are so broad, and it's much harder for computers to do than just trying to identify whether this is the same person and this is the same image as what was reported. You're matching one-to-one instead of one-to-many.

Rhéal Fortin Bloc Rivière-du-Nord, QC

When it comes to deepfakes, do you think there are any effective measures we can take to combat this problem, and if so, what are they?

11:35 a.m.

Advocate, Social Platforms Transparency and Accountability, As an Individual

Frances Haugen

This is a great example of where safety by design and having disclosure on how platforms operate are so important, because it is unlikely that we will be able to reliably identify whether an image is real or a deepfake; the computers will keep getting better and better at making these. That means we need to, instead, ask questions about whether we are weaponizing the platforms or making ourselves vulnerable to people abusing these images.

The Chair Liberal Lena Metlege Diab

Thank you.

Mr. MacGregor, go ahead, please.

Alistair MacGregor NDP Cowichan—Malahat—Langford, BC

Thank you very much, Madam Chair.

I'd like to thank all the witnesses who have joined us today to help guide the committee through this study.

Ms. Panas, I'd like to start with you.

Thank you for showing up today and explaining just how your previous experience—your life experience—makes this approach a very important thing for our committee to consider. Often, when a party comes forward with a policy idea on regulating the Internet or online spaces, the first charge levelled against policy-makers is that they're taking away freedom of speech and freedom of expression, but I think you have quite clearly explained how, by not doing anything.... The status quo is actually affecting your freedom of expression right now.

I want to talk about this concept of a public space or the public square. When we're in a room, like we are right now, everyone has an equal voice and we can all hear each other equally, but in an online space, especially on social media platforms, the platform itself is not a passive bystander. It can actively promote content, or it can actively put it down into certain corners and it can direct people to certain dark corners of the Internet.

My other committee is the public safety committee. We've been looking at how our foreign adversaries make use of online platforms to spread disinformation, and there's quite a lot of overlap with the subject matter we're dealing with today. We've had witnesses at that committee talking not only about whether we need to take a law approach or a regulatory approach, but also about trying to instill a digital literacy strategy.

Do you have any thoughts on equipping Canadians with the skills they may need to navigate the online space?

11:40 a.m.

Canadian Certified Inclusion Professional, As an Individual

Marni Panas

Thank you so much for the question and for the comments of support.

Yes, it is scary being here. It's not scary because of you folks—you folks are pretty friendly—but I know the moment I leave this space.... I know the people who are watching me right now, and what it will mean to me online. It's terrifying, quite honestly.

This is such a complex issue. The Internet is so complex. Literacy is part of it. We need a multi-faceted approach to supporting this. We need education supports, but certainly online accountability as well.

You know, when I think about literacy, it's a really interesting word. X, for example, has banned the word “cis”—“cisgender”, for example. It's a Latin word. It's essentially a biological and chemical term, often rooted in science, that has been banned because of the implications of denying transgender people's existence. That's the whole purpose.

Literacy would rely on the platforms to actually use language that is appropriate, rather than ban language, which serves to actually eliminate me from society. The people need the literacy, but the platforms have to be held accountable for ensuring proper literacy.

Alistair MacGregor NDP Cowichan—Malahat—Langford, BC

Thank you very much.

Ms. Haugen, I'd like to turn to you for my next question. We're in a kind of legislative deadlock right now in the House of Commons. There's pretty much nothing getting done in our main chamber. It's been like that since the end of September. In fact, we don't even have Bill C-63 properly before this committee. This is a prestudy. It hasn't even passed second reading.

The fact of the matter is that this Parliament is rapidly running out of runway. Bill C-63 is still a long way away from the Governor General's desk. You have just talked about how rapidly this technology is evolving. It may be that we don't actually have a proper legislative approach to this problem for another two or three years.

What are some of the things a future Parliament has to take note of? We have this draft of Bill C-63, but what are some of the other things we may need to think of in a future piece of draft legislation?

11:40 a.m.

Advocate, Social Platforms Transparency and Accountability, As an Individual

Frances Haugen

One of the reasons I'm so excited about the approach Canada took was that you guys did more rounds of citizen assemblies than anyone else in the world did. You actually had conversations. Groups of Canadians went and argued about trade-offs on how to approach the Internet. This is what came out, other than the hate speech attachments that have been added on at the end. As a result, I think the bill overall is pretty resilient. It addresses a bunch of core things that need to be addressed.

The place where I would encourage you guys to be a little more open-minded or to do a little more future-proofing would be to ensure that the concept of what is a social platform is able to evolve. For example, virtual reality is easy to laugh at right now. If you go walk around Meta Horizon Worlds, which is Facebook's virtual reality space, it's overwhelmingly full of people under the age of 12. Age assurance is important for that reason. Those who talk to AI chatbots are overwhelmingly under the age of 18.

Think a little more expansively about what it means to be social, because children are starting to say.... Games are another space that is effectively social networks. As long as you're thinking a little bit more expansively about what's under the tent, the structure overall, and saying that we need to have a proactive duty of care and we need to care about transparency and these issues, that is what's important.

Alistair MacGregor NDP Cowichan—Malahat—Langford, BC

I have just a few seconds left in this round. Let me close by saying that, again, at my public safety committee, we have had witnesses who are complete and total experts in the AI space, and its rapid pace of development leaves them greatly concerned.

11:45 a.m.

Advocate, Social Platforms Transparency and Accountability, As an Individual

Frances Haugen

I would love to talk to your committee, because I worked in that space at Google.

Alistair MacGregor NDP Cowichan—Malahat—Langford, BC

We'll keep you in mind.

Thank you.

The Chair Liberal Lena Metlege Diab

Thank you very much.

Now we'll move to our second round.

We will go to Mr. Brock for five minutes, please.