Evidence of meeting #125 for Justice and Human Rights in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was children.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Carol Todd  Founder and Mother, Amanda Todd Legacy Society
Lianna McDonald  Executive Director, Canadian Centre for Child Protection
Barbie Lavers  As an Individual
Miranda Jordan-Smith  Executive, As an Individual
Tim McSorley  National Coordinator, International Civil Liberties Monitoring Group
Frances Haugen  Advocate, Social Platforms Transparency and Accountability, As an Individual

12:15 p.m.

National Coordinator, International Civil Liberties Monitoring Group

Tim McSorley

Okay—I'll finish up here.

There are several examples, but one I'll share is that the definition of “incitement to terrorism”, in fact, includes only “actively encourages” an act. There's no definition of what “actively encourages” includes. It's a much lower threshold than “incitement”, and, in fact, goes against what is already in the Criminal Code, which punishes either “instructing” or “counselling”—

The Chair Liberal Lena Metlege Diab

Thank you very much. We'll get back to you with the questions.

12:15 p.m.

National Coordinator, International Civil Liberties Monitoring Group

Tim McSorley

Thank you.

The Chair Liberal Lena Metlege Diab

Now we have Madam Haugen for up to five minutes, please.

Frances Haugen Advocate, Social Platforms Transparency and Accountability, As an Individual

Hello. Thank you for inviting me. I was invited only yesterday, so unfortunately you'll have to listen to my stream of consciousness.

I want to address three main issues. The first is with regard to what can be known and what is unknown. You've probably heard lots of things in the media around what the damages and risks are to kids. I thought the testimony regarding Miranda's child was incredibly compelling. There are many, many stories like this.

The Chair Liberal Lena Metlege Diab

Ms. Haugen, please speak closer to your microphone. Apparently, they're not able to hear you well.

12:20 p.m.

Advocate, Social Platforms Transparency and Accountability, As an Individual

Frances Haugen

Everything was fine on the audio check yesterday.

Is this any better?

The Chair Liberal Lena Metlege Diab

Yes. Please continue.

12:20 p.m.

Advocate, Social Platforms Transparency and Accountability, As an Individual

Frances Haugen

Good. I'm sorry. I was invited only yesterday, and none of the headphones on the approved list were available in Puerto Rico.

You can go and read all these horrific accounts of the impacts on kids, but the thing I want to emphasize is what can be known and what cannot be known today. There are lags in the impact of when we see these effects. We look at 16-year-olds today and we say that we know what the harms are of social media, but the 16- and 17-year-olds today came online at 12 and 13.

Rhéal Fortin Bloc Rivière-du-Nord, QC

Madam Chair, there is no interpretation.

The Chair Liberal Lena Metlege Diab

Okay. I will suspend for a moment while we try to figure something out in the room.

Panellists, give me a moment, please.

12:20 p.m.

Advocate, Social Platforms Transparency and Accountability, As an Individual

Frances Haugen

I'm sorry. I was using the exact same set-up a couple of days ago on CNN, and it was fine, so I don't really know what to change.

Is that a little better?

The Chair Liberal Lena Metlege Diab

Ms. Haugen, they will have to give you a call and reschedule you to attend at a different time. We have members in the room who do not speak English. They require interpretation, and that is absolutely—

Rhéal Fortin Bloc Rivière-du-Nord, QC

The point is not that people don't speak English. The point is that Parliament is bilingual. We work in both official languages.

The Chair Liberal Lena Metlege Diab

Exactly. We must speak in both official languages. Unfortunately, interpretation cannot be offered because of the equipment you have.

They will give you a call and reschedule you for another time that is convenient. If you would like to stay and listen, that's okay, but you don't have to. I apologize for that.

We will continue with the panellists we have in the time we have left.

Madam Rempel Garner, you have six minutes, please.

12:20 p.m.

Conservative

Michelle Rempel Conservative Calgary Nose Hill, AB

Thank you, Chair.

Ms. Jordan-Smith, thank you for your testimony today and for your courage in speaking out on this issue.

One thing that struck me about your testimony was that you talked about how your daughter was victimized through a platform that you weren't even aware she was using. It strikes me that in order to have a duty of care that would address the fact that technology changes all the time—there will always be some new platform that kids are on—we need to have a very clear but also broad definition of who, or what, a duty of care would apply to. It can't just be Meta or a couple of the known players, can it?

I've been giving some thought to what that could mean. I tend towards having a broader term. The term I would like to use is something like “online operator”, which would mean the owner or operator of a platform, such as a service online or application that connects to the Internet, or that is used or that reasonably could be expected to be used by a minor, including a social media service and an online video gaming service, so that it's very clear that as new platforms come up in the future, as technology changes, you as a parent aren't having to guess whether or not your child is being exposed to a platform that might not be covered by the law.

Would you support that type of recommendation?

12:20 p.m.

Executive, As an Individual

Miranda Jordan-Smith

I would support it, because I think it's the best way to capture it. It would encapsulate all types of online activity, and I think that's what is important.

12:20 p.m.

Conservative

Michelle Rempel Conservative Calgary Nose Hill, AB

Thank you.

Then, bridging from the who or the what to what they're responsible for, I'd like to very briefly suggest some things that online platforms or operators should be responsible for: a significant duty of care to prevent physical harm or incitement of such harm, including online bullying and harassment; online sexual violence against a minor, including any conduct directed at a minor online; the creation or dissemination of imagery that is sexually exploitative, humiliates them, is harmful to their dignity or invades their privacy; the promotion and marketing of products or services that are currently unlawful for minors; and patterns that indicate or encourage addiction-like behaviour.

Would you say we're on the right track there in terms of looking at the scope of things an online operator would have to ensure that minors were not subjected to?

12:25 p.m.

Executive, As an Individual

Miranda Jordan-Smith

I think, minimally, that would be ideal.

A few additional thoughts of mine would be that they, again, have age restrictions, that there is a responsibility on tech companies to identify who their users should be. My daughter was on a platform that didn't have any age restrictions, so to me, that's completely irresponsible.

12:25 p.m.

Conservative

Michelle Rempel Conservative Calgary Nose Hill, AB

I'm glad you brought this up, because it was actually my next question. It's a question between you and Mr. McSorley.

The government, in Bill C-63, has not thought about age verification at all. It's punting this to a regulator that's not created, and it's going to be two or three years down the road.

Witnesses on the other panel have suggested that age verification can be done right now through algorithms, and I agree with that. You can detect someone's age using an algorithm. If Meta knows somebody wants to buy a KitchenAid spatula, it knows how old they are.

I'm wondering, between the two of you, if the way that we should be squaring the circle on age verification to protect personal information, while also ensuring that minors are not subjected to harm, is by requiring online operators to use algorithms or other technological means to determine age within a degree of accuracy.

Does that make sense to you, Ms. Jordan-Smith?

12:25 p.m.

Executive, As an Individual

Miranda Jordan-Smith

I think so. I think they need to determine an appropriate age for the users of their platform.

Then, verification to me seems like a normal thing that's even happening online, where some providers are already self-regulating. As an example, for LinkedIn, in order to be verified, you have to upload your driver's licence. For me to take a course at Oxford, I had to upload my passport for them to verify my identity, that I'm actually the person taking that course. I don't see it as a huge deal.

Michelle Rempel Conservative Calgary Nose Hill, AB

I'd just like to get to Mr. McSorley quickly on that.

Does that seem like a way to square the circle?

12:25 p.m.

National Coordinator, International Civil Liberties Monitoring Group

Tim McSorley

I think it would be.

One thing that we've raised in our brief, and I think others will raise, is that there's a lack of requirements for algorithmic transparency from social media platforms in the bill. If that were integrated, I think that would answer lots of the concerns.

12:25 p.m.

Conservative

Michelle Rempel Conservative Calgary Nose Hill, AB

Now, do both of you think this should be something that is put in a legislated duty of care as opposed to being punted off to a regulator where parliamentarians would have no say—or you would have no say—in what that looked like? Does that make sense to you, Mr. McSorley?

12:25 p.m.

National Coordinator, International Civil Liberties Monitoring Group

Tim McSorley

I'm not enough of an expert on where the technology is at right now to say whether or not it should be legislated or what the best approach would be.