Evidence of meeting #125 for Justice and Human Rights in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was children.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Carol Todd  Founder and Mother, Amanda Todd Legacy Society
Lianna McDonald  Executive Director, Canadian Centre for Child Protection
Barbie Lavers  As an Individual
Miranda Jordan-Smith  Executive, As an Individual
Tim McSorley  National Coordinator, International Civil Liberties Monitoring Group
Frances Haugen  Advocate, Social Platforms Transparency and Accountability, As an Individual

Rhéal Fortin Bloc Rivière-du-Nord, QC

Very well. I just want to make sure it was properly explained to you. I am not blaming you. Witnesses must be told how interpretation works beforehand, because it is important for all Canadians, both those who speak French and those who speak English, to be able to hear your testimony. It is part of my role to make sure everyone fully understands you, because your testimony is important and must be understood by everyone. That said, I am aware it’s not necessarily obvious, when it is the first time.

As I was saying earlier, I thank you for being with us. Your testimony is touching, like that from Ms. Todd and Ms. Lavers, who preceded you. We are aware of the seriousness of your daughter’s victimization. Rest assured we will keep it in mind throughout our work on Bill C‑63.

The question I was asking you—before we realized you were not hearing the interpretation—was on Bill C‑63. The minister announced he could divide it so that we can work more quickly on every aspect of it, especially the issue of online harm. What is the most urgent, in my opinion, is protecting our children, and I think most of us feel the same way.

What do you think about the idea of dividing Bill C‑63 in order to study the Online Harms Act and the issue of online hate separately?

12:40 p.m.

Executive, As an Individual

Miranda Jordan-Smith

For me, it's whatever is easiest to administer. If there are contentious components to Bill C-63, then I feel as though I'd capitulate to government folks who know how things are administered to extrapolate components of the Criminal Code or pieces that might be up for debate and then create other pieces of legislation that might work better within the system.

I guess that's all I can really say on that topic. I don't see an issue with them being separated, so long as they're effective and they work within the system.

Rhéal Fortin Bloc Rivière-du-Nord, QC

Thank you, Ms. Jordan‑Smith.

Bill C‑63provides for the creation of the Digital Safety Commission of Canada, the position of Digital Safety Ombudsperson of Canada and the Digital Safety Office of Canada.

Are you aware of their respective roles? What do you have to say about them?

12:40 p.m.

Executive, As an Individual

Miranda Jordan-Smith

Yes. I've read about it. I suppose, speaking candidly, I look at that as being a function of government. I sort of rely on what would actually work systemically. Those are components that I just don't have the expertise in. I suppose my experience is more boots on the ground and a lived experience that was absolutely horrific. I can just outline what the issues are in our case and in cases that I've seen.

Rhéal Fortin Bloc Rivière-du-Nord, QC

Thank you, Ms. Jordan‑Smith. Excuse me for interrupting you. I do not have a lot of time left.

It was my understanding that the measure which seems most urgent to you is verifying users’ age on social media. Did I understand correctly? If not, could you specify which measure you think we should focus on?

12:40 p.m.

Executive, As an Individual

Miranda Jordan-Smith

To me, age verification seems like a natural given, because I see it already happening. The other piece is mandating that the providers or operators are actually monitoring these sites. Right now there's an issue with AI as well. They're not detecting certain words that are sexually charged...not just racial or violent terms. In the case of my daughter, the stuff we'd found on the platform that she was attached to was still live up until a few months ago, when our lawyer had it removed. I think AI monitoring and age restriction and verification are key and fundamental to having some control over the Internet.

Rhéal Fortin Bloc Rivière-du-Nord, QC

Thank you.

Beyond age verification, are there other aspects of the issue we should look into? For example, should we go further and ask the manufacturers of electronic devices—such as computers, telephones and tablets—to add a mechanism for controlling what appears on them? Obviously, I agree there is also the issue of platforms, which I did not raise. However, when it comes to the equipment, such as computers and telephones, do you think something else should be done as well?

The Chair Liberal Lena Metlege Diab

Please be very brief, Madam, if you have a response to that.

12:45 p.m.

Executive, As an Individual

Miranda Jordan-Smith

I mean, I'm not opposed to it. I wouldn't have an issue with it. Again, I think you'll run into other Canadians who say it's an infringement. I look at it as all being focused around safety. My perspective is that I would have no problem with it, and I know other people who wouldn't.

The Chair Liberal Lena Metlege Diab

Thank you very much.

Mr. MacGregor, you have six minutes, please.

Alistair MacGregor NDP Cowichan—Malahat—Langford, BC

Thank you, Madam Chair.

I would like to welcome Mr. McSorley to the committee. He was a witness at my other committee, the public safety committee. We really appreciated that.

In your exchange with Ms. Rempel Garner, the subject of “algorithmic transparency” came up. It's a term that I am familiar with and am very much interested in. When people are posting online on these platforms, the platforms are not just passive bystanders. Their algorithms can both amplify and suppress. Algorithms can be very useful. They can direct people towards their interests; they can help make searching much more efficient, but they can also push people down to some very dark corners. I think over the last number of years we have seen the real-world results of that.

My colleague Peter Julian has come up with a bill, Bill C-292. I'm sure there's a variety of ways to approach this, but in terms of taking a more active role in promoting algorithmic transparency, how do you figure that fits into this subject matter that we're discussing today?

12:45 p.m.

National Coordinator, International Civil Liberties Monitoring Group

Tim McSorley

I think it is very important, because as we address different forms of harms, we need to look at modelling different approaches. That's why, in our comments, we're not proposing changes in terms of addressing child sexual abuse material or other things, but focusing specifically around national security and anti-terrorism concerns.

That said, in terms of algorithmic transparency, we think that it would be important to, overall, have a mandate for these platforms to have to be open about the development of their algorithms and what kind of information is being fed into them.

As we've argued in other places around the current artificial intelligence and data act, there need to be third party assessments to ensure that these algorithms are doing their job, not only in ensuring that they're efficient in what they're being asked to do but also in ensuring that there aren't negative repercussions. We know that already, with the use of artificial intelligence and algorithms, there have been documented cases of bias around age, gender and race, so it's important that there be openness, and that's something that's missing from Bill C-63.

Alistair MacGregor NDP Cowichan—Malahat—Langford, BC

Thank you.

In your opening statement, you were talking, I think, about how anything posted on social media could be viewed in one context as inciting terrorist violence but in another as perfectly acceptable speech. Can you elaborate on this and maybe provide an example?

12:45 p.m.

National Coordinator, International Civil Liberties Monitoring Group

Tim McSorley

Sure, definitely. I think that there are a few areas where we could look at that. We could look at that domestically, where there are individuals marching in the street for one cause—and we've been active on raising concerns about the characterization of any support for Palestinian human rights as being in support of terrorism—but a march for another issue that unfurls in the exact same way as the exact same call for an action would not be characterized that way. We think of the Occupy movement. There's a concern that, even though there's no direct call for violence, because of the stigma of simply labelling something as potentially being in support of terrorism, it could be viewed as a harm.

I'd like to expand that, too, because one of our concerns is how this will be applied internationally. There are countries that would say that human rights defenders in Egypt or people resisting in Ukraine are defined by other countries as terrorists. How would the platforms be expected to decide how to monitor all that? If it was limited simply to incitement to violence without that subjective decision-making around it, it would be a lot more clear for the platforms. It would be clearer for the audience, and it would also be easier to challenge it when it comes to the digital safety commission if there are any issues.

Alistair MacGregor NDP Cowichan—Malahat—Langford, BC

In other words, as legislators, we need to very much pay attention to the subjective interpretations of the laws that we are proposing. That is very well taken.

We've also had a lot of conversations about how we want platforms to take more responsibility for the content, but do you have concerns at all about platforms proactively monitoring all content and how they would deal with the collection and retention of private information? Can you elaborate on those concepts for the committee?

12:50 p.m.

National Coordinator, International Civil Liberties Monitoring Group

Tim McSorley

One of the concerns we had with the original iteration of this bill was that it would have mandated platforms to have to, in fact, monitor essentially all content that was going up. It no longer does that, but we're concerned that it doesn't stop them from doing that. It doesn't block that. The reason for that is, if that were the case and they were to do that, they would by default have to rely almost primarily on algorithmic decision-making, and that's not included. As we said, transparency is included in the bill. It would almost by default result in an overmoderation. They would have to lean towards taking down content and dealing with it later, rather than narrowly defining it.

In some cases, for child sexual abuse material, there are hashes and things that can be used to specifically identify particular types of content, and that would avoid having to monitor all online content, but that's missing from this bill in terms of an obligation for the platform to not engage in that activity.

Alistair MacGregor NDP Cowichan—Malahat—Langford, BC

Thank you.

The Chair Liberal Lena Metlege Diab

Thank you very much, Mr. MacGregor.

We still have a bit of time, and I'm going to shorten the time frame.

I will divide the next rounds as follows: I give the Conservatives three minutes, the Liberals three minutes, the Bloc Quebecois a minute and a half and the Nw Democratic Party a minute and a half.

Ms. Rempel-Garner, you have three minutes.

12:50 p.m.

Conservative

Michelle Rempel Conservative Calgary Nose Hill, AB

Thank you, Madam Chair.

I'm going to go back to you, Ms. Jordan-Smith, to pick up on a line of questions from my colleague Mr. Fortin.

He asked you if you knew what the regulators did, and I think you gave a very succinct answer. You said that would be up to the government. It's concerning to me, though, that you don't know what they do. I'm not saying that pejoratively; I'm saying it from the perspective of a parent who's gone through so much loss. I feel that the stated goal of Bill C-63 is for you to know what protections you have upon its passage, but they don't exist, because all it does is create a regulator where there's no guarantee that the protections that you're asking for are going to be legislated by Parliament.

In that, my preference would be that Parliament legislate that duty of care immediately, so that either law enforcement or existing regulatory bodies could take action immediately.

Does that make sense to you?

12:50 p.m.

Executive, As an Individual

Miranda Jordan-Smith

Yes, I think it needs to happen, because right now we have nothing.

12:50 p.m.

Conservative

Michelle Rempel Conservative Calgary Nose Hill, AB

Exactly.

There is a part of Bill C-63, in proposed section 4, where it talks about enhancing reporting requirements. Some of my colleagues have suggested that we need a regulator to do that. In the bill itself, it says that these reporting requirements would go to a law enforcement body that already exists.

Would you support those provisions that are enhancing laws that already exist and that would go through law enforcement? Is that perhaps what the government should be focusing on while also ensuring that there's a legislative duty of care, so that if one of us asked you again whether you know what this law does or what protections you're afforded, you'd be able to answer that with a degree of certainty that brought you some peace in your heart?

12:50 p.m.

Executive, As an Individual

Miranda Jordan-Smith

Just to clarify, what I said was that I don't know how it would be administered. I'm not ignorant of the bill, but I feel as though there has to be something in place that guides the citizens of Canada on how to engage with the Internet as well as providers. That's where I feel as though the bill is strong.

When we talk about how it's levied out or whether there are ombudsmen or different regulating bodies, that's the piece that, I fully admit, I don't know how that would function. That could be a weakness.

12:50 p.m.

Conservative

Michelle Rempel Conservative Calgary Nose Hill, AB

With my last 30 seconds, I want to thank you for all your work on this. I'd like to follow up with you after this meeting. I would like to send you a copy of Bill C-412, which actually specifies that in great certainty. It's a bill before Parliament that we could pass today and actually get these protections with some certainty for parents like you. I think that's what we all want to do here. We don't want to wait another two or three years.

Thank you.

The Chair Liberal Lena Metlege Diab

Thank you very much.

Now, for the three minutes, I'm going to go to Madame Brière, please.

Élisabeth Brière Liberal Sherbrooke, QC

Thank you, Madam Chair.

I'd like to ask a question to Madam Lianna McDonald, if she's still online.