Evidence of meeting #125 for Justice and Human Rights in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was children.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Carol Todd  Founder and Mother, Amanda Todd Legacy Society
Lianna McDonald  Executive Director, Canadian Centre for Child Protection
Barbie Lavers  As an Individual
Miranda Jordan-Smith  Executive, As an Individual
Tim McSorley  National Coordinator, International Civil Liberties Monitoring Group
Frances Haugen  Advocate, Social Platforms Transparency and Accountability, As an Individual

11:35 a.m.

Founder and Mother, Amanda Todd Legacy Society

Carol Todd

Thank you for allowing me to do this.

I will continue about why I feel that Bill C-63 is important.

I also want to say that we aren't the only country that has afforded this. The U.K. has an Online Safety Act that was established and written into law in 2023, and Australia had the Online Safety Act put into law in 2021. Also, the EU has an online harms act that is similar to what Canada is doing. Canada has been in collaboration with the U.K., Australia and the EU regarding BillC-63.

Why is this important? It's important because it protects children. What I don't understand—and this is from my own thinking—are all the people who are negative on Bill C-63, saying that it's not about children and it's not about protection. They focus on the parts that Minister Virani has said he and his cabinet would rewrite. It is about protecting children. It's about protecting children and families from the online behaviours of others.

We can't do this without the tech companies' help. It's really important that we understand this. There are so many people who don't understand this. I read the negative comments, and, personally, it just infuriates me, because my daughter died 12 years ago, and I've waited 12 years for this to happen. Parliamentarians and political groups are arguing about this not being necessary, and we're going.... It just hurts me. It hurts me as a Canadian.

We need accountability and transparency. We need to support the victims. Passing Bill C-63 is not just about regulation; it's about taking a stand for the safety and dignity of all Canadians. This about ensuring that our digital spaces are as safe and respectful as our physical ones.

By supporting this bill, we are committing to a future in which the Internet is a place of opportunity and connection, free from threats of harm and exploitation. Passing Bill C-63 would demonstrate the federal government's commitment to adapting to the digital age and ensuring that the Internet remains a safe space for all users. It balances the need for free expression with the imperative to protect individuals from harm, making it a necessary and timely piece of legislation.

It's also essential to recognize the collective effort in creating platforms that address the challenges faced by children, women and men.

We've come to realize that what happened to Amanda could happen to anyone. As Amanda herself said, “Everyone has a story.” When these stories emerge, and they belong to your child, your relatives or your grandchildren, they carry more weight.

No one is immune to becoming a statistic, and, as I have previously shared, I have waited 12 years for this, because on day one of Amanda's death, I knew things needed to change in terms of law, legislation and online safety. I can't bring my child back, but we can certainly keep other children safe.

Thank you for this time.

Rhéal Fortin Bloc Rivière-du-Nord, QC

Thank you, Ms. Todd.

Ms. Lavers, I offer you the last minute remaining for you to add your comments, should you have any.

11:40 a.m.

As an Individual

Barbie Lavers

Thank you.

I think what I would like to say is that our children are so precious, and I would ask you as a committee to go home and hug your children, your nieces, your nephews and your grandchildren and just think about what Carol and I and so many other parents have had to endure because of unsafe social media platforms. Just take that home with you and really think about it, because Harry and Amanda could still be here with us if this conversation were not necessary.

Rhéal Fortin Bloc Rivière-du-Nord, QC

I thank you all. The members of our committee will think a lot about Harry and Amanda.

Thank you, Madam Chair.

The Chair Liberal Lena Metlege Diab

Thank you, Mr. Fortin.

Now we will conclude with Mr. MacGregor, please.

Alistair MacGregor NDP Cowichan—Malahat—Langford, BC

Thank you very much, Madam Chair.

I'd like to echo colleagues in thanking the witnesses for joining our committee and helping us wade through a very difficult subject. I'm a father of three daughters. I have 12-year-old twins, so we are dealing with that as parents, with them getting access to the Internet, and the challenges of finding ways to allow them to do that safely.

Ms. Todd and Ms. Lavers, I'd like to start with you, because part of the debate on the subject of Bill C-63 has been on whether we should just modernize existing laws and changes to the Criminal Code or whether we should add another layer of bureaucracy.

Briefly, when you had your experiences in reporting this to the police and when the police were trying to make use of existing Criminal Code provisions to solve this for your children, can you talk about some of the limitations you experienced with that and illustrate why you think more is needed based on your personal experiences?

11:40 a.m.

Founder and Mother, Amanda Todd Legacy Society

Carol Todd

Do you want to go first, Barbie?

11:40 a.m.

As an Individual

Barbie Lavers

Sure. Thanks, Carol.

In our experience, the RCMP worked with the FBI in the United States, but tracking down the IP address of who had contacted Harry was difficult. When they did track it down, it was basically like a call centre type of set-up, and people worked there to extort and sextort. This is a job, just as if they were working at Bell Aliant and taking calls, but they're calling out, and they search for people.

I don't think that just having the Criminal Code is enough, as Lianna said. I think there have to be stronger guidelines and regulations in order to hold these companies accountable, because they could do it now if they wanted; they have the ability. I have no doubt in my mind that they do, but they don't want to do it, because they use the algorithms that they have to make money and not to keep people safe.

Alistair MacGregor NDP Cowichan—Malahat—Langford, BC

Thank you.

I just want to be sure Ms. Todd can get in, because I also want to direct a question to Ms. McDonald.

Please go ahead, Ms. Todd.

11:45 a.m.

Founder and Mother, Amanda Todd Legacy Society

Carol Todd

My thinking as a teacher-educator—and I speak to parents, teachers and communities—is that there's an aspect of prevention, intervention and reaction, and legislation becomes a reactionary phase: “Something's happened, and what are we going to do next?” We need more prevention and intervention.

When I first had to report when this was happening to Amanda, and I reported it to our local RCMP, it was a very challenging and difficult situation. You have to remember that all this started 14 years ago, two years prior to her death. It came back to me that they couldn't find the IP address coming out of the States. It was under a VPN, and they couldn't find anything. This was when she was alive.

After she died, through an investigation in the Netherlands and the U.K., they found an IP address for a fellow who was victimizing other young girls, and this happened to be Amanda's predator. Through finding information on Facebook, Amanda's name popped up under the account that she had. Ultimately, the Dutch police contacted the Canadian RCMP, and that's how Amanda's predator got caught.

Things have changed in the last 12 years, and I understand that, but there needs to be more incentive for law enforcement to take on these cases.

Alistair MacGregor NDP Cowichan—Malahat—Langford, BC

Thank you.

11:45 a.m.

Founder and Mother, Amanda Todd Legacy Society

Carol Todd

Not all cases will go to court.

Alistair MacGregor NDP Cowichan—Malahat—Langford, BC

Thank you very much for that, and just because I'm running out of time, I would like to get to you, Ms. McDonald.

On your website, your organization has a statement that “exclusion of private messaging features...leaves a substantial threat to children unaddressed.”

I'm curious about how we approach this, because, of course, there are great privacy concerns in place now. My 12-year-olds are using children's messenger, so we have full control over their contact list, and, in fact, the parents of their friends also have full control, so we have a lot of oversight.

In what ways would you like the law to be crafted to address what you think is a glaring omission in this bill while still respecting the very real privacy concerns that have been raised with the potential of such an approach?

The Chair Liberal Lena Metlege Diab

Give a brief response if possible. There are 40 seconds left.

11:45 a.m.

Executive Director, Canadian Centre for Child Protection

Lianna McDonald

Just to make the point clear, yes, we are concerned that private messaging is not brought in scope. I think the concern is that we see many of these organized crime groups targeting Canadian children in their own homes and bedrooms and basically moving over to these types of applications.

Our organization has produced a paper on our site that outlines how we can capture some of the metadata without capturing the direct communication. There are a number of ways and opportunities for us to build that in. Certainly that is something that we will continue to raise.

The Chair Liberal Lena Metlege Diab

Thank you very much.

Thank you to our witnesses. Normally we would go for another few minutes but, unfortunately, the bells are ringing, and it's probably going to take us another 30 minutes before we start.

We're going to suspend, and we're going to test the second panellists while we're suspended. Then we'll come back to our second panellist session.

James Maloney Liberal Etobicoke—Lakeshore, ON

Why don't we continue with this panel after the...? I mean, it's....

11:45 a.m.

Conservative

Michelle Rempel Conservative Calgary Nose Hill, AB

It's up to you.

James Maloney Liberal Etobicoke—Lakeshore, ON

There are 17 or 18 minutes.

The Chair Liberal Lena Metlege Diab

I'm going to suspend for now, though.

The Chair Liberal Lena Metlege Diab

I am now going to start the process.

Ms. Haugen, you will get a phone call from the clerk or somebody from the room regarding interpretation, if you don't mind answering that.

I will welcome our witnesses. We have two witnesses by video conference and one in the room.

We have Madam Frances Haugen, advocate, social platforms transparency and accountability; and we have Madam Miranda Jordan-Smith, executive, both by video conference.

With us in the room, from Coalition pour la surveillance internationale des libertés civiles, we have Mr. Tim McSorley, national coordinator.

Please wait, each of you, until I recognize you by name before speaking.

For those participating by video conference, please ensure that you have selected, on the bottom of your screen, the language of your choice, because questions will be coming in both languages.

I also ask that you wait to be asked to speak, whether you're a member or a witness, and that you go through the chair.

I will now ask Madam Miranda Jordan-Smith to please commence.

You have up to five minutes.

Miranda Jordan-Smith Executive, As an Individual

Thank you for having me here today.

As mentioned, my name is Miranda. I'm here today to represent the astronomical and increasing number of victims who have been subjected to online harm. Please allow me to share with you the story of my daughter's abuse.

At the age of 12, my daughter had a cellphone, which we ensured was equipped with parental controls. She was not on social media at all. Her screen time was limited, and her contacts needed to be approved. Her father could see all of the activity on her phone.

Therefore, it was shocking to us to learn that our daughter, at the age of 12, could be groomed and manipulated online on a school device that carried a music platform that did not have any age restrictions. It had a chat function, like many, and it was not monitored adequately by the tech provider to detect the online predator she was speaking to. For one year, she was groomed by an online predator, who presented as a peer.

In June 2022, at the age of 13, she was abducted right beside her school by the predator, a 40-year-old man. When my daughter did not arrive home on the school bus, I reported her as missing.

From there, a full-scale search for her ensued, with volunteer crews on the ground, knocking on doors and putting up posters. The police in Edmonton merged their historical crimes, missing persons, cybercrime and human trafficking divisions in the hope that our daughter would be found safe.

For days, we had sleepless and tearful nights, wondering what happened to her. We engaged the media heavily, and our appeals made international news, with the New York Post and the U.K.'s Guardian.

After a week of our daughter missing, I woke to officers at our door, knowing that they had an update. We knew that either they had found her alive or our daughter would be returned to us in a body bag.

Naturally, we were overjoyed to learn that our daughter was found. The FBI had seized her from a hotel room in Portland, Oregon, and she was being held at a children's hospital there, where they administered a rape kit and an assessment of her abuse. Immediately, we jumped on a plane to retrieve her from Portland, and we brought her home.

While the criminal case is still pending, with a federal trial date set for January 13, 2025, the abuse that my daughter suffered is unbearable, impossible to comprehend. Her perpetrator faces 70 to 77 years in prison for a litany of crimes, some of which include kidnapping, rape, sodomy, putting a child on display, possessing and developing child pornography, and crossing an international border with sexual intent.

My daughter was stuffed into the perpetrator's trunk, and this act alone could have killed her.

For the last two years, my family has been on a healing journey. The pain and the damage of these horrific events is complex and largely irreparable. We are learning to coexist with it.

Today I appeal to you to understand the damage of an unregulated Internet and what it creates. Tech companies need to be held accountable and ensure they are acting in a legal and ethical manner. The online harms bill is a step in the right direction.

While I know that some people feel regulation is an infringement on one's freedom of speech or privacy, I must tell you that my family has no privacy and no anonymity. Everyone knows who we are now, and we have to live with judgment or misconceptions around, “This could not happen to my child,” or that our daughter is somehow gullible, or that she comes from a poor socio-economic background, all of which are not true.

I often think about regulation. To drive a car, one needs a licence. To fish or hunt, one needs a licence. To go into a porn shop and access pornographic material, one must produce identification. Why is the Internet not regulated the same way, so that users have to verify who they are?

I think it's time for online reform in Canada, otherwise more children will become victims. The impact is great for families and communities across the country. Already, the U.K. has progressive legislation, and Australia just passed regulation that social media users must be 16.

I appeal to you today, as members of Parliament, to make changes that will have a profound and lasting impact for the citizens of Canada, because it is my position and my lived experience that no child is safe on the web. If this can happen to us, it could happen to anyone.

Thank you.

The Chair Liberal Lena Metlege Diab

Thank you very much.

Mr. McSorley, you have up to five minutes, please.

Tim McSorley National Coordinator, International Civil Liberties Monitoring Group

Thank very much, Chair.

Thank you to the committee for this invitation to speak to Bill C-63.

I'm grateful to be here on behalf of the International Civil Liberties Monitoring Group, a coalition of 44 Canadian civil society organizations that work to defend civil liberties in the context of national security and anti-terrorism measures.

The provisions of this bill, particularly in regard to part 1 of the online harms act, are vastly improved over the government's original 2021 proposal, and we believe that it will respond to urgent and important issues. However, there are still areas of serious concern that must be addressed, especially regarding undue restrictions on free expression and infringement on privacy.

This includes, in part 1 of the act, first, the overly broad definition of the harm of “content that incites violent extremism or terrorism” will lead to overmoderation and censorship. Further, given the inclusion of the online harm of “content that incites violence”, it is redundant and unnecessary.

Second, the definition of “content that incites violence” itself is overly broad and will lead to content advocating protest to be made inaccessible on social media platforms.

Third, the act fails to prevent platforms from proactively monitoring, essentially surveilling, all content uploaded to their sites.

Fourth, a lack of clarity in the definition of what is considered “a regulated service” could lead to platforms being required to break encryption tools that provide privacy and security online.

Fifth, proposed requirements for platforms to retain certain kinds of data could lead to the unwarranted collection and retention of the private information of social media users.

Finally, seventh, there has been little consideration on how this law will inhibit the access of Canadians and people in Canada to content shared by people in other countries.

Briefly, on part 2 of the act, this section amends Canada's existing hate-crime offences and creates a new stand-alone hate crime offence, and it is only tangentially related to part 1. It has raised serious concerns among human rights and civil liberties advocates in regard to the breadth of the offences and the associated penalties. We've called for parts 2 and 3 to be split from part 1 in order to be considered separately, and we're very pleased to see the government's announcement yesterday that it intends to do just that.

I'd be happy to speak to any of these issues during questions, and I've submitted a more detailed brief to the committee with specific amendments on these issues. However, I'd like to try to focus in the time I have on the first two points that I've made regarding “content that incites violent extremism or terrorism”, as well as a definition of “content that incites violence”.

The harm of “content that incites violent extremism or terrorism” is problematic for three reasons and should be removed from the act. First, it is redundant and unnecessary. The definitions of “content that incites violent extremism or terrorism” and “content that incites violence” are nearly identical, the major difference being that the first includes a motivating factor for the violence it is attempting to prevent. These two forms of harms are also treated the same throughout the online harms act, including requirements for platforms to retain information related to these harms for a year to aid in possible investigations.

Moreover, and maybe most importantly, incitement to violence alone would clearly capture any incitement to violence that arises from terrorist or extremist content. Further definition of what motivates the incitement to violence is unnecessary.

Second, if included, incitement to terrorism will result in the unjustified censorship of user content. “Terrorism”, and with it “extremism”, are subjective terms based on interpretation of the motivations for a certain act. The same opinion expressed in one context may be viewed as support for terrorism and therefore violent, while, in another, it may be viewed as legitimate and legally protected political speech.

Acts of dissent become stigmatized and criminalized not because of the acts themselves but because of the alleged motivation behind the acts. As we have seen, this leads to unacceptable incidents of racial, religious and political profiling in pursuit of fighting terrorism.

Studies have also extensively documented how social media platforms already overmoderate content that expresses dissenting views under the auspices of removing “terrorist content”. The result is that, by including terrorism as a motivating factor for posts that incite violence, the act will be biased against language that is not, in fact, urging violence but is seen as doing so because of personal or societal views of what is considered terrorism or extremism.

I note also that “extremism” is not defined in Canadian law. This ties into the third key part that we're concerned about, and that's that parts of the language used in this definition are undefined in Canadian law or the Criminal Code. This contradicts the government's main justification for all seven harms—that they align with the Criminal Code and do not expand existing offences.

The Chair Liberal Lena Metlege Diab

You have 30 seconds.