An Act to enact the Online Harms Act, to amend the Criminal Code, the Canadian Human Rights Act and An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other Acts

Sponsor

Arif Virani  Liberal

Status

Second reading (House), as of Sept. 23, 2024

Subscribe to a feed (what's a feed?) of speeches and votes in the House related to Bill C-63.

Summary

This is from the published bill. The Library of Parliament has also written a full legislative summary of the bill.

Part 1 of this enactment enacts the Online Harms Act , whose purpose is to, among other things, promote the online safety of persons in Canada, reduce harms caused to persons in Canada as a result of harmful content online and ensure that the operators of social media services in respect of which that Act applies are transparent and accountable with respect to their duties under that Act.
That Act, among other things,
(a) establishes the Digital Safety Commission of Canada, whose mandate is to administer and enforce that Act, ensure that operators of social media services in respect of which that Act applies are transparent and accountable with respect to their duties under that Act and contribute to the development of standards with respect to online safety;
(b) creates the position of Digital Safety Ombudsperson of Canada, whose mandate is to provide support to users of social media services in respect of which that Act applies and advocate for the public interest in relation to online safety;
(c) establishes the Digital Safety Office of Canada, whose mandate is to support the Digital Safety Commission of Canada and the Digital Safety Ombudsperson of Canada in the fulfillment of their mandates;
(d) imposes on the operators of social media services in respect of which that Act applies
(i) a duty to act responsibly in respect of the services that they operate, including by implementing measures that are adequate to mitigate the risk that users will be exposed to harmful content on the services and submitting digital safety plans to the Digital Safety Commission of Canada,
(ii) a duty to protect children in respect of the services that they operate by integrating into the services design features that are provided for by regulations,
(iii) a duty to make content that sexually victimizes a child or revictimizes a survivor and intimate content communicated without consent inaccessible to persons in Canada in certain circumstances, and
(iv) a duty to keep all records that are necessary to determine whether they are complying with their duties under that Act;
(e) authorizes the Digital Safety Commission of Canada to accredit certain persons that conduct research or engage in education, advocacy or awareness activities that are related to that Act for the purposes of enabling those persons to have access to inventories of electronic data and to electronic data of the operators of social media services in respect of which that Act applies;
(f) provides that persons in Canada may make a complaint to the Digital Safety Commission of Canada that content on a social media service in respect of which that Act applies is content that sexually victimizes a child or revictimizes a survivor or intimate content communicated without consent and authorizes the Commission to make orders requiring the operators of those services to make that content inaccessible to persons in Canada;
(g) authorizes the Governor in Council to make regulations respecting the payment of charges by the operators of social media services in respect of which that Act applies, for the purpose of recovering costs incurred in relation to that Act.
Part 1 also makes consequential amendments to other Acts.
Part 2 amends the Criminal Code to, among other things,
(a) create a hate crime offence of committing an offence under that Act or any other Act of Parliament that is motivated by hatred based on certain factors;
(b) create a recognizance to keep the peace relating to hate propaganda and hate crime offences;
(c) define “hatred” for the purposes of the new offence and the hate propaganda offences; and
(d) increase the maximum sentences for the hate propaganda offences.
It also makes related amendments to other Acts.
Part 3 amends the Canadian Human Rights Act to provide that it is a discriminatory practice to communicate or cause to be communicated hate speech by means of the Internet or any other means of telecommunication in a context in which the hate speech is likely to foment detestation or vilification of an individual or group of individuals on the basis of a prohibited ground of discrimination. It authorizes the Canadian Human Rights Commission to deal with complaints alleging that discriminatory practice and authorizes the Canadian Human Rights Tribunal to inquire into such complaints and order remedies.
Part 4 amends An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service to, among other things,
(a) clarify the types of Internet services covered by that Act;
(b) simplify the mandatory notification process set out in section 3 by providing that all notifications be sent to a law enforcement body designated in the regulations;
(c) require that transmission data be provided with the mandatory notice in cases where the content is manifestly child pornography;
(d) extend the period of preservation of data related to an offence;
(e) extend the limitation period for the prosecution of an offence under that Act; and
(f) add certain regulation-making powers.
Part 5 contains a coordinating amendment.

Elsewhere

All sorts of information on this bill is available at LEGISinfo, an excellent resource from the Library of Parliament. You can also read the full text of the bill.

December 12th, 2024 / 11:45 a.m.


See context

Canadian Certified Inclusion Professional, As an Individual

Marni Panas

Look, to even get to a situation where I have the courts involved and police involved would require me—a person who is already unsafe online, a person who is already facing enormous costs just in being visible—to have to report that, to have to be believed by the police in the first place that these things are happening, and to have to address all the biases that are inherently built in law enforcement against trans people. I'm more likely to just do nothing and probably withdraw. That is the consequence. You can give them all the tools they want, but that requires reporting and that requires people to believe and to have a safe process.

Bill C-63 provides means for us to be able to do that in a way where I feel I would be believed for the first time, I would be supported for the first time and I would find some avenue to get that far.

By the time it's gone to the police—

Larry Brock Conservative Brantford—Brant, ON

Okay.

I'll start with you, Ms. Panas. I listened very carefully to your opening statement. You reiterated in some of the questions put to you that ultimately you feel safe in this environment, but the same cannot be said when you actually leave this building. You talked about various avenues of online harassment.

Let's face it: That's the reality Canadians are facing. It's not necessarily just children and teenagers. It's also adults. There is a legal definition of criminal harassment in the Criminal Code of Canada, but what's sadly lacking in the Criminal Code of Canada is provisions to deal with online harassment. Sadly—and this is a direct indictment against the Liberal government—Bill C-63 contains no provisions at all that deal with online harassment. Bill C-412 does. I don't know if you've had a chance to dive into Bill C-412 to take a look at the provisions that deal with online harassment.

The question I put to you, Ms. Panas, is this: Do you think law enforcement and judges should have more tools to provide “no contact” orders for criminal harassment online? Do you think that's a good idea?

Larry Brock Conservative Brantford—Brant, ON

Thank you, Chair.

I thank the witnesses for their attendance. I echo the commentary of my colleague Ms. Ferreri that this is such an important discussion we're having today.

Just to clarify, Ms. Haugen, I heard you say that you are not familiar with Bill C-412, which ostensibly achieves the same result in terms of keeping kids safe online. We get to it in a vastly different way versus Bill C-63. It's unfortunate that you haven't had a chance to review that.

Can the same be said for you, Ms. Selby, that you are not familiar with Bill C-412?

Alistair MacGregor NDP Cowichan—Malahat—Langford, BC

Thank you very much.

Ms. Haugen, I'd like to turn to you for my next question. We're in a kind of legislative deadlock right now in the House of Commons. There's pretty much nothing getting done in our main chamber. It's been like that since the end of September. In fact, we don't even have Bill C-63 properly before this committee. This is a prestudy. It hasn't even passed second reading.

The fact of the matter is that this Parliament is rapidly running out of runway. Bill C-63 is still a long way away from the Governor General's desk. You have just talked about how rapidly this technology is evolving. It may be that we don't actually have a proper legislative approach to this problem for another two or three years.

What are some of the things a future Parliament has to take note of? We have this draft of Bill C-63, but what are some of the other things we may need to think of in a future piece of draft legislation?

Rhéal Fortin Bloc Rivière-du-Nord, QC

I'll quickly read the definition of “intimate content” proposed in Bill C‑63:

(b) a visual recording…that falsely presents in a reasonably convincingly manner a person as being nude or exposing their sexual organs or anal region or engaged in explicit sexual activity, including a deepfake that presents a person in that manner, if it is reasonable to suspect that the person does not consent to the recording being communicated.

That seems like a rather long definition that seeks to cover a number of areas. Maybe I wouldn't have done any better. So it's not really a criticism.

Do you think that's a good definition, or should it be amended differently?

Rhéal Fortin Bloc Rivière-du-Nord, QC

Thank you, Madam Chair.

My first question is for Ms. Panas.

I understand that you've looked at Bill C‑63, which provides for the creation of three bodies, including an ombudsperson's office and a commission.

How do you assess the effectiveness of the complaint process with those organizations?

Anju Dhillon Liberal Dorval—Lachine—LaSalle, QC

I don't know if you had the chance to hear last week's testimony when Jane Doe came. It was a very painful testimony. It was painful for all of us to hear what kind of evil can exist in this world.

These parents came. There was one parent whose child was part of the armed forces and committed suicide. They're begging. We're talking about how parents should not be held responsible, completely responsible, because there's also a duty on governments and the platforms. We know that Bill C-63 applies to all online platforms. They're begging for us to do something as quickly as possible to mitigate the damages that are already done and that could come in the future.

We keep hearing things about regulatory bodies and delays. Do you not think that, at this point, it's better to pass something rather than nothing? Nothing is perfect, but at least something can give you support. We can give you support.

Michelle Ferreri Conservative Peterborough—Kawartha, ON

I would love to direct it to you or send it to you. I think it's addressing what you're saying. It gets to the heart of the issue quickly and more efficiently than Bill C-63.

I have another question for you. How does Bill C-63 ensure that platforms understand their obligations without explicit definitions?

Michelle Ferreri Conservative Peterborough—Kawartha, ON

I agree 100%. I think that's where we're going with this. Bill C-412 directly deals with this immediately, as opposed to Bill C-63, which has combined too many issues that will not hold these perpetrators, whom I will call “perverts”, to account, as well as social media platforms that, to your point exactly, need to have accountability.

Ms. Haugen, I was very interested in your testimony. It was profound. You hit a lot of nails on the head in terms of the impact social media is having on our children and exposing them too young to this. Without mandatory parental controls or algorithmic accountability in Bill C-63, how do we ensure that platforms are actually protecting children?

Michelle Ferreri Conservative Peterborough—Kawartha, ON

Certainly. I think we would agree on that, for sure.

I guess what I'm trying to say is.... There's a bill that the Conservatives have. It's called Bill C-412. It was put forward by my Conservative colleague. It deals with exactly what you're saying immediately, as opposed to Bill C-63, where the Liberals have combined two separate issues that are not targeting the predators online and the sexual exploitation.

To Ms. Haugen's point, the brains of these young children are forever changed. There's not a parent out there who isn't concerned about this.

Michelle Ferreri Conservative Peterborough—Kawartha, ON

Thank you so much, Madam Chair.

Thank you to our witnesses.

We're talking here about one of the most serious bills, I think, that have come before Parliament, certainly in my time and many others' time. That is Bill C-63.

I want to start with you, Ms. Selby. This is on record from the Canadian Constitution Foundation:

“Bill C-63 combined things that have no reason to go together,” Van Geyn said. “The issue of the online sexual exploitation of children through pornography is urgent and serious and should not be lumped in together with the government’s controversial plans to criminalize all kinds of speech and allow for civil remedies through the Canadian Human Rights Commission for speech,” she added.

My question for you is this: Shouldn't we have a stand-alone bill or legislation that protects children from online perverts? Shouldn't that be its own legislation?

Dr. Jocelyn Monsma Selby Chair, Clinical Therapist and Forensic Evaluator, Connecting to Protect

Honourable Chair Diab and all member of the Standing Committee on Justice and Human Rights, thank you for the opportunity to be here today.

My first point is that, in Canada, our current legal framework addresses child sexual abuse and exploitation via the Criminal Code and the law for protection of children from sexual exploitation. However, we should not be relying on a broad duty of care by any Internet platform. There should be law requiring the identification and immediate action to report and take down illegal sexually explicit images. We need regulation that is fit for purpose and safety by design.

My second point is this. Bill C-63 reads, “reduce harms caused to persons in Canada as a result of harmful content online and ensure that the operators of social media services...respect...their duties under that Act.” This is a glitch. All Internet platforms need accountability, not just social media sites. It takes just three clicks to find child sexual abuse imagery or child sexual exploitation material on the regular Net, and this includes images generated by artificial intelligence found through accessing many, many online platforms, including the dark web. These IPAs are disguised within websites and embedded in emojis and hidden links, requiring the viewer to follow a digital pathway that can disappear as quickly as the next link is clicked on.

In 2022, the IWF found a 360% increase in self-generated child sexual abuse reports of seven-year-olds to 10-year-olds, more prevalent than non-self-generated content. This trend has continued into 2023, when the IWF hashed 2,401 self-generated sexually explicit images and videos of three-year-olds to six-year-olds. Of those images, 91% were girls showing themselves in sexual poses, displaying their genitals to the camera. It's normal for children to have curiosity, explore their bodies or experiment sexually, but that is not what the IWF found. What is shocking is the unsupervised access of children using digital devices.

My third point is with regard to guidelines respecting the protection of children in relation to regulating services and age of consent to data processing and in using social media. There is a duty to make certain content inaccessible. Caution should be used in passing regulation based on precedents set out in other countries. We need to look in turn at all the international laws, treaties and conventions. A single guiding principle is in article 5 of the UNCRC, concerning the importance of having regard for an individual child's “evolving capacities” at any moment in time in their interactions with the online world.

My fourth point is the establishment of a digital safety office of Canada, a digital safety commission and a digital safety ombudsperson. Could Canada benefit by establishing an online safety office and a children's commissioner or ombudsperson? The answer is yes, and several countries have been blazing a trail for us. These countries are part of a global online safety regulators network that aims to create a coordinated approach to online safety issues. Canada, sadly, is not at the table.

Last week, I was invited to attend a global summit in Abu Dhabi, sponsored by WeProtect and the UAE government. I was the only child protection representative from Canada, and I'm a self-funded third party voice.

I have a few final thoughts.

It took 50 years from the development of the Gutenberg Press to develop 20 million books. It took Ford 10 years to develop 10 million Model Ts. It took Playboy approximately two years to sell over a million copies each month. It took the global Internet in 1995 two years to develop 20 million users. It took Facebook 10 months to reach one million users. Today, Meta's ecosystem—including Instagram, WhatsApp and Messenger—has approximately 2.93 billion daily active users.

We need to close the gap between the rapid development and access of the Internet and needed regulation. We cannot have a continued partisan approach, lacking civility, to develop the important regulations needed to protect children and vulnerable individuals.

Marni Panas Canadian Certified Inclusion Professional, As an Individual

I am Marni Panas. I use the pronouns “she” and “her”. I am a Canadian certified inclusion professional. I led the development of diversity and inclusion activities at Alberta Health Services, Canada's largest health care services provider. I am the director of DEI for one of Canada's most respected corporations, and I am the board chair for the Canadian Centre for Diversity and Inclusion.

Today, I am speaking on behalf of myself and my own experiences. I'm here to vehemently defend every Canadian's right to freedom of expression, the foundation of our democracy. However, I and millions like me do not have freedom of expression, because it is safer to be racist, homophobic, sexist and transphobic online than it is to be Black, gay, a woman or transgender online. Online hate is real hate. It descends into our streets. It endangers Canadians in real life.

In September 2021, I took the stage at a university in my hometown of Camrose, Alberta, to deliver a lecture on LGBTQ2S+ inclusion, a lecture I've delivered to thousands of students, medical professionals, and leaders around the world. While I was on stage, unbeknownst to me, a student, like many other youth who have been radicalized by online hate, was livestreaming my presentation on Facebook and several far-right online platforms. By the time I got off stage, thousands of people were commenting on my appearance, my identity and my family. The worst of the comments included threats to watch my back. My next lecture was cancelled. Police escorted me off campus for my own safety.

In March 2023, I was invited to participate on a panel celebrating International Women's Day to raise awareness for an organization in Calgary that works to protect women and children from domestic violence. Because of the many online threats of violence directed towards me, the Calgary Police Service and my employer's protective services unit had to escort me in and out of the Calgary Public Library, where the event was being held.

Last February, emboldened by the introduction of anti-trans legislation in Alberta, people harassed and threatened me and others online at levels I had never experienced before, even trying to intimidate me by contacting my employer. I'm grateful for the support of my current employer, who once again had to step in to have my back.

It is rarely the people spewing hate online who are the greatest threat, but words are never just words. It is the people who read, listen and believe in hate speech who become emboldened to act on what's been said. These words and the actions they fuel have followed me to my community, my workplace and even my doorstep. The impact of this relentless harassment for simply living my life publicly, proudly and joyfully as me has profoundly impacted my mental health, my well-being and my sense of safety where I live and work, leaving me withdrawn from the communities I cherish and leaving me wondering every time someone recognizes me on the street whether this is the moment where online hate turns to real physical violence. I feel far less safe in my community and in my country than I ever have before.

No, I don't have freedom of expression. There is a cost to being visible. There is a cost to speaking out. There is a cost to speaking before you today, knowing that this is being broadcast online. Most often, the cost just isn't worth it. The people all too often silenced are those who desperately need these online platforms the most to find community and support. This is made worse when the same platforms allow disinformation to be spread that aims to dehumanize and villainize LGBTQ2S+ people, contributing to the significant rise in anti-LGBTQ2S+ violence as highlighted by CSIS this past year.

The status quo is no longer acceptable. Platforms need to be held accountable for the hateful content they host and the disinformation they allow to spread. The federal government needs to act. We can't wait. I've been called brave, courageous and even resilient, but I'd rather simply just be safe. People have a right to freely exist without fear because of who they are and whom they love. This is needed in online spaces, too. In fact, our communities and our democracy depend on it.

Uphold freedom of expression. Pass Bill C-63, and protect us all from online harms.

Thank you.

The Chair Liberal Lena Metlege Diab

Good morning, everyone.

I call this meeting to order.

Welcome to meeting 127 of the House of Commons Standing Committee on Justice and Human Rights.

Pursuant to Standing Order 108(2) and the motion adopted by the committee on December 2, 2024, the committee is meeting in public to continue its pre‑study of the subject matter of Bill C‑63.

Before I welcome the witnesses for the first panel, I have a few introductory remarks to make.

For those appearing in person, please use your microphone and your headset. Move it away from the microphone so that we do not give a hard time to our interpreters. It's also for their safety and health. For those in the room and those appearing virtually, please wait to be recognized by the chair.

I'm speaking French right now. The English participants should be hearing the English interpretation.

If you did not understand what I just said in French, you do not have your device turned the proper way. I would ask that you ensure that you have your device turned the right way so that you understand the language of your choice and we're not interrupted midstream.

Please mute your electronic devices.

If you are appearing virtually, unmute yourself only when you are recognized by the chair.

I will now introduce our three panellists this morning.

First, we have Frances Haugen.

She is an advocate for social platform transparency and accountability. She is appearing by video conference.

Marni Panas is a Canadian certified inclusion professional.

From Connecting to Protect, we have Jocelyn Monsma Selby, chair, clinical therapist and forensic evaluator, by video conference.

I will give each of you up to five minutes to say your introductory remarks. I understand that it's a little bit difficult, particularly if you're on your screen. When you have 30 seconds left, I will let you know. When the time is up, I will interrupt you as softly and delicately as possible, whether during your five-minute remarks or during your answers to members' questions.

I want to let you know that we have Senator Kristopher Wells with us today. He will be here for the first hour. Welcome, Senator.

I will now ask Ms. Frances Haugen to please start.

You have up to five minutes.

Jamil Jivani Conservative Durham, ON

Thank you, Mr. Chair.

Mr. Marcoux, finally I get to ask you my question. I apologize for the delay.

Let's go back to the concerns that many Canadians have about the creation of a massive bureaucracy through Bill C-63. I'd be curious if other Canadians who share your concerns, your objectives concerning the protection of children.... Do you appreciate why they are not favourable toward Bill C-63's expansion of the bureaucracy? Do you see why there are concerns about that posing a threat to freedom of expression in our country? Would you be able to find common ground with Canadians who share your concerns related to the protection of children online but do not appreciate the way that Bill C-63 proposes to go about it?