An Act to enact the Online Harms Act, to amend the Criminal Code, the Canadian Human Rights Act and An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other Acts

Sponsor

Arif Virani  Liberal

Status

Second reading (House), as of Sept. 23, 2024

Subscribe to a feed (what's a feed?) of speeches and votes in the House related to Bill C-63.

Summary

This is from the published bill. The Library of Parliament has also written a full legislative summary of the bill.

Part 1 of this enactment enacts the Online Harms Act , whose purpose is to, among other things, promote the online safety of persons in Canada, reduce harms caused to persons in Canada as a result of harmful content online and ensure that the operators of social media services in respect of which that Act applies are transparent and accountable with respect to their duties under that Act.
That Act, among other things,
(a) establishes the Digital Safety Commission of Canada, whose mandate is to administer and enforce that Act, ensure that operators of social media services in respect of which that Act applies are transparent and accountable with respect to their duties under that Act and contribute to the development of standards with respect to online safety;
(b) creates the position of Digital Safety Ombudsperson of Canada, whose mandate is to provide support to users of social media services in respect of which that Act applies and advocate for the public interest in relation to online safety;
(c) establishes the Digital Safety Office of Canada, whose mandate is to support the Digital Safety Commission of Canada and the Digital Safety Ombudsperson of Canada in the fulfillment of their mandates;
(d) imposes on the operators of social media services in respect of which that Act applies
(i) a duty to act responsibly in respect of the services that they operate, including by implementing measures that are adequate to mitigate the risk that users will be exposed to harmful content on the services and submitting digital safety plans to the Digital Safety Commission of Canada,
(ii) a duty to protect children in respect of the services that they operate by integrating into the services design features that are provided for by regulations,
(iii) a duty to make content that sexually victimizes a child or revictimizes a survivor and intimate content communicated without consent inaccessible to persons in Canada in certain circumstances, and
(iv) a duty to keep all records that are necessary to determine whether they are complying with their duties under that Act;
(e) authorizes the Digital Safety Commission of Canada to accredit certain persons that conduct research or engage in education, advocacy or awareness activities that are related to that Act for the purposes of enabling those persons to have access to inventories of electronic data and to electronic data of the operators of social media services in respect of which that Act applies;
(f) provides that persons in Canada may make a complaint to the Digital Safety Commission of Canada that content on a social media service in respect of which that Act applies is content that sexually victimizes a child or revictimizes a survivor or intimate content communicated without consent and authorizes the Commission to make orders requiring the operators of those services to make that content inaccessible to persons in Canada;
(g) authorizes the Governor in Council to make regulations respecting the payment of charges by the operators of social media services in respect of which that Act applies, for the purpose of recovering costs incurred in relation to that Act.
Part 1 also makes consequential amendments to other Acts.
Part 2 amends the Criminal Code to, among other things,
(a) create a hate crime offence of committing an offence under that Act or any other Act of Parliament that is motivated by hatred based on certain factors;
(b) create a recognizance to keep the peace relating to hate propaganda and hate crime offences;
(c) define “hatred” for the purposes of the new offence and the hate propaganda offences; and
(d) increase the maximum sentences for the hate propaganda offences.
It also makes related amendments to other Acts.
Part 3 amends the Canadian Human Rights Act to provide that it is a discriminatory practice to communicate or cause to be communicated hate speech by means of the Internet or any other means of telecommunication in a context in which the hate speech is likely to foment detestation or vilification of an individual or group of individuals on the basis of a prohibited ground of discrimination. It authorizes the Canadian Human Rights Commission to deal with complaints alleging that discriminatory practice and authorizes the Canadian Human Rights Tribunal to inquire into such complaints and order remedies.
Part 4 amends An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service to, among other things,
(a) clarify the types of Internet services covered by that Act;
(b) simplify the mandatory notification process set out in section 3 by providing that all notifications be sent to a law enforcement body designated in the regulations;
(c) require that transmission data be provided with the mandatory notice in cases where the content is manifestly child pornography;
(d) extend the period of preservation of data related to an offence;
(e) extend the limitation period for the prosecution of an offence under that Act; and
(f) add certain regulation-making powers.
Part 5 contains a coordinating amendment.

Elsewhere

All sorts of information on this bill is available at LEGISinfo, an excellent resource from the Library of Parliament. You can also read the full text of the bill.

JusticeOral Questions

December 17th, 2024 / 3:15 p.m.


See context

Liberal

Francesco Sorbara Liberal Vaughan—Woodbridge, ON

Mr. Speaker, children in Canada need protection from online harm. The abuse that occurs online is endangering our kids, and it is time we acted to prevent more families from being harmed. Our government has risen to this challenge, putting forward a plan to help parents and children. Bill C-63, the online harms act, would create safety measures that would save lives. The Conservatives are now the only roadblock to making the bill a reality in Canada.

The safety of our children should not be political. Can the Minister of Justice please discuss the importance of this critical legislation and why we need it passed now?

JusticeOral Questions

December 16th, 2024 / 3:10 p.m.


See context

Liberal

Joyce Murray Liberal Vancouver Quadra, BC

Mr. Speaker, children in Canada are just not safe online. Our government wants to join the many countries that have now adopted online safety regulations, yet the Conservatives are preventing our online harms act from moving forward. Shockingly, they are blocking our efforts to remove child sex abuse material from the Internet. How disgusting.

Can the justice minister please describe the importance of Bill C-63 to parents and children, and explain why Canadians so urgently need this law now?

JusticeOral Questions

December 16th, 2024 / 3:05 p.m.


See context

Liberal

Lena Metlege Diab Liberal Halifax West, NS

Mr. Speaker, our government takes the safety of children seriously. That is why we put forward a comprehensive plan to bring Canada into the 21st century and change our online world, making it safer for kids and better for all. The Conservatives are blocking the plan, and they are standing in the way of a better future for our kids online.

Parents want the online harms act. Experts want the online harms act. Can the Minister of Justice explain why Bill C-63 must be passed to keep our kids safe?

Pam Damoff Liberal Oakville North—Burlington, ON

Thanks so much, Chair.

I'm so happy to get another opportunity to ask our incredible witnesses questions.

For the record, the legislation the Conservatives mentioned is a private member's bill that has no hope of ever seeing the light of day. It also puts the onus on a victim to come forward and get digital or social media companies to respond.

Tyler, I know my experience with reaching out to them results in nothing. Our Sergeant-at-Arms says you can't even reach out to Twitter anymore.

If my colleagues haven't read it yet, I want to focus on a report from the Office of the Federal Ombudsperson for Victims of Crime that came out last week. It's called “Strengthening Access to Justice for Victims of Hate Crime in Canada”. It's an outstanding report. I asked him to submit it to the committee for evidence.

I want to read you some of the stats.

Tyler, you mentioned one particular death threat. I know that's not the only one you've encountered.

It says:

72% of police officers said their police service did not have a dedicated hate crime unit. Of those that did, 44% had only one officer

44% of victim services had fewer than 5 paid staff

73% of victim services cited limited resources as a significant barrier to providing adequate support

77% of police officers and 82% of victim service workers believed the proposed standalone hate crime offence in Bill C-63 would be helpful or very helpful.

It also says:

Throughout the years, discriminatory laws have marginalized 2SLGBTQIA+ people, and recent data suggest they are more likely to suffer physical harm from hate crimes than other targeted groups.

He's made 13 outstanding recommendations that I hope colleagues will take the time to read.

Again, I'll start with the Tylers in the room. Then, if we have time, I'll go online.

Have you gone to the police to report hate crimes, and what has been your experience if you have?

Tyler Boyce, I'll start with you.

Anna Roberts Conservative King—Vaughan, ON

Thank you, Madam Chair.

Thank you to the witnesses.

I'm going to start by asking if anyone has heard of Bill C-412.

No.

Bill C-412 is a better alternative to Bill C-63, the online harms act. It will keep Canadians safe online without infringing on their civil liberties. The online harms act creates a costly censorship bureaucracy, which the PBO has estimated at $200 million—arguably the most expensive in the world. Bill C-412 gives Canadians more protection online through existing regulations and the justice system.

The reason I ask is that I understand free speech. I get it. However, what I'm getting from all the witnesses is that we're not holding people accountable. I feel it's important that if you commit a crime, you should be held accountable. If we don't stop the perpetrators from hurting people.... It was said earlier by Ms. Baker, I believe, that 91% of 2SLGBTQ1+...do not report.

How can we make this a better world if we don't hold these individuals to account?

I'll start with you, Mr. Boyce.

John HorganOral Questions

December 12th, 2024 / 3:20 p.m.


See context

Conservative

Michelle Rempel Conservative Calgary Nose Hill, AB

Mr. Speaker, if you seek it, you will find unanimous consent for the following motion, given that Bill C-63, the so-called—

John HorganOral Questions

December 12th, 2024 / 3:20 p.m.


See context

Liberal

Arif Virani Liberal Parkdale—High Park, ON

Mr. Speaker, I rise on a point of order.

Regarding Bill C-63, if you seek it, I believe you will find unanimous consent for—

Alistair MacGregor NDP Cowichan—Malahat—Langford, BC

Thank you.

In the course of the conversation around Bill C-63, my Conservative colleagues have mentioned one of their own bills, Bill C-412. I want to mention another private member's bill, brought in by my colleague MP Peter Julian, Bill C-292, the online algorithm transparency act.

I'm just wondering if you could talk a little bit about the features in that legislation and maybe how Bill C-63 might not be hitting the mark of where we need to be in this space.

Joanna Baron Executive Director, Canadian Constitution Foundation

Good afternoon. Thank you for the opportunity to present before this committee.

I represent the Canadian Constitution Foundation, a national legal charity that defends fundamental freedoms. We have participated in Whatcott, Fleming, Ward and other seminal Supreme Court of Canada decisions on freedom of expression. We view this bill, Bill C-63, as posing a grave threat to all Canadians' right to free speech and a flourishing democracy.

We welcome the minister's announcement that he intends to split the bill with regard to parts 1 and 4, but we remain concerned about the constitutionality of aspects of part 1, as well as parts 2 and 3 in their entirety.

First I'll address portions of the bill that expand sanctions for offences related to hate speech, including “harmful content” and “content that foments hatred”. I am referring to both the mandate of the new digital safety commissioner, created in part 1 of the bill, and the expanded penalties for hate crimes in part 2.

Part 1 of the bill imposes obligations on an operator to “implement measures that are adequate to mitigate the risk that users...will be exposed to harmful content”. This includes “content that foments hatred”. This office will cost around $200 million over five years and impose fines up to the millions of dollars on platforms.

Part 2 of the bill, meanwhile, increases penalties for existing hate crimes, including promoting genocide, now punishable with up to life. It also creates a new stand-alone offence, in proposed section 320.‍1001, for any federal offence motivated by hatred, now punishable up to life.

As the previous witness mentioned, and I agree with many of his comments, hate speech is an inherently subjective concept. These expanded penalties and regulatory obligations pose a risk of gross disproportionality and excessive chill of protected expression. In Whatcott, the Supreme Court of Canada said that hatred encompasses only the most “extreme manifestations [captured] by the words 'detestation' and 'vilification'”. Only that type of speech can be penalized without violating the charter.

Bill C-63 adopts this language in proposed subsection 319(7): “hatred means the emotion that involves detestation or vilification”. But “detestation” is really just a synonym for “hate”, and vilification is a highly subjective concept. We are in a present moment of passionate and often fraught disagreement in our society, where a lot of claims are made that are understood differently depending on context.

For example, calling someone a Zionist currently may land as vilification or, more dubiously, promotion of genocide, or as praise, depending on the speaker and the audience. Just a few days ago, a former CBC producer, Shenaz Kermalli, accused MP Kevin Vuong of hateful expression for posing with an individual wearing an “F Hamas” sweatshirt on social media. That's the problem with criminalizing language. It's subjective. It shifts depending on context.

These concerns become pressing with the expanded sanctions proposed in part 2. Even if our judges can be relied upon to respect the principles of proportionality when sentencing an offender under section 320, for example, the range of available sentences in the law will now include life imprisonment. It's not a frivolous possibility that prosecutors can refer judges to a range of sentencing up to life imprisonment for a crime such as vandalism if it is alleged that the crime was motivated by hate.

The reality is that it's virtually impossible to identify in advance, predictably, a line that separates the merely “awful but lawful” from criminal hate speech. This lack of clarity poses an urgent threat to online discourse, which is our current town square and should brook this type of passionate and adversarial disagreement. When these types of sanctions are in play, everyone has an incentive to err on the side of caution. Platforms will flag and remove content that is actually protected expression, and individuals will self-censor.

Finally, I will briefly address part 3 of the bill. It brings back a civil remedy for online hate speech, which allows members of the public to bring complaints before the Canadian Human Rights Commission. This would be disastrous. You should not go forward with this proposal. Even if most alleged instances are dismissed for not meeting the threshold of hate speech, the penalties for individuals found liable—up to $50,000 paid to the government plus $20,000 to the victim—are severe enough that we can infer that the new regime will lead to large amounts of soft-pedalling of expression for fear of skirting the line. It will interfere severely with press freedom to publish controversial opinions, which are necessary for a flourishing civil society. Finally, process is punishment, even if the case does not proceed. We will see more people punished for protected expression.

Thank you. I welcome your questions.

Guillaume Rousseau Full Professor and Director, Graduate Applied State Law and Policy Programs, Université de Sherbrooke, As an Individual

Good morning, everyone. Thank you for inviting me to speak to Bill C‑63.

I apologize for my appearance. I had surgery yesterday, which is why I'm wearing a bandage. Although I have a few scars on my head, my mind is working fine. I should be able to make this presentation and answer your questions.

As a constitutional lawyer, I mainly want to draw your attention to the issue of freedom of expression and, since I'm from Quebec, I also want to draw your attention to the fact that Bill C‑63 is very similar to Bill 59, which was studied in Quebec in 2015 and 2016.

For those who, like me, fought against Bill 59, it's a bit like Groundhog Day, since Bill C‑63 contains extremely similar elements, including the prohibition on hate speech. This reminds us of the extent to which Quebec and federal jurisdictions are not always sufficiently exclusive and that there is a lot of overlap. I will stop my digression on Canadian federalism here, but I would like to point out in passing that I have just tabled a report with the Quebec advisory committee on constitutional issues within the Canadian federation. If you're interested in this issue, you should know that a report has just been submitted to the Government of Quebec.

Bill 59, which was studied in 2015 and 2016, banned hate speech, and it was considered very problematic in terms of freedom of expression. In the end, the government of the day decided to set aside part of the bill and not adopt the hate speech component of the bill in order to keep the other part of the bill, which was much more consensual and dealt in particular with the regulation of underage marriages. With respect to Bill C‑63, I hope we are preparing for a similar outcome.

I think the bill contains a lot of interesting things about sexual victimization and “revenge porn”. I believe the equivalent term in French is “pornodivulgation”. I think this whole area of protecting minors and protecting them from sexual victimization is very important. However, everything to do with hate seems much more problematic to me.

Sometimes, people talk about splitting the bill, saying that part 1 isn't a problem, and that parts 2 and 3 are more problematic. For my part, I draw your attention to the fact that, even in part 1, the definition of harmful content includes content that promotes hatred. Even in part 1, there's this mix between the issue of protecting minors from certain elements of pornography and the issue of hate. In my opinion, if we want to rework the bill properly, we must not only not adopt parts 2 and 3, but also eliminate hate from part 1.

The problem with everything to do with hate in the bill is that the definition is very vague and very broad. Hate is defined as detestation and defamation, but the definitions of detestation and defamation often include a reference to hate. It's all a bit circular. It's very vague and, for that reason, it's very difficult for litigants to know what their obligation is, to know what they can and cannot say.

I understand that this definition is inspired by the Supreme Court's Whatcott case, but there are two problems in this regard.

First, this definition was given in a human rights case, but here we want to use it as a model in criminal law. In terms of evidence, in particular, these two areas are very distinct. Second, I understand why we are taking our cues from the Supreme Court when it comes to definitions, because that means that the provision of the act is less likely to be struck down. I understand it on a technical level, but on the substance, a definition that isn't clear and isn't good isn't clear and isn't good, even if it comes from the Supreme Court.

I want to repeat this famous sentence: The Supreme Court is not final because it is infallible, it is infallible because it is final.

As legislators, you really have to ask yourself whether the definition is clear rather than just whether it is the Supreme Court's definition. Ultimately, if you absolutely want to have a definition inspired by the Supreme Court, I would recommend the definition in the Keegstra decision, which is more of a criminal decision. It's a little clearer and a little less problematic than the Whatcott inspired definition.

That said, if you go along with what I'm proposing and remove the hate component from the bill, it will raise the following question: If we create a bill that is more targeted on sexual victimization and the protection of minors, will we need a commission, an ombudsperson, an office and all the bureaucracy that is planned when the purpose of the act is more limited? We will therefore have to rethink the bill so that it is less bureaucratic.

Finally, I draw your attention to the fact that the bill should include the abolition of exemptions that allow hate speech in the name of religion. We were talking earlier about Bill C‑63 and Bill C‑412, but there's also Bill C‑367, which I invite you to study.

Thank you.

Professor Andrew Clement Professor Emeritus, Faculty of Information, University of Toronto, As an Individual

Thank you, Madam Chair and committee members, for the opportunity to contribute to your important prestudy of Bill C-63, the online harms act.

I'm Andrew Clement, a professor emeritus in the faculty of information at the University of Toronto, speaking on my own behalf. I'm a computer scientist by training and have long studied the social and policy implications of computerization. I'm also a grandfather of two young girls, so I bring both a professional and a personal interest to the complex issues you're having to grapple with.

I will confine my remarks to redressing a glaring absence in part 1 of the bill—a bill I generally support—which is the need for algorithmic transparency. Several witnesses have made a point about this. The work of Frances Haugen is particularly important in this respect.

Social media operators, broadly defined, provide their users with access to large quantities of various kinds of content, but they're not simply passive purveyors of information. They actively curate this content, making some content inaccessible while amplifying other content, based primarily on calculations of what users are most likely to respond to by clicking, liking, sharing, commenting on, etc.

An overriding priority for operators is to keep people on their site and exposed to revenue-producing advertising. In the blink of an eye, they select the specific content to display to an individual following precise instructions, based on a combination of the individual's characteristics—for example, demographics, behaviour and social network—and features of the content, such as keywords, income potential and assigned labels. This is referred to as an “algorithmic content curation practice”, or “algorithmic practice” for short.

These algorithmic practices determine what appears most prominently in the tiny display space of personal devices and thereby guides users through the vast array of content possibilities. In conjunction with carefully designed interactive features, such curation practices have become so compelling, or even addictive, that it holds the attention of U.S. teens, among others, for nearly five hours a day. Disturbingly, their time spent on social media is strongly correlated with adverse mental health outcomes and with a rapid rise in suicide rates starting around 2012. We've heard vivid testimony about this from your other witnesses. Leading operators are aware of the adverse effects of their practices but resist reform, because it undermines their business models.

While we need multiple approaches to promote safety online, a much better understanding of algorithmic curation practices is surely one of the most important.

Canadians have begun calling for operators to be more transparent about their curation practices. The Citizens' Assembly on Democratic Expression recommended that digital service providers “be required to disclose...the...inner workings of their algorithms”. Respondents to the online consultation regarding this proposed online harms legislation noted “the importance of...algorithmic transparency when setting out a regulatory regime.” Your sister standing committee, the Standing Committee on Public Safety and National Security, has made a similar recommendation: “That the Government of Canada work with platforms to encourage algorithmic transparency...for better content moderation decisions.”

Internationally, the U.S., the EU and others have developed or are developing regulatory regimes that address online platforms' algorithmic practices. Most large social media services or online operators in Canada also operate in the EU, where they are already subject to algorithmic transparency requirements found in several laws, including the Digital Services Act. It requires that “online platforms...consistently ensure that recipients of their service are appropriately informed about how recommender systems impact the way information is displayed, and can influence how information is presented to them.”

While Bill C-63 requires operators to provide detailed information about the harmful content accessible on the service, it is surprisingly silent on the algorithmic practices that are vital for determining the accessibility, the reach and the effects of such content. This lapse is easily remedied through amendments—first, by adding a definition of “algorithmic content curation practice”, and second, by adding requirements for the inclusion of algorithmic content curation practices in the digital safety plans in clause 62 and in the electronic data accessible to accredited persons in clauses 73 and 74. I will offer specific amendment wording in a written submission.

Thank you for your attention, and I welcome your questions.

Rhéal Fortin Bloc Rivière-du-Nord, QC

Thank you, Madam Chair.

I want to go back to Ms. Haugen, this time on the issue of private messaging. It was discussed that it should be included in Bill C‑63, and it was proposed that certain obligations be imposed on social media companies, including:

…reporting unusual friend requests from strangers in remote locations…removing invitations to expand one's network through friend recommendations based on location and interests…providing easy-to-use complaint mechanisms…providing user accountability tools, such as account blocking.

All that seems reasonable to me, but the fact remains that we're talking about breaking into individuals' private messages. I have the same question about freedom of expression and privacy: Aren't we going too far? Shouldn't private messaging be left private, or is there really a need for provisions to enable the owners of these addresses to better control what goes on there and the messages their users receive and send?

James Maloney Liberal Etobicoke—Lakeshore, ON

Thank you.

We've heard from witnesses already who have lived through horrific experiences with their children and families, who have tried to use the courts and the criminal process to address this and who have tried to do it directly with the social media platforms. It simply doesn't work. That is why the digital safety commission and the ombudsperson are so critical, so that it can be responded to quickly.

Ms. Selby, I take it you support part 1 of Bill C-63.

James Maloney Liberal Etobicoke—Lakeshore, ON

Thank you, Madam Chair.

I want to thank all the witnesses for being here today. There's a lot to cover.

Ms. Panas, I'll start with something you said. You talked about feeling comfortable and feeling safe online. Last Christmas Day, I posted a video. I was standing in front of a Christmas tree at a community centre wishing everybody a merry Christmas. The first five or six or 10 comments were, “I hope you lose the next election”, “Rot in hell”—blah blah blah—and those were the nice ones. But I sloughed it off. I have big shoulders. It doesn't matter. That's not what this bill is about. This bill is about protecting people who don't have that ability and who are the most vulnerable.

I want to pick up on what Mr. Brock was trying to do. I want to thank you for your answers about the difference between Bill C-63, which you support, and Bill C-412, which I consider to be.... Well, it doesn't matter what I think. We've had witnesses who have said it's far too narrow and doesn't accomplish the goals we're trying to achieve here. One witness said that she thought it confused tort law with criminal law, which I agree with.

I want to deal with this right off the bat. If something is posted online that's offensive and that involves some of the things we're talking about—I won't use the examples—Bill C-63 provides a method to have it taken down from the Internet right away. Contrast that with the so-called solution of Bill C-412, which would require somebody to go out and retain a lawyer, put together some sort of application or motion, go before a judge and try to convince him or her that this should be taken down.

First of all, you're dealing with people who are the most vulnerable, who don't know how to find a lawyer, who can't afford a lawyer, who have to find a lawyer who knows how to deal with this and appear before a judge who has no expertise in this. It's an insulting joke dressed up as policy. It's not effective. I'd like to get that off the table.

I'm assuming you agree with that, Ms. Panas. You've already highlighted the importance of having the ability to deal with this quickly.

Larry Brock Conservative Brantford—Brant, ON

I'm going to interrupt you there.

Bill C-63 does not provide an avenue for you to deal with online criminal harassment. It is a glaring oversight. Bill C-412 provides a ready, able mechanism that addresses some of the concerns you deal with.

I just wanted to highlight that to you and encourage you to review that.