An Act to enact the Online Harms Act, to amend the Criminal Code, the Canadian Human Rights Act and An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other Acts

Sponsor

Arif Virani  Liberal

Status

Second reading (House), as of Sept. 23, 2024

Subscribe to a feed (what's a feed?) of speeches and votes in the House related to Bill C-63.

Summary

This is from the published bill. The Library of Parliament has also written a full legislative summary of the bill.

Part 1 of this enactment enacts the Online Harms Act , whose purpose is to, among other things, promote the online safety of persons in Canada, reduce harms caused to persons in Canada as a result of harmful content online and ensure that the operators of social media services in respect of which that Act applies are transparent and accountable with respect to their duties under that Act.
That Act, among other things,
(a) establishes the Digital Safety Commission of Canada, whose mandate is to administer and enforce that Act, ensure that operators of social media services in respect of which that Act applies are transparent and accountable with respect to their duties under that Act and contribute to the development of standards with respect to online safety;
(b) creates the position of Digital Safety Ombudsperson of Canada, whose mandate is to provide support to users of social media services in respect of which that Act applies and advocate for the public interest in relation to online safety;
(c) establishes the Digital Safety Office of Canada, whose mandate is to support the Digital Safety Commission of Canada and the Digital Safety Ombudsperson of Canada in the fulfillment of their mandates;
(d) imposes on the operators of social media services in respect of which that Act applies
(i) a duty to act responsibly in respect of the services that they operate, including by implementing measures that are adequate to mitigate the risk that users will be exposed to harmful content on the services and submitting digital safety plans to the Digital Safety Commission of Canada,
(ii) a duty to protect children in respect of the services that they operate by integrating into the services design features that are provided for by regulations,
(iii) a duty to make content that sexually victimizes a child or revictimizes a survivor and intimate content communicated without consent inaccessible to persons in Canada in certain circumstances, and
(iv) a duty to keep all records that are necessary to determine whether they are complying with their duties under that Act;
(e) authorizes the Digital Safety Commission of Canada to accredit certain persons that conduct research or engage in education, advocacy or awareness activities that are related to that Act for the purposes of enabling those persons to have access to inventories of electronic data and to electronic data of the operators of social media services in respect of which that Act applies;
(f) provides that persons in Canada may make a complaint to the Digital Safety Commission of Canada that content on a social media service in respect of which that Act applies is content that sexually victimizes a child or revictimizes a survivor or intimate content communicated without consent and authorizes the Commission to make orders requiring the operators of those services to make that content inaccessible to persons in Canada;
(g) authorizes the Governor in Council to make regulations respecting the payment of charges by the operators of social media services in respect of which that Act applies, for the purpose of recovering costs incurred in relation to that Act.
Part 1 also makes consequential amendments to other Acts.
Part 2 amends the Criminal Code to, among other things,
(a) create a hate crime offence of committing an offence under that Act or any other Act of Parliament that is motivated by hatred based on certain factors;
(b) create a recognizance to keep the peace relating to hate propaganda and hate crime offences;
(c) define “hatred” for the purposes of the new offence and the hate propaganda offences; and
(d) increase the maximum sentences for the hate propaganda offences.
It also makes related amendments to other Acts.
Part 3 amends the Canadian Human Rights Act to provide that it is a discriminatory practice to communicate or cause to be communicated hate speech by means of the Internet or any other means of telecommunication in a context in which the hate speech is likely to foment detestation or vilification of an individual or group of individuals on the basis of a prohibited ground of discrimination. It authorizes the Canadian Human Rights Commission to deal with complaints alleging that discriminatory practice and authorizes the Canadian Human Rights Tribunal to inquire into such complaints and order remedies.
Part 4 amends An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service to, among other things,
(a) clarify the types of Internet services covered by that Act;
(b) simplify the mandatory notification process set out in section 3 by providing that all notifications be sent to a law enforcement body designated in the regulations;
(c) require that transmission data be provided with the mandatory notice in cases where the content is manifestly child pornography;
(d) extend the period of preservation of data related to an offence;
(e) extend the limitation period for the prosecution of an offence under that Act; and
(f) add certain regulation-making powers.
Part 5 contains a coordinating amendment.

Elsewhere

All sorts of information on this bill is available at LEGISinfo, an excellent resource from the Library of Parliament. You can also read the full text of the bill.

Élisabeth Brière Liberal Sherbrooke, QC

Since the beginning of the study, a number of parents, particularly mothers, have told us horrible stories about what their children had experienced. Some young people have even committed suicide.

Do you believe that Bill C‑63will really allow us to achieve the goals as they are set out?

Matthew Hatfield

I think that there's a lot to appreciate in Bill C-412. We do think that bill, if this bill does not pass, is worthy of study, but I think that Bill C-63 would accomplish more over a longer period of time for Canadians than Bill C-412. I think that Bill C-412 is narrow, perhaps too narrow a bill. When it comes to the harms that both of them treat, I think having a regulator involved is really beneficial.

Now, if you look at privacy law, we don't just say, “Here are your privacy laws on paper, and here's a private right of action, go to it. Our privacy is defended.” We found it extraordinarily valuable to have a Privacy Commissioner who can assist Canadians in asserting their privacy rights. Our hope for this digital safety commission is that they will function similarly.

Jamil Jivani Conservative Durham, ON

Thank you, Mr. Chair.

My first question is for Mr. Hatfield.

Thank you for your presentation.

I'm curious, given the very clear concerns you've expressed relating to parts 2 and 3 of Bill C-63, why you're not more concerned about some sections of part 1, particularly those related to the digital safety commission, the digital safety office and the digital safety ombudsperson, which would lay some of the bureaucratic groundwork that makes parts 2 and 3 possible.

Are you concerned about those sections of part 1? Would you care to give us some specific concerns you have related to part 1, which we're focused on today?

Matthew Hatfield Executive Director, OpenMedia

Good evening. I'm Matt Hatfield, the executive director of OpenMedia, a non-partisan, grassroots community of over 250,000 people in Canada working for an open, affordable and surveillance-free Internet.

I'm joining you from the unceded territory of the Stó:lō, Tsleil-Waututh, Squamish and Musqueam nations.

It's a pretty remarkable thing to be here today to talk about the online harms bill. When Canadians first saw what this bill might look like as a white paper back in 2021, we didn't much like what we saw. OpenMedia called it a blueprint for making Canada's Internet one of the most censored and surveilled in the democratic world, and we were far from alone in being concerned.

For once, our government listened. The rush to legislate stopped. National consultations were organized across the country on how to get regulation right with a wide range of stakeholders and experts on harms and speech. The resulting part 1 of Bill C-63 is an enormous, night-and-day improvement. Simple-minded punitive approaches that would have done more harm than good are gone, and nuances and distinctions made throughout show real sophistication about how the Internet works and how different harms should be managed. Packaging part 1—the online harms act itself—with changes to the Criminal Code and Human Rights Act proposed alongside it badly obscured that good work. That's why, alongside our peers, we called for these parts to be separated and why we warmly welcome the government's decision to separate those parts out.

I'll focus here on part 1 and part 4.

OpenMedia has said for years that Canadians do not have to sacrifice our fundamental freedoms to make very meaningful improvements to our online safety. The refocused Bill C-63 is the proof. Instead of trying to solve everything unpleasant on the Internet at once, Bill C-63 focuses on seven types of already-illegal content in Canada, and treats the worst and most easily identifiable content—child abuse material and adult material shared without consent—most severely. That's the right call. Instead of criminalizing platforms for the ugly actions of a small number of users, which would predictably make them wildly overcorrect to surveil and censor all of us, Bill C-63 asks them to write their own assessments of the risks posed by these seven types of content and document how they try to mitigate that risk. That's the right call again. It will put the vast engineering talent of platforms to work for the Canadian public, thinking creatively about ways to reduce these specific illegal harms. It will also make them explain what they are doing as they do it, so we can assess whether it makes sense and correct it if it does not.

However, I want to be very clear: It is not the time to pass Bill C-63 and call it quits. It's just the opposite. Because the parts that are now being separated raise so many concerns, there has not been nearly enough attention paid to refining part 1. I know you'll be hearing from a range of legal and policy experts about concerns they have with some of the part 1 wording and recommended fixes. I hope you will listen very carefully to all of them and pass on many of the fixes they suggest to you.

This is not the time to be a rubber stamp. The new digital safety commission is granted extraordinary power to review, guide and make binding decisions on how platforms moderate the public expression of Canadians in the online spaces we use the most. That's appropriate if, and only if, you make sure they carefully consider and minimize impacts on our freedom of expression and privacy. It isn't good enough for the commission to think about our rights and its explicit decisions. A badly designed platform safety plan could reduce an online harm but have a wildly disproportionate impact on our privacy or freedom of expression. You need to make sure platforms and the regulator make written assessments of the impact of their plans on our rights and ensure that any impact is small and proportionate to the harm mitigated. Bill C-63's protections of private, encrypted communication, and against platforms surveilling their users, need to be strengthened further and made airtight.

OpenMedia has a unique role in this discussion because we are both a rights-defending community that will always stand up for our fundamental freedoms and a community of consumer advocates who fight for common-sense regulation that empowers us and improves our daily lives. If you do your work at this committee, you can made Bill C-63 a win on both these counts. Since 2021, members of our community have sent nearly 22,000 messages to government asking you to get online harms right. Taking your time to study Bill C-63 carefully and make appropriate fixes before passing it would fulfill years of our activism and make our Internet a better, healthier place for many years to come.

Thank you, and I look forward to your questions.

Étienne-Alexis Boucher President, Droits collectifs Québec

Good evening, parliamentarians, honourable members of the House of Commons Standing Committee on Justice and Human Rights.

Thank you for this opportunity to speak as part of the pre‑study on Bill C‑63, which concerns online hate speech.

My name is Étienne‑Alexis Boucher. I'm the president of Droits collectifs Québec. I was supposed to be joined by François Côté, senior legal officer at Droits collectifs Québec. Unfortunately, he can't join us on account of the brand of his microphone.

Droits collectifs Québec is a non‑profit organization governed by an independent board of directors. It identifies as an agent of social transformation and operates throughout Quebec. Our mission is to help advocate for collective rights in Quebec, particularly with regard to people's language and constitutional rights. Our approach is non‑partisan. The organization's work encompasses many areas of action, including public education, social mobilization, political representation and legal action.

I've just given a brief overview of the organization. I would now like to focus on the Quebec consensus, which covers two aspects. We've already addressed the first, and this was touched on by the witnesses in the first panel earlier. We heard particularly poignant evidence regarding the mother of a young woman whose intimate images were shared.

While Ottawa refused to budge on this issue, Quebec ended up taking the lead. It became a pioneer in the field. The National Assembly adopted measures that fall under the Criminal Code. Unfortunately, Quebec doesn't have any power over the Criminal Code. At least, that's the current situation. Using its constitutional prerogatives, Quebec adopted measures concerning the sharing of intimate content without consent. In other words, since the federal government wasn't addressing the issue, we responded to the Quebec consensus with this initiative.

Another example of the Quebec consensus is the National Assembly's unanimous adoption of the request to repeal subsections 319(3)(b) and 319(3.1)(b) of the Criminal Code. These subsections state that “no person shall be convicted of an offence” of wilfully promoting hatred against an identifiable group “if, in good faith, the person expressed or attempted to establish by an argument an opinion on a religious subject or an opinion based on a belief in a religious text.”

This exception in the name of religious freedom has no place in a modern state such as Canada. We know that the Constitution of 1867 states that power in Canada is granted by divine right. Even the head of state can't be chosen democratically by the citizens of Canada, but by God. However, it's now the 21st century. I don't think that freedom of religion should rank higher than freedom of conscience, for example, or freedom of political opinion, when everyone acknowledges that certain limits are valid. For example, teachers may not, in the course of their duties, express opinions based on the political status of Quebec or Canada. These limits to a basic freedom are perfectly justifiable.

However, we find it completely unacceptable to make something normally considered a crime into a non‑crime in the name of freedom of religion. As a result, we're ultimately encouraging the parliamentarians to heed the call of Quebec's justice minister. Once again, the vast majority of Quebeckers are in agreement. The justice minister expressed a widely‑held consensus that hate speech based on religion is simply unacceptable.

There have been some concrete examples. We've seen the abuses and effects resulting from this exception up until now. People, in a fully public manner, in front of hundreds of thousands of individuals—if we count the people who viewed the images widely available on social media—could see the call to genocide made in the name of a religion.

Unfortunately, this call was not able to be criminally prosecuted, probably due to the exception. Again, we think this is unacceptable. This position is held by the Quebec government and by organizations such as the Rassemblement pour la laïcité, of which I am the vice-president. Ours is an umbrella organization for dozens of organizations representing thousands of people.

Dr. Emily Laidlaw Associate Professor and Canada Research Chair in Cybersecurity Law, University of Calgary, As an Individual

Thank you for the invitation to appear before you.

My name is Emily Laidlaw. I'm a Canada research chair and associate professor of law at the University of Calgary.

At the last committee meeting, and earlier today, you heard horrific stories, bringing home the harms this legislation aims to address. With my time, I'd like to focus on the legal structure for achieving these goals, why law is needed, why part 1 of Bill C-63 is structured the way it is and what amendments are needed.

My area of expertise is technology law and human rights: specifically, platform regulation, freedom of expression and privacy. I have spent my career examining how best to write these kinds of laws. I will make three points with my time.

First, why do we need a law in the first place? When the Internet was commercialized in the 1990s, tech companies became powerful arbiters of expression. They set the rules and how to enforce them. Their power has only grown over time.

Social media are essentially data and advertising businesses and, now, AI businesses. How they deliver that to consumers and how they design their products and services can directly cause harm. For example, how they design their algorithms makes decisions about our mental health, pushing content encouraging self-harm and hate. They use persuasive techniques to nudge addictive behaviour, such as with endless scrolling rewards and constant notifications.

Thus far in Canada, we have largely relied on corporate self-governance. The EU, U.K. and U.S. passed legislation decades ago. Many are on their second-generation versions of these laws, and a network of regulators is working together to create global coherence.

Meanwhile, Canada has never passed a comprehensive law in this space. The law that does apply is piecemeal, mainly a bit of defamation, privacy and competition law, circling important dimensions of the problem, but not dealing with it directly.

Where does that leave us in Canada? Part 1 of Bill C-63 is the product of years of consultation, to which I contributed. In my view, with amendments, it is the best legal structure to address online harms.

That brings me to my second point. This legislation impacts the right to freedom of expression.

Our expert panel spent considerable time on how best to protect freedom of expression, and the graduated approach we recommended is reflected in this bill.

There are three levels to this graduated approach.

First, the greatest interference with freedom of expression is content removal, and the bill requests that for only two types of content that are the worst of the worst, the stuff that we all agree should be taken down: child sexual abuse material and non-consensual disclosure of intimate images, both of which are crimes.

At the next level is a special duty to protect children, recognizing their unique vulnerability. The duty requires that social media integrate safety by design into their products and services.

The third, the foundation, is that social media have a duty to act responsibly. This does not require content removal. It requires that social media mitigate the risks of exposure to harmful content.

In my view, the bill aligns with global standards because it's focused on systemic risks of harm and takes a risk mitigation approach, coupled with transparency obligations.

Third, I am not here to advocate that the bill is passed as is. The bill is not perfect. It should be carefully studied and amended.

There are also other parts of the bill that don't necessarily need to amended but entail hard choices that should be debated. To be debated are the scope of the bill; what harms are included and not; what social media are included based on size or type; the regulatory structure; a new versus existing body and what powers it should have; and what should be included in the legislation versus left to be developed later in codes of practice or regulations.

There are, however, amendments that I do think are crucial. I'll close with this list. I have three.

One, the duty to act responsibly should also include the duty to have due regard for fundamental rights and how companies mitigate risk. Otherwise, social media might implement sloppy solutions in the name of safety that disproportionately impact rights. This type of provision is in the EU and U.K. legislation.

Two, the duty to act responsibly and duty to protect children should clearly cover algorithmic accountability and transparency. I think it's loosely covered in the current bill, but it should be fleshed out and made explicit.

Three, the child protection section should be reframed as the best interests of the child. In addition, the definitions of harmful content for children should be amended. There are two main amendments here. One is that content that induces a child to harm themselves should be narrowly scoped so that children exploring their identity are not accidentally captured and, two, addictive design features should be added to the list.

Thank you for your time. I look forward to our discussion.

Michelle Ferreri Conservative Peterborough—Kawartha, ON

Exactly. With Bill C-63, you would still have to go through a person, a regulating body, so let's say it's an ombudsman. They would then have to have a meeting with the regulating body. Then they would have to go to the social media platform.

What we're saying is that instead of having to go in-between, you would get to go right to a judge; and the judge would say, okay, this is the person—because there's a duty of care for the social media platform to remove that image instantly.

Witness-Témoin 1

I am trusting those who are in charge of Bill C-63 with what they're doing for the protection of all children in Canada.

Michelle Ferreri Conservative Peterborough—Kawartha, ON

I appreciate that you don't know what that bill is, so that's totally fair and I'm happy to share it with you.

I can tell you that with Bill C-63 there still is this concern of its being years down the road. What I'm saying is that we all want the same thing. We want protection of children today, but if you implement a regulating body, and you don't have duty of care to the social media platforms, then it's not instant, because the regulating body then has to have a meeting, with a meeting, and so on.

Do you see what I'm saying? It's not direct to the person. Does that make sense?

Witness-Témoin 1

First, what I'm going to say is that I'm not familiar with the bill you just spoke about.

My concern is what's in Bill C-63. I see that adding protection and moving forward for my child and the other children.

Michelle Ferreri Conservative Peterborough—Kawartha, ON

Thank you.

Jane, what you've done today is very courageous. People don't know this is happening. They have no idea. I believe that halfway to beating this is.... Obviously, we have to do legislation and implement change, but people don't believe that parents traffic their children. People don't believe that children are used as sexual tools online daily, as you've testified here today. They don't know because they don't want to believe that humanity is that horrific.

I want to tell you thank you. We can't fix anything if we don't acknowledge what has actually happened. Thank you for that.

There are a couple of things I want to point out. The big thing we're trying to sort out here is the best recommendation so that we have implementation as soon as possible to protect children online. We've had witness testimony on sextortion. Children are taking their lives.

Jane, you're traumatized for the rest of your life. Your child is traumatized for the rest of her life. The impact on the community is significant.

Right now, the way that Bill C-63 is written, it is calling on—and I'll use the language from it—a digital safety commission of Canada, the digital safety office of Canada, the position of a digital safety ombudsperson, and a mandate for the commission and ombudsperson to follow. This is another aspect of not having action instantly.

To my Liberal colleague's point of an immediate takedown of the image, you're not going to have that with Bill C-63. You need a regulated body to be put in place, which could take years.

What we're saying in Bill C-412 is that we would implement this instantly through the actual social media platform. A judge would have the capacity instantly to name the person who has the image, release their name and charge them. The duty of care then falls on the social media platforms to be implementing age verification—which we know they can do through algorithms.

The issue we're having with Bill C-63 is the same issue we've seen in other regulating bodies. The action doesn't come with the intention.

The example I will give you is the ombudsperson we have in this country for victims. They've seen an increase of 477%. Nothing happens after the victims go to the ombudsman, right? There's no action tied to it.

My question for you, Jane, is this. Would you like to see a bill like Bill C-412 that implements instant action on the social media platforms and enables judges to ensure that those names are released so that there is actually a takedown and not just an intention of takedown?

Peter Julian NDP New Westminster—Burnaby, BC

The message you're sending us is very clear: that we need to take action. I think all members of the committee understand that. I can't thank you enough for coming forward today to share that with us.

I have questions for the other witnesses.

Now I'm going to turn to Ms. Bussières McNicoll and Ms. Claveau.

Part 1 of Bill C‑63 establishes fines. Operators are liable to “a fine of not more than 3% of the person’s gross global revenue or $10 million, whichever is greater”.

It says that, on summary conviction, an operator is liable to “a fine of not more than 2% of the person’s gross global revenue or $5 million, whichever is greater”.

Individuals are liable to “a fine of not more than $50,000”. That seems pretty low given the repercussions of the offence in question, such as the impact on Witness 1, her daughter and family.

It's one thing to put a legislative framework in place, but it's another to establish penalties in order to end the scourge. It's clear that the case involving Witness 1's daughter calls for significant penalties.

What do you think of the penalties I just mentioned and the approach outlined in the bill?

I would like Ms. Bussières McNicoll to answer first.

Michel Marchand Member, Criminal Law Expert Group, Barreau du Québec

Good afternoon.

Emotion of an intense and extreme nature is being used as an objective test.

It is important, however, to distinguish between the test set out in Keegstra and Mugesera, which were criminal law decisions, and the test set out in Whatcott and other human rights decisions. The decision was made to rework the test in Whatcott.

Basically, the test selected was the one established in the decisions I just mentioned. It was simply adjusted to clarify that the emotion must be characterized as would reasonably be expected. That means the emotion, not of the person at the source of the content in question, but of the person on the receiving end of the content.

I think the definitions set out by the Supreme Court for the term “hatred” are very clear. It's about taking those criteria and incorporating them into the Criminal Code.

As I see it, the current provisions in Bill C‑63 set a lower standard than the test established in Mugesera.

I think it's important to be very careful because when you get into freedom of expression and freedom of religion, people have rights. The Supreme Court considered the issue very seriously and thoroughly, examining hundreds of pages of material before making the findings it did and rendering its decision.

James Maloney Liberal Etobicoke—Lakeshore, ON

Thank you, Mr. Chair.

I want to thank all the witnesses for joining us today.

Jane—I'll refer to you that way—thank you very much for sharing your horrific story with us. We're here talking about a bill presented by the government, Bill C-63, and particularly part 1. My question for you is one that you've somewhat addressed. To quote you, “The unregulated Internet has damaged my child”, and it continues to do so on an ongoing basis.

An important part of part 1 of the bill, which is the part we're focusing on, is the so-called takedown provisions that would be required on the Internet. Criminal Code provisions are one thing, but there's a requirement, as you alluded to, about the importance of having the ability to instantly address a problem when it arises and have something removed from the Internet ASAP.

Can you expand on the importance of that, in your view? Also, if this is not passed into legislation now, can you explain what impact that might have on your family and others?

Tako Van Popta Conservative Langley—Aldergrove, BC

Fair enough.

In your testimony, you referenced parts 1, 2 and 3. You're happy that parts 2 and 3 have now been removed. I know that your organization recommended that, so the minister listened to your recommendation. Congratulations.

My question is on whether part 4 of Bill C-63 could be separated out completely and dealt with separately to accelerate the protection it would afford to people who are sexually harassed. Part 4, just for your reference, amends An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service. Part 1, on the other hand, creates a regulatory body. It will be time-consuming and expensive to get there. Part 4, if it's separated out completely, could be dealt with very quickly.

What's your opinion on that?