An Act to enact the Online Harms Act, to amend the Criminal Code, the Canadian Human Rights Act and An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other Acts

Sponsor

Arif Virani  Liberal

Status

Second reading (House), as of Sept. 23, 2024

Subscribe to a feed (what's a feed?) of speeches and votes in the House related to Bill C-63.

Summary

This is from the published bill. The Library of Parliament has also written a full legislative summary of the bill.

Part 1 of this enactment enacts the Online Harms Act , whose purpose is to, among other things, promote the online safety of persons in Canada, reduce harms caused to persons in Canada as a result of harmful content online and ensure that the operators of social media services in respect of which that Act applies are transparent and accountable with respect to their duties under that Act.
That Act, among other things,
(a) establishes the Digital Safety Commission of Canada, whose mandate is to administer and enforce that Act, ensure that operators of social media services in respect of which that Act applies are transparent and accountable with respect to their duties under that Act and contribute to the development of standards with respect to online safety;
(b) creates the position of Digital Safety Ombudsperson of Canada, whose mandate is to provide support to users of social media services in respect of which that Act applies and advocate for the public interest in relation to online safety;
(c) establishes the Digital Safety Office of Canada, whose mandate is to support the Digital Safety Commission of Canada and the Digital Safety Ombudsperson of Canada in the fulfillment of their mandates;
(d) imposes on the operators of social media services in respect of which that Act applies
(i) a duty to act responsibly in respect of the services that they operate, including by implementing measures that are adequate to mitigate the risk that users will be exposed to harmful content on the services and submitting digital safety plans to the Digital Safety Commission of Canada,
(ii) a duty to protect children in respect of the services that they operate by integrating into the services design features that are provided for by regulations,
(iii) a duty to make content that sexually victimizes a child or revictimizes a survivor and intimate content communicated without consent inaccessible to persons in Canada in certain circumstances, and
(iv) a duty to keep all records that are necessary to determine whether they are complying with their duties under that Act;
(e) authorizes the Digital Safety Commission of Canada to accredit certain persons that conduct research or engage in education, advocacy or awareness activities that are related to that Act for the purposes of enabling those persons to have access to inventories of electronic data and to electronic data of the operators of social media services in respect of which that Act applies;
(f) provides that persons in Canada may make a complaint to the Digital Safety Commission of Canada that content on a social media service in respect of which that Act applies is content that sexually victimizes a child or revictimizes a survivor or intimate content communicated without consent and authorizes the Commission to make orders requiring the operators of those services to make that content inaccessible to persons in Canada;
(g) authorizes the Governor in Council to make regulations respecting the payment of charges by the operators of social media services in respect of which that Act applies, for the purpose of recovering costs incurred in relation to that Act.
Part 1 also makes consequential amendments to other Acts.
Part 2 amends the Criminal Code to, among other things,
(a) create a hate crime offence of committing an offence under that Act or any other Act of Parliament that is motivated by hatred based on certain factors;
(b) create a recognizance to keep the peace relating to hate propaganda and hate crime offences;
(c) define “hatred” for the purposes of the new offence and the hate propaganda offences; and
(d) increase the maximum sentences for the hate propaganda offences.
It also makes related amendments to other Acts.
Part 3 amends the Canadian Human Rights Act to provide that it is a discriminatory practice to communicate or cause to be communicated hate speech by means of the Internet or any other means of telecommunication in a context in which the hate speech is likely to foment detestation or vilification of an individual or group of individuals on the basis of a prohibited ground of discrimination. It authorizes the Canadian Human Rights Commission to deal with complaints alleging that discriminatory practice and authorizes the Canadian Human Rights Tribunal to inquire into such complaints and order remedies.
Part 4 amends An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service to, among other things,
(a) clarify the types of Internet services covered by that Act;
(b) simplify the mandatory notification process set out in section 3 by providing that all notifications be sent to a law enforcement body designated in the regulations;
(c) require that transmission data be provided with the mandatory notice in cases where the content is manifestly child pornography;
(d) extend the period of preservation of data related to an offence;
(e) extend the limitation period for the prosecution of an offence under that Act; and
(f) add certain regulation-making powers.
Part 5 contains a coordinating amendment.

Elsewhere

All sorts of information on this bill is available at LEGISinfo, an excellent resource from the Library of Parliament. You can also read the full text of the bill.

Alleged Premature Disclosure of Bill C-63PrivilegeOral Questions

March 21st, 2024 / 3:15 p.m.


See context

Conservative

Andrew Scheer Conservative Regina—Qu'Appelle, SK

Mr. Speaker, I wanted to make a very brief intervention in response to the government House leader's parliamentary secretary's response to my question of privilege on Bill C-63 and the leak that occurred.

The parliamentary secretary's 25-minute submission extensively quoted the Internet. What it did not do, however, was explain exactly how the sources whom Travis Dhanraj and Rachel Aiello spoke to were lucky enough to state precisely which of the options the government consulted on would make it into the bill.

Had the reporting been based on the published consultation documents, the media reports would have said so, but they did not. They quoted “sources” who were “not authorized to speak publicly on the matter before the bill is tabled in Parliament.” The parliamentary secretary's implication that the sources were all stakeholders uninformed about the ways of Parliament is demonstrably untrue. CTV's source was “a senior government source”. The CBC attributed its article to “two sources, including one with the federal government”. Besides, had these sources actually all been stakeholders speaking about previous consultations, why would they have sought anonymity to begin with, let alone specify the need for anonymity, because the bill had not yet been introduced?

As I said back on February 26, the leakers knew what they were doing. They knew it was wrong, and they knew why it was wrong. We are not talking about general aspects of the bill that might have been shared with stakeholders during consultation processes. We are talking about very detailed information that was in the legislation and was leaked to the media before it was tabled in the House. That is the issue we are asking you to rule on, Mr. Speaker.

Arif Virani Liberal Parkdale—High Park, ON

My concluding remarks would be, with respect to Bill S-210 proposed by Senator Miville-Dechêne, that there are very legitimate questions that relate to privacy interests. We need to understand that age verification and age-appropriate design features are entrenched in Bill C-63, something that Monsieur Fortin seemed to misunderstand.

Second, the idea of uploading the age-verification measure such as one's government ID is something that has been roundly criticized, including by people like law enforcement, who'd be concerned about what that kind of privacy disclosure would do in terms of perpetuating financial crimes against Canadians.

What we need to be doing here is keeping Canadians safe by ensuring that their age-appropriate design measures have been informed by a conversation between law enforcement, government and the platforms themselves. There are examples of how to do this, and we're keen to work on those examples and to get this important bill into this committee so we can debate the best ways forward.

Thank you.

Élisabeth Brière Liberal Sherbrooke, QC

Thank you, Madam Chair.

Good morning, Minister. I'd like to thank you and your entire team for being with us this morning.

We are living in an increasingly divided world. Even though everyone is entitled to their own opinion, people are either for or against different issues. We are quick to put people into categories, to see them as being on one side or another and slap labels on them. In this increasingly complex world, and perhaps as my previous role taught me, I think it would help if people were more caring, attentive and open to each other.

In your opening remarks, you referred to Bill C‑63, which aims to protect children online. We have been hearing a lot about this bill. I have two questions for you.

First, do you believe that the definition of “hate speech” in Bill C‑63 will really make it possible to achieve the goal of protecting children online?

Second, the bill seems to apply pre-emptively, even before a person has said or done anything. I wonder if you could tell me your thoughts on that.

Rob Moore Conservative Fundy Royal, NB

Thank you, Minister.

Madam Chair, we'll have lots of time to debate Bill C‑63 in the future. I think the verdict is coming out very quickly on that. I want to use what's left of my time to now move my motion regarding former minister David Lametti on the issue of ex-judge Delisle, where the minister ordered a new trial.

I'm moving that motion now, Madam Speaker.

Rob Moore Conservative Fundy Royal, NB

Thank you, Chair.

Minister, we're here in the estimates today. You spent your entire opening remarks on a defence of Bill C-63. I recall your predecessor, Minister Lametti, when he was here. I asked him a question on the issue of MAID, when I think 25 constitutional experts said the minister's opinion on the matter was wrong. I asked the minister who was right, him or these 25 constitutional experts. And he said he was.

That kind of hubris is probably a good reason why he's not longer here and now you are, but we're starting to see that same thing on Bill C-63with yourself, when virtually everyone has come out and said this was an effort to trample down freedom of speech. Margaret Atwood described Bill C-63 as “Orwellian”. David Thomas, who was chairperson of the Canadian Human Rights Tribunal, said:

The Liberal government's proposed Bill C-63, the online harms act, is terrible law that will unduly impose restrictions on Canadians' sacred Charter right to freedom of expression. That is what the Liberals intend. By drafting a vague law creating a draconian regime to address online “harms”, they will win their wars without firing a bullet.

There's a diverse group of people who feel that Bill C-63 is an outrageous infringement on Canadians' rights. We also see a government that will not stand up for the most vulnerable.

You had the opportunity, Minister, to introduce a bill that would have protected children, but your government, true to form, could not resist taking aim at their political opponents. This is not about hate speech, it's about speech that Liberals hate, and shutting that down.

Now Bill C-63, if it unfortunately were to pass, will too be struck down by the courts. If you were in a position to appeal it, I have no doubt you would. That brings me to my question on your government's radical agenda.

You've decided to file a number of appeals in recent court rulings. You've appealed a ruling that found the invocation of the Emergencies Act was unconstitutional. You appealed a ruling that found that the plastic bag ban and the plastic straw ban that Canadians hate so much was unconstitutional. You were quick to appeal those. But when the Supreme Court ruled the six-month minimum sentence for the crime of child luring was unconstitutional, you chose not to file an appeal.

Why is it that, when your government's radical agenda is challenged in the courts, you're quick to appeal, but when vulnerable Canadians' lives are at stake, you choose not to appeal?

Arif Virani Liberal Parkdale—High Park, ON

Thank you, Mr. Garrison, for your leadership on the first part of what you talked about and the courage that you continue to show as a parliamentarian, and also for your leadership and that of Laurel Collins on coercive control.

In terms of supporting victims, we are constantly and actively thinking about how to better support victims, including victims of intimate partner violence. Please take a cue from what we did in Bill C-75 and in Bill C-48 with respect to the reverse onus on bail for survivors of intimate partner violence. Issues about support and funding are always on the table.

Also, please understand that when you talk about a 24-hour takedown of things like revenge porn, you're dealing with an aspect of coercive control that exists right now. That's in Bill C-63.

You also mentioned, in your opening, hearing from voices. I think two of the most salient voices that I heard from were the two that were at the press conference with me: Jane, the mother of a child who has been sexually abused and repeatedly exploited online, and Carla Beauvais, a woman who has been intimidated and has retreated from participating in the public space.

I would also suggest taking your cues from the groups that were also there beside me. The National Council of Canadian Muslims and the Centre for Israel and Jewish Affairs have, in the last six months, not seen eye to eye on a lot of issues. On this bill, they do see eye to eye. They both support this, as do the special envoys on anti-Semitism and Islamophobia. Those are important voices to be hearing from, and that's what I will continue to do.

Randall Garrison NDP Esquimalt—Saanich—Sooke, BC

No, I won't.

I want to thank the minister for his very clear presentation on Bill C-63.

I want to add two things to this discussion. One is that the loudest voices on this bill often do not include those who are most likely to be subjected to hate crime campaigns. When it comes before this committee, I'm looking forward to a diversity of witnesses who can talk about the real-world impacts that online hate has. We've seen it again and again. It's often well organized.

I stood outside the House of Commons and defended the rights of trans kids. Within one day, I had 700 emails with the same defamatory and hateful two-word phrase used to describe me. I am a privileged person. I have a staff. I have all the resources and support I need. However, when you think about what happens to trans kids and their families when they are subjected to these online hate crimes, it has very real consequences.

I'm looking forward to us being able to hear from diverse voices and, in particular, those who are most impacted. I know this is not really a question to you at this point.

We have other important work we've been doing in this committee. I want to turn to Bill C-332, which just passed this committee and was sent back to the House. This is the bill on controlling and coercive behaviour. This committee has been dealing with this topic for more than three years. One of the things that we quite clearly said was that the passage of this bill is a tool for dealing with the epidemic of intimate partner violence, but it's not the only tool.

I guess I'm asking two things here.

What other plans does the Department of Justice have to provide the necessary and associated supports for survivors of intimate partner violence?

What plans are there to do the educational work that will be necessary?

The bill says it will be proclaimed at a time chosen by cabinet. I'm assuming there will be a plan to get ready for this. I'm interested in what's going to happen with that plan. It has unanimous support, so I don't think it's premature to be asking about this at this point.

Arif Virani Liberal Parkdale—High Park, ON

The point I want to make about Bill S‑210 is that Bill C‑63 already contains age verification mechanisms. Furthermore, we must always protect the privacy rights of Canadians. In other words—

Rhéal Fortin Bloc Rivière-du-Nord, QC

Thank you, Chair.

Thank you for being here, Minister.

I have several questions running through my head, but I'll have to prioritize them. I wish I had more time, but I understand that's the way it has to be done.

First, I have some questions about the legal aid system for immigrants and refugees. I'm sure you understand that this issue is of great concern to the Bloc Québécois. In Quebec, the amount owed by the federal government is a problem. In fact, the Quebec government is not getting paid, yet it continues to spend on newcomers.

There's also the question of official languages. A total of $1.2 million has been earmarked for official languages and I'm interested in hearing how that money will be distributed among the provinces.

In addition, there's obviously the whole issue of systemic racism. You want to help judges impose sentences that take this into account. How is that going to work? How are we going to define systemic racism?

There's the question of cybersecurity, in courthouses, etc.

There are plenty of important issues, essential even, that I won't necessarily be able to address this morning, unfortunately. However, I will try.

There's also Bill C‑63, which you told us about in your opening remarks. I'm not sure how it relates to the Supplementary Estimates (C), but it is an important question, regardless. With respect to this bill, I am curious as to why you didn't introduce the age verification process, as proposed by Senator Julie Miville-Dechêne. Her proposal seemed relatively wise to me, but there's no mention of it at all in Bill C‑63.

The Bloc Québécois is in the same boat. We've proposed abolishing the two religious exceptions in the Criminal Code, which I think is essential in the current context. How is it possible that someone can still build their defence around the idea that they committed a hate crime or spread hatred because of a religious text? That is completely absurd and contrary to the values shared by all Quebeckers and, I'm certain, by the rest of Canada too.

These are all essential questions, but I'm going to focus on two important elements.

First, our committee recently passed a bill that aims to create a commission to review errors in the justice system. This is obviously something that had to be done; congratulations. I think it was high time for a major clean‑up. The commission will comprise nine members. I've tabled an amendment to the effect that these nine commissioners should be bilingual. In fact, I'm a little surprised that this wasn't planned from the outset. Still, it seems a very modest goal. Nine bilingual commissioners across Canada shouldn't be too hard to achieve. However, I've run into an objection from some of my colleagues, including one of your Liberal colleagues.

I'd like to hear your thoughts on this. If we want the justice system to be bilingual, shouldn't we necessarily make an effort by asking for bilingualism among these nine commissioners? It's not as though there are 900 of them; there are nine.

March 21st, 2024 / 8:20 a.m.


See context

Parkdale—High Park Ontario

Liberal

Arif Virani LiberalMinister of Justice and Attorney General of Canada

Thank you, Chair, and members of the Committee.

Thank you for inviting me to join you today.

I would like to begin by acknowledging that we are meeting on the traditional unceded territory of the Algonquin Anishinaabe Nation.

As I am sure you have seen, a few weeks ago, I introduced Bill C‑63, the Online Harms Act. I want to both explain the vital importance of the Online Harms Act and dispel misunderstandings about what it does and doesn't do.

The premise of this legislation is simple: we all expect to be safe in our homes, neighbourhoods and communities. We should be able to expect the same kind of security in our online communities. We need to address the online harms that threaten us, and especially our children, every day.

Let me start by talking about our children.

There are currently no safety standards mandated for the online platforms that kids use every day. In contrast, my children's LEGO in our basement is subject to rigorous safety standards and testing before my two boys get their hands on it. I know that these days my children spend much more time online than playing with their LEGO. The most dangerous toys in my home right now and in every Canadian home are the screens our children are on. Social media is everywhere. It brings unchecked dangers and horrific content. This, frankly, terrifies me. We need to make the Internet safe for our young people around the country.

As parents, one of the first things we teach all of our kids is how to cross the road. We tell them to wait for the green light. We tell them to look in both directions. We trust our children, but we also have faith in the rules of the road and that drivers will respect the rules of the road. We trust that cars will stop at a red light and obey the speed limit. Safety depends on a basic network of trust. This is exactly what we are desperately lacking in the digital world. The proposed online harms act would establish rules of the road for platforms so that we can teach our kids to be safe online, with the knowledge that platforms are also doing their part.

Now, let's talk about hate crimes.

The total number of police-reported hate crimes in Canada has reached its highest level on record, nearly doubling the rate recorded in 2019.

Police across the country are calling the increase “staggering”. Toronto Police Chief Myron Demkiw said this week that hate crime calls in Toronto have increased by 93% since last October. Communities and law enforcement have been calling on governments to act.

Bill C-63 creates a new stand-alone hate crime offence to make sure that hate crimes are properly prosecuted and identified. Under our current legal system, hate motivation for a crime is only considered as an afterthought at the sentencing stage; it is not part of the offence-laying itself. The threshold for criminal hatred is high. Comments that offend, humiliate or insult do not hit the standard of hatred. They are what we call awful but lawful. The definition of hate that we are embedding in the Criminal Code comes straight from the Supreme Court of Canada in the Keegstra and Whatcott decisions. We did not make up the definition of hatred that we are proposing.

It has been disappointing, though not surprising, to see the wildly inaccurate assertions made by some commentators about how sentencing for this new hate crime provision would work. I have heard some claim that, under this provision, someone who commits an offence under the National Parks Act would now be subject to a life sentence. That is simply false.

In Canada, judges impose sentences following sentencing ranges established through past decisions. Judges are required by law—and every member of this committee who is a lawyer will know this—to impose sentences that are proportionate to the offence committed. In other words, the punishment must always fit the crime. If judges impose sentences that are unfit, we have appeal courts that can overturn those sentences.

You may be asking, “Well, why not specify that, Minister? Why put a maximum sentence of life in the new hate crime offence-laying provision?”

Let me explain.

First, it's important to remember that a maximum sentence is not an average sentence; it's an absolute ceiling.

Second, the new hate crime offence captures any existing offence if it was hate-motivated. That can run the gamut from a hate-motivated theft all the way to a hate-motivated attempted murder. The sentencing range entrenched in Bill C-63 was designed to mirror the existing sentencing options for all of these potential underlying offences, from the most minor to the most serious offences on the books, such as attempted murder, which can attract, right now, a life sentence.

This does not mean that minor offences will suddenly receive extremely harsh sentences. This would violate all the legal principles that sentencing judges are required to follow. Hate-motivated murder will result in a life sentence. A minor infraction will certainly not result in it.

Another criticism I have heard is that this bill could stifle freedom of expression. This is simply not true. On the contrary, this bill strengthens freedom of expression. There are people in Canada who cannot speak out because they legitimately fear for their safety. When they speak out, they are mistreated and subjected to truly despicable threats, intimidation and harassment.

This is carefully balanced. We consulted. We looked abroad.

We do not automatically take down material within 24 hours except for child sexual abuse material or revenge pornography. We do not touch private communications. We do not affect individual websites that do not host user-generated content.

This bill protects children and gives everyone the tools they need to protect themselves online. We do not tolerate hate speech in the public square. Nor must we tolerate hate speech online.

We have seen the consequences of unchecked online hate and child sexual exploitation. Ask the families of the six people killed at the Quebec City mosque by someone who was radicalized online.

Ask the young boy orphaned by the horrific attack on four members of the Afzaal family in London, Ontario. Ask the parents of young people right across this country who have taken their own lives after being sextorted by online predators.

Finally, let me set the record straight on the peace bond provision in Bill C-63. Peace bonds are not house arrests. Peace bonds are not punishments. Peace bonds are well-established tools used to impose individually tailored conditions on someone when there is credible evidence to show that they may hurt someone or commit a crime. The proposed peace bond here would operate very similarly to existing peace bonds.

As an example, if someone posts online about their plan to deface or attack a synagogue to intimidate the Jewish community, members of the synagogue could take this information to the police and the court. They could seek to have a peace bond imposed after obtaining consent from the provincial attorney general. Decades of case law tell us that conditions must be reasonable and linked to the specific threat. Here conditions imposed on the person could include staying 100 metres away from that synagogue for a period of 12 months. If the person breached that simple condition, they could be arrested. If they abided by the conditions, they would face no consequences.

I ask you this: Why should members of that synagogue, when facing a credible threat of being targeted by a hate-motivated crime, have to wait to be attacked or to have a swastika graffitied on the front door before we act to help them? If we can prevent some attacks from happening, isn't that much better? Peace bonds are not perfect, but we believe they can be a valuable tool to keep people safe. In the face of rising hate crime, our government believes that doing nothing in an instance like this would be irresponsible.

I think that's what explains both CIJA's and the special envoy on anti-Semitism's support of Bill C-63.

As always, I am open to good faith suggestions to improve this legislation. My goal is to get it right. I look forward to debating the Online Harms Act in the House of Commons and following the committee's process as it reaches that stage. I am convinced that we all have the same goal here: we need to create a safe online world, especially for the most vulnerable members of our society—our children.

Thank you for your time.

I'm happy to take your questions.

Alleged Premature Disclosure of Bill C-63PrivilegeGovernment Orders

March 19th, 2024 / 5:15 p.m.


See context

Winnipeg North Manitoba

Liberal

Kevin Lamoureux LiberalParliamentary Secretary to the Leader of the Government in the House of Commons

Mr. Speaker, I am rising to respond to a question of privilege raised by the member for Regina—Qu'Appelle on February 26 regarding the alleged premature disclosure of the content of Bill C-63, the online harms act.

I would like to begin by stating that the member is incorrect in asserting that there has been a leak of the legislation, and I will outline a comprehensive process of consultation and information being in the public domain on this issue long before the bill was placed on notice.

Online harms legislation is something that the government has been talking about for years. In 2015, the government promised to make ministerial mandate letters public, a significant departure from the secrecy around those key policy commitment documents from previous governments. As a result of the publication of the mandate letters, reporters are able to use the language from these letters to try to telegraph what the government bill on notice may contain.

In the 2021 Liberal election platform entitled “Forward. For Everyone.”, the party committed to the following:

Introduce legislation within its first 100 days to combat serious forms of harmful online content, specifically hate speech, terrorist content, content that incites violence, child sexual abuse material and the non-consensual distribution of intimate images. This would make sure that social media platforms and other online services are held accountable for the content that they host. Our legislation will recognize the importance of freedom of expression for all Canadians and will take a balanced and targeted approach to tackle extreme and harmful speech.

Strengthen the Canada Human Rights Act and the Criminal Code to more effectively combat online hate.

The December 16, 2021, mandate letter from the Prime Minister to the Minister of Justice and Attorney General of Canada asked the minister to achieve results for Canadians by delivering on the following commitment:

Continue efforts with the Minister of Canadian Heritage to develop and introduce legislation as soon as possible to combat serious forms of harmful online content to protect Canadians and hold social media platforms and other online services accountable for the content they host, including by strengthening the Canadian Human Rights Act and the Criminal Code to more effectively combat online hate and reintroduce measures to strengthen hate speech provisions, including the re-enactment of the former Section 13 provision. This legislation should be reflective of the feedback received during the recent consultations.

Furthermore, the December 16, 2021, mandate letter from the Prime Minister to the Minister of Canadian Heritage also asked the minister to achieve results for Canadians by delivering on the following commitment:

Continue efforts with the Minister of Justice and Attorney General of Canada to develop and introduce legislation as soon as possible to combat serious forms of harmful online content to protect Canadians and hold social media platforms and other online services accountable for the content they host. This legislation should be reflective of the feedback received during the recent consultations.

As we can see, the government publicly stated its intention to move ahead with online harms legislation, provided information on its plan and consulted widely on the proposal long before any bill was placed on the Notice Paper.

I will now draw to the attention of the House just how broadly the government has consulted on proposed online harms legislation.

Firstly, with regard to online consultations, from July 29 to September 25, 2021, the government published a proposed approach to address harmful content online for consultation and feedback. Two documents were presented for consultation: a discussion guide that summarized and outlined an overall approach, and a technical paper that summarized drafting instructions that could inform legislation.

I think it is worth repeating here that the government published a technical paper with the proposed framework for this legislation back in July 2021. This technical paper outlined the categories of proposed regulated harmful content; it addressed the establishment of a digital safety commissioner, a digital safety commission, regulatory powers and enforcement, etc.

Second is the round table on online safety. From July to November 2022, the Minister of Canadian Heritage conducted 19 virtual and in-person round tables across the country on the key elements of a legislative and regulatory framework on online safety. Virtual sessions were also held on the following topics: anti-Semitism, Islamophobia, anti-Black racism, anti-Asian racism, women and gender-based violence, and the tech industry.

Participants received an information document in advance of each session to prepare for the discussion. This document sought comments on the advice from the expert advisory group on online safety, which concluded its meetings on June 10. The feedback gathered from participants touched upon several key areas related to online safety.

Third is the citizens' assembly on democratic expression. The Department of Canadian Heritage, through the digital citizen initiative, is providing financial support to the Public Policy Forum's digital democracy project, which brings together academics, civil society and policy professionals to support research and policy development on disinformation and online harms. One component of this multi-year project is an annual citizens' assembly on democratic expression, which considers the impacts of digital technologies on Canadian society.

The assembly took place between June 15 and 19, 2023, in Ottawa, and focused on online safety. Participants heard views from a representative group of citizens on the core elements of a successful legislative and regulatory framework for online safety.

Furthermore, in March 2022, the government established an expert advisory group on online safety, mandated to provide advice to the Minister of Canadian Heritage on how to design the legislative and regulatory framework to address harmful content online and how to best incorporate the feedback received during the national consultation held from July to September 2021.

The expert advisory group, composed of 12 individuals, participated in 10 weekly workshops on the components of a legislative and regulatory framework for online safety. These included an introductory workshop and a summary concluding workshop.

The government undertook its work with the expert advisory group in an open and transparent manner. A Government of Canada web page, entitled “The Government's commitment to address online safety”, has been online for more than a year. It outlines all of this in great detail.

I now want to address the specific areas that the opposition House leader raised in his intervention. The member pointed to a quote from a CBC report referencing the intention to create a new regulator that would hold online platforms accountable for harmful content they host. The same website that I just referenced states the following: “The Government of Canada is committed to putting in place a transparent and accountable regulatory framework for online safety in Canada. Now, more than ever, online services must be held responsible for addressing harmful content on their platforms and creating a safe online space that protects all Canadians.”

Again, this website has been online for more than a year, long before the bill was actually placed on notice. The creation of a regulator to hold online services to account is something the government has been talking about, consulting on and committing to for a long period of time.

The member further cites a CBC article that talks about a new regulatory body to oversee a digital safety office. I would draw to the attention of the House the “Summary of Session Four: Regulatory Powers” of the expert advisory group on online safety, which states:

There was consensus on the need for a regulatory body, which could be in the form of a Digital Safety Commissioner. Experts agreed that the Commissioner should have audit powers, powers to inspect, have the powers to administer financial penalties and the powers to launch investigations to seek compliance if a systems-based approach is taken—but views differed on the extent of these powers. A few mentioned that it would be important to think about what would be practical and achievable for the role of the Commissioner. Some indicated they were reluctant to give too much power to the Commissioner, but others noted that the regulator would need to have “teeth” to force compliance.

This web page has been online for months.

I also reject the premise of what the member for Regina—Qu'Appelle stated when quoting the CBC story in question as it relates to the claim that the bill will be modelled on the European Union's Digital Services Act. This legislation is a made-in-Canada approach. The European Union model regulates more than social media and targets the marketplace and sellers. It also covers election disinformation and certain targeted ads, which our online harms legislation does not.

The member also referenced a CTV story regarding the types of online harms that the legislation would target. I would refer to the 2021 Liberal election platform, which contained the following areas as targets for the proposed legislation: “hate speech, terrorist content, content that incites violence, child sexual abuse material and the non-consensual distribution of intimate images.” These five items were the subject of the broad-based and extensive consultations I referenced earlier in my intervention.

Based on these consultations, a further two were added to the list to be considered. I would draw the attention of the House to an excerpt from the consultation entitled, “What We Heard: The Government’s proposed approach to address harmful content online”, which states, “Participants also suggested the inclusion of deep fake technology in online safety legislation”. It continues, “Many noted how child pornography and cyber blackmailing can originate from outside of Canada. Participants expressed frustration over the lack of recourse and tools available to victims to handle such instances and mentioned the need for a collaborative international effort to address online safety.”

It goes on to state:

Some respondents appreciated the proposal going beyond the Criminal Code definitions for certain types of content. They supported the decision to include material relating to child sexual exploitation in the definition that might not constitute a criminal offence, but which would nevertheless significantly harm children. A few stakeholders said that the proposal did not go far enough and that legislation could be broader by capturing content such as images of labour exploitation and domestic servitude of children. Support was also voiced for a concept of non-consensual sharing of intimate images.

It also notes:

A few respondents stated that additional types of content, such as doxing (i.e., the non-consensual disclosure of an individual’s private information), disinformation, bullying, harassment, defamation, conspiracy theories and illicit online opioid sales should also be captured by the legislative and regulatory framework.

This document has been online for more than a year.

I would also point to the expert advisory group's “Concluding Workshop Summary” web page, which states:

They emphasized the importance of preventing the same copies of some videos, like live-streamed atrocities, and child sexual abuse, from being shared again. Experts stressed that many file sharing services allow content to spread very quickly.

It goes on to say:

Experts emphasized that particularly egregious content like child sexual exploitation content would require its own solution. They explained that the equities associated with the removal of child pornography are different than other kinds of content, in that context simply does not matter with such material. In comparison, other types of content like hate speech may enjoy Charter protection in certain contexts. Some experts explained that a takedown obligation with a specific timeframe would make the most sense for child sexual exploitation content.

It also notes:

Experts disagreed on the usefulness of the five categories of harmful content previously identified in the Government’s 2021 proposal. These five categories include hate speech, terrorist content, incitement to violence, child sexual exploitation, and the non-consensual sharing of intimate images.

Another point is as follows:

A few participants pointed out how the anonymous nature of social media gives users more freedom to spread online harm such as bullying, death threats and online hate. A few participants noted that this can cause greater strain on the mental health of youth and could contribute to a feeling of loneliness, which, if unchecked, could lead to self-harm.

Again, this web page has been online for more than a year.

The member further cites the CTV article's reference to a new digital safety ombudsperson. I would point to the web page of the expert advisory group for the “Summary of Session Four: Regulatory Powers”, which states:

The Expert Group discussed the idea of an Ombudsperson and how it could relate to a Digital Safety Commissioner. Experts proposed that an Ombudsperson could be more focused on individual complaints ex post, should users not be satisfied with how a given service was responding to their concerns, flags and/or complaints. In this scheme, the Commissioner would assume the role of the regulator ex ante, with a mandate devoted to oversight and enforcement powers. Many argued that an Ombudsperson role should be embedded in the Commissioner’s office, and that information sharing between these functions would be useful. A few experts noted that the term “Ombudsperson” would be recognizable across the country as it is a common term and [has] meaning across other regimes in Canada.

It was mentioned that the Ombudsperson could play more of an adjudicative role, as distinguished from...the Commissioner’s oversight role, and would have some authority to have certain content removed off of platforms. Some experts noted that this would provide a level of comfort to victims. A few experts raised questions about where the line would be drawn between a private complaint and resolution versus the need for public authorities to be involved.

That web page has been online for months.

Additionally, during the round table on online safety and anti-Black racism, as the following summary states:

Participants were supportive of establishing a digital safety ombudsperson to hold social media platforms accountable and to be a venue for victims to report online harms. It was suggested the ombudsperson could act as a body that takes in victim complaints and works with the corresponding platform or governmental body to resolve the complaint. Some participants expressed concern over the ombudsperson's ability to process and respond to user complaints in a timely manner. To ensure the effectiveness of the ombudsperson, participants believe the body needs to have enough resources to keep pace with the complaints it receives. A few participants also noted the importance for the ombudsperson to be trained in cultural nuances to understand the cultural contexts behind content that is reported to them.

That web page has been online for more than a year.

Finally, I would draw the attention of the House to a Canadian Press article of February 21, 2024, which states, “The upcoming legislation is now expected to pave the way for a new ombudsperson to field public concerns about online content, as well as a new regulatory role that would oversee the conduct of internet platforms.” This appeared online before the bill was placed on notice.

Mr. Speaker, as your predecessor reiterated in his ruling on March 9, 2021, “it is a recognized principle that the House must be the first to learn the details of new legislative measures.” He went on to say, “...when the Chair is called on to determine whether there is a prima facie case of privilege, it must take into consideration the extent to which a member was hampered in performing their parliamentary functions and whether the alleged facts are an offence against the dignity of Parliament.” The Chair also indicated:

When it is determined that there is a prima facie case of privilege, the usual work of the House is immediately set aside in order to debate the question of privilege and decide on the response. Given the serious consequences for proceedings, it is not enough to say that the breach of privilege or contempt may have occurred, nor to cite precedence in the matter while implying that the government is presumably in the habit of acting in this way. The allegations must be clear and convincing for the Chair.

The government understands and respects the well-established practice that members have a right of first access to the legislation. It is clear that the government has been talking about and consulting widely on its plan to introduce online harms legislation for the past two years. As I have demonstrated, the public consultations have been wide-ranging and in-depth with documents and technical papers provided. All of this occurred prior to the bill's being placed on notice.

Some of the information provided by the member for Regina—Qu'Appelle is not even in the bill, most notably the reference to its being modelled on the European Union's Digital Services Act, which is simply false, as I have clearly demonstrated. The member also hangs his arguments on the usage of the vernacular “not authorized to speak publicly” in the media reports he cites. It is certainly not proof of a leak, especially when the government consulted widely and publicly released details on the content of the legislative proposal for years before any bill was actually placed on notice.

The development of the legislation has been characterized by open, public and wide-ranging consultations with specific proposals consulted on. This is how the Leader of the Opposition was able to proclaim, on February 21, before the bill was even placed on notice, that he and his party were vehemently opposed to the bill. He was able to make this statement because of the public consultation and the information that the government has shared about its plan over the last two years. I want to be clear that the government did not share the bill before it was introduced in the House, and the evidence demonstrates that there was no premature disclosure of the bill.

I would submit to the House that consulting Canadians this widely is a healthy way to produce legislation and that the evidence I have presented clearly demonstrates that there is no prima facie question of privilege. It is our view that this does not give way for the Chair to conclude that there was a breach of privilege of the House nor to give the matter precedence over all other business of the House.

Alleged Premature Disclosure of Bill C-63PrivilegeGovernment Orders

February 26th, 2024 / 5:15 p.m.


See context

Conservative

Andrew Scheer Conservative Regina—Qu'Appelle, SK

Mr. Speaker, I am rising this afternoon on a question of privilege concerning the leak of key details of Bill C-63, the so-called online harms bill, which was tabled in the House earlier today.

While a lot will be said in the days, weeks and months ahead about the bill in the House, its parliamentary journey is not off to a good start. Yesterday afternoon, the CBC published on its website an article entitled “Ottawa to create regulator to hold online platforms accountable for harmful content: sources”. The article, written by Naama Weingarten and Travis Dhanraj, outlined several aspects of the bill with the information attributed to two sources “with knowledge of Monday's legislation”.

I will read brief excerpts of the CBC's report revealing details of the bill before it was tabled in Parliament.

“The Online Harms Act, expected to be introduced by the federal government on Monday, will include the creation of a new regulator that would hold online platforms accountable for harmful content they host, CBC News has confirmed.”

“The new regulatory body is expected to oversee a digital safety office with the mandate of reducing online harm and will be separate from the Canadian Radio-television and Telecommunications Commission (CRTC), sources say.”

“Sources say some components of the new bill will be modelled on the European Union's Digital Services Act. According to the European Commission, its act “regulates online intermediaries and platforms such as marketplaces, social networks, content-sharing platforms, app stores, and online travel and accommodation platforms.””

Then, today, CTV News published a second report entitled “Justice Minister to Introduce New Bill to Tackle Harmful Online Content”. In Rachel Aiello's article, she says, “According to a senior government source [Bill C-63] would be expected to put an emphasis on harms to youth including specific child protection obligations for social media and other online platforms, including enhanced preservation requirements. It targets seven types of online harms: hate speech, terrorist content, incitement to violence, the sharing of non-consensual intimate images, child exploitation, cyberbullying, and inciting self-harm, and includes measures to crack down on non-consensual artificial intelligence pornography, deepfakes and require takedown provisions for what's become known as 'revenge porn'. Further, while the sources suggested there will be no new powers for law enforcement, multiple reports have indicated the bill will propose creating a new digital safety ombudsperson to field Canadians' concerns about platform decisions around content moderation.”

As explained in footnote 125 on page 84 of the House of Commons Procedure and Practice, third edition, on March 19, 2001: “Speaker Milliken ruled that the provision of information concerning legislation to the media without any effective measures to secure the rights of the House constituted a prima facie case of contempt.”

The subsequent report of the Standing Committee on Procedure and House Affairs concluded: “This case should serve as a warning that our House will insist on the full recognition of its constitutional function and historic privileges across the full spectrum of government.”

Sadly, Mr. Speaker, the warning has had to be sounded multiple times since. Following rulings by your predecessors finding similar prima facie contempts on October 15, 2001, April 19, 2016 and March 10, 2020, not to mention several other close-call rulings that fell short of the necessary threshold yet saw the Chair sound cautionary notes for future reference, a number of those close-call rulings occurred under the present government that would often answer questions of privilege with claims that no one could be certain who had leaked the bill or even when it had been leaked, citing advanced policy consultations with stakeholders.

Mr. Speaker, your immediate predecessor explained, on March 10, 2020, on page 1,892 of the Debates, the balancing act that must be observed. He said:

The rule on the confidentiality of bills on notice exists to ensure that members, in their role as legislators, are the first to know their content when they are introduced. Although it is completely legitimate to carry out consultations when developing a bill or to announce one’s intention to introduce a bill by referring to its public title available on the Notice Paper and Order Paper, it is forbidden to reveal specific measures contained in a bill at the time it is put on notice.

In the present circumstances, no such defence about stakeholders talking about their consultations can be offered. The two sources the CBC relied upon for its reporting were, according to the CBC itself, granted anonymity “because they were not authorized to speak publicly on the matter before the bill is tabled in Parliament.”

As for the CTV report, its senior government source “was not authorized to speak publicly about details yet to be made public.”

When similar comments were made by the Canadian Press in its report on the leak of the former Bill C-7 respecting medical assistance in dying, Mr. Speaker, your immediate predecessor had this to say when finding a prima facie contempt in his March 10, 2020 ruling:

Everything indicates that the act was deliberate. It is difficult to posit a misunderstanding or ignorance of the rules in this case.

Just as in 2020, the leakers knew what they were doing. They knew it was wrong and they knew why it was wrong. The House must stand up for its rights, especially against a government that appears happy to trample over them in the pursuit of legislating the curtailing of Canadians' rights.

Mr. Speaker, if you agree with me that there is a prima facie contempt, I am prepared to move the appropriate motion.

Online Harms ActRoutine Proceedings

February 26th, 2024 / 3:25 p.m.


See context

Parkdale—High Park Ontario

Liberal

Arif Virani LiberalMinister of Justice and Attorney General of Canada

moved for leave to introduce Bill C-63, An Act to enact the online harms act, to amend the Criminal Code, the Canadian Human Rights Act and an act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service, and to make consequential and related amendments to other acts.

(Motions deemed adopted, bill read the first time and printed)