An Act to enact the Online Harms Act, to amend the Criminal Code, the Canadian Human Rights Act and An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other Acts

This bill is from the 44th Parliament, 1st session, which ended in January 2025.

Sponsor

Arif Virani  Liberal

Status

Second reading (House), as of Sept. 23, 2024
(This bill did not become law.)

Summary

This is from the published bill. The Library of Parliament has also written a full legislative summary of the bill.

Part 1 of this enactment enacts the Online Harms Act , whose purpose is to, among other things, promote the online safety of persons in Canada, reduce harms caused to persons in Canada as a result of harmful content online and ensure that the operators of social media services in respect of which that Act applies are transparent and accountable with respect to their duties under that Act.
That Act, among other things,
(a) establishes the Digital Safety Commission of Canada, whose mandate is to administer and enforce that Act, ensure that operators of social media services in respect of which that Act applies are transparent and accountable with respect to their duties under that Act and contribute to the development of standards with respect to online safety;
(b) creates the position of Digital Safety Ombudsperson of Canada, whose mandate is to provide support to users of social media services in respect of which that Act applies and advocate for the public interest in relation to online safety;
(c) establishes the Digital Safety Office of Canada, whose mandate is to support the Digital Safety Commission of Canada and the Digital Safety Ombudsperson of Canada in the fulfillment of their mandates;
(d) imposes on the operators of social media services in respect of which that Act applies
(i) a duty to act responsibly in respect of the services that they operate, including by implementing measures that are adequate to mitigate the risk that users will be exposed to harmful content on the services and submitting digital safety plans to the Digital Safety Commission of Canada,
(ii) a duty to protect children in respect of the services that they operate by integrating into the services design features that are provided for by regulations,
(iii) a duty to make content that sexually victimizes a child or revictimizes a survivor and intimate content communicated without consent inaccessible to persons in Canada in certain circumstances, and
(iv) a duty to keep all records that are necessary to determine whether they are complying with their duties under that Act;
(e) authorizes the Digital Safety Commission of Canada to accredit certain persons that conduct research or engage in education, advocacy or awareness activities that are related to that Act for the purposes of enabling those persons to have access to inventories of electronic data and to electronic data of the operators of social media services in respect of which that Act applies;
(f) provides that persons in Canada may make a complaint to the Digital Safety Commission of Canada that content on a social media service in respect of which that Act applies is content that sexually victimizes a child or revictimizes a survivor or intimate content communicated without consent and authorizes the Commission to make orders requiring the operators of those services to make that content inaccessible to persons in Canada;
(g) authorizes the Governor in Council to make regulations respecting the payment of charges by the operators of social media services in respect of which that Act applies, for the purpose of recovering costs incurred in relation to that Act.
Part 1 also makes consequential amendments to other Acts.
Part 2 amends the Criminal Code to, among other things,
(a) create a hate crime offence of committing an offence under that Act or any other Act of Parliament that is motivated by hatred based on certain factors;
(b) create a recognizance to keep the peace relating to hate propaganda and hate crime offences;
(c) define “hatred” for the purposes of the new offence and the hate propaganda offences; and
(d) increase the maximum sentences for the hate propaganda offences.
It also makes related amendments to other Acts.
Part 3 amends the Canadian Human Rights Act to provide that it is a discriminatory practice to communicate or cause to be communicated hate speech by means of the Internet or any other means of telecommunication in a context in which the hate speech is likely to foment detestation or vilification of an individual or group of individuals on the basis of a prohibited ground of discrimination. It authorizes the Canadian Human Rights Commission to deal with complaints alleging that discriminatory practice and authorizes the Canadian Human Rights Tribunal to inquire into such complaints and order remedies.
Part 4 amends An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service to, among other things,
(a) clarify the types of Internet services covered by that Act;
(b) simplify the mandatory notification process set out in section 3 by providing that all notifications be sent to a law enforcement body designated in the regulations;
(c) require that transmission data be provided with the mandatory notice in cases where the content is manifestly child pornography;
(d) extend the period of preservation of data related to an offence;
(e) extend the limitation period for the prosecution of an offence under that Act; and
(f) add certain regulation-making powers.
Part 5 contains a coordinating amendment.

Similar bills

C-36 (43rd Parliament, 2nd session) An Act to amend the Criminal Code and the Canadian Human Rights Act and to make related amendments to another Act (hate propaganda, hate crimes and hate speech)

Elsewhere

All sorts of information on this bill is available at LEGISinfo, an excellent resource from Parliament. You can also read the full text of the bill.

Bill numbers are reused for different bills each new session. Perhaps you were looking for one of these other C-63s:

C-63 (2017) Law Budget Implementation Act, 2017, No. 2
C-63 (2015) Law Déline Final Self-Government Agreement Act
C-63 (2013) Law Appropriation Act No. 2, 2013-14
C-63 (2009) First Nations Certainty of Land Title Act

Premature Disclosure of Bill C-63PrivilegeGovernment Orders

March 19th, 2024 / 5:15 p.m.


See context

Winnipeg North Manitoba

Liberal

Kevin Lamoureux LiberalParliamentary Secretary to the Leader of the Government in the House of Commons

Mr. Speaker, I am rising to respond to a question of privilege raised by the member for Regina—Qu'Appelle on February 26 regarding the alleged premature disclosure of the content of Bill C-63, the online harms act.

I would like to begin by stating that the member is incorrect in asserting that there has been a leak of the legislation, and I will outline a comprehensive process of consultation and information being in the public domain on this issue long before the bill was placed on notice.

Online harms legislation is something that the government has been talking about for years. In 2015, the government promised to make ministerial mandate letters public, a significant departure from the secrecy around those key policy commitment documents from previous governments. As a result of the publication of the mandate letters, reporters are able to use the language from these letters to try to telegraph what the government bill on notice may contain.

In the 2021 Liberal election platform, entitled “Forward. For Everyone”, the party committed to the following:

Introduce legislation within its first 100 days to combat serious forms of harmful online content, specifically hate speech, terrorist content, content that incites violence, child sexual abuse material and the non-consensual distribution of intimate images. This would make sure that social media platforms and other online services are held accountable for the content that they host. Our legislation will recognize the importance of freedom of expression for all Canadians and will take a balanced and targeted approach to tackle extreme and harmful speech.

Strengthen the Canada Human Rights Act and the Criminal Code to more effectively combat online hate.

The December 16, 2021, mandate letter from the Prime Minister to the Minister of Justice and Attorney General of Canada asked the minister to achieve results for Canadians by delivering on the following commitment:

Continue efforts with the Minister of Canadian Heritage to develop and introduce legislation as soon as possible to combat serious forms of harmful online content to protect Canadians and hold social media platforms and other online services accountable for the content they host, including by strengthening the Canadian Human Rights Act and the Criminal Code to more effectively combat online hate and reintroduce measures to strengthen hate speech provisions, including the re-enactment of the former Section 13 provision. This legislation should be reflective of the feedback received during the recent consultations.

Furthermore, the December 16, 2021, mandate letter from the Prime Minister to the Minister of Canadian Heritage also asked the minister to achieve results for Canadians by delivering on the following commitment:

Continue efforts with the Minister of Justice and Attorney General of Canada to develop and introduce legislation as soon as possible to combat serious forms of harmful online content to protect Canadians and hold social media platforms and other online services accountable for the content they host. This legislation should be reflective of the feedback received during the recent consultations.

As we can see, the government publicly stated its intention to move ahead with online harms legislation, provided information on its plan and consulted widely on the proposal long before any bill was placed on the Notice Paper.

I will now draw to the attention of the House just how broadly the government has consulted on proposed online harms legislation.

Firstly, with regard to online consultations, from July 29 to September 25, 2021, the government published a proposed approach to address harmful content online for consultation and feedback. Two documents were presented for consultation: a discussion guide that summarized and outlined an overall approach, and a technical paper that summarized drafting instructions that could inform legislation.

I think it is worth repeating here that the government published a technical paper with the proposed framework for this legislation back in July 2021. This technical paper outlined the categories of proposed regulated harmful content; it addressed the establishment of a digital safety commissioner, a digital safety commission, regulatory powers and enforcement, etc.

Second is the round table on online safety. From July to November 2022, the Minister of Canadian Heritage conducted 19 virtual and in-person round tables across the country on the key elements of a legislative and regulatory framework on online safety. Virtual sessions were also held on the following topics: anti-Semitism, Islamophobia, anti-Black racism, anti-Asian racism, women and gender-based violence, and the tech industry.

Participants received an information document in advance of each session to prepare for the discussion. This document sought comments on the advice from the expert advisory group on online safety, which concluded its meetings on June 10. The feedback gathered from participants touched upon several key areas related to online safety.

Third is the citizens' assembly on democratic expression. The Department of Canadian Heritage, through the digital citizen initiative, is providing financial support to the Public Policy Forum's digital democracy project, which brings together academics, civil society and policy professionals to support research and policy development on disinformation and online harms. One component of this multi-year project is an annual citizens' assembly on democratic expression, which considers the impacts of digital technologies on Canadian society.

The assembly took place between June 15 and 19, 2023, in Ottawa, and focused on online safety. Participants heard views from a representative group of citizens on the core elements of a successful legislative and regulatory framework for online safety.

Furthermore, in March 2022, the government established an expert advisory group on online safety, mandated to provide advice to the Minister of Canadian Heritage on how to design the legislative and regulatory framework to address harmful content online and how to best incorporate the feedback received during the national consultation held from July to September 2021.

The expert advisory group, composed of 12 individuals, participated in 10 weekly workshops on the components of a legislative and regulatory framework for online safety. These included an introductory workshop and a summary concluding workshop.

The government undertook its work with the expert advisory group in an open and transparent manner. A Government of Canada web page, entitled “The Government's commitment to address online safety”, has been online for more than a year. It outlines all of this in great detail.

I now want to address the specific areas that the opposition House leader raised in his intervention. The member pointed to a quote from a CBC report referencing the intention to create a new regulator that would hold online platforms accountable for harmful content they host. The same website that I just referenced states the following: “The Government of Canada is committed to putting in place a transparent and accountable regulatory framework for online safety in Canada. Now, more than ever, online services must be held responsible for addressing harmful content on their platforms and creating a safe online space that protects all Canadians.”

Again, this website has been online for more than a year, long before the bill was actually placed on notice. The creation of a regulator to hold online services to account is something the government has been talking about, consulting on and committing to for a long period of time.

The member further cites a CBC article that talks about a new regulatory body to oversee a digital safety office. I would draw to the attention of the House the “Summary of Session Four: Regulatory Powers” of the expert advisory group on online safety, which states:

There was consensus on the need for a regulatory body, which could be in the form of a Digital Safety Commissioner. Experts agreed that the Commissioner should have audit powers, powers to inspect, have the powers to administer financial penalties and the powers to launch investigations to seek compliance if a systems-based approach is taken—but views differed on the extent of these powers. A few mentioned that it would be important to think about what would be practical and achievable for the role of the Commissioner. Some indicated they were reluctant to give too much power to the Commissioner, but others noted that the regulator would need to have “teeth” to force compliance.

This web page has been online for months.

I also reject the premise of what the member for Regina—Qu'Appelle stated when quoting the CBC story in question as it relates to the claim that the bill will be modelled on the European Union's Digital Services Act. This legislation is a made-in-Canada approach. The European Union model regulates more than social media and targets the marketplace and sellers. It also covers election disinformation and certain targeted ads, which our online harms legislation does not.

The member also referenced a CTV story regarding the types of online harms that the legislation would target. I would refer to the 2021 Liberal election platform, which contained the following areas as targets for the proposed legislation: “hate speech, terrorist content, content that incites violence, child sexual abuse material and the non-consensual distribution of intimate images.” These five items were the subject of the broad-based and extensive consultations I referenced earlier in my intervention.

Based on these consultations, a further two were added to the list to be considered. I would draw the attention of the House to an excerpt from the consultation entitled, “What We Heard: The Government’s proposed approach to address harmful content online”, which states, “Participants also suggested the inclusion of deep fake technology in online safety legislation”. It continues, “Many noted how child pornography and cyber blackmailing can originate from outside of Canada. Participants expressed frustration over the lack of recourse and tools available to victims to handle such instances and mentioned the need for a collaborative international effort to address online safety.”

It goes on to state:

Some respondents appreciated the proposal going beyond the Criminal Code definitions for certain types of content. They supported the decision to include material relating to child sexual exploitation in the definition that might not constitute a criminal offence, but which would nevertheless significantly harm children. A few stakeholders said that the proposal did not go far enough and that legislation could be broader by capturing content such as images of labour exploitation and domestic servitude of children. Support was also voiced for a concept of non-consensual sharing of intimate images.

It also notes:

A few respondents stated that additional types of content, such as doxing (i.e., the non-consensual disclosure of an individual’s private information), disinformation, bullying, harassment, defamation, conspiracy theories and illicit online opioid sales should also be captured by the legislative and regulatory framework.

This document has been online for more than a year.

I would also point to the expert advisory group's “Concluding Workshop Summary” web page, which states:

They emphasized the importance of preventing the same copies of some videos, like live-streamed atrocities, and child sexual abuse, from being shared again. Experts stressed that many file sharing services allow content to spread very quickly.

It goes on to say:

Experts emphasized that particularly egregious content like child sexual exploitation content would require its own solution. They explained that the equities associated with the removal of child pornography are different than other kinds of content, in that context simply does not matter with such material. In comparison, other types of content like hate speech may enjoy Charter protection in certain contexts. Some experts explained that a takedown obligation with a specific timeframe would make the most sense for child sexual exploitation content.

It also notes:

Experts disagreed on the usefulness of the five categories of harmful content previously identified in the Government’s 2021 proposal. These five categories include hate speech, terrorist content, incitement to violence, child sexual exploitation, and the non-consensual sharing of intimate images.

Another point is as follows:

A few participants pointed out how the anonymous nature of social media gives users more freedom to spread online harm such as bullying, death threats and online hate. A few participants noted that this can cause greater strain on the mental health of youth and could contribute to a feeling of loneliness, which, if unchecked, could lead to self-harm.

Again, this web page has been online for more than a year.

The member further cites the CTV article's reference to a new digital safety ombudsperson. I would point to the web page of the expert advisory group for the “Summary of Session Four: Regulatory Powers”, which states:

The Expert Group discussed the idea of an Ombudsperson and how it could relate to a Digital Safety Commissioner. Experts proposed that an Ombudsperson could be more focused on individual complaints ex post, should users not be satisfied with how a given service was responding to their concerns, flags and/or complaints. In this scheme, the Commissioner would assume the role of the regulator ex ante, with a mandate devoted to oversight and enforcement powers. Many argued that an Ombudsperson role should be embedded in the Commissioner’s office, and that information sharing between these functions would be useful. A few experts noted that the term “Ombudsperson” would be recognizable across the country as it is a common term and [has] meaning across other regimes in Canada.

It was mentioned that the Ombudsperson could play more of an adjudicative role, as distinguished from...the Commissioner’s oversight role, and would have some authority to have certain content removed off of platforms. Some experts noted that this would provide a level of comfort to victims. A few experts raised questions about where the line would be drawn between a private complaint and resolution versus the need for public authorities to be involved.

That web page has been online for months.

Additionally, during the round table on online safety and anti-Black racism, as the following summary states:

Participants were supportive of establishing a digital safety ombudsperson to hold social media platforms accountable and to be a venue for victims to report online harms. It was suggested the ombudsperson could act as a body that takes in victim complaints and works with the corresponding platform or governmental body to resolve the complaint. Some participants expressed concern over the ombudsperson's ability to process and respond to user complaints in a timely manner. To ensure the effectiveness of the ombudsperson, participants believe the body needs to have enough resources to keep pace with the complaints it receives. A few participants also noted the importance for the ombudsperson to be trained in cultural nuances to understand the cultural contexts behind content that is reported to them.

That web page has been online for more than a year.

Finally, I would draw the attention of the House to a Canadian Press article of February 21, 2024, which states, “The upcoming legislation is now expected to pave the way for a new ombudsperson to field public concerns about online content, as well as a new regulatory role that would oversee the conduct of internet platforms.” This appeared online before the bill was placed on notice.

Mr. Speaker, as your predecessor reiterated in his ruling on March 9, 2021, “it is a recognized principle that the House must be the first to learn the details of new legislative measures.” He went on to say, “...when the Chair is called on to determine whether there is a prima facie case of privilege, it must take into consideration the extent to which a member was hampered in performing their parliamentary functions and whether the alleged facts are an offence against the dignity of Parliament.” The Chair also indicated:

When it is determined that there is a prima facie case of privilege, the usual work of the House is immediately set aside in order to debate the question of privilege and decide on the response. Given the serious consequences for proceedings, it is not enough to say that the breach of privilege or contempt may have occurred, nor to cite precedence in the matter while implying that the government is presumably in the habit of acting in this way. The allegations must be clear and convincing for the Chair.

The government understands and respects the well-established practice that members have a right of first access to the legislation. It is clear that the government has been talking about and consulting widely on its plan to introduce online harms legislation for the past two years. As I have demonstrated, the public consultations have been wide-ranging and in-depth with documents and technical papers provided. All of this occurred prior to the bill's being placed on notice.

Some of the information provided by the member for Regina—Qu'Appelle is not even in the bill, most notably the reference to its being modelled on the European Union's Digital Services Act, which is simply false, as I have clearly demonstrated. The member also hangs his arguments on the usage of the vernacular “not authorized to speak publicly” in the media reports he cites. It is certainly not proof of a leak, especially when the government consulted widely and publicly released details on the content of the legislative proposal for years before any bill was actually placed on notice.

The development of the legislation has been characterized by open, public and wide-ranging consultations with specific proposals consulted on. This is how the Leader of the Opposition was able to proclaim, on February 21, before the bill was even placed on notice, that he and his party were vehemently opposed to the bill. He was able to make this statement because of the public consultation and the information that the government has shared about its plan over the last two years. I want to be clear that the government did not share the bill before it was introduced in the House, and the evidence demonstrates that there was no premature disclosure of the bill.

I would submit to the House that consulting Canadians this widely is a healthy way to produce legislation and that the evidence I have presented clearly demonstrates that there is no prima facie question of privilege. It is our view that this does not give way for the Chair to conclude that there was a breach of privilege of the House nor to give the matter precedence over all other business of the House.

Premature Disclosure of Bill C-63PrivilegeGovernment Orders

February 26th, 2024 / 5:15 p.m.


See context

Conservative

Andrew Scheer Conservative Regina—Qu'Appelle, SK

Mr. Speaker, I am rising this afternoon on a question of privilege concerning the leak of key details of Bill C-63, the so-called online harms bill, which was tabled in the House earlier today.

While a lot will be said in the days, weeks and months ahead about the bill in the House, its parliamentary journey is not off to a good start. Yesterday afternoon, the CBC published on its website an article entitled “Ottawa to create regulator to hold online platforms accountable for harmful content: sources”. The article, written by Naama Weingarten and Travis Dhanraj, outlined several aspects of the bill with the information attributed to two sources “with knowledge of Monday's legislation”.

I will read brief excerpts of the CBC's report revealing details of the bill before it was tabled in Parliament.

“The Online Harms Act, expected to be introduced by the federal government on Monday, will include the creation of a new regulator that would hold online platforms accountable for harmful content they host, CBC News has confirmed.”

“The new regulatory body is expected to oversee a digital safety office with the mandate of reducing online harm and will be separate from the Canadian Radio-television and Telecommunications Commission (CRTC), sources say.”

“Sources say some components of the new bill will be modelled on the European Union's Digital Services Act. According to the European Commission, its act “regulates online intermediaries and platforms such as marketplaces, social networks, content-sharing platforms, app stores, and online travel and accommodation platforms.””

Then, today, CTV News published a second report entitled “Justice Minister to Introduce New Bill to Tackle Harmful Online Content”. In Rachel Aiello's article, she says, “According to a senior government source [Bill C-63] would be expected to put an emphasis on harms to youth including specific child protection obligations for social media and other online platforms, including enhanced preservation requirements. It targets seven types of online harms: hate speech, terrorist content, incitement to violence, the sharing of non-consensual intimate images, child exploitation, cyberbullying, and inciting self-harm, and includes measures to crack down on non-consensual artificial intelligence pornography, deepfakes and require takedown provisions for what's become known as 'revenge porn'. Further, while the sources suggested there will be no new powers for law enforcement, multiple reports have indicated the bill will propose creating a new digital safety ombudsperson to field Canadians' concerns about platform decisions around content moderation.”

As explained in footnote 125 on page 84 of the House of Commons Procedure and Practice, third edition, on March 19, 2001: “Speaker Milliken ruled that the provision of information concerning legislation to the media without any effective measures to secure the rights of the House constituted a prima facie case of contempt.”

The subsequent report of the Standing Committee on Procedure and House Affairs concluded: “This case should serve as a warning that our House will insist on the full recognition of its constitutional function and historic privileges across the full spectrum of government.”

Sadly, Mr. Speaker, the warning has had to be sounded multiple times since. Following rulings by your predecessors finding similar prima facie contempt on October 15, 2001, April 19, 2016 and March 10, 2020, not to mention several other close-call rulings that fell short of the necessary threshold yet saw the Chair sound cautionary notes for future reference, a number of those close-call rulings occurred under the present government that would often answer questions of privilege with claims that no one could be certain who had leaked the bill or even when it had been leaked, citing advanced policy consultations with stakeholders.

Mr. Speaker, your immediate predecessor explained, on March 10, 2020, on page 1,892 of the Debates, the balancing act that must be observed. He said:

The rule on the confidentiality of bills on notice exists to ensure that members, in their role as legislators, are the first to know their content when they are introduced. Although it is completely legitimate to carry out consultations when developing a bill or to announce one’s intention to introduce a bill by referring to its public title available on the Notice Paper and Order Paper, it is forbidden to reveal specific measures contained in a bill at the time it is put on notice.

In the present circumstances, no such defence about stakeholders talking about their consultations can be offered. The two sources the CBC relied upon for its reporting were, according to the CBC itself, granted anonymity “because they were not authorized to speak publicly on the matter before the bill is tabled in Parliament.”

As for the CTV report, its senior government source “was not authorized to speak publicly about details yet to be made public.”

When similar comments were made by the Canadian Press in its report on the leak of the former Bill C-7 respecting medical assistance in dying, Mr. Speaker, your immediate predecessor had this to say when finding a prima facie contempt in his March 10, 2020 ruling:

Everything indicates that the act was deliberate. It is difficult to posit a misunderstanding or ignorance of the rules in this case.

Just as in 2020, the leakers knew what they were doing. They knew it was wrong and they knew why it was wrong. The House must stand up for its rights, especially against a government that appears happy to trample over them in the pursuit of legislating the curtailing of Canadians' rights.

Mr. Speaker, if you agree with me that there is a prima facie contempt, I am prepared to move the appropriate motion.