An Act to enact the Online Harms Act, to amend the Criminal Code, the Canadian Human Rights Act and An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other Acts

Sponsor

Arif Virani  Liberal

Status

Second reading (House), as of Sept. 23, 2024

Subscribe to a feed (what's a feed?) of speeches and votes in the House related to Bill C-63.

Summary

This is from the published bill. The Library of Parliament has also written a full legislative summary of the bill.

Part 1 of this enactment enacts the Online Harms Act , whose purpose is to, among other things, promote the online safety of persons in Canada, reduce harms caused to persons in Canada as a result of harmful content online and ensure that the operators of social media services in respect of which that Act applies are transparent and accountable with respect to their duties under that Act.
That Act, among other things,
(a) establishes the Digital Safety Commission of Canada, whose mandate is to administer and enforce that Act, ensure that operators of social media services in respect of which that Act applies are transparent and accountable with respect to their duties under that Act and contribute to the development of standards with respect to online safety;
(b) creates the position of Digital Safety Ombudsperson of Canada, whose mandate is to provide support to users of social media services in respect of which that Act applies and advocate for the public interest in relation to online safety;
(c) establishes the Digital Safety Office of Canada, whose mandate is to support the Digital Safety Commission of Canada and the Digital Safety Ombudsperson of Canada in the fulfillment of their mandates;
(d) imposes on the operators of social media services in respect of which that Act applies
(i) a duty to act responsibly in respect of the services that they operate, including by implementing measures that are adequate to mitigate the risk that users will be exposed to harmful content on the services and submitting digital safety plans to the Digital Safety Commission of Canada,
(ii) a duty to protect children in respect of the services that they operate by integrating into the services design features that are provided for by regulations,
(iii) a duty to make content that sexually victimizes a child or revictimizes a survivor and intimate content communicated without consent inaccessible to persons in Canada in certain circumstances, and
(iv) a duty to keep all records that are necessary to determine whether they are complying with their duties under that Act;
(e) authorizes the Digital Safety Commission of Canada to accredit certain persons that conduct research or engage in education, advocacy or awareness activities that are related to that Act for the purposes of enabling those persons to have access to inventories of electronic data and to electronic data of the operators of social media services in respect of which that Act applies;
(f) provides that persons in Canada may make a complaint to the Digital Safety Commission of Canada that content on a social media service in respect of which that Act applies is content that sexually victimizes a child or revictimizes a survivor or intimate content communicated without consent and authorizes the Commission to make orders requiring the operators of those services to make that content inaccessible to persons in Canada;
(g) authorizes the Governor in Council to make regulations respecting the payment of charges by the operators of social media services in respect of which that Act applies, for the purpose of recovering costs incurred in relation to that Act.
Part 1 also makes consequential amendments to other Acts.
Part 2 amends the Criminal Code to, among other things,
(a) create a hate crime offence of committing an offence under that Act or any other Act of Parliament that is motivated by hatred based on certain factors;
(b) create a recognizance to keep the peace relating to hate propaganda and hate crime offences;
(c) define “hatred” for the purposes of the new offence and the hate propaganda offences; and
(d) increase the maximum sentences for the hate propaganda offences.
It also makes related amendments to other Acts.
Part 3 amends the Canadian Human Rights Act to provide that it is a discriminatory practice to communicate or cause to be communicated hate speech by means of the Internet or any other means of telecommunication in a context in which the hate speech is likely to foment detestation or vilification of an individual or group of individuals on the basis of a prohibited ground of discrimination. It authorizes the Canadian Human Rights Commission to deal with complaints alleging that discriminatory practice and authorizes the Canadian Human Rights Tribunal to inquire into such complaints and order remedies.
Part 4 amends An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service to, among other things,
(a) clarify the types of Internet services covered by that Act;
(b) simplify the mandatory notification process set out in section 3 by providing that all notifications be sent to a law enforcement body designated in the regulations;
(c) require that transmission data be provided with the mandatory notice in cases where the content is manifestly child pornography;
(d) extend the period of preservation of data related to an offence;
(e) extend the limitation period for the prosecution of an offence under that Act; and
(f) add certain regulation-making powers.
Part 5 contains a coordinating amendment.

Elsewhere

All sorts of information on this bill is available at LEGISinfo, an excellent resource from the Library of Parliament. You can also read the full text of the bill.

December 11th, 2024 / 5:15 p.m.


See context

Director of research and analytics, Canadian Centre for Child Protection

Jacques Marcoux

Well, fundamentally that's what Bill C-63, in principle, aims to do: It's to establish regulations on the system itself and to impose duties of care onto those companies. Something like age verification, potentially, which has been supported by some parties, would allow websites and platforms to provide age-appropriate experiences to kids. That would be one example of something that could be done.

Bill C-63 is an example of the government trying to establish a systems approach. It's a approach similar to what's happening in the U.K. already, and in Australia and in the EU.

December 11th, 2024 / 5:10 p.m.


See context

Senator, Alberta, Non-affiliated

Kristopher David Wells

I do, absolutely, and in particular the hate crimes provisions that are in Bill C-63. I understand that the minister is considering splitting that bill. I really believe that those hate crimes provisions, which all law enforcement widely supports, need to be in the bill and will help combat hate in this country by having stronger legislation in the Criminal Code.

Hedy Fry Liberal Vancouver Centre, BC

Thank you very much, Chair, and welcome to the chair, Mr. Champoux.

I want to thank the witnesses for coming and I would like to thank them for taking the time to discuss the broad framework of freedom of expression, which is more than freedom of speech.

My last colleague asked a question about Bill C-63, and I want to go back to that question in a way that says I know the Conservatives do not approve of Bill C-63. They call it a “$200 million censorship bureaucracy", but the bottom line is that the Criminal Code changes are not enough to stop this kind of online harm. We know, in fact, that taking down the harmful content, which can stay online for years afterwards, is something the Conservatives also oppose and disapprove of.

Can you elaborate on why it's necessary to do more than the Criminal Code and why it is necessary to remove offensive content online, as Bill C-63 proposes to do?

Jamil Jivani Conservative Durham, ON

Mr. Marcoux, we're low on time, and I do hope you'll get to continue your thoughts in future questions.

What I would just leave you with, though, is I still think that the current Liberal government and the supporters of Bill C-63 have yet to make a convincing argument to the majority of the public that a $200-million bureaucracy is the appropriate response to your concerns. I think that's a challenge we put forward to them, and they regularly continue to fail to meet it.

Thank you.

Jamil Jivani Conservative Durham, ON

Certainly I think a lot of the issues you raised, sir, are concerns shared by many of us.

One of the points of debate between Bill C-63 and Bill C-412 is whether the existing laws and frameworks in our country can be updated and strengthened to respond to your concerns. This is a primary objective of Bill C-412, compared to Bill C-63, which is focused largely on building a $200-million bureaucracy and asking the Canadian public to trust that bureaucracy to accomplish the objectives that I believe you are sincerely interested in.

I'm wondering if you could comment on whether you believe Bill C-412 is an adequate response to many of your concerns, and, if not, why you would prefer Bill C-63's highly bureaucratic, longer-term response to issues that people are looking for urgent action on.

Industry and TechnologyCommittees of the HouseRoutine Proceedings

December 10th, 2024 / 1:45 p.m.


See context

Winnipeg North Manitoba

Liberal

Kevin Lamoureux LiberalParliamentary Secretary to the Leader of the Government in the House of Commons

Mr. Speaker, the member raises a very important issue about the Internet, and threats on the Internet, in a number of ways. He spent a great deal of his time focused on Bill C-27, and understandably so since that is what the motion is about. The government has taken a very holistic approach in dealing with all aspects of the Internet in the form of legislation and regulations.

Quite often in legislation, we see a framework that is absolutely essential to support healthy and strong regulations that, ultimately, protect the interests of Canadians. It has been somewhat frustrating, as the member was frustrated when talking about what is taking place in committees; on the floor of the House of Commons, it has also been frustrating. The member referred to Bill C-27 being held up in committee, but he tried to put the blame on the government.

One of the biggest differences between the government today and the government while Stephen Harper was prime minister is that we are very open to ideas, constructive criticism, and looking at ways we can improve legislation. That means we have been open to amendments and changes. There have been a number of recommendations, but there was also an extensive filibuster on Bill C-27. It was not just government members but opposition members, much like we see filibusters taking place now on other aspects of the safety of Canadians.

For seven or eight weeks now, there has been a Conservative filibuster on the floor of the House of Commons, and there are other pieces of legislation dealing with the Internet that the Conservatives continue to filibuster. I am referring to Bill C-63, which deals with things such as intimate images being spread on the Internet without consent and child exploitation. We are talking about serious issues facing Canadians, including Bill C-63, that we cannot even get to committee because the Conservative Party has made the decision to filibuster on the floor of the House of Commons.

When the member opposite talks about Bill C-27, I can assure the member that the government is very keen on the legislation. We do not see how Canadians would benefit by splitting the legislation because both aspects are really important to Canadians. We should look at where it can be improved and we are open to that. We have clearly demonstrated that, but we need a higher sense of co-operation, whether dealing with Bill C-63 in the chamber or Bill C-27 at committee. Bill C-26 deals with cybersecurity. As I said, the government is very aware of what is happening on the Internet and our responsibility as legislators to advance legislation that helps establish a framework that will protect the interests of Canadians.

Earlier, I referred to a trip I took to the Philippines in the last five days. One of the companies we visited was a Canadian company, Open Text, that employs 1,500-plus people. We sat in a room that had this huge monitor of the world, and Open Text talked about how threats to infrastructure and to individuals occur every second. We are talking about a trillion type of number when it comes to computer threats occurring on a monthly basis. Open Text can tell where they are coming from and where they are going. It was a very interesting presentation.

No government has invested more in issues around AI than this government has, recognizing the potential good but also the extreme harm out there. We can think about different types of data banks. There are government data banks, such as Canada Revenue at the national level and health care records at the provincial level. There are the Tim Hortons, the private companies, and the data they acquire in their applications. The amount of information about Canadian individuals on the Internet is incredible. Technology has changed the lives of each and every one of us, whether we know it or not.

We can take a look at the number of cameras on our public streets, in malls and so on. We can think of the number of interactions we have on a daily or weekly basis, whether that is banking, which contains very sensitive information, or medical reports—

Matthew Hatfield

For the international examples, I have to mirror what Ms. Laidlaw said. I think it is the DSA first and then looking at what some of what our Commonwealth peers have done in Australia and the U.K.

From the perspective of partisanship, in the U.K., it was a Conservative government that moved through a bill that had some of the same parameters as Canada's bill. I would encourage everyone to remember that.

If we were simply stacking Bill C-412 with no changes onto Bill C-63 with no changes—every part, with parts 2 and 3 included against each other—in that contest, OpenMedia would prefer Bill C-412. The exciting opportunity you have here, given that we're looking at just parts 1 and 4 potentially, is to strengthen and pass a version of Bill C-63, which, I think, of the Canadian examples, provides the best overall protection.

December 9th, 2024 / 5:50 p.m.


See context

Associate Professor and Canada Research Chair in Cybersecurity Law, University of Calgary, As an Individual

Dr. Emily Laidlaw

I recommend that the committee first study the Digital Services Act. The one thing to keep in mind is that these types of European legislation tend to be a lot shorter and leave a lot until later. I think that Bill C-63 is a little more fulsome, but especially for the algorithmic accountability, with the way it's been addressed there, it could be helpful here.

The other thing to consider for some aspects of it would be the U.K.'s Online Safety Act. We have also drawn certain aspects from Australia's eSafety Commissioner structure. I can't remember the name of the legislation at the moment.

Those are the three that I would recommend that you look at.

Matthew Hatfield

Yes, it would be healthy for Bill C-63 to go a bit further into looking into what algorithms are doing, from the framework of providing more transparency, giving researchers good access to study algorithms and determining how they're impacting the public. If MPs could agree on some language there, could get it done and could move on to the rest of the bill, then that would be healthy. I wouldn't hinge the bill's future on it, but I think that would be the appropriate approach.

Peter Julian NDP New Westminster—Burnaby, BC

Thank you very much for that.

I want to ask the same question of Mr. Hatfield.

Mr. Hatfield, you mentioned that Bill C-63, part 1, accomplishes more than a bill that has been raised around this table—Bill C-412—so you've resolved that for us. It's very clear that we should be putting the focus on Bill C-63, part 1.

To what extent do you believe that algorithm transparency is also important to achieve, and to what extent would you like to see some of the provisions of Bill C-292 incorporated into Bill C-63, part 1?

December 9th, 2024 / 5:50 p.m.


See context

Associate Professor and Canada Research Chair in Cybersecurity Law, University of Calgary, As an Individual

Dr. Emily Laidlaw

Thank you for the question.

I agree. Algorithmic accountability should be added to Bill C-63. Earlier when I spoke, I said that it's loosely covered, but it requires a leap of faith. We need more in the legislation because the duty to act responsibly would leave it to the digital safety commission to develop codes of practice and regulations. There is scope there for algorithmic transparency and algorithmic amplification to be covered, but that kind of digital safety by design and the algorithmic accountability need to be embedded in the legislation itself.

The same goes for the children's provisions. It does cover algorithms when it comes to safety by design, but it's one very short provision. If we take your bill and some of the provisions in that, and if that becomes a blueprint to flesh out those parts of Bill C-63 in part 1, then I think that we would be in a good position

The Vice-Chair (Mr. Rhéal Éloi Fortin) Bloc Rhéal Fortin

Thank you.

Mr. Boucher, are you able to tell us about the definition of the word “hate” as proposed in Bill C‑63?

You may have heard the comments made by the witnesses who appeared in the first part of the meeting. The Barreau du Québec has expressed its opinion on this definition. We were told about a Supreme Court decision, the name of which escapes me. In that case, a judge looked at that definition.

I'd like to hear your thoughts on that.

How should that word be defined and what are the parameters that would make it possible to frame this concept?

The Vice-Chair (Mr. Rhéal Éloi Fortin) Bloc Rhéal Fortin

I'm referring to part 2 of the bill, which deals with hate. The minister announced that he was going to split the bill in two. I obviously agree with that, since it was the Bloc Québécois that made the request in the first place. We agree that the bill needs to be split in two. However, until this actually comes to pass, we are conducting a prestudy of Bill C‑63in its entirety.

I'm taking the liberty of asking you this question, even though I, too, think that my question should be asked as part of another study.

The bill reads as follows: “A person may, with the Attorney General's consent, lay an information before a provincial court judge if the person fears on reasonable grounds that another person will commit...”.

Does that sound reasonable? Are you not concerned that this could open the door to abuse in terms of whistleblowing?

The Vice-Chair (Mr. Rhéal Éloi Fortin) Bloc Rhéal Fortin

Thank you.

I will now take the floor for six minutes.

I'd like to thank all the witnesses for being with us today.

Mr. Boucher, Mr. Côté, Mr. Hatfield and Ms. Laidlaw, your participation is invaluable.

Ms. Laidlaw, on the subject of hate, Bill C‑63 provides that “A person may, with the Attorney General's consent, lay an information before a provincial court judge if the person fears on reasonable grounds that another person will commit...”.

Are you not concerned that this wording is a little too vague and that it could lead to abuse?

Élisabeth Brière Liberal Sherbrooke, QC

Okay.

Ms. Laidlaw, the platforms already have their own rules, and we know that they sometimes don't follow them.

Therefore, do you believe that Bill C‑63 will be able to hold them in check?