An Act to enact the Online Harms Act, to amend the Criminal Code, the Canadian Human Rights Act and An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other Acts

Sponsor

Arif Virani  Liberal

Status

Second reading (House), as of Sept. 23, 2024

Subscribe to a feed (what's a feed?) of speeches and votes in the House related to Bill C-63.

Summary

This is from the published bill. The Library of Parliament has also written a full legislative summary of the bill.

Part 1 of this enactment enacts the Online Harms Act , whose purpose is to, among other things, promote the online safety of persons in Canada, reduce harms caused to persons in Canada as a result of harmful content online and ensure that the operators of social media services in respect of which that Act applies are transparent and accountable with respect to their duties under that Act.
That Act, among other things,
(a) establishes the Digital Safety Commission of Canada, whose mandate is to administer and enforce that Act, ensure that operators of social media services in respect of which that Act applies are transparent and accountable with respect to their duties under that Act and contribute to the development of standards with respect to online safety;
(b) creates the position of Digital Safety Ombudsperson of Canada, whose mandate is to provide support to users of social media services in respect of which that Act applies and advocate for the public interest in relation to online safety;
(c) establishes the Digital Safety Office of Canada, whose mandate is to support the Digital Safety Commission of Canada and the Digital Safety Ombudsperson of Canada in the fulfillment of their mandates;
(d) imposes on the operators of social media services in respect of which that Act applies
(i) a duty to act responsibly in respect of the services that they operate, including by implementing measures that are adequate to mitigate the risk that users will be exposed to harmful content on the services and submitting digital safety plans to the Digital Safety Commission of Canada,
(ii) a duty to protect children in respect of the services that they operate by integrating into the services design features that are provided for by regulations,
(iii) a duty to make content that sexually victimizes a child or revictimizes a survivor and intimate content communicated without consent inaccessible to persons in Canada in certain circumstances, and
(iv) a duty to keep all records that are necessary to determine whether they are complying with their duties under that Act;
(e) authorizes the Digital Safety Commission of Canada to accredit certain persons that conduct research or engage in education, advocacy or awareness activities that are related to that Act for the purposes of enabling those persons to have access to inventories of electronic data and to electronic data of the operators of social media services in respect of which that Act applies;
(f) provides that persons in Canada may make a complaint to the Digital Safety Commission of Canada that content on a social media service in respect of which that Act applies is content that sexually victimizes a child or revictimizes a survivor or intimate content communicated without consent and authorizes the Commission to make orders requiring the operators of those services to make that content inaccessible to persons in Canada;
(g) authorizes the Governor in Council to make regulations respecting the payment of charges by the operators of social media services in respect of which that Act applies, for the purpose of recovering costs incurred in relation to that Act.
Part 1 also makes consequential amendments to other Acts.
Part 2 amends the Criminal Code to, among other things,
(a) create a hate crime offence of committing an offence under that Act or any other Act of Parliament that is motivated by hatred based on certain factors;
(b) create a recognizance to keep the peace relating to hate propaganda and hate crime offences;
(c) define “hatred” for the purposes of the new offence and the hate propaganda offences; and
(d) increase the maximum sentences for the hate propaganda offences.
It also makes related amendments to other Acts.
Part 3 amends the Canadian Human Rights Act to provide that it is a discriminatory practice to communicate or cause to be communicated hate speech by means of the Internet or any other means of telecommunication in a context in which the hate speech is likely to foment detestation or vilification of an individual or group of individuals on the basis of a prohibited ground of discrimination. It authorizes the Canadian Human Rights Commission to deal with complaints alleging that discriminatory practice and authorizes the Canadian Human Rights Tribunal to inquire into such complaints and order remedies.
Part 4 amends An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service to, among other things,
(a) clarify the types of Internet services covered by that Act;
(b) simplify the mandatory notification process set out in section 3 by providing that all notifications be sent to a law enforcement body designated in the regulations;
(c) require that transmission data be provided with the mandatory notice in cases where the content is manifestly child pornography;
(d) extend the period of preservation of data related to an offence;
(e) extend the limitation period for the prosecution of an offence under that Act; and
(f) add certain regulation-making powers.
Part 5 contains a coordinating amendment.

Elsewhere

All sorts of information on this bill is available at LEGISinfo, an excellent resource from the Library of Parliament. You can also read the full text of the bill.

June 11th, 2024 / 5:05 p.m.


See context

Associate Professor of Law, University of Colorado Law School, As an Individual

Vivek Krishnamurthy

Very well.

The only thing I will say to conclude is that Bill C-63 does not deal with messaging software, with things like WhatsApp, which are a primary vector by which this kind of content moves. I think that is a good call, because of the difficulty in doing so. It's something that requires further study, a lot of work and a lot of thought on dealing with that particular piece of the distribution problem.

Thank you, Madam Chair.

Vivek Krishnamurthy Associate Professor of Law, University of Colorado Law School, As an Individual

Thank you, Madam Chair.

I'm very honoured to be here. I apologize in advance that I also have a hard deadline, due to child care obligations, so let me get right to it.

I'm not an expert on the harms caused by what the committee is studying, that is, exposure to illegal explicit sexual content. The focus of my remarks today will be on the technological means by which this kind of content is distributed and what can be done about it in compliance with the charter.

Just to frame my remarks, I think we can distinguish between two kinds of material. There's certain material that's per se illegal. Child sexual exploitation material is always illegal, but we face a challenge with material that's what I would call “conditionally illegal”. I think non-consensual distribution of intimate imagery falls into this category, because the illegality depends on whether the distribution is consensual or not—or the creation, for that matter.

The challenge we face is in regulating the distribution of this content by means of distribution that are general purpose. Take a social media platform, whichever one you want—Instagram, TikTok—or take a messaging platform such as WhatsApp. The problem with regulating the distribution of this content on those platforms is, of course, that we use them for many positive purposes, but they of course can be used for ill as well.

I'd like to pivot briefly to discuss the online harms act, which is, of course, before Parliament right now and which I think offers a good approach to dealing with one part of the distribution challenge with regard to social media platforms. These are platforms that take content generated by individuals and make them available to a large number of people. I think the framework of this law is quite sensible in that it creates “a duty to act responsibly”, which gets to the systemic problem of how platforms curate and moderate content. The idea here is to reduce the risk that this kind of content does get distributed on these platforms.

The bill is, in my view, well designed, in that there's also a duty to remove content, especially child sexual exploitation material and non-consensual distribution of intimate imagery, to the extent that platforms' own moderation efforts or user reports flag that content as being unlawful. This is a very sensible approach that I think is very compliant with the charter in its broad strokes.

The challenge, however, is with the effectiveness of these laws. It's very hard to determine before the fact how effective these are, because of issues with determining both the numerator and the denominator. I don't want to take us too much into mathematical territory, but it's very hard for us to measure the prevalence of this content online or on any given platform. It's just hard to identify, in part because the legality—or not—of the content depends on the conditions in which it's distributed. Then, on the numerator, which is how well the platforms are doing the job of getting it off, again, we have issues with identifying what's in and what's out. This is a step forward, but the bill has limitations.

One way of understanding the limitations is with an analogy that a friend of mine, Peter Swire, who teaches at Georgia Tech, calls the problem of “elephants and mice”. There are some elephants in the room, which are large, powerful and visible actors. These are your Metas and your TikToks, or even a company like Pornhub, which has a very large and significant presence. These are players that can't hide from the law, but what is difficult in this space is that there are many mice. Mice are small, they're furtive and they reproduce very quickly. They move around in darkness. This law is going to be very difficult to implement with regard to those kinds of actors, the ones that we find on the darker corners of the Internet.

Again, I think Bill C-63 is a very—

June 10th, 2024 / 1:25 p.m.


See context

Chair, Canadian Muslim Lawyers Association

Husein Panju

We're familiar with Bill C-63, which is currently before the House. It's a complex issue. I think there needs to be some more dialogue with our groups on a more directed basis. You're right: Equity-seeking groups like ours are often the victims and the targets of hate speech, but there also needs to be some more consultation to ensure that any such measures do not overly censor legitimate, non-hateful speech from equity-seeking groups as well.

Online Harms ActGovernment Orders

June 7th, 2024 / 1:10 p.m.


See context

Winnipeg North Manitoba

Liberal

Kevin Lamoureux LiberalParliamentary Secretary to the Leader of the Government in the House of Commons

Mr. Speaker, it is a pleasure to be able to rise and speak to Bill C-63.

We often talk about the communities and neighbourhoods in which we live. We do this not only as parliamentarians but also as politicians in general, whether at the municipal, provincial, or federal level. We talk about how we want people to feel safe. People need to feel safe in their homes, in their communities and in the places where they live. That has always been a priority for the current government and, I would like to think, for all parliamentarians of all political stripes. However, sometimes we need to look at finding a better definition of what we mean when we talk about keeping people safe in our communities.

The Internet is a wonderful thing, and it plays a critical and important role in society today. In fact, I would argue that, nowadays, it is an essential service that is virtually required in all communities. We see provincial and national governments investing greatly to ensure that there is more access to the Internet. We have become more and more dependent on it in so many different ways. It is, for all intents and purposes, a part of the community.

I could go back to the days when I was a child, and my parents would tell me to go outside and play. Yes, I would include my children as having been encouraged to go outside and play. Then things such as Nintendo came out, and people started gravitating toward the TV and playing computer games. I have grandchildren now, and I get the opportunity to see my two grandsons quite a bit. I can tell members that, when I do, I am totally amazed at what they are participating in on the Internet and with respect to technology. There are incredible programs associated with it, from gaming to YouTube, that I would suggest are a part of the community. Therefore, when we say that we want to protect our children in our communities when they are outside, we also need to protect them when they are inside.

It is easy for mega platforms to say it is not their responsibility but that of the parent or guardian. From my perspective, that is a cop-out. We have a responsibility here, and we need to recognize that responsibility. That is what Bill C-63 is all about.

Some people will talk about freedom of speech and so forth. I am all for freedom of speech. In fact, I just got an email from a constituent who is quite upset about how the profanity and flags being displayed by a particular vehicle that is driving around is promoting all sorts of nastiness in the community. I indicated to them that freedom of speech entitles that individual to do that.

I care deeply about the fact that we, as a political party, brought in the Charter of Rights and Freedoms, which guarantees freedom of speech and expression. At the end of the day, I will always advocate for freedom of speech, but there are limitations. I believe that, if we look at Bill C-63, we can get a better sense of the types of limitations the government is talking about. Not only that, but I believe they are a reflection of a lot of the work that has been put together in order to bring the legislation before us today.

I understand some of the comments that have been brought forward, depending on which political parties addressed the bill so far. However, the minister himself has reinforced that this is not something that was done on a napkin; it is something that has taken a great deal of time, effort and resources to make sure that we got it right. The minister was very clear about the consultations that were done, the research that took a look at what has been done in other countries, and what is being said here in our communities. There are a great number of people who have been engaged in the legislation. I suspect that once it gets to committee we will continue to hear a wide spectrum of opinions and thoughts on it.

I do not believe that as legislators we should be put off to such a degree that we do not take action. I am inclined to agree with the minister in saying that this is a holistic approach at dealing with an important issue. We should not be looking at ways to divide the legislation. Rather, we should be looking at ways it can be improved. The minister himself, earlier today, said that if members have ideas or amendments they believe will give more strength to the legislation, then let us hear them. Bring them forward.

Often there is a great deal of debate on something at second reading and not as much at third reading. I suggest that the legislation before us might be the type of legislation that it would be beneficial to pass relatively quickly out of second reading, after some members have had the opportunity to provide some thoughts, in favour of having more reading or debate time at third reading but more specifically to allow for time at the committee stage. That would allow, for example, members the opportunity to have discussions with constituents over the summer, knowing full well that the bill is at committee. I think there is a great deal of merit to that.

There was something that spoke volumes, in terms of keeping the community safe, and the impact today that the Internet has on our children in particular. Platforms have a responsibility, and we have to ensure that they are living up to that responsibility.

I want to speak about Carol Todd, the mother of Amanda Todd, to whom reference has been made already. Ultimately, I believe, she is one of the primary reasons why the legislation is so critically important. Amanda Michelle Todd was born November 27, 1996, and passed away October 10, 2012. Colleagues can do the math. She was a 15-year-old Canadian student and a victim of cyber-bullying who hanged herself at her home in Port Coquitlam, British Columbia. There is a great deal of information on the Internet about to Amanda. I thank her mother, Carol, for having the courage to share the story of her daughter, because it is quite tragic.

I think there is a lot of blame that can be passed around, whether it is to the government, the private sector or society, including individuals. Carol Todd made reference to the thought that her daughter Amanda might still actually be alive if, in fact, Bill C-63 had been law at the time. She said, “As a mom, and having gone through the story that I've gone through with Amanda, this needs to be bipartisan. All parties in the House of Commons need to look in their hearts and look at young Canadians. Our job is to protect them. And parents, we can't do it alone. The government has to step in and that's what we are calling for.”

That is a personal appeal, and it is not that often I will bring up a personal appeal of this nature. I thought it was warranted because I believe it really amplifies and humanizes why this legislation is so important. Some members, as we have seen in the debate already, have indicated that they disagree with certain aspects of the legislation, and that is fine. I can appreciate that there will be diverse opinions on this legislation. However, let us not use that as a way to ultimately prevent the legislation from moving forward.

Years of consultation and work have been put into the legislation to get it to where it is today. I would suggest, given we all have had discussions related to these types of issues, during private members' bills or with constituents, we understand the importance of freedom of speech. We know why we have the Charter of Rights. We understand the basics of hate crime and we all, I believe, acknowledge that freedom of speech does have some limitations to it.

I would like to talk about some of the things we should think about, in terms of responsibilities, when we think about platforms. I want to focus on platforms in my last three minutes. Platforms have a responsibility to be responsible. It is not all about profit. There is a societal responsibility that platforms have, and if they are not prepared to take it upon themselves to be responsible, then the government does need to take more actions.

Platforms need to understand and appreciate that there are certain aspects of society, and here we are talking about children, that need to be protected. Platforms cannot pass the buck on to parents and guardians. Yes, parents and guardians have the primary responsibility, but the Internet never shuts down. Even parents and guardians have limitations. Platforms need to recognize that they also have a responsibility to protect children.

Sexually victimized children, and intimate content that is shared without consent, are the types of things platforms have to do due diligence on. When the issue is raised to platforms, there is a moral and, with the passage of this legislation, a legal obligation for them to take action. I am surprised it has taken this type of legislation to hit that point home. At the end of the day, whether a life is lost, people being bullied, or depression and mental issues are caused because of things of that nature, platforms have to take responsibility.

There are other aspects that we need to be very much aware of. Inciting violent extremism or terrorism needs to be flagged. Content that induces a child to harm themselves also needs to be flagged. As it has been pointed out, this legislation would have a real, positive, profound impact, and it would not have to take away one's freedom of speech. It does not apply to private conversations or communications.

I will leave it at that and continue at a later date.

Online Harms ActGovernment Orders

June 7th, 2024 / 1:05 p.m.


See context

Bloc

Martin Champoux Bloc Drummond, QC

Mr. Speaker, I know that my colleague from New Westminster—Burnaby also cares about regulating what happens on the web. We had the opportunity to work together at the Standing Committee on Canadian Heritage on various topics that have to do with this issue.

We have been waiting for Bill C‑63 for a long time. I think that there is consensus on part 1. As the Bloc Québécois has been saying all day, it is proposing that we split the bill in order to quickly pass part 1, which is one part we all agree on.

The trouble is with part 2 and the subsequent parts. There are a lot of things that deserve to be discussed. There is one in particular that raises a major red flag, as far as I am concerned. It is the idea that a person could file a complaint because they fear that at some point, someone might utter hate speech or commit a crime as described in the clauses of the bill. A complaint could be filed simply on the presumption that a person might commit this type of crime.

To me, that seems to promote a sort of climate of accusation that could lead to paranoia. It makes me think of the movie Minority Report. I am sure my colleague has heard of it. I would like his impressions of this type of thing that we find in Bill C‑63.

Online Harms ActGovernment Orders

June 7th, 2024 / 12:45 p.m.


See context

NDP

Peter Julian NDP New Westminster—Burnaby, BC

Mr. Speaker, first of all, as we mentioned earlier, the NDP believes that certain aspects of Bill C‑63 are important and will help address a situation that calls for measures to counter online harm. However, other elements of this bill are not as clear and raise important questions.

We feel it is really necessary to pass the bill, send it to committee and give that committee the opportunity to do a thorough review. Parts of this bill are well done, but other parts need clarification and still others raise concerns. We therefore have some reservations.

This bill has been needed for years. The Liberal government promised it within 100 days of the last election, but it took almost three years, as members know. Finally, it has been introduced and is being examined. As parliamentarians, we need to do the work necessary to get answers to the questions people are asking, improve the parts of the bill that need improving and pass those parts that are sorely needed.

If parts of the bill cannot be passed or seem not to be in the public interest after a thorough examination in committee, it is our responsibility to withdraw them. However, there is no question that we need this legislation.

The harm being done to children is definitely rising. The idea that people can approach children, without restriction, to encourage them to self-harm or commit suicide should be something that our society will not tolerate. The fact that we have these web giants or platforms that promote child pornography is unacceptable. It should not be happening in our society. We have to acknowledge the importance of implementing laws to prevent this from happening. Hate speech is another issue. We are seeing a disturbing rise in violence in society, which is often fomented online.

For all of these reasons, we are going to pass this bill at second reading. We are going to send it to committee. This part of the process is very important to us. All answers must be obtained and all necessary improvements to the bill must be made in committee.

I do not think that anyone in the Parliament of Canada would like to vote against the principle of having such legislation in place. In practice, the important role of parliamentarians is to do everything in their power to produce a bill that achieves consensus, with questions answered and the necessary improvements put in place.

There is no doubt about the need for the bill. The NDP has been calling for the bill for years. The government promised it after 100 days. Canadians had to wait over 800 days before we saw the bill actually being presented.

In the meantime, the reality is that we have seen more and more cases of children being induced to harm themselves. This is profoundly disturbing to us, as parents, parliamentarians and Canadians, to see how predators have been going after children in our society. When we are talking about child pornography or inducing children to harm themselves, it is something that should be a profound concern to all of us.

Issues around the sharing of intimate content online without permission, in a way that it attacks victims, is also something that we have been calling for action on. It is important for parliamentarians to take action.

We have seen a steady and disturbing rise in hate crimes. We have seen it in all aspects of racism and misogyny, homophobia and transphobia, anti-Semitism and Islamophobia. All of these toxic sources of hate are rising.

I would note two things. First, the rise in anti-Semitism is mirrored by the rise in Islamophobia. Something we have seen from the far right is that they are attacking all groups.

Second, as the ADL has pointed out, in 2022 and 2023, all the violent acts of mass murder that were ideologically motivated came from the far right in North America. These are profoundly disturbing acts. We have a responsibility to take action.

The fact that the government has delayed the bill for so long is something we are very critical of. The fact that it is before us now means that, as parliamentarians, we have the responsibility to take both the sections of the bill where there is consensus and parts of the bill where there are questions and concerns being raised that are legitimate, and we must ensure that the committee has all the resources necessary, once it is referred to the committee in principle.

That second reading vote is a vote in principle, supporting the idea of legislation in this area. However, it is at the committee stage that we will see all the witnesses who need to come forward to dissect the bill and make sure that it is the best possible legislation. From there, we determine which parts of the bill can be improved, which parts are adequate and which parts, if they raise legitimate concerns and simply do not do the job, need to be taken out.

Over the course of the next few minutes, let us go through where there is consensus and where there are legitimate questions being raised. I want to flag that the issue of resources, which has been raised by every speaker so far today, is something that the NDP takes very seriously as well.

In the Conservative government that preceded the current Liberal government, we saw the slashing of crime prevention funding. This basically meant the elimination of resources that play a valuable role in preventing crimes. In the current Liberal government, we have not seen the resources that need to go into countering online harms.

There are legitimate questions being raised about whether resources are going to be adequate for the bill to do the job that it needs to do. Those questions absolutely need to be answered in committee. If the resources are not adequate, the best bill in the world is not going to do the job to stop online harms. Therefore, the issue of resources is key for the NDP as we move forward.

With previous pieces of legislation, we have seen that the intent was good but that the resources were inadequate. The NDP, as the adults in the House, the worker bees of Parliament, as many people have attested, would then push the Liberal government hard to actually ensure adequate resources to meet the needs of the legislation.

Legislation should never be symbolic. It should accomplish a goal. If we are concerned about online harms, and so many Canadians are, then we need to ensure that the resources are adequate to do the job.

Part 1 of the bill responds to the long-delayed need to combat online harms, and a number of speakers have indicated a consensus on this approach. It is important to note the definitions, which we certainly support, in the intent of part 1 of the bill, which is also integrated into other parts of the bill. The definitions include raising concerns about “content that foments hatred”, “content that incites violence”, “content that incites violent extremism or terrorism”, “content that induces a child to harm themselves”, “content that sexually victimizes a child or revictimizes a survivor”, “content used to bully a child” and “intimate content communicated without consent”.

All of these are, I think it is fair to say, definitions that are detailed in how they address each of those categories. This is, I think, a goal all parliamentarians would share. No one wants to see the continued increase in sexual victimization of children and content that induces a child to harm themselves.

I have raised before in the House the sad and tragic story of Molly Russell. I met with her father and have spoken with the family. The tragic result of her having content forced upon her that led to her ending her own life is a tragedy that we have seen repeated many times, where the wild west of online platforms is promoting, often through secret algorithms, material that is profoundly damaging to children. This is something that is simply unacceptable in any society, yet that content proliferates online. It is often reinforced by secret algorithms.

I would suggest that, while the definitions in the bill are strong concerning the content we do not want to see, whether it is violent extremism or the victimization of children, the reality is that it is not tackling a key element of why this harmful online content expands so rapidly, and with such disturbing strength, and that is the secretive algorithms online platforms use. There is no obligation for these companies to come clean about their algorithms, yet these algorithms inflict profound damage on Canadians, victimize children and, often, encourage violence.

One of the pieces I believe needs to be addressed through the committee process of the bill is why these online platforms have no obligation at all to reveal the algorithms that produce, in such disturbing strength, this profoundly toxic content. The fact is that a child, Molly Russell, was, through the algorithms, constantly fed material that encouraged her to ultimately end her own life, and these companies, these massive corporations, are often making unbelievable profits.

I will flag one more time that Canada continues to indirectly subsidize both Meta and Google, to the tune of a billion dollars a year, with indirect subsidies when there is no responsibility from these online platforms at all, which is something I find extremely disturbing. These are massive amounts of money, and they meet with massive profits. We have, as well, these significant subsidies, which we need to absolutely get a handle on. We see the fact that these algorithms are present, and not being dealt with in the legislation, as a major problem.

Second, when we look at other aspects of the bill and the detail that I have just run through in terms of the actual content itself, the definitions in part 1 are not mirrored by the same level of detail in part 2 of the bill, which is the aspects of the Criminal Code that are present. The Criminal Code provisions have raised concerns because of their lack of definition. The concerns around part 2, on the Criminal Code, are something that firmly needs to be dealt with at the committee stage. Answers need to be obtained, and amendments need to be brought to that section. I understand that as part of the committee process there will be rigorous questions asked on part 2. It is a concern that a number of people and a number of organizations have raised. The committee step in this legislation is going to be crucial to improving and potentially deleting parts of the bill, subject to the rigorous questioning that would occur at the committee stage.

The third part of the bill addresses issues around the Canadian Human Rights Commission. We were opposed to the former Harper government's gutting of the ability of the Human Rights Commission to uphold the Charter of Rights and Freedoms. Under the Charter of Rights and Freedoms, the Constitution that governs our country, Canadians have a right to be free from discrimination. The reality of the Harper government's cuts to that portion of the Canadian Human Rights Commission is something that we found disturbing at the time. The reality is that part 3, the question of resources and whether the Canadian Human Rights Commission has the ability to actually respond to the responsibilities that would come from part 3 of the bill, is something that we want to rigorously question witnesses on. Whether we are talking about government witnesses or the Canadian Human Rights Commission, it is absolutely important that we get those answers before we think of the next steps for part 3.

Finally, there is part 4, an act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service. That section of the bill as well is something that, I think it is fair to say, should receive some level of consensus from parliamentarians.

In short, at second reading, as members well know, the intent of the debate and discussion is whether or not we are in agreement with the principle of the bill. New Democrats are in agreement with the principle of the bill. We have broad concerns about certain parts of the bill. The intent around part 1, though, the idea that we would be tackling and forcing a greater level of responsibility on the web giants that have profited for so long with such a degree of irresponsibility to tackle issues of content that incites violence or violent extremism, content that induces a child to harm themselves or that sexually victimizes a child, content used to bully a child, and intimate content communicated without consent, all of those elements of the bill, we support in principle.

We look forward to a very rigorous examination at committee with the witnesses we need to bring forward. There is no doubt that there is a need for this bill and we need to proceed as quickly as possible, but only by hearing from the appropriate witnesses and making sure that we have gotten all the answers and made all the improvements necessary to this bill.

Online Harms ActGovernment Orders

June 7th, 2024 / 12:25 p.m.


See context

Bloc

Andréanne Larouche Bloc Shefford, QC

Mr. Speaker, it is not easy to speak in front of the member for Salaberry—Suroît, who does outstanding work and who just gave a wonderful speech. I will see what I can add to it. I may get a little more technical than she did. She spoke from the heart, as usual, and I commend her for that. I also want to thank her for her shout-out to Bill C-319. People are still talking to me about Bill C‑319, because seniors between the ages of 65 and 74 feel forgotten. We will continue this debate over the summer. In anticipation of this bill's eventual return before the House, we will continue to try to raise public awareness of the important issue of increasing old age security by 10% for all seniors.

I have gotten a bit off today's topic. I am the critic for seniors, but I am also the critic for status of women, and it is more in that capacity that I am rising today to speak to Bill C-63. This is an issue that I hear a lot about. Many groups reach out to me about hate speech. They are saying that women are disproportionately affected. That was the theme that my colleague from Drummond and I chose on March 8 of last year. We are calling for better control over hate speech out of respect for women who are the victims of serious violence online. It is important that we have a bill on this subject. It took a while, but I will come back to that.

Today we are discussing the famous Bill C‑63, the online harms act, “whose purpose is to, among other things, promote the online safety of persons in Canada, reduce harms caused to persons in Canada as a result of harmful content online and ensure that the operators of social media services in respect of which that Act applies are transparent and accountable with respect to their duties under that Act”. This bill was introduced by the Minister of Justice. I will provide a bit of context. I will then talk a bit more about the bill. I will close with a few of the Bloc Québécois's proposals.

To begin, I would like to say that Bill C‑63 should have been introduced much sooner. The Liberals promised to legislate against online hate. As members know, in June 2021, during the second session of the 43rd Parliament, the Liberals tabled Bill C-36, which was a first draft that laid out their intentions. This bill faced criticism, so they chose to let it die on the Order Paper. In July 2021, the government launched consultations on a new regulatory framework for online safety. It then set up an expert advisory group to help it draft a new bill. We saw that things were dragging on, so in 2022 we again asked about bringing back the bill. We wanted the government to keep its promises. This bill comes at a time when tensions are high and discourse is strained, particularly because of the war between Israel and Hamas. Some activists fear that hate speech will be used to silence critics. The Minister of Justice defended himself by saying that the highest level of proof would have to be produced before a conviction could be handed down.

Second, I would like to go back over a few aspects of the bill. Under this bill, operators who refuse to comply with the law, or who refuse to comply with the commission's decision, could face fines of up to 8% of their overall gross revenues, or $25 million, the highest fine, depending on the nature of the offence. Bill C‑63 increases the maximum penalties for hate crimes. It even includes a definition of hate as the “emotion that involves detestation or vilification and that is stronger than disdain or dislike”. The bill addresses that. This legislation includes tough new provisions stipulating that a person who commits a hate-motivated crime, under any federal law, can be sentenced to life in prison. Even more surprising, people can file a complaint before a provincial court judge if they have reasonable grounds to suspect that someone is going to commit one of these offences.

Bill C-63 amends the Canadian Human Rights Act to allow the Canadian Human Rights Commission to receive complaints regarding the communication of hate speech. Individuals found guilty could be subject to an order. Private conversations are excluded from the communication of hate speech. There are all kinds of things like that to examine more closely. As my colleague explained, this bill contains several parts, each with its own elements. Certain aspects will need a closer look in committee.

Bill C-63 also updates the definition of “Internet service”. The law requires Internet service providers to “notify the law enforcement body designated by the regulations...as soon as feasible and in accordance with the regulations” if they have “reasonable grounds to believe that their Internet service is being or has been used to commit a child pornography offence”.

Bill C-63 tackles two major scourges of the digital world, which I have already discussed. The first is non-consensual pornographic material or child pornography, and the second is hate speech.

The provisions to combat child pornography and the distribution of non-consensual pornographic material are generally positive. The Bloc Québécois supports them. That is why the Bloc Québécois supports part 1 of the bill.

On the other hand, some provisions of Bill C‑63 to fight against hate are problematic. The Bloc Québécois fears, as my colleague from Salaberry—Suroît explained, that the provisions of Bill C‑63 might unnecessarily restrict freedom of expression. We want to remind the House that Quebec already debated the subject in 2015. Bill 59, which sought to counter radicalization, was intended to sanction hate speech. Ultimately, Quebec legislators concluded that giving powers to the Commission des droits de la personne et des droits de la jeunesse, as Bill C‑63 would have us do with the Canadian Human Rights Commission, would do more harm than good. The Bloc Québécois is going with the consensus in Quebec on this. It believes that the Criminal Code provisions are more than sufficient to fight against hate speech. Yes, the Bloc Québécois is representing the consensus in Quebec and reiterating it here in the House.

Third, the Bloc Québécois is proposing that Bill C‑63 be divided so that we can debate part 1 separately, as I explained. This is a critical issue. Internet pornography has a disproportionate effect on children, minors and women, and we need to protect them. This part targets sexual content. Online platforms are also targeted in the other parts.

We believe that the digital safety commission must be established as quickly as possible to provide support and recourse for those who are trying to have content about them removed from platforms. We have to help them. By dividing Bill C‑63, we would be able to debate and reach a consensus on part 1 more quickly.

Parts 2, 3 and 4 also contain provisions about hate speech. That is a bit more complex. Part 1 of the bill is well structured. It forces social media operators, including platforms that distribute pornographic material, such as Pornhub, to take measures to increase the security of digital environments. In order to do so, the bill requires social media operators to act responsibly. All of that is very positive.

Part 1 also talks about allowing users to report harmful content to operators based on seven categories defined by the law, so that it can be removed. We want Bill C-63 to be tougher on harmful content, meaning content that sexually victimizes a child or revictimizes a survivor and intimate content communicated without consent. As we have already seen, this has serious consequences for victims with related PTSD. We need to take action.

However, part 2 of the bill is more problematic, because it amends the Criminal Code to increase the maximum sentences for hate crimes. The Bloc Québécois finds it hard to see how increasing maximum sentences for this type of crime will have any effect and how it is justified. Introducing a provision that allows life imprisonment for any hate-motivated federal offence is puzzling.

Furthermore, part 2 provides that a complaint can be made against someone when there is a fear they may commit a hate crime, and orders can be made against that person. However, as explained earlier, there are already sections of the Criminal Code that deal with these situations. This part is therefore problematic.

Part 3 allows an individual to file a complaint with the Canadian Human Rights Commission for speech that foments hate, including online speech. As mentioned, the Bloc Québécois has concerns that these provisions may be used to silence ideological opponents.

Part 4 states that Internet service providers must notify the appropriate authority if they suspect that their services are being used for child pornography purposes. In short, this part should also be studied.

In conclusion, the numbers are alarming. According to Statistics Canada, violent hate crimes have increased each year since 2015. Between 2015 and 2021, the total number of victims of violent hate crimes increased by 158%. The Internet is contributing to the surge in hate. However, if we want to take serious action, I think it is important to split Bill C‑63. The Bloc Québécois has been calling for this for a long time. Part 1 is important, but parts 2, 3 and 4 need to be studied separately in committee.

I would like to acknowledge all the work accomplished on this issue by my colleagues. Specifically, I am referring to the member for Drummond, the member for Rivière-du-Nord and the member for Avignon—La Mitis—Matane—Matapédia. We really must take action.

This is an important issue that the Bloc Québécois has been working on for a very long time.

Online Harms ActGovernment Orders

June 7th, 2024 / 12:15 p.m.


See context

Bloc

Claude DeBellefeuille Bloc Salaberry—Suroît, QC

Mr. Speaker, I have been authorized to share my time with the hon. member for Shefford, who does essential work for the Bloc Québécois on issues having to do with seniors. I would like to take this opportunity to remind the government that Bill C‑319, which was introduced by my colleague, was unanimously adopted in committee with good reason. The Bloc Québécois is proposing to increase the amount of the full pension by 10% starting at age 65 and change the way to guaranteed income supplement is calculated to benefit seniors.

There is a lot of talk about that in my riding. This bill is coming back to the House and the government should make a commitment at some point. We are asking the government to give royal assent to Bill C‑319. In other words, if the bill is blocked again, seniors will understand that the Liberals are once again abandoning them. I am passionate about the cause of seniors, and so I wanted to use my speech on Bill C‑63 to make a heartfelt plea on behalf of seniors in Quebec and to commend my colleague from Shefford for her work.

Today we are debating Bill C‑63, which amends a number of laws to tackle two major digital scourges, specifically child pornography, including online child pornography, and hate speech. This legislation was eagerly awaited. We were surprised that it took the government so long to introduce it.

We have been waiting a long time for this bill, especially part 1. The Bloc Québécois has been waiting a long time for such a bill to protect our children and people who are abused and bullied and whose reputations are jeopardized because of all the issues related to pornography. We agree with part 1 of the bill. We even made an offer to the minister. We agree with it so completely, and I believe there is a consensus about that across the House, that I think we should split the bill and pass the first part before the House rises. That way, we could implement everything needed to protect our children, teens and young adults who are currently going through difficult experiences that can change their lives and have a significant negative impact on them.

We agree that parts 2, 3 and 4 need to be discussed and debated, because the whole hate speech component of the bill is important. We agree with the minister on that. It is very important. What is currently happening on the Internet and online is unacceptable. We need to take action, but reaching an agreement on how to deal with this issue is not that easy. We need time and we need to debate it amongst ourselves.

The Bloc Québécois has a list of witnesses who could enlighten us on how we can improve the situation. We would like to hear from experts who could help us pass the best bill possible in order to protect the public, citizens and groups when it comes to the whole issue of hate speech. We also wonder why the minister, in part 2 of his bill, which deals with hate speech, omitted to include the two clauses of the bill introduced by the member for Beloeil—Chambly. I am talking about Bill C-367, which proposed removing the protection afforded under the Criminal Code to people who engage in hate speech on a religious basis.

We are wondering why the minister did not take the opportunity to add these clauses to his bill. These are questions that we have because to us, offering this protection is out of the question. It is out of the question to let someone use religion as an excuse to make gestures, accusations or even very threatening comments on the Internet under these sections of the Criminal Code. We are asking the minister to listen. The debates in the House and in committee are very polarized right now.

It would be extremely sad and very disappointing if we passed this bill so quickly that there was no time to debate it in order to improve it and make it the best bill it can be.

I can say that the Bloc Québécois is voting in favour of the bill at second reading. As I said, it is a complex bill. We made a proposal to the Prime Minister. We wrote to him and the leader. We also talked to the Minister of Justice to tell him to split the bill as soon as possible. That way, we could quickly protect the survivors who testified at the Standing Committee on Access to Information, Privacy and Ethics in the other Parliament. These people said that their life is unbearable, and they talked about the consequences they are suffering from being victims of sites such as Pornhub. They were used without their consent. Intimate images of them were posted without their consent. We are saying that we need to protect the people currently going through this by quickly adopting part 1. The committee could then study part 2 and hear witnesses.

I know that the member for Drummond and the member for Avignon—La Mitis—Matane—Matapédia raised this idea during committee of the whole on May 23. They tried to convince the minister, but he is still refusing to split the bill. We think that is a very bad idea. We want to repeat our offer. We do not really understand why he is so reluctant to do so. There is nothing partisan about what the Bloc Québécois is proposing. Our focus is on protecting victims on various platforms.

In closing, I know that the leaders are having discussions to finalize when the House will rise for the summer. Maybe fast-tracking a bill like this one could be part of the negotiations. However, I repeat that we are appealing to the Minister of Justice's sense of responsibility. I know he cares a lot about victims and their cause. We are sincerely asking him to postpone the passage of parts 2, 3 and 4, so that we can have more time to debate them in committee. Most importantly, we want to pass part 1 before the House rises for the summer so that we can protect people who are going through a really hard time right now because their private lives have been exposed online and they cannot get web platforms to taken down their image, their photo or photos of their private parts.

We are appealing to the minister's sense of responsibility.

Online Harms ActGovernment Orders

June 7th, 2024 / 10:45 a.m.


See context

Conservative

Michelle Rempel Conservative Calgary Nose Hill, AB

Mr. Speaker, third, the government must actually enforce laws that are already on the books but have not been recently enforced due to a extreme lack of political will and disingenuous politics and leadership, particularly as they relate to hate speech. This is particularly in light of the rise of dangers currently faced by vulnerable Canadian religious communities such as, as the minister mentioned, Canada's Jewish community.

This could be done via actions such as ensuring the RCMP, including specialized integrated national security enforcement teams and national security enforcement sections, is providing resources and working directly with appropriate provincial and municipal police forces to share appropriate information intelligence to provide protection to these communities, as well as making sure the secure security infrastructure program funding is accessible in an expedited manner so community institutions and centres can enhance security measures at their gathering places.

Fourth, for areas where modernization of existing regulations and the Criminal Code need immediate updating to reflect the digital age, and where there could be cross-partisan consensus, the government should undertake these changes in a manner that would allow for swift and non-partisan passage through Parliament.

These items could include some of the provisions discussed in Bill C-63. These include the duty of making content that sexually victimizes a child or revictimizes a survivor, or of intimate content communicated without consent, inaccessible to persons in Canada in certain circumstances; imposing certain duties to keep all records related to sexual victimization to online providers; making provisions for persons in Canada to make a complaint to existing enforcement bodies, such as the CRTC or the police, not a new bureaucracy that would take years to potentially materialize and be costly and/or ineffective; ensuring that content on a social media service that sexually victimizes a child or revictimizes a survivor, or is intimate content communicated without consent, by authorization of a court making orders to the operators of those services, is inaccessible to persons in Canada; and enforcing the proposed amendment to an act respecting the mandatory reporting of internet child pornography by persons who provide an Internet service.

Other provisions the government has chosen not to include in Bill C-63, but that should have been and that Parliament should be considering in the context of harms that are being conducted online, must include updating Canada's existing laws on the non-consensual distribution of intimate images to ensure the distribution of intimate deepfakes is also criminalized, likely through a simple update to the Criminal Code. We could have done this by unanimous consent today had the government taken the initiative to do so. This is already a major problem in Canada with girls in high schools in Winnipeg seeing intimate images of themselves, sometimes, as reports are saying, being sexually violated without any ability for the law to intervene.

The government also needs to create a new criminal offence of online criminal harassment that would update the existing crime of criminal harassment to address the ease and anonymity of online criminal harassment. Specifically, this would apply to those who repeatedly send threatening and/or explicit messages or content to people across the Internet and social media when they know, or should know, it is not welcome. This could include aggravating factors for repeatedly sending such material anonymously and be accompanied by a so-called digital restraining order that would allow victims of online criminal harassment to apply to a judge, under strict circumstances, to identify the harassment and end the harassment.

This would protect privacy, remove the onus on social media platforms from guessing when they should be giving identity to the police and prevent the escalation of online harassment into physical violence. This would give police and victims clear and easy-to-understand tools to prevent online harassment and associated escalation. This would address a major issue of intimate partner violence and make it easier to stop coercive control.

As well, I will note to the minister that members of the governing Liberal Party agreed to the need for these exact measures at a recent meeting of PROC related to online harassment of elected officials this past week.

Fifth, the government should consider a more effective and better way to regulate online platforms, likely under the authority of the CRTC and the Minister of Industry, to better protect children online while protecting charter rights.

This path could include improved measures to do this. This could include, through legislation, not backroom regulation, but precisely through law, defining the duty of care required by online platforms. Some of these duties of care have already been mentioned in questions to the ministers today. This is what Parliament should be seized with, not allowing some unnamed future regulatory body to decide this for us while we have big tech companies and their lobbying arms defining that behind closed doors. That is our job, not theirs.

We could provide parents with safeguards, controls and transparency to prevent harm to their kids when they are online, which could be part of the duty of care. We could also require that online platforms put the interests of children first with appropriate safeguards, again, in a legislative duty of care.

There could also be measures to prevent and mitigate self-harm, mental health disorders, addictive behaviours, bullying and harassment, sexual violence and exploitation, and the promotion of marketing and products that are unlawful for minors. All of these things are instances of duty of care.

We could improve measures to implement privacy-preserving and trustworthy age verification methods, which many platforms always have the capacity to do, while prohibiting the use of a digital ID in any of these mechanisms.

This path could also include measure to ensure that the enforcement of these mechanisms, including a system of administrative penalties and consequences, is done through agencies that already exist. Additionally, we could ensure that there are perhaps other remedies, such as the ability to seek remedy for civil injury, when that duty of care is violated.

This is a non-comprehensive list of online harms, but the point is, we could come to consensus in this place on simple modernization issues that would update the laws now. I hope that the government will accept this plan.

A send out a shout-out to Sean Phelan and David Murray, two strong and mighty workers. We did not have an army of bureaucrats, but we came up with this. I hope that Parliament considers this alternative plan, instead of Bill C-63, because the safety of Canadians is at risk.

Online Harms ActGovernment Orders

June 7th, 2024 / 10:30 a.m.


See context

Conservative

Michelle Rempel Conservative Calgary Nose Hill, AB

Mr. Speaker, we must protect Canadians in the digital age, but Bill C-63 is not the way to do it. It would force Canadians to make unnecessary trade-offs between the guarantee of their security and their charter rights. Today I will explain why Bill C-63 is deeply flawed and why it would not protect Canadians' rights sufficiently. More importantly, I will present a comprehensive alternative plan that is more respectful of Canadians' charter rights and would provide immediate protections for Canadians facing online harms.

The core problem with Bill C-63 is how the government has changed and chosen to frame the myriad harms that occur in the digital space as homogenous and as capable of being solved with one approach or piece of legislation. In reality, harms that occur online are an incredibly heterogenous set of problems requiring a multitude of tailored solutions. It may sound like the former might be more difficult to achieve than the latter, but this is not the case. It is relatively easy to inventory the multitudes of problems that occur online and cause Canadians harm. From there, it should be easy to sort out how existing laws and regulatory processes that exist for the physical world could be extended to the digital world.

There are few, if any, examples of harms that are being caused in digital spaces that do not already have existing relatable laws or regulatory structures that could be extended or modified to cover them. Conversely, what the government has done for nearly a decade is try to create new, catch-all regulatory, bureaucratic and extrajudicial processes that would adapt to the needs of actors in the digital space instead of requiring them to adapt to our existing laws. All of these attempts have failed to become law, which is likely going to be the fate of Bill C-63.

This is a backward way of looking at things. It has caused nearly a decade of inaction on much-needed modernization of existing systems and has translated into law enforcement's not having the tools it needs to prevent crime, which in turn causes harm to Canadians. It has also led to a balkanization of laws and regulations across Canadian jurisdictions, a loss of investment due to the uncertainty, and a lack of coordination with the international community. Again, ultimately, it all harms Canadians.

Bill C-63 takes the same approach by listing only a few of the harms that happen in online spaces and creates a new, onerous and opaque extrajudicial bureaucracy, while creating deep problems for Canadian charter rights. For example, Bill C-63 would create a new “offence motivated by a hatred” provision that could see a life sentence applied to minor infractions under any act of Parliament, a parasitic provision that would be unchecked in the scope of the legislation. This means that words alone could lead to life imprisonment.

While the government has attempted to argue that this is not the case, saying that a serious underlying act would have to occur for the provision to apply, that is simply not how the bill is written. I ask colleagues to look at it. The bill seeks to amend section 320 of the Criminal Code, and reads, “Everyone who commits an offence under this Act or any other Act of Parliament...is guilty of an indictable offence and liable to imprisonment for life.”

At the justice committee earlier this year, the minister stated:

...the new hate crime offence captures any existing offence if it was hate-motivated. That can run the gamut from a hate-motivated theft all the way to a hate-motivated attempted murder. The sentencing range entrenched in Bill C-63 was designed to mirror the existing...options for all of these potential underlying offences, from the most minor to the most serious offences on the books....

The minister continued, saying, “this does not mean that minor offences will suddenly receive...harsh sentences. However, sentencing judges are required to follow legal principles, and “hate-motivated murder will result in a life sentence. A minor infraction will...not result in it.”

In this statement, the minister admitted both that the new provision could be applied to any act of Parliament, as the bill states, and that the government would be relying upon the judiciary to ensure that maximum penalties were not levelled against a minor infraction. Parliament cannot afford the government to be this lazy, and by that I mean not spelling out exactly what it intends a life sentence to apply to in law, as opposed to handing a highly imperfect judiciary an overbroad law that could have extreme, negative consequences.

Similarly, a massive amount of concern from across the political spectrum has been raised regarding Bill C-63's introduction of a so-called hate crime peace bond, calling it a pre-crime provision for speech. This is highly problematic because it would explicitly extend the power to issue peace bonds to crimes of speech, which the bill does not adequately define, nor does it provide any assurance that it would meet a criminal standard for hate.

Equally as concerning is that Bill C-63 would create a new process for individuals and groups to complain to the Canadian Human Rights Commission that online speech directed at them is discriminatory. This process would be extrajudicial, not subject to the same evidentiary standards of a criminal court, and could take years to resolve. Findings would be based on a mere balance of probabilities rather than on the criminal standard of proof beyond a reasonable doubt.

The subjectivity of defining hate speech would undoubtedly lead to punishments for protected speech. The mere threat of human rights complaints would chill large amounts of protected speech, and the system would undoubtedly be deluged with a landslide of vexatious complaints. There certainly are no provisions in the bill to prevent any of this from happening.

Nearly a decade ago, even the Toronto Star, hardly a bastion of Conservative thought, wrote a scathing opinion piece opposing these types of provisions. The same principle should apply today. When the highly problematic components of the bill are overlaid upon the fact that we are presently living under a government that unlawfully invoked the Emergencies Act and that routinely gaslights Canadians who legitimately question efficacy or the morality of its policies as spreading misinformation, as the Minister of Justice did in his response to my question, saying that I had mis-characterized the bill, it is not a far leap to surmise that the new provision has great potential for abuse. That could be true for any political stripe that is in government.

The government's charter compliance statement, which is long and vague and has only recently been issued, should raise concerns for parliamentarians in this regard, as it relies on this statement: “The effects of the Bill on freedom expression are outweighed by the benefits of protecting members of vulnerable groups”. The government has already been found to have violated the Charter in the case of Bill C-69 for false presumptions on which one benefit outweighs others. I suspect this would be the same case for Bill C-63 should it become law, which I hope it does not.

I believe in the capacity of Canadians to express themselves within the bounds of protected speech and to maintain the rule of law within our vibrant pluralism. Regardless of political stripe, we must value freedom of speech and due process, because they are what prevents violent conflict. Speech already has clearly defined limitations under Canadian law. The provisions in Bill C-63 that I have just described are anathema to these principles. To be clear, Canadians should not be expected to have their right to protected speech chilled or limited in order to be safe online, which is what Bill C-63 would ask of them.

Bill C-63 would also create a new three-headed, yet-to-exist bureaucracy. It would leave much of the actual rules the bill describes to be created and enforced under undefined regulations by said bureaucracy at some much later date in the future. We cannot wait to take action in many circumstances. As one expert described it to me, it is like vaguely creating an outline and expecting bureaucrats, not elected legislators, to colour in the picture behind closed doors without any accountability to the Canadian public.

The government should have learned from the costs associated with failing when it attempted the same approach with Bill C-11 and Bill C-18, but alas, here we are. The new bureaucratic process would be slow, onerous and uncertain. If the government proceeds with it, it means Canadians would be left without protection, and innovators and investors would be left without the regulatory certainty needed to grow their businesses.

It would also be costly. I have asked the Parliamentary Budget Officer to conduct an analysis of the costs associated with the creation of the bureaucracy, and he has agreed to undertake the task. No parliamentarian should even consider supporting the bill without understanding the resources the government intends to allocate to the creation of the new digital safety commission, digital safety ombudsman and digital safety office, particularly since the findings in this week's damning NSICOP report starkly outlined the opportunity cost of the government failing to allocate much needed resources to the RCMP.

Said differently, if the government cannot fund and maintain the critical operations of the RCMP, which already has the mandate to enforce laws related to public safety, then Parliament should have grave, serious doubts about the efficacy of its setting up three new bureaucracies to address issues that could likely be managed by existing regulatory bodies like the CRTC or in the enforcement of the Criminal Code. Also, Canadians should have major qualms about creating new bureaucracies which would give power to well-funded and extremely powerful big tech companies to lobby and manipulate regulations to their benefit behind the scenes and outside the purview of Parliament.

This approach would not necessarily protect Canadians and may create artificial barriers to entry for new innovative industry players. The far better approach would be to adapt and extend long-existing laws and regulatory systems, properly resource their enforcement arms, and require big tech companies and other actors in the digital space to comply with these laws, not the other way around. This approach would provide Canadians with real protections, not what amounts to a new, ineffectual complaints department with a high negative opportunity cost to Canadians.

In no scenario should Parliament allow the government to entrench in legislation a power for social media companies to be arbiters of speech, which Bill C-63 risks doing. If the government wishes to further impose restrictions on Canadians' rights to speech, that should be a debate for Parliament to consider, not for regulators and tech giants to decide behind closed doors and with limited accountability to the public.

In short, this bill is completely flawed and should be abandoned, particularly given the minister's announcement this morning that he is unwilling to proceed with any sort of change to it in scope.

However, there is a better way. There is an alternative, which would be a more effective and more quickly implementable plan to protect Canadians' safety in the digital age. It would modernize existing laws and processes to align with digital advancements. It would protect speech not already limited in the Criminal Code, and would foster an environment for innovation and investment in digital technologies. It would propose adequately resourcing agencies with existing responsibilities for enforcing the law, not creating extrajudicial bureaucracies that would amount to a complaints department.

To begin, the RCMP and many law enforcement agencies across the country are under-resourced after certain flavours of politicians have given much more than a wink and a nod to the “defund the police” movement for over a decade. This trend must immediately be reversed. Well-resourced and well-respected law enforcement is critical to a free and just society.

Second, the government must also reform its watered-down bail policies, which allow repeat offenders to commit crimes over and over again. Criminals in the digital space will never face justice, no matter what laws are passed, if the Liberal government's catch-and-release policies are not reversed. I think of a woman in my city of Calgary who was murdered in broad daylight in front of an elementary school because her spouse was subject to the catch-and-release Liberal bail policy, in spite of his online harassment of her for a very long time.

Third, the government must actually enforce—

Online Harms ActGovernment Orders

June 7th, 2024 / 10:20 a.m.


See context

Bloc

Claude DeBellefeuille Bloc Salaberry—Suroît, QC

Mr. Speaker, the Bloc Québécois believes that Bill C-63 tackles two major online scourges and that it is time for us, as legislators, to take action to stamp them out.

The Bloc Québécois strongly supports part 1 of the bill, in other words, all provisions related to addressing child pornography and the communication of pornographic content without consent. As we see it, this part is self-evident. It has garnered such strong consensus that we told the minister, through our critic, the member for Rivière-du-Nord, that we not only support it, but we were also prepared to accept and pass part 1 quickly and facilitate its passage.

As for part 2, however, we have some reservations. We consider it reasonable to debate this part in committee. The minister can accuse other political parties of playing politics with part 2, but not the Bloc Québécois. We sincerely believe that part 2 needs to be debated. We have questions. We have doubts. I think our role calls on us to to get to the bottom of things.

That is why we have asked the minister—and why we are asking him again today—to split Bill C‑63 in two, so that we can pass part 1 quickly and implement it, and set part 2 aside for legislative and debate-related purposes.

Online Harms ActGovernment Orders

June 7th, 2024 / 10 a.m.


See context

Parkdale—High Park Ontario

Liberal

Arif Virani LiberalMinister of Justice and Attorney General of Canada

moved that Bill C-63, An Act to enact the online harms act, to amend the Criminal Code, the Canadian Human Rights Act and An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other Acts, be read the second time and referred to a committee.

Mr. Speaker, hon. colleagues, I am very pleased today to speak to Bill C-63, the online harms act. I speak today not only as a minister and as a fellow parliamentarian, but also as a father, as a South Asian and as a Muslim Canadian.

There are a few moments in this place when our work becomes very personal, and this is one such moment for me. Let me explain why. I ran for office for a number of reasons in 2015. Chief among them was to fight against discrimination and to fight for equality in what I viewed as an increasingly polarized world. In recent years, we have seen that polarization deepen and that hatred fester, including at home here in Canada.

I would never have fathomed that in 2024, Canada would actually lead the G7 in the number of deaths attributable to Islamophobia. Among our allies, it is Canada that has experienced the most fatal attacks against Muslims in the G7. There have been 11. Those were 11 preventable deaths. I say “preventable” because in the trials of both the Quebec mosque shooter, who murdered six men on January 29, 2017, and the man who murdered four members of the Afzaal family in London, Ontario, the attackers admitted, in open court, to having been radicalized online. They admitted what so many of us have always known to be the case: Online hatred has real-world consequences.

Yesterday was the third anniversary of the attack on the Afzaal family, an attack described by the presiding judge as “a terrorist act”. In memory of Talat, Salman, Yumna and Madiha, who lost their lives to an act of hatred on June 6, 2021, we are taking action.

Bill C-63, the online harms act, is a critical piece of that action. This bill is the product of years of work.

We held consultations for over four years. We talked to victims' groups, advocacy groups, international partners, people from the technology industry and the general public. We organized a nationwide consultation and held 19 national and regional round tables. We published a report about what we learned. We listened to the recommendations of our expert advisory group on online safety, a diverse think tank made up of experts who are respected across Canada. We were given valuable advice and gained a great deal of knowledge thanks to those consultations, and all of that informed the development of Bill C-63.

Many of our international partners, such as the United Kingdom, Australia, Germany, France and the European Union, have already done considerable legislative work to try to limit the risks of harmful content online. We learned from their experience and adapted the best parts of their most effective plans to the Canadian context.

We have also learned what did not work abroad, like the immediate takedown of all types of harmful content, originally done in Germany; or like the overbroad restriction on freedom of speech that was struck as unconstitutional in France. We are not repeating those errors here. Our approach is much more measured and reflects the critical importance of constitutionally protected free expression in Canada's democracy. What we learned from this extensive consultation was that the Internet and social media platforms can be a force for good in Canada and around the world. They have been a tool for activists to defend democracy. They are platforms for critical expression and for critical civic discourse. They make learning more accessible to everyone.

The Internet has made people across our vast world feel more connected to one another, but the internet also has a dark side. Last December, the RCMP warned of an alarming spike in online extremism among young people in Canada and the radicalization of youth online. We know that the online environment is especially dangerous for our most vulnerable. A recent study by Plan International found that 58% of girls have experienced harassment online.

Social media platforms are used to exploit and disseminate devastating messages with tragic consequences. This is because of one simple truth. For too long, the profits of platforms have come before the safety of users. Self-regulation has failed to keep our kids safe. Stories of tragedy have become far too common. There are tragic consequences, like the death of Amanda Todd, a 15-year-old Port Coquitlam student who died by suicide on October 10, 2012, after being exploited and extorted by more than 20 social media accounts. This relentless harassment started when Amanda was just 12 years old, in grade 7.

There was Carson Cleland last fall. He was the same age as my son at the time: 12 years old. Carson made a mistake. He shared an intimate image with someone whom he thought was a friend online, only to find himself caught up in a web of sextortion from which he could not extricate himself. Unable to turn to his parents, too ashamed to turn to his friends, Carson turned on himself. Carson is no longer with us, but he should be with us.

We need to do more to protect the Amanda Todds and the Carson Clelands of this country, and with this bill, we will. I met with the incredible people at the Canadian Centre for Child Protection earlier this year, and they told me that they receive 70 calls every single week from scared kids across Canada in situations like Amanda's and like Carson's.

As the father of two youngsters, this is very personal for me. As they grow up, my 10-year-old and 13-year-old boys spend more and more time on screens. I know that my wife and I are not alone in this parenting struggle. It is the same struggle that parents are facing around the country.

At this point, there is no turning back. Our children and teens are being exposed to literally everything online, and I feel a desperate need, Canadians feel a desperate need, to do a better job of protecting those kids online. That is precisely what we are going to do with this bill.

Bill C-63 is guided by four important objectives. It aims to reduce exposure to harmful content online, to empower and support users. Second, it would address and denounce the rise in hatred and hate crimes. Third, it would ensure that victims of hate have recourse to improved remedies, and fourth, it would strengthen the reporting of child sexual abuse material to enhance the criminal justice response to this heinous crime.

The online harms act will address seven types of harmful content based on categories established over more than four years of consultation.

Not all harms will be treated the same. Services will be required to quickly remove content that sexually victimizes a child or that revictimizes a survivor, as well as to remove what we call “revenge porn”, including sexual deepfakes. There is no place for this material on the Internet whatsoever.

For other types of content, like content that induces a child to self-harm or material that bullies a child, we are placing a duty on platforms to protect children. This means a new legislative and regulatory framework to ensure that social media platforms reduce exposure to harmful, exploitative content on their platforms. This means putting in place special protections for children. It also means that platforms will have to make sure that users have the tools and the resources they need to report harmful content.

To fulfill the duty to protect children, social media platforms will have to integrate age-appropriate design features to make their platforms safer for children to use. This could mean defaults for parental controls and warning labels for children. It could mean security settings for instant messaging for children, or it could mean safe-search settings.

Protecting our children is one of our most important duties that we undertake as lawmakers in this place. As a parent, it literally terrifies me that the most dangerous toys in my home, my children's screens, are not subject to any safety standards right now. This needs to change, and it would change with the passage of Bill C-63.

It is not only that children are subject to horrible sexual abuse and bullying online, but also that they are exposed to hate and hateful content, as are Internet users of all ages and all backgrounds, which is why Bill C-63 targets content that foments hatred and incitements to violence as well as incitements to terrorism. This bill would not require social media companies to take down this kind of harmful content; instead, the platforms would have to reduce exposure to it by creating a digital safety plan, disclosing to the digital safety commissioner what steps they are putting in place to reduce risk and reporting back on their progress.

The platforms would also be required to give users practical options for recourse, like tools to either flag or block certain harmful material from their own feeds. This is key to ensuring community safety, all the more so because they are backed by significant penalties for noncompliance. When I say “significant”, the penalties would be 6% of global revenue or $10 million, whichever is higher, and in the instance of a contravention of an order from the digital safety commission, those would rise to 8% of global revenue or $25 million, again, whichever is higher.

The online harms act is an important step towards a safer, more inclusive online environment, where social media platforms actively work to reduce the risk of user exposure to harmful content on their platforms and help to prevent its spread, and where, as a result, everyone in Canada can feel safer to express themselves openly. This is critical, because at the heart of this initiative, it is about promoting expression and participation in civic discourse that occurs online. We can think about Carla Beauvais and the sentiments she expressed when she stood right beside me when we tabled this legislation in February, and the amount of abuse she faced for voicing her concerns about the George Floyd incident in the United States, which cowered her and prevented her from participating online. We want her voice added to the civic discourse. Right now, it has been removed.

The online harms act will regulate social media services, the primary purpose of which is to enable users to share publicly accessible content, services that pose the greatest risk of exposing the greatest number of people to harmful content.

This means that the act would apply to social media platforms, such as Facebook, X and Instagram; user-uploaded adult content services, such as Pornhub; and livestreaming services, such as Twitch. However, it would not apply to any private communications, meaning private texts or direct private messaging on social media apps, such as Instagram or Facebook Messenger. It is critical to underscore, again, that this is a measured approach that does not follow the overreach seen in other countries we have studied, in terms of how they embarked upon this endeavour. The goal is to target the largest social media platforms, the places where the most people in Canada are spending their time online.

Some ask why Bill C-63 addresses both online harms and hate crimes, which can happen both on and off-line. I will explain this. Online dangers do not remain online. We are seeing a dramatic rise in hate crime across our country. According to Statistics Canada, the number of police-reported hate crimes increased by 83% between 2019 and 2022. B'nai Brith Canada reports an alarming 109% increase in anti-Semitic incidents from 2022 to 2023. In the wake of October 7, 2023, I have been hearing frequently from Jewish and Muslim groups, which are openly questioning whether it is safe to be openly Jewish or Muslim in Canada right now. This is not tenable. It should never be tolerated, yet hate-motivated violence keeps happening. People in Canada are telling us to act. It is up to us, as lawmakers, to do exactly that.

We must take concrete action to better protect all people in Canada from harms, both online and in our communities. We need better tools to deal with harmful content online that foments violence and destruction. Bill C-63 gives law enforcement these much-needed tools.

The Toronto Police Service has expressed their open support of Bill C-63 because they know it will make our communities safer. Members of the Afzaal family have expressed their open support for Bill C-63 because they know the Islamophobic hate that causes someone to kill starts somewhere, and it is often online.

However, we know there is no single solution to the spread of hatred on and off-line. That is why the bill proposes a number of different tools to help stop the hate. It starts with the Criminal Code of Canada. Bill C-63 would amend the Criminal Code to better target hate crime and hate propaganda. It would do this in four important ways.

First, it would create a new hate crime offence. Law enforcement has asked us for this tool, so they can call a hate crime a hate crime when laying a charge, rather than as an afterthought at sentencing. This new offence will also help law enforcement track the actual number of hate-motivated crimes in Canada. That is why they have appealed to me to create a free-standing hate crime offence in a manner that replicates what already exists in 47 of the 50 states south of the border. A hate-motivated assault is not just an assault. It is a hate crime and should be recognized as such on the front end of a prosecution.

Second, Bill C‑63 would increase sentences for the four existing hate speech offences. These are serious offences, and the sentences should reflect that.

Third, Bill C-63 would create a recognizance to keep the peace, which is specifically designed to prevent any of the four hate propaganda offences and the new hate crime offence from being committed.

This would be modelled on existing peace bonds, such as those used in domestic violence cases, and would require someone to have a reasonable fear that these offences would be committed. The threshold of “reasonable fear” is common to almost all peace bonds.

In addition, as some but not all peace bonds do, this would require the relevant attorney general to give consent before an application is made to a judge to impose a peace bond on a person. This ensures an extra layer of scrutiny in the process.

Finally, the bill would codify a definition of hatred for hate propaganda offences and for the new hate crime offence, based on the definition the Supreme Court of Canada created in its seminal decisions in R. v. Keegstra and in Saskatchewan Human Rights Commission v. Whatcott. The definition sets out not only what hatred is but also what it is not, thereby helping Canadians and law enforcement to better understand the scope of these offences.

The court has defined hate speech as content that expresses detestation or vilification of an individual or group on the basis of grounds such as race, national or ethnic origin, religion and sex. It only captures the most extreme and marginal type of expression, leaving the entirety of political and other discourse almost untouched. That is where one will find the category of content that some have called “awful but lawful”. This is the stuff that is offensive and ugly but is still permitted as constitutionally protected free expression under charter section 2(b). This category of content is not hate speech under the Supreme Court's definition.

I want to make clear what Bill C‑63 does not do. It does not undermine freedom of expression. It strengthens freedom of expression by allowing all people to participate safely in online discussions.

Bill C-63 would provide another tool as well. It would amend the Canadian Human Rights Act to define a new discriminatory practice of communicating hate speech online. The legislation makes clear that hate does not encompass content that merely discredits, humiliates, hurts or offends, but where hate speech does occur, there would be a mechanism through which an individual could ask that those expressions of hate be removed. The CHRA amendments are not designed to punish anyone. They would simply give Canadians a tool to get hate speech removed.

Finally, Bill C-63 would modernize and close loopholes in the mandatory reporting act. This would help law enforcement more effectively investigate child sex abuse and exploitation and bring perpetrators to justice, retaining information longer and ensuring that social media companies report CSAM to the RCMP.

There is broad support for the online harms act. When I introduced the legislation in February, I was proud to have at my side the Centre for Israel and Jewish Affairs and the National Council of Canadian Muslims. Those two groups have had vast differences in recent months, but on the need to fight hatred online, they are united. The same unity has been expressed by both Deborah Lyons, the special envoy on preserving Holocaust remembrance and combatting anti-Semitism, and Amira Elghawaby, the special representative on combatting Islamophobia.

The time to combat all forms of online hate is now. Hatred that festers online can result in real-world violence. I am always open to good-faith suggestions on how to improve the bill. I look forward to following along with the study of the legislation at the committee stage. I have a fundamental duty to uphold the charter protection of free expression and to protect all Canadians from harm. I take both duties very seriously.

Some have urged me to split Bill C-63 in two, dealing only with the provisions that stop sexually exploitative material from spreading and throwing away measures that combat hate. To these people, I say that I would not be doing my job as minister if I failed to address the rampant hatred on online platforms. It is my job to protect all Canadians from harm. That means kids and adults. People are pleading for relief from the spread of hate. It is time we acted.

Bill C-63 is a comprehensive response to online harms and the dangerous hate we are seeing spreading in our communities. We have a duty to protect our children in the real world. We must take decisive action to protect them online as well, where the dangers can be just as pernicious, if not more so. Such action starts with passing Bill C-63.

Business of the HouseOral Questions

June 6th, 2024 / 3:20 p.m.


See context

Gatineau Québec

Liberal

Steven MacKinnon LiberalLeader of the Government in the House of Commons

Mr. Speaker, there is indeed a secret in the House, and that is the Conservative Party's true intentions when it comes to cuts. “Chop, chop, chop,” as my colleague from Gaspésie—Les Îles-de-la-Madeleine so aptly puts it. That party wants to cut social programs and the programs that are so dear to Quebeckers and Canadians: women's rights, the right to abortion, the right to contraception. The Conservatives want to scrap our government's dental care and pharmacare plans. The secret is the Conservative Party's hidden agenda, which will do great harm to all Canadians.

With our government's usual transparency, this evening we will proceed to report stage consideration of Bill C-20, an act establishing the public complaints and review commission and amending certain acts and statutory instruments, and Bill C-40, an act to amend the Criminal Code, to make consequential amendments to other acts and to repeal a regulation regarding miscarriage of justice reviews, also known as David and Joyce Milgaard's law.

Tomorrow, we will begin second reading of Bill C-63, an act to enact the online harms act, to amend the Criminal Code, the Canadian Human Rights Act and An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other acts.

I would like to inform the House that next Monday and Thursday shall be allotted days. On Tuesday, we will start report stage of Bill C-69, the budget implementation act. On Wednesday, we will deal with Bill C-70, concerning foreign interference, as per the special order adopted last Thursday. I wish all members and the House staff a good weekend.

Ali Islam

What I know about Bill C-63 I've heard through the media. I haven't read the bill myself.