House of Commons Hansard #327 of the 44th Parliament, 1st Session. (The original version is on Parliament's site.) The word of the day was need.

Topics

Online Harms ActGovernment Orders

12:25 p.m.

NDP

Peter Julian NDP New Westminster—Burnaby, BC

Mr. Speaker, as usual, I listened to my colleague from Salaberry—Suroît's speech with great interest. There is one aspect of the bill that I see as a major flaw, specifically the fact that children are often profoundly harmed by hateful content promoted by secret algorithms, yet there is nothing in this bill about algorithm transparency.

Does my colleague agree that the big digital platforms, the web giants, should be responsible for disclosing the algorithms they use? These algorithms amplify hate speech, which is often extremely harmful to children.

Online Harms ActGovernment Orders

12:25 p.m.

Bloc

Claude DeBellefeuille Bloc Salaberry—Suroît, QC

Mr. Speaker, the issue raised by my colleague is just one example of something that could be studied and debated in committee. For instance, experts could share their expertise on algorithm management. As legislators, our goal is to improve the bill. What my colleague is proposing is one of the things that will probably be discussed in committee. Depending on the nature of the deliberations, we might be able to amend the bill.

Quebec began exploring how we could reduce radicalization and hate speech on the Internet in 2015. This was even the subject of a bill studied in the Quebec National Assembly. However, it was not easy. We realized that what we were doing would not necessarily help the situation and could even do more damage.

I urge my colleagues to study parts 2, 3 and 4 of the bill in committee and to pass part 1 now.

Online Harms ActGovernment Orders

12:25 p.m.

Bloc

Xavier Barsalou-Duval Bloc Pierre-Boucher—Les Patriotes—Verchères, QC

Mr. Speaker, since I have very little time, I would just like to say something and perhaps ask my colleague a question. Not very long ago, the leader of the Bloc Québécois and member for Beloeil—Chambly introduced a bill to prevent people from using the religious exemption to engage in hate speech. I would like to know whether this bill addresses that very important matter.

Online Harms ActGovernment Orders

12:25 p.m.

Bloc

Claude DeBellefeuille Bloc Salaberry—Suroît, QC

Mr. Speaker, I hope we are going to discuss this and be able to amend the bill, because we do not understand why this aspect was not included.

I would also like to take this opportunity to acknowledge the schoolchildren from École Edgar-Hébert, who are here with us today to observe our work in the House and see what a good job the Speaker is doing.

Online Harms ActGovernment Orders

12:25 p.m.

Bloc

Andréanne Larouche Bloc Shefford, QC

Mr. Speaker, it is not easy to speak in front of the member for Salaberry—Suroît, who does outstanding work and who just gave a wonderful speech. I will see what I can add to it. I may get a little more technical than she did. She spoke from the heart, as usual, and I commend her for that. I also want to thank her for her shout-out to Bill C-319. People are still talking to me about Bill C‑319, because seniors between the ages of 65 and 74 feel forgotten. We will continue this debate over the summer. In anticipation of this bill's eventual return before the House, we will continue to try to raise public awareness of the important issue of increasing old age security by 10% for all seniors.

I have gotten a bit off today's topic. I am the critic for seniors, but I am also the critic for status of women, and it is more in that capacity that I am rising today to speak to Bill C-63. This is an issue that I hear a lot about. Many groups reach out to me about hate speech. They are saying that women are disproportionately affected. That was the theme that my colleague from Drummond and I chose on March 8 of last year. We are calling for better control over hate speech out of respect for women who are the victims of serious violence online. It is important that we have a bill on this subject. It took a while, but I will come back to that.

Today we are discussing the famous Bill C‑63, the online harms act, “whose purpose is to, among other things, promote the online safety of persons in Canada, reduce harms caused to persons in Canada as a result of harmful content online and ensure that the operators of social media services in respect of which that Act applies are transparent and accountable with respect to their duties under that Act”. This bill was introduced by the Minister of Justice. I will provide a bit of context. I will then talk a bit more about the bill. I will close with a few of the Bloc Québécois's proposals.

To begin, I would like to say that Bill C‑63 should have been introduced much sooner. The Liberals promised to legislate against online hate. As members know, in June 2021, during the second session of the 43rd Parliament, the Liberals tabled Bill C-36, which was a first draft that laid out their intentions. This bill faced criticism, so they chose to let it die on the Order Paper. In July 2021, the government launched consultations on a new regulatory framework for online safety. It then set up an expert advisory group to help it draft a new bill. We saw that things were dragging on, so in 2022 we again asked about bringing back the bill. We wanted the government to keep its promises. This bill comes at a time when tensions are high and discourse is strained, particularly because of the war between Israel and Hamas. Some activists fear that hate speech will be used to silence critics. The Minister of Justice defended himself by saying that the highest level of proof would have to be produced before a conviction could be handed down.

Second, I would like to go back over a few aspects of the bill. Under this bill, operators who refuse to comply with the law, or who refuse to comply with the commission's decision, could face fines of up to 8% of their overall gross revenues, or $25 million, the highest fine, depending on the nature of the offence. Bill C‑63 increases the maximum penalties for hate crimes. It even includes a definition of hate as the “emotion that involves detestation or vilification and that is stronger than disdain or dislike”. The bill addresses that. This legislation includes tough new provisions stipulating that a person who commits a hate-motivated crime, under any federal law, can be sentenced to life in prison. Even more surprising, people can file a complaint before a provincial court judge if they have reasonable grounds to suspect that someone is going to commit one of these offences.

Bill C-63 amends the Canadian Human Rights Act to allow the Canadian Human Rights Commission to receive complaints regarding the communication of hate speech. Individuals found guilty could be subject to an order. Private conversations are excluded from the communication of hate speech. There are all kinds of things like that to examine more closely. As my colleague explained, this bill contains several parts, each with its own elements. Certain aspects will need a closer look in committee.

Bill C-63 also updates the definition of “Internet service”. The law requires Internet service providers to “notify the law enforcement body designated by the regulations...as soon as feasible and in accordance with the regulations” if they have “reasonable grounds to believe that their Internet service is being or has been used to commit a child pornography offence”.

Bill C-63 tackles two major scourges of the digital world, which I have already discussed. The first is non-consensual pornographic material or child pornography, and the second is hate speech.

The provisions to combat child pornography and the distribution of non-consensual pornographic material are generally positive. The Bloc Québécois supports them. That is why the Bloc Québécois supports part 1 of the bill.

On the other hand, some provisions of Bill C‑63 to fight against hate are problematic. The Bloc Québécois fears, as my colleague from Salaberry—Suroît explained, that the provisions of Bill C‑63 might unnecessarily restrict freedom of expression. We want to remind the House that Quebec already debated the subject in 2015. Bill 59, which sought to counter radicalization, was intended to sanction hate speech. Ultimately, Quebec legislators concluded that giving powers to the Commission des droits de la personne et des droits de la jeunesse, as Bill C‑63 would have us do with the Canadian Human Rights Commission, would do more harm than good. The Bloc Québécois is going with the consensus in Quebec on this. It believes that the Criminal Code provisions are more than sufficient to fight against hate speech. Yes, the Bloc Québécois is representing the consensus in Quebec and reiterating it here in the House.

Third, the Bloc Québécois is proposing that Bill C‑63 be divided so that we can debate part 1 separately, as I explained. This is a critical issue. Internet pornography has a disproportionate effect on children, minors and women, and we need to protect them. This part targets sexual content. Online platforms are also targeted in the other parts.

We believe that the digital safety commission must be established as quickly as possible to provide support and recourse for those who are trying to have content about them removed from platforms. We have to help them. By dividing Bill C‑63, we would be able to debate and reach a consensus on part 1 more quickly.

Parts 2, 3 and 4 also contain provisions about hate speech. That is a bit more complex. Part 1 of the bill is well structured. It forces social media operators, including platforms that distribute pornographic material, such as Pornhub, to take measures to increase the security of digital environments. In order to do so, the bill requires social media operators to act responsibly. All of that is very positive.

Part 1 also talks about allowing users to report harmful content to operators based on seven categories defined by the law, so that it can be removed. We want Bill C-63 to be tougher on harmful content, meaning content that sexually victimizes a child or revictimizes a survivor and intimate content communicated without consent. As we have already seen, this has serious consequences for victims with related PTSD. We need to take action.

However, part 2 of the bill is more problematic, because it amends the Criminal Code to increase the maximum sentences for hate crimes. The Bloc Québécois finds it hard to see how increasing maximum sentences for this type of crime will have any effect and how it is justified. Introducing a provision that allows life imprisonment for any hate-motivated federal offence is puzzling.

Furthermore, part 2 provides that a complaint can be made against someone when there is a fear they may commit a hate crime, and orders can be made against that person. However, as explained earlier, there are already sections of the Criminal Code that deal with these situations. This part is therefore problematic.

Part 3 allows an individual to file a complaint with the Canadian Human Rights Commission for speech that foments hate, including online speech. As mentioned, the Bloc Québécois has concerns that these provisions may be used to silence ideological opponents.

Part 4 states that Internet service providers must notify the appropriate authority if they suspect that their services are being used for child pornography purposes. In short, this part should also be studied.

In conclusion, the numbers are alarming. According to Statistics Canada, violent hate crimes have increased each year since 2015. Between 2015 and 2021, the total number of victims of violent hate crimes increased by 158%. The Internet is contributing to the surge in hate. However, if we want to take serious action, I think it is important to split Bill C‑63. The Bloc Québécois has been calling for this for a long time. Part 1 is important, but parts 2, 3 and 4 need to be studied separately in committee.

I would like to acknowledge all the work accomplished on this issue by my colleagues. Specifically, I am referring to the member for Drummond, the member for Rivière-du-Nord and the member for Avignon—La Mitis—Matane—Matapédia. We really must take action.

This is an important issue that the Bloc Québécois has been working on for a very long time.

Online Harms ActGovernment Orders

12:40 p.m.

Winnipeg North Manitoba

Liberal

Kevin Lamoureux LiberalParliamentary Secretary to the Leader of the Government in the House of Commons

Mr. Speaker, the minister, in introducing the legislation, made it very clear that amendments are something he is open to, as long as they give more strength to the legislation.

In recognition of the fine work that standing committees can do in giving strength to legislation, would it be fair to say that the Bloc's position would be that it is in favour of this legislation, as it currently is, at least at this stage, going to committee? In other words, will the member be voting in favour of the legislation going to committee?

Online Harms ActGovernment Orders

12:40 p.m.

Bloc

Andréanne Larouche Bloc Shefford, QC

Mr. Speaker, I would like to remind my colleague that the Bloc Québécois would have preferred to split the bill in two.

Right now, it is far too problematic to get a proper perspective. We certainly want to study this bill in committee, including parts two, three and four. The leader of the Bloc Québécois, the member for Beloeil—Chambly, introduced a bill to deal with hate speech. There are two clauses that we would have liked to include in this bill, for example. We would have liked to work on the bill.

The Bloc Québécois made a perfectly reasonable proposal, specifically to split the bill in two in order to work on part 1, which has a much greater consensus. Urgent action is needed on part 1, which deals with sexual crimes involving children online. We have been calling for this for quite some time. We must act.

Some elements of the Criminal Code already apply to parts 2, 3 and 4 of the bill. The Bloc Québécois has also made other proposals. We would like to rework these parts in committee.

Above all, we reiterate the need to split the bill in two, because these are two completely separate issues.

Online Harms ActGovernment Orders

12:40 p.m.

NDP

Laurel Collins NDP Victoria, BC

Mr. Speaker, I want to focus on the part of the bill that addresses hate. In the past few weeks, we have seen horrific attacks on synagogues and Jewish schools, and I have met with community members and leaders from the Jewish community who are scared. They are scared about the rise in anti-Semitism, and a number of them have brought up how online platforms are fuelling this kind of hate. We must address the issues of civil liberties and free speech that are problematic in this bill.

New Democrats want to hold social media giants accountable for their algorithms. Can the member talk a bit about how we also need to strengthen accountability and transparency measures to hold social media platforms accountable?

Online Harms ActGovernment Orders

12:40 p.m.

Bloc

Andréanne Larouche Bloc Shefford, QC

Mr. Speaker, one thing is certain: When we talk about algorithms, it is not so simple.

In my presentation, I explained the issue of hate speech. When it comes to parts 2, 3 and 4 of the bill, we have questions that we want to work on.

It was in fact to deal with anti-Semitism and hate speech against the Jewish community that the Bloc Québécois introduced the member for Beloeil—Chambly's bill.

Then there is the whole issue of freedom of expression, which is critical but certainly not simple. There is a fine line between wanting to take action and knowing how to deal with algorithms without attacking freedom of expression. That is why I think that we need to hear from experts in committee. We need to hear suggestions from experts on these very serious issues. That is such a fine line that we truly need help to walk that line and strike a delicate balance between the two. It is critically important.

Online Harms ActGovernment Orders

June 7th, 2024 / 12:40 p.m.

Bloc

Martin Champoux Bloc Drummond, QC

Mr. Speaker, it is my turn to commend my colleague for her speech and for her work on this issue. I know that she really puts her heart into it. This is something that really concerns her. Like me, she was really looking forward to finally seeing some legislation put forward on this issue.

In her speech, my colleague mentioned an aspect of this bill that is of personal concern to me. I am talking about the increase in maximum sentences for crimes set out in the bill. However, Canada's corrections system is more focused on rehabilitation than on punishment.

I would like to hear my colleague's thoughts on how effective it will be to increase these maximum sentences.

Online Harms ActGovernment Orders

12:40 p.m.

Bloc

Andréanne Larouche Bloc Shefford, QC

Mr. Speaker, that is why we want to divide the bill in two. This is yet another example, in addition to the matter of algorithms that my colleague from Victoria raised. My esteemed colleague from Drummond, with whom I worked on this file, is right. Increasing minimum sentences is an issue of major concern. In fact, that is why we want to examine it in committee. Is that the best solution, or should we focus instead on restorative justice?

Online Harms ActGovernment Orders

12:45 p.m.

NDP

Peter Julian NDP New Westminster—Burnaby, BC

Mr. Speaker, first of all, as we mentioned earlier, the NDP believes that certain aspects of Bill C‑63 are important and will help address a situation that calls for measures to counter online harm. However, other elements of this bill are not as clear and raise important questions.

We feel it is really necessary to pass the bill, send it to committee and give that committee the opportunity to do a thorough review. Parts of this bill are well done, but other parts need clarification and still others raise concerns. We therefore have some reservations.

This bill has been needed for years. The Liberal government promised it within 100 days of the last election, but it took almost three years, as members know. Finally, it has been introduced and is being examined. As parliamentarians, we need to do the work necessary to get answers to the questions people are asking, improve the parts of the bill that need improving and pass those parts that are sorely needed.

If parts of the bill cannot be passed or seem not to be in the public interest after a thorough examination in committee, it is our responsibility to withdraw them. However, there is no question that we need this legislation.

The harm being done to children is definitely rising. The idea that people can approach children, without restriction, to encourage them to self-harm or commit suicide should be something that our society will not tolerate. The fact that we have these web giants or platforms that promote child pornography is unacceptable. It should not be happening in our society. We have to acknowledge the importance of implementing laws to prevent this from happening. Hate speech is another issue. We are seeing a disturbing rise in violence in society, which is often fomented online.

For all of these reasons, we are going to pass this bill at second reading. We are going to send it to committee. This part of the process is very important to us. All answers must be obtained and all necessary improvements to the bill must be made in committee.

I do not think that anyone in the Parliament of Canada would like to vote against the principle of having such legislation in place. In practice, the important role of parliamentarians is to do everything in their power to produce a bill that achieves consensus, with questions answered and the necessary improvements put in place.

There is no doubt about the need for the bill. The NDP has been calling for the bill for years. The government promised it after 100 days. Canadians had to wait over 800 days before we saw the bill actually being presented.

In the meantime, the reality is that we have seen more and more cases of children being induced to harm themselves. This is profoundly disturbing to us, as parents, parliamentarians and Canadians, to see how predators have been going after children in our society. When we are talking about child pornography or inducing children to harm themselves, it is something that should be a profound concern to all of us.

Issues around the sharing of intimate content online without permission, in a way that it attacks victims, is also something that we have been calling for action on. It is important for parliamentarians to take action.

We have seen a steady and disturbing rise in hate crimes. We have seen it in all aspects of racism and misogyny, homophobia and transphobia, anti-Semitism and Islamophobia. All of these toxic sources of hate are rising.

I would note two things. First, the rise in anti-Semitism is mirrored by the rise in Islamophobia. Something we have seen from the far right is that they are attacking all groups.

Second, as the ADL has pointed out, in 2022 and 2023, all the violent acts of mass murder that were ideologically motivated came from the far right in North America. These are profoundly disturbing acts. We have a responsibility to take action.

The fact that the government has delayed the bill for so long is something we are very critical of. The fact that it is before us now means that, as parliamentarians, we have the responsibility to take both the sections of the bill where there is consensus and parts of the bill where there are questions and concerns being raised that are legitimate, and we must ensure that the committee has all the resources necessary, once it is referred to the committee in principle.

That second reading vote is a vote in principle, supporting the idea of legislation in this area. However, it is at the committee stage that we will see all the witnesses who need to come forward to dissect the bill and make sure that it is the best possible legislation. From there, we determine which parts of the bill can be improved, which parts are adequate and which parts, if they raise legitimate concerns and simply do not do the job, need to be taken out.

Over the course of the next few minutes, let us go through where there is consensus and where there are legitimate questions being raised. I want to flag that the issue of resources, which has been raised by every speaker so far today, is something that the NDP takes very seriously as well.

In the Conservative government that preceded the current Liberal government, we saw the slashing of crime prevention funding. This basically meant the elimination of resources that play a valuable role in preventing crimes. In the current Liberal government, we have not seen the resources that need to go into countering online harms.

There are legitimate questions being raised about whether resources are going to be adequate for the bill to do the job that it needs to do. Those questions absolutely need to be answered in committee. If the resources are not adequate, the best bill in the world is not going to do the job to stop online harms. Therefore, the issue of resources is key for the NDP as we move forward.

With previous pieces of legislation, we have seen that the intent was good but that the resources were inadequate. The NDP, as the adults in the House, the worker bees of Parliament, as many people have attested, would then push the Liberal government hard to actually ensure adequate resources to meet the needs of the legislation.

Legislation should never be symbolic. It should accomplish a goal. If we are concerned about online harms, and so many Canadians are, then we need to ensure that the resources are adequate to do the job.

Part 1 of the bill responds to the long-delayed need to combat online harms, and a number of speakers have indicated a consensus on this approach. It is important to note the definitions, which we certainly support, in the intent of part 1 of the bill, which is also integrated into other parts of the bill. The definitions include raising concerns about “content that foments hatred”, “content that incites violence”, “content that incites violent extremism or terrorism”, “content that induces a child to harm themselves”, “content that sexually victimizes a child or revictimizes a survivor”, “content used to bully a child” and “intimate content communicated without consent”.

All of these are, I think it is fair to say, definitions that are detailed in how they address each of those categories. This is, I think, a goal all parliamentarians would share. No one wants to see the continued increase in sexual victimization of children and content that induces a child to harm themselves.

I have raised before in the House the sad and tragic story of Molly Russell. I met with her father and have spoken with the family. The tragic result of her having content forced upon her that led to her ending her own life is a tragedy that we have seen repeated many times, where the wild west of online platforms is promoting, often through secret algorithms, material that is profoundly damaging to children. This is something that is simply unacceptable in any society, yet that content proliferates online. It is often reinforced by secret algorithms.

I would suggest that, while the definitions in the bill are strong concerning the content we do not want to see, whether it is violent extremism or the victimization of children, the reality is that it is not tackling a key element of why this harmful online content expands so rapidly, and with such disturbing strength, and that is the secretive algorithms online platforms use. There is no obligation for these companies to come clean about their algorithms, yet these algorithms inflict profound damage on Canadians, victimize children and, often, encourage violence.

One of the pieces I believe needs to be addressed through the committee process of the bill is why these online platforms have no obligation at all to reveal the algorithms that produce, in such disturbing strength, this profoundly toxic content. The fact is that a child, Molly Russell, was, through the algorithms, constantly fed material that encouraged her to ultimately end her own life, and these companies, these massive corporations, are often making unbelievable profits.

I will flag one more time that Canada continues to indirectly subsidize both Meta and Google, to the tune of a billion dollars a year, with indirect subsidies when there is no responsibility from these online platforms at all, which is something I find extremely disturbing. These are massive amounts of money, and they meet with massive profits. We have, as well, these significant subsidies, which we need to absolutely get a handle on. We see the fact that these algorithms are present, and not being dealt with in the legislation, as a major problem.

Second, when we look at other aspects of the bill and the detail that I have just run through in terms of the actual content itself, the definitions in part 1 are not mirrored by the same level of detail in part 2 of the bill, which is the aspects of the Criminal Code that are present. The Criminal Code provisions have raised concerns because of their lack of definition. The concerns around part 2, on the Criminal Code, are something that firmly needs to be dealt with at the committee stage. Answers need to be obtained, and amendments need to be brought to that section. I understand that as part of the committee process there will be rigorous questions asked on part 2. It is a concern that a number of people and a number of organizations have raised. The committee step in this legislation is going to be crucial to improving and potentially deleting parts of the bill, subject to the rigorous questioning that would occur at the committee stage.

The third part of the bill addresses issues around the Canadian Human Rights Commission. We were opposed to the former Harper government's gutting of the ability of the Human Rights Commission to uphold the Charter of Rights and Freedoms. Under the Charter of Rights and Freedoms, the Constitution that governs our country, Canadians have a right to be free from discrimination. The reality of the Harper government's cuts to that portion of the Canadian Human Rights Commission is something that we found disturbing at the time. The reality is that part 3, the question of resources and whether the Canadian Human Rights Commission has the ability to actually respond to the responsibilities that would come from part 3 of the bill, is something that we want to rigorously question witnesses on. Whether we are talking about government witnesses or the Canadian Human Rights Commission, it is absolutely important that we get those answers before we think of the next steps for part 3.

Finally, there is part 4, an act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service. That section of the bill as well is something that, I think it is fair to say, should receive some level of consensus from parliamentarians.

In short, at second reading, as members well know, the intent of the debate and discussion is whether or not we are in agreement with the principle of the bill. New Democrats are in agreement with the principle of the bill. We have broad concerns about certain parts of the bill. The intent around part 1, though, the idea that we would be tackling and forcing a greater level of responsibility on the web giants that have profited for so long with such a degree of irresponsibility to tackle issues of content that incites violence or violent extremism, content that induces a child to harm themselves or that sexually victimizes a child, content used to bully a child, and intimate content communicated without consent, all of those elements of the bill, we support in principle.

We look forward to a very rigorous examination at committee with the witnesses we need to bring forward. There is no doubt that there is a need for this bill and we need to proceed as quickly as possible, but only by hearing from the appropriate witnesses and making sure that we have gotten all the answers and made all the improvements necessary to this bill.

Online Harms ActGovernment Orders

1:05 p.m.

Liberal

Ken Hardie Liberal Fleetwood—Port Kells, BC

Mr. Speaker, it was very good to hear the word “quickly” in the hon. member's comments. When something gets posted, it gets propagated at the speed of light. We heard earlier today in the debate that there were questions about using existing mechanisms to deal with this, but existing mechanisms are notoriously slow.

What factors would need to be considered in this bill to, in essence, use the precautionary principle? If it looks awful, there should be a way of dealing with it very quickly and not just leaving it up there while some process works its way through.

Can the hon. member comment on that?

Online Harms ActGovernment Orders

1:05 p.m.

NDP

Peter Julian NDP New Westminster—Burnaby, BC

Mr. Speaker, certainly, but what I am saying is with regard to the rigorous examination of this at the committee stage. When I say “quickly”, I am not talking about, in any way, short-circuiting the important work of committee. That needs to happen.

One of the major concerns I have seen, as the member points out, is that we have identified content that harms a child but the problem is that, because algorithms are not touched by this, and algorithm transparency is not touched by the bill, it could well mean closing the barn door after the horse has already left, and that the despicable content that harms a child has been promoted widely by algorithms. It is then ultimately taken out of circulation.

However, with the algorithms, it is amplified so quickly and to such a huge extent that this is, I would suggest, a major shortfall in the bill. The U.S. Congress is considering legislation around algorithm transparency. I have a bill in front of the House on algorithm transparency. The reality is we cannot act quickly to save a child if the algorithms have already promoted that harmful content everywhere. That is a major concern and a major shortfall, I believe, in this legislation.

Online Harms ActGovernment Orders

1:05 p.m.

Conservative

Cheryl Gallant Conservative Renfrew—Nipissing—Pembroke, ON

Mr. Speaker, the hon. member mentioned the Human Rights Tribunal. Would calling for the elimination of the State of Israel online land someone before the Human Rights Tribunal or would calling for “from the river to the sea”, which refers to the dismantling of Israel or the removal or extermination of its Jewish population, either of those, online, end up landing somebody before the Human Rights Tribunal?

Online Harms ActGovernment Orders

1:05 p.m.

NDP

Peter Julian NDP New Westminster—Burnaby, BC

Mr. Speaker, I think this is why we need to have the rigorous committee process. I know Conservatives will try to throw out lines and ask, “Does this matter? Does this matter?”

With regard to the important aspect of definition, if we just look through part 1 of the bill, it is very clear. As for the definitions that apply, the member knows, as I am sure she read the bill, what definitions apply. In terms of what happens around the Criminal Code, we have concerns about the definitions and we need to be very clear about that.

Conservatives will take that issue of clarity and try to exploit it. I think it is important, as adults in the room, as legislators, as parliamentarians, that we go through that rigorous committee process and that we ensure that questions are answered. I do not believe that the kind of speculation that Conservatives do is helpful at all. Let us get the work done around the bill. It is definitely needed to combat online harms. Let us make sure the definitions are clear and concise.

Online Harms ActGovernment Orders

1:05 p.m.

Bloc

Martin Champoux Bloc Drummond, QC

Mr. Speaker, I know that my colleague from New Westminster—Burnaby also cares about regulating what happens on the web. We had the opportunity to work together at the Standing Committee on Canadian Heritage on various topics that have to do with this issue.

We have been waiting for Bill C‑63 for a long time. I think that there is consensus on part 1. As the Bloc Québécois has been saying all day, it is proposing that we split the bill in order to quickly pass part 1, which is one part we all agree on.

The trouble is with part 2 and the subsequent parts. There are a lot of things that deserve to be discussed. There is one in particular that raises a major red flag, as far as I am concerned. It is the idea that a person could file a complaint because they fear that at some point, someone might utter hate speech or commit a crime as described in the clauses of the bill. A complaint could be filed simply on the presumption that a person might commit this type of crime.

To me, that seems to promote a sort of climate of accusation that could lead to paranoia. It makes me think of the movie Minority Report. I am sure my colleague has heard of it. I would like his impressions of this type of thing that we find in Bill C‑63.

Online Harms ActGovernment Orders

1:10 p.m.

NDP

Peter Julian NDP New Westminster—Burnaby, BC

Mr. Speaker, that is why we would like the bill to go to committee for a thorough study, because it is important in the context of this bill.

That said, we know that hate crimes are on the rise. We are seeing more and more anti-Semitism, Islamophobia, racism, misogyny, homophobia, transphobia, and so on. That is why it is important to have clear definitions in the bill.

At this stage of the bill's consideration, we are being asked to vote on the principle of the bill. The bill seeks to reduce online harm, and we agree with that principle. However, there are still many questions and details to be studied. We will have the opportunity to amend the bill in committee to remove certain parts or add others. There is still a lot of work to be done. The NDP wants to refer the bill to committee so that we can begin that work.

Online Harms ActGovernment Orders

1:10 p.m.

NDP

Alexandre Boulerice NDP Rosemont—La Petite-Patrie, QC

Mr. Speaker, I thank my NDP colleague from New Westminster—Burnaby for his speech and his involvement in this serious issue.

Unfortunately, we have more proof that the Liberals are dragging their feet and waiting to take action. Online hate is a real problem. Many children and teenagers are experiencing social media in harmful, aggressive and damaging ways. These young people are often the victims of cyberbullying and cyber-attacks, which create very tense situations. The Liberals have not done anything about that.

My colleague is right in saying the Liberals missed something in this bill. The Minister of Justice does not see it. The algorithms are creating echo chambers where people with far-right perspectives, who are racist, homophobic, transphobic and sexist, feed off each other. For example, the phenomenon of fake news is on the rise. The Liberals do not dare touch the issue of secret algorithms.

Why does my colleague think that the Liberals do not dare take that fundamental step in the fight against online hate?

Online Harms ActGovernment Orders

1:10 p.m.

NDP

Peter Julian NDP New Westminster—Burnaby, BC

Mr. Speaker, that is a really great question from my colleague from Rosemont—La Petite-Patrie.

I know that he has done a lot of work to protect children. As a father, it is important for my colleague to ensure that children are not inundated with toxic content that encourages them to self-harm or to commit suicide. It is appalling to see what is out there.

My colleague is right to talk about the Liberals' abject failure. Everyone heard the Prime Minister say in 2021 that he was going to introduce a bill within 100 days to counter all the attacks, the hate crimes and the attacks on children that we are seeing. It took another two years.

Furthermore, the Liberals did not touch on the real profit maker for the web giants: the algorithms. Algorithms rake in incredible profits for these companies. They did not seem to want to look at this key element, and we can speculate as to why. However, we want to get answers to this question, and that is something we are going to do in committee.

Online Harms ActGovernment Orders

1:10 p.m.

Winnipeg North Manitoba

Liberal

Kevin Lamoureux LiberalParliamentary Secretary to the Leader of the Government in the House of Commons

Mr. Speaker, it is a pleasure to be able to rise and speak to Bill C-63.

We often talk about the communities and neighbourhoods in which we live. We do this not only as parliamentarians but also as politicians in general, whether at the municipal, provincial, or federal level. We talk about how we want people to feel safe. People need to feel safe in their homes, in their communities and in the places where they live. That has always been a priority for the current government and, I would like to think, for all parliamentarians of all political stripes. However, sometimes we need to look at finding a better definition of what we mean when we talk about keeping people safe in our communities.

The Internet is a wonderful thing, and it plays a critical and important role in society today. In fact, I would argue that, nowadays, it is an essential service that is virtually required in all communities. We see provincial and national governments investing greatly to ensure that there is more access to the Internet. We have become more and more dependent on it in so many different ways. It is, for all intents and purposes, a part of the community.

I could go back to the days when I was a child, and my parents would tell me to go outside and play. Yes, I would include my children as having been encouraged to go outside and play. Then things such as Nintendo came out, and people started gravitating toward the TV and playing computer games. I have grandchildren now, and I get the opportunity to see my two grandsons quite a bit. I can tell members that, when I do, I am totally amazed at what they are participating in on the Internet and with respect to technology. There are incredible programs associated with it, from gaming to YouTube, that I would suggest are a part of the community. Therefore, when we say that we want to protect our children in our communities when they are outside, we also need to protect them when they are inside.

It is easy for mega platforms to say it is not their responsibility but that of the parent or guardian. From my perspective, that is a cop-out. We have a responsibility here, and we need to recognize that responsibility. That is what Bill C-63 is all about.

Some people will talk about freedom of speech and so forth. I am all for freedom of speech. In fact, I just got an email from a constituent who is quite upset about how the profanity and flags being displayed by a particular vehicle that is driving around is promoting all sorts of nastiness in the community. I indicated to them that freedom of speech entitles that individual to do that.

I care deeply about the fact that we, as a political party, brought in the Charter of Rights and Freedoms, which guarantees freedom of speech and expression. At the end of the day, I will always advocate for freedom of speech, but there are limitations. I believe that, if we look at Bill C-63, we can get a better sense of the types of limitations the government is talking about. Not only that, but I believe they are a reflection of a lot of the work that has been put together in order to bring the legislation before us today.

I understand some of the comments that have been brought forward, depending on which political parties addressed the bill so far. However, the minister himself has reinforced that this is not something that was done on a napkin; it is something that has taken a great deal of time, effort and resources to make sure that we got it right. The minister was very clear about the consultations that were done, the research that took a look at what has been done in other countries, and what is being said here in our communities. There are a great number of people who have been engaged in the legislation. I suspect that once it gets to committee we will continue to hear a wide spectrum of opinions and thoughts on it.

I do not believe that as legislators we should be put off to such a degree that we do not take action. I am inclined to agree with the minister in saying that this is a holistic approach at dealing with an important issue. We should not be looking at ways to divide the legislation. Rather, we should be looking at ways it can be improved. The minister himself, earlier today, said that if members have ideas or amendments they believe will give more strength to the legislation, then let us hear them. Bring them forward.

Often there is a great deal of debate on something at second reading and not as much at third reading. I suggest that the legislation before us might be the type of legislation that it would be beneficial to pass relatively quickly out of second reading, after some members have had the opportunity to provide some thoughts, in favour of having more reading or debate time at third reading but more specifically to allow for time at the committee stage. That would allow, for example, members the opportunity to have discussions with constituents over the summer, knowing full well that the bill is at committee. I think there is a great deal of merit to that.

There was something that spoke volumes, in terms of keeping the community safe, and the impact today that the Internet has on our children in particular. Platforms have a responsibility, and we have to ensure that they are living up to that responsibility.

I want to speak about Carol Todd, the mother of Amanda Todd, to whom reference has been made already. Ultimately, I believe, she is one of the primary reasons why the legislation is so critically important. Amanda Michelle Todd was born November 27, 1996, and passed away October 10, 2012. Colleagues can do the math. She was a 15-year-old Canadian student and a victim of cyber-bullying who hanged herself at her home in Port Coquitlam, British Columbia. There is a great deal of information on the Internet about to Amanda. I thank her mother, Carol, for having the courage to share the story of her daughter, because it is quite tragic.

I think there is a lot of blame that can be passed around, whether it is to the government, the private sector or society, including individuals. Carol Todd made reference to the thought that her daughter Amanda might still actually be alive if, in fact, Bill C-63 had been law at the time. She said, “As a mom, and having gone through the story that I've gone through with Amanda, this needs to be bipartisan. All parties in the House of Commons need to look in their hearts and look at young Canadians. Our job is to protect them. And parents, we can't do it alone. The government has to step in and that's what we are calling for.”

That is a personal appeal, and it is not that often I will bring up a personal appeal of this nature. I thought it was warranted because I believe it really amplifies and humanizes why this legislation is so important. Some members, as we have seen in the debate already, have indicated that they disagree with certain aspects of the legislation, and that is fine. I can appreciate that there will be diverse opinions on this legislation. However, let us not use that as a way to ultimately prevent the legislation from moving forward.

Years of consultation and work have been put into the legislation to get it to where it is today. I would suggest, given we all have had discussions related to these types of issues, during private members' bills or with constituents, we understand the importance of freedom of speech. We know why we have the Charter of Rights. We understand the basics of hate crime and we all, I believe, acknowledge that freedom of speech does have some limitations to it.

I would like to talk about some of the things we should think about, in terms of responsibilities, when we think about platforms. I want to focus on platforms in my last three minutes. Platforms have a responsibility to be responsible. It is not all about profit. There is a societal responsibility that platforms have, and if they are not prepared to take it upon themselves to be responsible, then the government does need to take more actions.

Platforms need to understand and appreciate that there are certain aspects of society, and here we are talking about children, that need to be protected. Platforms cannot pass the buck on to parents and guardians. Yes, parents and guardians have the primary responsibility, but the Internet never shuts down. Even parents and guardians have limitations. Platforms need to recognize that they also have a responsibility to protect children.

Sexually victimized children, and intimate content that is shared without consent, are the types of things platforms have to do due diligence on. When the issue is raised to platforms, there is a moral and, with the passage of this legislation, a legal obligation for them to take action. I am surprised it has taken this type of legislation to hit that point home. At the end of the day, whether a life is lost, people being bullied, or depression and mental issues are caused because of things of that nature, platforms have to take responsibility.

There are other aspects that we need to be very much aware of. Inciting violent extremism or terrorism needs to be flagged. Content that induces a child to harm themselves also needs to be flagged. As it has been pointed out, this legislation would have a real, positive, profound impact, and it would not have to take away one's freedom of speech. It does not apply to private conversations or communications.

I will leave it at that and continue at a later date.

The House proceeded to the consideration of Bill C-323, An Act to amend the Excise Tax Act (mental health services), as reported (without amendment) from the committee.

Excise Tax ActPrivate Members' Business

1:30 p.m.

Conservative

The Deputy Speaker Conservative Chris d'Entremont

There being no amendment motions at report stage, the House will now proceed, without debate, to the putting of the question on the motion to concur in the bill at report stage.

Excise Tax ActPrivate Members' Business

1:30 p.m.

Conservative

Stephen Ellis Conservative Cumberland—Colchester, NS

moved that the bill be concurred in.

Excise Tax ActPrivate Members' Business

1:30 p.m.

Conservative

The Deputy Speaker Conservative Chris d'Entremont

If a member participating in person wishes that the motion be carried or carried on division, or if a member of a recognized party participating in person wishes to request a recorded division, I would invite them to rise and indicate it to the Chair.