House of Commons Hansard #327 of the 44th Parliament, 1st Session. (The original version is on Parliament's site.) The word of the day was need.

Topics

Online Harms ActGovernment Orders

10 a.m.

Parkdale—High Park Ontario

Liberal

Arif Virani LiberalMinister of Justice and Attorney General of Canada

moved that Bill C-63, An Act to enact the online harms act, to amend the Criminal Code, the Canadian Human Rights Act and An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other Acts, be read the second time and referred to a committee.

Mr. Speaker, hon. colleagues, I am very pleased today to speak to Bill C-63, the online harms act. I speak today not only as a minister and as a fellow parliamentarian, but also as a father, as a South Asian and as a Muslim Canadian.

There are a few moments in this place when our work becomes very personal, and this is one such moment for me. Let me explain why. I ran for office for a number of reasons in 2015. Chief among them was to fight against discrimination and to fight for equality in what I viewed as an increasingly polarized world. In recent years, we have seen that polarization deepen and that hatred fester, including at home here in Canada.

I would never have fathomed that in 2024, Canada would actually lead the G7 in the number of deaths attributable to Islamophobia. Among our allies, it is Canada that has experienced the most fatal attacks against Muslims in the G7. There have been 11. Those were 11 preventable deaths. I say “preventable” because in the trials of both the Quebec mosque shooter, who murdered six men on January 29, 2017, and the man who murdered four members of the Afzaal family in London, Ontario, the attackers admitted, in open court, to having been radicalized online. They admitted what so many of us have always known to be the case: Online hatred has real-world consequences.

Yesterday was the third anniversary of the attack on the Afzaal family, an attack described by the presiding judge as “a terrorist act”. In memory of Talat, Salman, Yumna and Madiha, who lost their lives to an act of hatred on June 6, 2021, we are taking action.

Bill C-63, the online harms act, is a critical piece of that action. This bill is the product of years of work.

We held consultations for over four years. We talked to victims' groups, advocacy groups, international partners, people from the technology industry and the general public. We organized a nationwide consultation and held 19 national and regional round tables. We published a report about what we learned. We listened to the recommendations of our expert advisory group on online safety, a diverse think tank made up of experts who are respected across Canada. We were given valuable advice and gained a great deal of knowledge thanks to those consultations, and all of that informed the development of Bill C-63.

Many of our international partners, such as the United Kingdom, Australia, Germany, France and the European Union, have already done considerable legislative work to try to limit the risks of harmful content online. We learned from their experience and adapted the best parts of their most effective plans to the Canadian context.

We have also learned what did not work abroad, like the immediate takedown of all types of harmful content, originally done in Germany; or like the overbroad restriction on freedom of speech that was struck as unconstitutional in France. We are not repeating those errors here. Our approach is much more measured and reflects the critical importance of constitutionally protected free expression in Canada's democracy. What we learned from this extensive consultation was that the Internet and social media platforms can be a force for good in Canada and around the world. They have been a tool for activists to defend democracy. They are platforms for critical expression and for critical civic discourse. They make learning more accessible to everyone.

The Internet has made people across our vast world feel more connected to one another, but the internet also has a dark side. Last December, the RCMP warned of an alarming spike in online extremism among young people in Canada and the radicalization of youth online. We know that the online environment is especially dangerous for our most vulnerable. A recent study by Plan International found that 58% of girls have experienced harassment online.

Social media platforms are used to exploit and disseminate devastating messages with tragic consequences. This is because of one simple truth. For too long, the profits of platforms have come before the safety of users. Self-regulation has failed to keep our kids safe. Stories of tragedy have become far too common. There are tragic consequences, like the death of Amanda Todd, a 15-year-old Port Coquitlam student who died by suicide on October 10, 2012, after being exploited and extorted by more than 20 social media accounts. This relentless harassment started when Amanda was just 12 years old, in grade 7.

There was Carson Cleland last fall. He was the same age as my son at the time: 12 years old. Carson made a mistake. He shared an intimate image with someone whom he thought was a friend online, only to find himself caught up in a web of sextortion from which he could not extricate himself. Unable to turn to his parents, too ashamed to turn to his friends, Carson turned on himself. Carson is no longer with us, but he should be with us.

We need to do more to protect the Amanda Todds and the Carson Clelands of this country, and with this bill, we will. I met with the incredible people at the Canadian Centre for Child Protection earlier this year, and they told me that they receive 70 calls every single week from scared kids across Canada in situations like Amanda's and like Carson's.

As the father of two youngsters, this is very personal for me. As they grow up, my 10-year-old and 13-year-old boys spend more and more time on screens. I know that my wife and I are not alone in this parenting struggle. It is the same struggle that parents are facing around the country.

At this point, there is no turning back. Our children and teens are being exposed to literally everything online, and I feel a desperate need, Canadians feel a desperate need, to do a better job of protecting those kids online. That is precisely what we are going to do with this bill.

Bill C-63 is guided by four important objectives. It aims to reduce exposure to harmful content online, to empower and support users. Second, it would address and denounce the rise in hatred and hate crimes. Third, it would ensure that victims of hate have recourse to improved remedies, and fourth, it would strengthen the reporting of child sexual abuse material to enhance the criminal justice response to this heinous crime.

The online harms act will address seven types of harmful content based on categories established over more than four years of consultation.

Not all harms will be treated the same. Services will be required to quickly remove content that sexually victimizes a child or that revictimizes a survivor, as well as to remove what we call “revenge porn”, including sexual deepfakes. There is no place for this material on the Internet whatsoever.

For other types of content, like content that induces a child to self-harm or material that bullies a child, we are placing a duty on platforms to protect children. This means a new legislative and regulatory framework to ensure that social media platforms reduce exposure to harmful, exploitative content on their platforms. This means putting in place special protections for children. It also means that platforms will have to make sure that users have the tools and the resources they need to report harmful content.

To fulfill the duty to protect children, social media platforms will have to integrate age-appropriate design features to make their platforms safer for children to use. This could mean defaults for parental controls and warning labels for children. It could mean security settings for instant messaging for children, or it could mean safe-search settings.

Protecting our children is one of our most important duties that we undertake as lawmakers in this place. As a parent, it literally terrifies me that the most dangerous toys in my home, my children's screens, are not subject to any safety standards right now. This needs to change, and it would change with the passage of Bill C-63.

It is not only that children are subject to horrible sexual abuse and bullying online, but also that they are exposed to hate and hateful content, as are Internet users of all ages and all backgrounds, which is why Bill C-63 targets content that foments hatred and incitements to violence as well as incitements to terrorism. This bill would not require social media companies to take down this kind of harmful content; instead, the platforms would have to reduce exposure to it by creating a digital safety plan, disclosing to the digital safety commissioner what steps they are putting in place to reduce risk and reporting back on their progress.

The platforms would also be required to give users practical options for recourse, like tools to either flag or block certain harmful material from their own feeds. This is key to ensuring community safety, all the more so because they are backed by significant penalties for noncompliance. When I say “significant”, the penalties would be 6% of global revenue or $10 million, whichever is higher, and in the instance of a contravention of an order from the digital safety commission, those would rise to 8% of global revenue or $25 million, again, whichever is higher.

The online harms act is an important step towards a safer, more inclusive online environment, where social media platforms actively work to reduce the risk of user exposure to harmful content on their platforms and help to prevent its spread, and where, as a result, everyone in Canada can feel safer to express themselves openly. This is critical, because at the heart of this initiative, it is about promoting expression and participation in civic discourse that occurs online. We can think about Carla Beauvais and the sentiments she expressed when she stood right beside me when we tabled this legislation in February, and the amount of abuse she faced for voicing her concerns about the George Floyd incident in the United States, which cowered her and prevented her from participating online. We want her voice added to the civic discourse. Right now, it has been removed.

The online harms act will regulate social media services, the primary purpose of which is to enable users to share publicly accessible content, services that pose the greatest risk of exposing the greatest number of people to harmful content.

This means that the act would apply to social media platforms, such as Facebook, X and Instagram; user-uploaded adult content services, such as Pornhub; and livestreaming services, such as Twitch. However, it would not apply to any private communications, meaning private texts or direct private messaging on social media apps, such as Instagram or Facebook Messenger. It is critical to underscore, again, that this is a measured approach that does not follow the overreach seen in other countries we have studied, in terms of how they embarked upon this endeavour. The goal is to target the largest social media platforms, the places where the most people in Canada are spending their time online.

Some ask why Bill C-63 addresses both online harms and hate crimes, which can happen both on and off-line. I will explain this. Online dangers do not remain online. We are seeing a dramatic rise in hate crime across our country. According to Statistics Canada, the number of police-reported hate crimes increased by 83% between 2019 and 2022. B'nai Brith Canada reports an alarming 109% increase in anti-Semitic incidents from 2022 to 2023. In the wake of October 7, 2023, I have been hearing frequently from Jewish and Muslim groups, which are openly questioning whether it is safe to be openly Jewish or Muslim in Canada right now. This is not tenable. It should never be tolerated, yet hate-motivated violence keeps happening. People in Canada are telling us to act. It is up to us, as lawmakers, to do exactly that.

We must take concrete action to better protect all people in Canada from harms, both online and in our communities. We need better tools to deal with harmful content online that foments violence and destruction. Bill C-63 gives law enforcement these much-needed tools.

The Toronto Police Service has expressed their open support of Bill C-63 because they know it will make our communities safer. Members of the Afzaal family have expressed their open support for Bill C-63 because they know the Islamophobic hate that causes someone to kill starts somewhere, and it is often online.

However, we know there is no single solution to the spread of hatred on and off-line. That is why the bill proposes a number of different tools to help stop the hate. It starts with the Criminal Code of Canada. Bill C-63 would amend the Criminal Code to better target hate crime and hate propaganda. It would do this in four important ways.

First, it would create a new hate crime offence. Law enforcement has asked us for this tool, so they can call a hate crime a hate crime when laying a charge, rather than as an afterthought at sentencing. This new offence will also help law enforcement track the actual number of hate-motivated crimes in Canada. That is why they have appealed to me to create a free-standing hate crime offence in a manner that replicates what already exists in 47 of the 50 states south of the border. A hate-motivated assault is not just an assault. It is a hate crime and should be recognized as such on the front end of a prosecution.

Second, Bill C‑63 would increase sentences for the four existing hate speech offences. These are serious offences, and the sentences should reflect that.

Third, Bill C-63 would create a recognizance to keep the peace, which is specifically designed to prevent any of the four hate propaganda offences and the new hate crime offence from being committed.

This would be modelled on existing peace bonds, such as those used in domestic violence cases, and would require someone to have a reasonable fear that these offences would be committed. The threshold of “reasonable fear” is common to almost all peace bonds.

In addition, as some but not all peace bonds do, this would require the relevant attorney general to give consent before an application is made to a judge to impose a peace bond on a person. This ensures an extra layer of scrutiny in the process.

Finally, the bill would codify a definition of hatred for hate propaganda offences and for the new hate crime offence, based on the definition the Supreme Court of Canada created in its seminal decisions in R. v. Keegstra and in Saskatchewan Human Rights Commission v. Whatcott. The definition sets out not only what hatred is but also what it is not, thereby helping Canadians and law enforcement to better understand the scope of these offences.

The court has defined hate speech as content that expresses detestation or vilification of an individual or group on the basis of grounds such as race, national or ethnic origin, religion and sex. It only captures the most extreme and marginal type of expression, leaving the entirety of political and other discourse almost untouched. That is where one will find the category of content that some have called “awful but lawful”. This is the stuff that is offensive and ugly but is still permitted as constitutionally protected free expression under charter section 2(b). This category of content is not hate speech under the Supreme Court's definition.

I want to make clear what Bill C‑63 does not do. It does not undermine freedom of expression. It strengthens freedom of expression by allowing all people to participate safely in online discussions.

Bill C-63 would provide another tool as well. It would amend the Canadian Human Rights Act to define a new discriminatory practice of communicating hate speech online. The legislation makes clear that hate does not encompass content that merely discredits, humiliates, hurts or offends, but where hate speech does occur, there would be a mechanism through which an individual could ask that those expressions of hate be removed. The CHRA amendments are not designed to punish anyone. They would simply give Canadians a tool to get hate speech removed.

Finally, Bill C-63 would modernize and close loopholes in the mandatory reporting act. This would help law enforcement more effectively investigate child sex abuse and exploitation and bring perpetrators to justice, retaining information longer and ensuring that social media companies report CSAM to the RCMP.

There is broad support for the online harms act. When I introduced the legislation in February, I was proud to have at my side the Centre for Israel and Jewish Affairs and the National Council of Canadian Muslims. Those two groups have had vast differences in recent months, but on the need to fight hatred online, they are united. The same unity has been expressed by both Deborah Lyons, the special envoy on preserving Holocaust remembrance and combatting anti-Semitism, and Amira Elghawaby, the special representative on combatting Islamophobia.

The time to combat all forms of online hate is now. Hatred that festers online can result in real-world violence. I am always open to good-faith suggestions on how to improve the bill. I look forward to following along with the study of the legislation at the committee stage. I have a fundamental duty to uphold the charter protection of free expression and to protect all Canadians from harm. I take both duties very seriously.

Some have urged me to split Bill C-63 in two, dealing only with the provisions that stop sexually exploitative material from spreading and throwing away measures that combat hate. To these people, I say that I would not be doing my job as minister if I failed to address the rampant hatred on online platforms. It is my job to protect all Canadians from harm. That means kids and adults. People are pleading for relief from the spread of hate. It is time we acted.

Bill C-63 is a comprehensive response to online harms and the dangerous hate we are seeing spreading in our communities. We have a duty to protect our children in the real world. We must take decisive action to protect them online as well, where the dangers can be just as pernicious, if not more so. Such action starts with passing Bill C-63.

Online Harms ActGovernment Orders

June 7th, 2024 / 10:20 a.m.

Conservative

Michelle Rempel Conservative Calgary Nose Hill, AB

Mr. Speaker, the bill has received widespread condemnation from groups of all political stripes because it forces Canadians to make unnecessary trade-offs between their security and their charter rights. As well, the bill would force much-needed reforms into a long, onerous regulatory process with no clear end in sight. There are people watching this today who will fear deepfaked intimate images being used to harass and bully them in their high schools.

The government could have made a small amendment to the Criminal Code to update existing laws to protect Canadians in the digital age, but it has chosen this onerous, widely panned approach instead of protecting Canadians' rights. Why?

Online Harms ActGovernment Orders

10:20 a.m.

Liberal

Arif Virani Liberal Parkdale—High Park, ON

Mr. Speaker, I would say categorically that this is a misconstruction of the legislation and what it would do. This legislation would uphold freedom of expression. Freedom of speech in this country, as of right now, does not include hateful speech. That is protected against in the physical world. We are transposing that protection into the online world to directly address the needs of the very people that she just mentioned in those schools in Alberta.

With respect to deepfakes, we are taking an additional step by entrenching that language in the legislation. That was done intentionally because deepfakes are being used against children, adolescents and adults to silence them. I know the member is a strong advocate for women's empowerment and women's voices in civic discourse. Deepfakes are being used right now against Alexandria Ocasio-Cortez and Prime Minister Meloni in Italy.

Regardless of one's views of their political positions, etc., the point is that when the leader of a G7 country is being limited in terms of their ability to participate in civic and political discourse via deepfakes, we need to take action. We are taking that action in a comprehensive bill and a comprehensive measure that would address and empower freedom of expression rather than limiting it.

Online Harms ActGovernment Orders

10:20 a.m.

Bloc

Claude DeBellefeuille Bloc Salaberry—Suroît, QC

Mr. Speaker, the Bloc Québécois believes that Bill C-63 tackles two major online scourges and that it is time for us, as legislators, to take action to stamp them out.

The Bloc Québécois strongly supports part 1 of the bill, in other words, all provisions related to addressing child pornography and the communication of pornographic content without consent. As we see it, this part is self-evident. It has garnered such strong consensus that we told the minister, through our critic, the member for Rivière-du-Nord, that we not only support it, but we were also prepared to accept and pass part 1 quickly and facilitate its passage.

As for part 2, however, we have some reservations. We consider it reasonable to debate this part in committee. The minister can accuse other political parties of playing politics with part 2, but not the Bloc Québécois. We sincerely believe that part 2 needs to be debated. We have questions. We have doubts. I think our role calls on us to to get to the bottom of things.

That is why we have asked the minister—and why we are asking him again today—to split Bill C‑63 in two, so that we can pass part 1 quickly and implement it, and set part 2 aside for legislative and debate-related purposes.

Online Harms ActGovernment Orders

10:25 a.m.

Liberal

Arif Virani Liberal Parkdale—High Park, ON

Mr. Speaker, I thank my colleague opposite for her question, and I appreciate the position of the Bloc Québécois

I want to emphasize three points.

First, the aspect that affects children also affects teens and adults. In other words, hatred is a problem for children, teenagers and adults. Hatred is not exclusive to any particular age. That is the first thing.

Second, the member is suggesting that a comprehensive study is needed, with witnesses and consultations, to see if we can improve the bill. I could not agree more, but it is not just part 2 that needs to be thoroughly studied. We need a comprehensive study of all aspects of this bill. We need to examine the bill in its entirety.

Third, as I mentioned at the outset, Canada is not the first country to move in this direction. Australia took its first steps in 2015, beginning with protecting children only. Nine years later, in 2024, Australia is addressing the issue more broadly. In 2024, Canada needs to address all aspects. Harmful content is by no means limited to content directed at children.

Online Harms ActGovernment Orders

10:25 a.m.

NDP

Peter Julian NDP New Westminster—Burnaby, BC

Mr. Speaker, the NDP finds that the government delayed introduction of this bill for far too long. We want it to be referred to committee for a comprehensive study.

There are some parts that we fully support. There are others that deal with the Criminal Code, for example, that will truly require a comprehensive study in committee. We have to make sure we take the time that is needed.

That being said, the bill is missing certain aspects, which is a bit surprising. I am talking about transparency with respect to algorithms. As the minister knows, hate and other such things are often amplified by algorithms that promote the kind of content that adversely affects people. This is not being addressed in the bill.

I would like the minister to tell us why this important aspect of algorithms and transparency is not being addressed so that we can determine precisely why some hateful content or harmful content is promoted on certain platforms.

Online Harms ActGovernment Orders

10:25 a.m.

Liberal

Arif Virani Liberal Parkdale—High Park, ON

Mr. Speaker, I want to note that the time it took to promulgate this bill and bring it here before the House for debate was directly related to the consultations we held around the world. That is why it took four years to prepare this bill.

Also, with respect to the transparency of social media and platforms, I would like to note three specific points.

First, the bill specifically seeks to enable the digital safety commissioner to authorize academic researchers to access data anonymously to verify what is happening on platforms with their own algorithms. Second, the digital safety commissioner will be responsible for ensuring that the platforms actually follow the digital safety plan. Third, every user can run their own algorithm to inform platforms that some content is harmful and to prevent content from a specific author from appearing on their feed.

We are therefore broadening many aspects related to algorithm transparency. If other measures should be taken, I am quite willing to consider amendments that are presented in good faith in committee on how to improve transparency on this front.

Online Harms ActGovernment Orders

10:30 a.m.

Mississauga—Streetsville Ontario

Liberal

Rechie Valdez LiberalMinister of Small Business

Mr. Speaker, a few weeks ago I had the opportunity to visit a school in my riding in response to letters that some nine-year-olds and 10-year-olds had written to me. In their classroom, I asked the kids whether they knew about cyber-bullying, and all of them raised their hands because all of them had experienced it or knew to some degree what cyber-bullying is like.

While I was talking about the topic of cyber-bullying, there was a young boy the age of my daughter, nine years old. He raised his hand and shared with me that on his birthday, he had received new VR glasses to use with his video game. He shared that while he was in his online space and was minding his own business, someone approached him online and did things to him repeatedly that were not nice. Needless to say, when I asked him what he did after this happened to him, the young man said he did not do anything and that he decided not to play video games ever again.

The reason I am sharing his testimony is that I would like to ask the hon. minister what the bill would do to help protect kids just like the one I spoke to at the elementary school.

Online Harms ActGovernment Orders

10:30 a.m.

Liberal

Arif Virani Liberal Parkdale—High Park, ON

Mr. Speaker, what I can say is that my heart breaks just listening to that. It is at the heart of the bill. The bill would entrench a duty to protect children, a duty to remove content that would target children. In terms of what the child who was mentioned experienced, one can rest assured that it is not an anomaly in Mississauga. Kids around Canada and around the world are facing this type of situation all the time.

We would never tolerate someone's lurking around a schoolyard or contacting our kids by telephone at midnight. That is what is occurring all the time. The fact that the bill takes a hard look at child sex predators and at those who would spread revenge porn, and that it would entrench a duty to protect children, is in fact the exact step we need to take. That is what Canadian parents are demanding. I hope every parliamentarian of the chamber will get behind the important bill before us.

Online Harms ActGovernment Orders

10:30 a.m.

Conservative

Michelle Rempel Conservative Calgary Nose Hill, AB

Mr. Speaker, we must protect Canadians in the digital age, but Bill C-63 is not the way to do it. It would force Canadians to make unnecessary trade-offs between the guarantee of their security and their charter rights. Today I will explain why Bill C-63 is deeply flawed and why it would not protect Canadians' rights sufficiently. More importantly, I will present a comprehensive alternative plan that is more respectful of Canadians' charter rights and would provide immediate protections for Canadians facing online harms.

The core problem with Bill C-63 is how the government has changed and chosen to frame the myriad harms that occur in the digital space as homogenous and as capable of being solved with one approach or piece of legislation. In reality, harms that occur online are an incredibly heterogenous set of problems requiring a multitude of tailored solutions. It may sound like the former might be more difficult to achieve than the latter, but this is not the case. It is relatively easy to inventory the multitudes of problems that occur online and cause Canadians harm. From there, it should be easy to sort out how existing laws and regulatory processes that exist for the physical world could be extended to the digital world.

There are few, if any, examples of harms that are being caused in digital spaces that do not already have existing relatable laws or regulatory structures that could be extended or modified to cover them. Conversely, what the government has done for nearly a decade is try to create new, catch-all regulatory, bureaucratic and extrajudicial processes that would adapt to the needs of actors in the digital space instead of requiring them to adapt to our existing laws. All of these attempts have failed to become law, which is likely going to be the fate of Bill C-63.

This is a backward way of looking at things. It has caused nearly a decade of inaction on much-needed modernization of existing systems and has translated into law enforcement's not having the tools it needs to prevent crime, which in turn causes harm to Canadians. It has also led to a balkanization of laws and regulations across Canadian jurisdictions, a loss of investment due to the uncertainty, and a lack of coordination with the international community. Again, ultimately, it all harms Canadians.

Bill C-63 takes the same approach by listing only a few of the harms that happen in online spaces and creates a new, onerous and opaque extrajudicial bureaucracy, while creating deep problems for Canadian charter rights. For example, Bill C-63 would create a new “offence motivated by a hatred” provision that could see a life sentence applied to minor infractions under any act of Parliament, a parasitic provision that would be unchecked in the scope of the legislation. This means that words alone could lead to life imprisonment.

While the government has attempted to argue that this is not the case, saying that a serious underlying act would have to occur for the provision to apply, that is simply not how the bill is written. I ask colleagues to look at it. The bill seeks to amend section 320 of the Criminal Code, and reads, “Everyone who commits an offence under this Act or any other Act of Parliament...is guilty of an indictable offence and liable to imprisonment for life.”

At the justice committee earlier this year, the minister stated:

...the new hate crime offence captures any existing offence if it was hate-motivated. That can run the gamut from a hate-motivated theft all the way to a hate-motivated attempted murder. The sentencing range entrenched in Bill C-63 was designed to mirror the existing...options for all of these potential underlying offences, from the most minor to the most serious offences on the books....

The minister continued, saying, “this does not mean that minor offences will suddenly receive...harsh sentences. However, sentencing judges are required to follow legal principles, and “hate-motivated murder will result in a life sentence. A minor infraction will...not result in it.”

In this statement, the minister admitted both that the new provision could be applied to any act of Parliament, as the bill states, and that the government would be relying upon the judiciary to ensure that maximum penalties were not levelled against a minor infraction. Parliament cannot afford the government to be this lazy, and by that I mean not spelling out exactly what it intends a life sentence to apply to in law, as opposed to handing a highly imperfect judiciary an overbroad law that could have extreme, negative consequences.

Similarly, a massive amount of concern from across the political spectrum has been raised regarding Bill C-63's introduction of a so-called hate crime peace bond, calling it a pre-crime provision for speech. This is highly problematic because it would explicitly extend the power to issue peace bonds to crimes of speech, which the bill does not adequately define, nor does it provide any assurance that it would meet a criminal standard for hate.

Equally as concerning is that Bill C-63 would create a new process for individuals and groups to complain to the Canadian Human Rights Commission that online speech directed at them is discriminatory. This process would be extrajudicial, not subject to the same evidentiary standards of a criminal court, and could take years to resolve. Findings would be based on a mere balance of probabilities rather than on the criminal standard of proof beyond a reasonable doubt.

The subjectivity of defining hate speech would undoubtedly lead to punishments for protected speech. The mere threat of human rights complaints would chill large amounts of protected speech, and the system would undoubtedly be deluged with a landslide of vexatious complaints. There certainly are no provisions in the bill to prevent any of this from happening.

Nearly a decade ago, even the Toronto Star, hardly a bastion of Conservative thought, wrote a scathing opinion piece opposing these types of provisions. The same principle should apply today. When the highly problematic components of the bill are overlaid upon the fact that we are presently living under a government that unlawfully invoked the Emergencies Act and that routinely gaslights Canadians who legitimately question efficacy or the morality of its policies as spreading misinformation, as the Minister of Justice did in his response to my question, saying that I had mis-characterized the bill, it is not a far leap to surmise that the new provision has great potential for abuse. That could be true for any political stripe that is in government.

The government's charter compliance statement, which is long and vague and has only recently been issued, should raise concerns for parliamentarians in this regard, as it relies on this statement: “The effects of the Bill on freedom expression are outweighed by the benefits of protecting members of vulnerable groups”. The government has already been found to have violated the Charter in the case of Bill C-69 for false presumptions on which one benefit outweighs others. I suspect this would be the same case for Bill C-63 should it become law, which I hope it does not.

I believe in the capacity of Canadians to express themselves within the bounds of protected speech and to maintain the rule of law within our vibrant pluralism. Regardless of political stripe, we must value freedom of speech and due process, because they are what prevents violent conflict. Speech already has clearly defined limitations under Canadian law. The provisions in Bill C-63 that I have just described are anathema to these principles. To be clear, Canadians should not be expected to have their right to protected speech chilled or limited in order to be safe online, which is what Bill C-63 would ask of them.

Bill C-63 would also create a new three-headed, yet-to-exist bureaucracy. It would leave much of the actual rules the bill describes to be created and enforced under undefined regulations by said bureaucracy at some much later date in the future. We cannot wait to take action in many circumstances. As one expert described it to me, it is like vaguely creating an outline and expecting bureaucrats, not elected legislators, to colour in the picture behind closed doors without any accountability to the Canadian public.

The government should have learned from the costs associated with failing when it attempted the same approach with Bill C-11 and Bill C-18, but alas, here we are. The new bureaucratic process would be slow, onerous and uncertain. If the government proceeds with it, it means Canadians would be left without protection, and innovators and investors would be left without the regulatory certainty needed to grow their businesses.

It would also be costly. I have asked the Parliamentary Budget Officer to conduct an analysis of the costs associated with the creation of the bureaucracy, and he has agreed to undertake the task. No parliamentarian should even consider supporting the bill without understanding the resources the government intends to allocate to the creation of the new digital safety commission, digital safety ombudsman and digital safety office, particularly since the findings in this week's damning NSICOP report starkly outlined the opportunity cost of the government failing to allocate much needed resources to the RCMP.

Said differently, if the government cannot fund and maintain the critical operations of the RCMP, which already has the mandate to enforce laws related to public safety, then Parliament should have grave, serious doubts about the efficacy of its setting up three new bureaucracies to address issues that could likely be managed by existing regulatory bodies like the CRTC or in the enforcement of the Criminal Code. Also, Canadians should have major qualms about creating new bureaucracies which would give power to well-funded and extremely powerful big tech companies to lobby and manipulate regulations to their benefit behind the scenes and outside the purview of Parliament.

This approach would not necessarily protect Canadians and may create artificial barriers to entry for new innovative industry players. The far better approach would be to adapt and extend long-existing laws and regulatory systems, properly resource their enforcement arms, and require big tech companies and other actors in the digital space to comply with these laws, not the other way around. This approach would provide Canadians with real protections, not what amounts to a new, ineffectual complaints department with a high negative opportunity cost to Canadians.

In no scenario should Parliament allow the government to entrench in legislation a power for social media companies to be arbiters of speech, which Bill C-63 risks doing. If the government wishes to further impose restrictions on Canadians' rights to speech, that should be a debate for Parliament to consider, not for regulators and tech giants to decide behind closed doors and with limited accountability to the public.

In short, this bill is completely flawed and should be abandoned, particularly given the minister's announcement this morning that he is unwilling to proceed with any sort of change to it in scope.

However, there is a better way. There is an alternative, which would be a more effective and more quickly implementable plan to protect Canadians' safety in the digital age. It would modernize existing laws and processes to align with digital advancements. It would protect speech not already limited in the Criminal Code, and would foster an environment for innovation and investment in digital technologies. It would propose adequately resourcing agencies with existing responsibilities for enforcing the law, not creating extrajudicial bureaucracies that would amount to a complaints department.

To begin, the RCMP and many law enforcement agencies across the country are under-resourced after certain flavours of politicians have given much more than a wink and a nod to the “defund the police” movement for over a decade. This trend must immediately be reversed. Well-resourced and well-respected law enforcement is critical to a free and just society.

Second, the government must also reform its watered-down bail policies, which allow repeat offenders to commit crimes over and over again. Criminals in the digital space will never face justice, no matter what laws are passed, if the Liberal government's catch-and-release policies are not reversed. I think of a woman in my city of Calgary who was murdered in broad daylight in front of an elementary school because her spouse was subject to the catch-and-release Liberal bail policy, in spite of his online harassment of her for a very long time.

Third, the government must actually enforce—

Online Harms ActGovernment Orders

10:45 a.m.

Conservative

The Deputy Speaker Conservative Chris d'Entremont

The hon. member for Drummond is rising on a point of order.

Online Harms ActGovernment Orders

10:45 a.m.

Bloc

Martin Champoux Bloc Drummond, QC

Mr. Speaker, I apologize to my colleague. I hate to interrupt her in the middle of a speech like this, but we can hear a telephone or device vibrating near a microphone and it must be very irritating for the interpreters.

Could you ask members to be mindful of that and to keep their devices away from the microphones, please?

Online Harms ActGovernment Orders

10:45 a.m.

Conservative

The Deputy Speaker Conservative Chris d'Entremont

I would ask the hon. member to move the cellphone away from the microphone so that it does not vibrate.

Online Harms ActGovernment Orders

10:45 a.m.

Conservative

Michelle Rempel Conservative Calgary Nose Hill, AB

Mr. Speaker, third, the government must actually enforce laws that are already on the books but have not been recently enforced due to a extreme lack of political will and disingenuous politics and leadership, particularly as they relate to hate speech. This is particularly in light of the rise of dangers currently faced by vulnerable Canadian religious communities such as, as the minister mentioned, Canada's Jewish community.

This could be done via actions such as ensuring the RCMP, including specialized integrated national security enforcement teams and national security enforcement sections, is providing resources and working directly with appropriate provincial and municipal police forces to share appropriate information intelligence to provide protection to these communities, as well as making sure the secure security infrastructure program funding is accessible in an expedited manner so community institutions and centres can enhance security measures at their gathering places.

Fourth, for areas where modernization of existing regulations and the Criminal Code need immediate updating to reflect the digital age, and where there could be cross-partisan consensus, the government should undertake these changes in a manner that would allow for swift and non-partisan passage through Parliament.

These items could include some of the provisions discussed in Bill C-63. These include the duty of making content that sexually victimizes a child or revictimizes a survivor, or of intimate content communicated without consent, inaccessible to persons in Canada in certain circumstances; imposing certain duties to keep all records related to sexual victimization to online providers; making provisions for persons in Canada to make a complaint to existing enforcement bodies, such as the CRTC or the police, not a new bureaucracy that would take years to potentially materialize and be costly and/or ineffective; ensuring that content on a social media service that sexually victimizes a child or revictimizes a survivor, or is intimate content communicated without consent, by authorization of a court making orders to the operators of those services, is inaccessible to persons in Canada; and enforcing the proposed amendment to an act respecting the mandatory reporting of internet child pornography by persons who provide an Internet service.

Other provisions the government has chosen not to include in Bill C-63, but that should have been and that Parliament should be considering in the context of harms that are being conducted online, must include updating Canada's existing laws on the non-consensual distribution of intimate images to ensure the distribution of intimate deepfakes is also criminalized, likely through a simple update to the Criminal Code. We could have done this by unanimous consent today had the government taken the initiative to do so. This is already a major problem in Canada with girls in high schools in Winnipeg seeing intimate images of themselves, sometimes, as reports are saying, being sexually violated without any ability for the law to intervene.

The government also needs to create a new criminal offence of online criminal harassment that would update the existing crime of criminal harassment to address the ease and anonymity of online criminal harassment. Specifically, this would apply to those who repeatedly send threatening and/or explicit messages or content to people across the Internet and social media when they know, or should know, it is not welcome. This could include aggravating factors for repeatedly sending such material anonymously and be accompanied by a so-called digital restraining order that would allow victims of online criminal harassment to apply to a judge, under strict circumstances, to identify the harassment and end the harassment.

This would protect privacy, remove the onus on social media platforms from guessing when they should be giving identity to the police and prevent the escalation of online harassment into physical violence. This would give police and victims clear and easy-to-understand tools to prevent online harassment and associated escalation. This would address a major issue of intimate partner violence and make it easier to stop coercive control.

As well, I will note to the minister that members of the governing Liberal Party agreed to the need for these exact measures at a recent meeting of PROC related to online harassment of elected officials this past week.

Fifth, the government should consider a more effective and better way to regulate online platforms, likely under the authority of the CRTC and the Minister of Industry, to better protect children online while protecting charter rights.

This path could include improved measures to do this. This could include, through legislation, not backroom regulation, but precisely through law, defining the duty of care required by online platforms. Some of these duties of care have already been mentioned in questions to the ministers today. This is what Parliament should be seized with, not allowing some unnamed future regulatory body to decide this for us while we have big tech companies and their lobbying arms defining that behind closed doors. That is our job, not theirs.

We could provide parents with safeguards, controls and transparency to prevent harm to their kids when they are online, which could be part of the duty of care. We could also require that online platforms put the interests of children first with appropriate safeguards, again, in a legislative duty of care.

There could also be measures to prevent and mitigate self-harm, mental health disorders, addictive behaviours, bullying and harassment, sexual violence and exploitation, and the promotion of marketing and products that are unlawful for minors. All of these things are instances of duty of care.

We could improve measures to implement privacy-preserving and trustworthy age verification methods, which many platforms always have the capacity to do, while prohibiting the use of a digital ID in any of these mechanisms.

This path could also include measure to ensure that the enforcement of these mechanisms, including a system of administrative penalties and consequences, is done through agencies that already exist. Additionally, we could ensure that there are perhaps other remedies, such as the ability to seek remedy for civil injury, when that duty of care is violated.

This is a non-comprehensive list of online harms, but the point is, we could come to consensus in this place on simple modernization issues that would update the laws now. I hope that the government will accept this plan.

A send out a shout-out to Sean Phelan and David Murray, two strong and mighty workers. We did not have an army of bureaucrats, but we came up with this. I hope that Parliament considers this alternative plan, instead of Bill C-63, because the safety of Canadians is at risk.

Online Harms ActGovernment Orders

10:50 a.m.

Parkdale—High Park Ontario

Liberal

Arif Virani LiberalMinister of Justice and Attorney General of Canada

Mr. Speaker, I genuinely thank the member opposite for her contributions to today's debate because it is really important.

I will point out four things and then ask her a question.

The first is that, with respect to my position on amendments, what I said, and I want to make sure it is crystal clear to Canadians watching, is that I am open to amendments that would strengthen the bill that are made in good faith.

The second point is with respect to free-standing hate crime, which is a provision that exists in 47 out of 50 states in the United States. The nature of the penalty that would be applied in a given context of a hate crime would depend on the underlying offence. Uttering a threat that was motivated by hate would constitute less of a penalty than committing a murder that was motivated by hate. For the member's benefit, paragraph 718.1 of the Criminal Code, which I do trust judges to interpret, specifically says that the penalty “must be proportionate to the gravity of the offence and the degree of responsibility of the offender.”

With respect to the peace bond, what I would say to the member's point, quite simply, is that I do believe it is necessary to take a tool that is well known to criminal law and apply it to the context of a synagogue, which has already been targeted with vandalism and may be targeted again, where there would be proof needed to be put before a judge and where the safeguard would exist for the attorney general of jurisdiction to give consent before such a peace bond was pursued.

The member talked about the fact that Criminal Code tools should be used in the context of ensuring that we can tackle this pernicious information. What I would say to her is that law enforcement has asked us for the same tool that Amanda Todd's mother has asked us for. The victimization of people, even after death, continues when the—

Online Harms ActGovernment Orders

10:55 a.m.

Conservative

The Deputy Speaker Conservative Chris d'Entremont

The hon. member for Calgary Nose Hill.

Online Harms ActGovernment Orders

10:55 a.m.

Conservative

Michelle Rempel Conservative Calgary Nose Hill, AB

Mr. Speaker, I have outlined in detail why the bill is irremediable. It is not fixable, and members do not have to take my word for it. The Atlantic magazine, hardly a bastion of conservative thought, has a huge expose this morning on why the bill is so flawed. I suspect it is why the government has only allowed it to come up for debate now. I do not expect to see it in the fall.

Given that the bill is so flawed, it is incumbent upon the Minister of Justice to take the suggestions of the opposition seriously. I have outlined several, and they are very easy to pick out of my speech, suggestions on how the minister could proceed. He could proceed, likely on an expedited process, under those situations.

It sounds like my colleagues from the Bloc and the NDP have similar concerns. The bill cannot proceed in its current state. Frankly, Canadians should not be expected to trade their rights for safety online, and they should not have to expect a government, which has dragged its heels for nearly a decade, to continue with the facade that it actually cares about this issue or has a plan to address it. We have given it one, and the Liberals should take it.

Online Harms ActGovernment Orders

10:55 a.m.

Bloc

Martin Champoux Bloc Drummond, QC

Mr. Speaker, at the end of this parliamentary term, I am pleased to see that more and more school groups are coming to watch the business of the House. I think this is a strategy used by teachers to show that they are not as boring as they seem and that students should pay attention in class. Quite often, what happens here is a lot more interesting than sitting in class.

That said, I listened closely to my colleague's speech. I noted several interesting points, particularly the fact that she made proposals. We do not often hear proposals about regulating online content from the Conservatives. I heard proposals and I also detected some desire for consensus. There may well be certain points on which we could agree.

Does my colleague agree with the Bloc Québécois, which is proposing that we split the bill, that we should fast-track the study of part 1, given that we generally agree on its principles at least, and that we should take the time to study part 2 in the House and in committee? Part 2 contains aspects that require much more in-depth discussion, in our opinion.

Online Harms ActGovernment Orders

10:55 a.m.

Conservative

Michelle Rempel Conservative Calgary Nose Hill, AB

Mr. Speaker, the unfortunate thing is that the government is close to the end of its mandate and does not have a lot of public support across the country. The reality is that even if the government members said that they were going to split the bill, which they just said that they were not going to do, the bill would not likely become law. Certainly, the regulatory process is not going to happen prior to the next election, even if the bill is rammed through.

The problem that is facing Canadians is that the solutions that are required have problems that need to be addressed today. I would suggest that what is actually needed is a separate, completely different piece of legislation, which outlines the suggestions I have in there. It is unfortunate that the government, with its army of bureaucrats, was not able to do it and that it is the opposition that has to do it. I am certainly willing to work with my opposition colleagues on another piece of legislation that could address these issues and find areas of commonality so that we can protect Canadians from online harms.

Online Harms ActGovernment Orders

10:55 a.m.

NDP

Peter Julian NDP New Westminster—Burnaby, BC

Mr. Speaker, I appreciate the member's hard work in terms of tackling issues like harassment and the distribution of non-consensual images; she is very sincere in this regard.

The member has flagged the issue of resources; the bill is unclear as to what the government would actually provide in terms of resources. I do note this has been an ongoing problem over the last 20 years with cutbacks to law enforcement.

The member notes as well the impact of big tech. I wanted her to comment on a substantial missing piece in the legislation around algorithm transparency, which is currently before the U.S. Congress, and needs to be addressed absolutely. Big tech companies often promote non-consensual images through their algorithms and hate through their algorithms without any sort of oversight or responsibility. How does the member feel about that missing piece?

Online Harms ActGovernment Orders

10:55 a.m.

Conservative

Michelle Rempel Conservative Calgary Nose Hill, AB

Mr. Speaker, with regard to resources, I asked the Parliamentary Budget Officer to conduct an analysis of the resources that the government was anticipating for the creation of its bureaucracy, because I believe that those resources would likely be much better allocated to other places. My colleague can wait for that report and perhaps re-emphasize to the Parliamentary Budget Officer the need to speed that along.

The second thing is with regard to algorithmic transparency. This is why we need to have a legislated duty of care. If we proceeded on the principle of a legislated duty of care of social media operators, then we could discuss what needs to be in there. Certainly, algorithmic transparency and bias that are used in AI systems that could be potentially injurious in a variety of ways are something—

Online Harms ActGovernment Orders

11 a.m.

Conservative

The Deputy Speaker Conservative Chris d'Entremont

It is time to go to Statements by Members.

The EconomyStatements by Members

11 a.m.

Liberal

Kevin Lamoureux Liberal Winnipeg North, MB

Mr. Speaker, this is a government that truly does care. I think of pharmacare, the school food program, dental care, child care and the disability program that we put into place, and we are focused on building a stronger economy. I think of the investments that we are receiving. Did anyone know that when we talk about direct investments per capita, Canada is number one in the G7, and when I compare us to the rest of the world, we are number three?

This is because people know and understand that the Canadian economy is doing well. At the same time, we are providing supports to Canadians. Earlier this week, the action that the Government of Canada is taking was reaffirmed as being positive, as the Bank of Canada dropped our interest rate. Canada is the first of the G7 countries to see a drop in interest rates. That is good for all of Canada.

LakelandStatements by Members

11 a.m.

Conservative

Shannon Stubbs Conservative Lakeland, AB

Mr. Speaker, two years after Putin's illegal attack on Ukraine, many Lakeland towns, groups and people have opened their hearts to displaced Ukrainians who now call Canada their home. These are families like the Krawecs from Athabasca, who started by filling out immigration forms and then found furnishings for multiple homes.

There are volunteer settlement committees, like Vegreville and Area Stands with Ukraine, and community efforts, like the Vyshyvanka Day fundraiser in Bonnyville to provide winter clothing or the Koinonia retreat outside Thorhild, the family camp, to connect displaced people for emotional support.

That is only a small glimpse, but all Lakeland's efforts share one common goal: to welcome and assist Ukrainian families. One of them, parents Tetiana and Kostiantyn and big brother Daniil, were blessed with a beautiful baby boy in May. Ernest is the first baby born to Ukrainian newcomers in the community and now also a baby Canadian citizen.

Conservatives will keep fighting to send weapons and Canadian LNG to help Ukrainians kick Putin's gas. That is real action to bring home peace, security and sovereignty for Ukrainians and Canadians.

Stefano EconomopoulosStatements by Members

11 a.m.

Liberal

Peter Fragiskatos Liberal London North Centre, ON

Mr. Speaker, I rise to honour the extraordinary life of Stefano “Steve” Economopoulos who recently passed away, unfortunately, in his 100th year. He was the husband to Angeliki for 74 years; father of Gus, Tom, Vivian and Angelo; grandfather to seven; and a great-grandfather as well.

He came to Canada in 1951, but he grew up in the Kalavryta area in Greece. He fought in the Second World War. A proud veteran, he then became a police officer and served the Greek police before coming here.

When he came here, he came here humble. He came willing to work hard to make a contribution to his country. He began as a dishwasher, and eventually became a very successful entrepreneur, owning several restaurants and doing very well throughout. In fact, even in his later years, he worked at Richies Family Restaurant, helping his sons. Everybody knows Richies back home.

He was kind and humble; he showed compassion to everyone he knew. He always had good advice for me. We will miss him. All of us will miss him very much. I wish all the very best to the family. We are thinking of them.