An Act to enact the Online Harms Act, to amend the Criminal Code, the Canadian Human Rights Act and An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other Acts

Sponsor

Arif Virani  Liberal

Status

Second reading (House), as of Feb. 26, 2024

Subscribe to a feed (what's a feed?) of speeches and votes in the House related to Bill C-63.

Summary

This is from the published bill. The Library of Parliament often publishes better independent summaries.

Part 1 of this enactment enacts the Online Harms Act , whose purpose is to, among other things, promote the online safety of persons in Canada, reduce harms caused to persons in Canada as a result of harmful content online and ensure that the operators of social media services in respect of which that Act applies are transparent and accountable with respect to their duties under that Act.
That Act, among other things,
(a) establishes the Digital Safety Commission of Canada, whose mandate is to administer and enforce that Act, ensure that operators of social media services in respect of which that Act applies are transparent and accountable with respect to their duties under that Act and contribute to the development of standards with respect to online safety;
(b) creates the position of Digital Safety Ombudsperson of Canada, whose mandate is to provide support to users of social media services in respect of which that Act applies and advocate for the public interest in relation to online safety;
(c) establishes the Digital Safety Office of Canada, whose mandate is to support the Digital Safety Commission of Canada and the Digital Safety Ombudsperson of Canada in the fulfillment of their mandates;
(d) imposes on the operators of social media services in respect of which that Act applies
(i) a duty to act responsibly in respect of the services that they operate, including by implementing measures that are adequate to mitigate the risk that users will be exposed to harmful content on the services and submitting digital safety plans to the Digital Safety Commission of Canada,
(ii) a duty to protect children in respect of the services that they operate by integrating into the services design features that are provided for by regulations,
(iii) a duty to make content that sexually victimizes a child or revictimizes a survivor and intimate content communicated without consent inaccessible to persons in Canada in certain circumstances, and
(iv) a duty to keep all records that are necessary to determine whether they are complying with their duties under that Act;
(e) authorizes the Digital Safety Commission of Canada to accredit certain persons that conduct research or engage in education, advocacy or awareness activities that are related to that Act for the purposes of enabling those persons to have access to inventories of electronic data and to electronic data of the operators of social media services in respect of which that Act applies;
(f) provides that persons in Canada may make a complaint to the Digital Safety Commission of Canada that content on a social media service in respect of which that Act applies is content that sexually victimizes a child or revictimizes a survivor or intimate content communicated without consent and authorizes the Commission to make orders requiring the operators of those services to make that content inaccessible to persons in Canada;
(g) authorizes the Governor in Council to make regulations respecting the payment of charges by the operators of social media services in respect of which that Act applies, for the purpose of recovering costs incurred in relation to that Act.
Part 1 also makes consequential amendments to other Acts.
Part 2 amends the Criminal Code to, among other things,
(a) create a hate crime offence of committing an offence under that Act or any other Act of Parliament that is motivated by hatred based on certain factors;
(b) create a recognizance to keep the peace relating to hate propaganda and hate crime offences;
(c) define “hatred” for the purposes of the new offence and the hate propaganda offences; and
(d) increase the maximum sentences for the hate propaganda offences.
It also makes related amendments to other Acts.
Part 3 amends the Canadian Human Rights Act to provide that it is a discriminatory practice to communicate or cause to be communicated hate speech by means of the Internet or any other means of telecommunication in a context in which the hate speech is likely to foment detestation or vilification of an individual or group of individuals on the basis of a prohibited ground of discrimination. It authorizes the Canadian Human Rights Commission to deal with complaints alleging that discriminatory practice and authorizes the Canadian Human Rights Tribunal to inquire into such complaints and order remedies.
Part 4 amends An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service to, among other things,
(a) clarify the types of Internet services covered by that Act;
(b) simplify the mandatory notification process set out in section 3 by providing that all notifications be sent to a law enforcement body designated in the regulations;
(c) require that transmission data be provided with the mandatory notice in cases where the content is manifestly child pornography;
(d) extend the period of preservation of data related to an offence;
(e) extend the limitation period for the prosecution of an offence under that Act; and
(f) add certain regulation-making powers.
Part 5 contains a coordinating amendment.

Elsewhere

All sorts of information on this bill is available at LEGISinfo, an excellent resource from the Library of Parliament. You can also read the full text of the bill.

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 7:55 p.m.
See context

Liberal

Arif Virani Liberal Parkdale—High Park, ON

Madam Chair, I am pleased to see any efforts that deal with combatting hatred, which is unfortunately spiralling in terms of anti-Semitic incidents and Islamophobic incidents. There is a 130% rise in hate crimes in this country in the last five years. That informs the necessity for bills such as Bill C-63, the online harms bill, which will tackle things like hatred and its festering online, which has real-world consequences. It is very unfortunate that Canada ranks number one in the G7 for the number of deaths of Muslims in the last seven years, 11 in total, due to Islamophobic acts of hate.

What I would say, with respect to this bill, is that we are looking at it closely. I would also reiterate for the member's edification that we amended the hate propaganda provisions to include Holocaust denialism and willful promotion of anti-Semitism within the fold of sections 318 and 319, the hate propaganda offences. That was done within the last two years, I believe.

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 7:45 p.m.
See context

Liberal

Arif Virani Liberal Parkdale—High Park, ON

Madam Chair, I think that the suggestion about hate, the Bloc Québécois's private member's bill and our Bill C-63 highlight the fact that we need to pass this bill at second reading and send it to the Standing Committee on Justice and Human Rights so that we can study it, hear from experts and witnesses and propose amendments, if a few turn out to be appropriate.

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 7:40 p.m.
See context

Bloc

Kristina Michaud Bloc Avignon—La Mitis—Matane—Matapédia, QC

Madam Chair, the government is completely ignoring Bill S‑210. Bill C‑63 is a huge bill that has received some criticism. It is likely to take a long time to study.

However, we think the proposal to set up a digital safety commission is a good idea that should be implemented quickly. That is why we are proposing that the bill be split, quite simply, so that we can take the time to properly study all harmful content while still setting up the digital safety commission quickly. I understand that the proposal has not been accepted, but I still think it is a good idea.

The topic of harmful content brings me to hate speech. Will the minister commit to abolishing the Criminal Code exemption that allows hate speech in the name of religion? In fact, that would be a great addition to his Bill C‑63.

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 7:40 p.m.
See context

Liberal

Arif Virani Liberal Parkdale—High Park, ON

Madam Chair, I have several answers to give on this matter. The big difference between the senator's bill and Bill C‑63 is that our bill had the benefit of a five-year consultation. That is the first thing.

The second thing is that, although we agree with some aspects, we want to work in close collaboration with the big digital companies to resolve the situation and protect the public and children from pornography. Taking down that information and content within a mandatory 24-hour period is a much stronger measure than what was proposed in the bill introduced by the senator.

The last thing is that we are targeting a situation where all harmful online content needs to be addressed. This concerns children, teenagers and adults. We want a big solution to a big problem. Australia started nine years ago with children only. Nine years later, protecting children only is no longer appropriate—

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 7:40 p.m.
See context

Bloc

Kristina Michaud Bloc Avignon—La Mitis—Matane—Matapédia, QC

Madam Chair, I politely beg to differ. I feel that Bill C‑63 is extremely important, but it is not exactly the same thing. Yes, it contains elements that make it possible to regulate or, at least, be warned before consuming certain types of content, but there is nothing that really makes it possible to verify the consumer's age.

I would therefore advise the government to support a bill like Bill S‑210. Obviously, it is not easy to implement this type of safeguard, and other countries are currently looking at that. However, it is an extremely important bill.

To return to Bill C‑63, would the minister agree that the first part of the bill could be split from the rest so that the digital security commission could be created as quickly as possible? That would enable us to protect female victims of intimate content communicated without consent, including deepfakes.

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 7:40 p.m.
See context

Liberal

Arif Virani Liberal Parkdale—High Park, ON

Madam Chair, with all due respect, I want to correct the member opposite.

First, Bill C‑63 deals mainly with types of content that are appropriate for children. Second, it addresses the obligation to protect children. There is also a provision of Bill C‑63 that talks about age appropriate design features.

We are targeting the same problem. We want to work with social media platforms to resolve this situation in a way that will enable us to protect people's privacy and personal information and protect children.

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 7:40 p.m.
See context

Bloc

Kristina Michaud Bloc Avignon—La Mitis—Matane—Matapédia, QC

Madam Chair, I think that the minister is well aware that those are two completely different missions. Both are commendable.

Bill C‑63 has its good points, but Bill S‑210 really seeks to check the age of pornography users to limit young people's access to it. The Liberal Party seems to disagree with this bill, and yet other countries, like Germany, France and the United Kingdom, as well as some states in the U.S. are looking into this way of verifying the age of users.

Why does Canada not want to move forward in this way to limit the access of children under the age of 18 to pornography?

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 7:40 p.m.
See context

Liberal

Arif Virani Liberal Parkdale—High Park, ON

Madam Chair, that is a great question, but I believe that the senator's bill, Bill S‑210, addresses only one aspect of our broader bill, C‑63.

Protecting children from pornography and sexual predators is a priority for both me and the senator. However, we have different ways of tackling the problem. We are dealing with a much bigger and broader problem in our own Bill C-63. We are also different when it comes to the mandates and the modus operandi that the senator proposes to use.

We are concerned about how to verify someone's age. Does it have to be a piece of government-issued ID? Will this cause other problems or lead to the possibility of other crimes, such as financial fraud, at the international level?

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 7:35 p.m.
See context

Bloc

Kristina Michaud Bloc Avignon—La Mitis—Matane—Matapédia, QC

Madam Chair, I would point out to the minister that he does not want to give Quebec an exemption from the Criminal Code, but he is giving one to British Columbia. In my view, this is something that is possible for the people in this situation in Quebec.

Now, I would like to hear his comments on all the issues related to child pornography, children's access to pornography and the sharing of non-consensual content. To my eyes, the purpose of Bill S‑210, which was introduced by Senator Julie Miville‑Dechêne and which seeks to prevent minors from accessing pornography, is completely different from the purpose of Bill C‑63, which the minister introduced and which seeks to protect the public from harmful content streamed on social media, such as intimate content communicated without consent and content that sexually victimizes a child.

Does he agree with me that these two bills have completely different purposes?

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 7:15 p.m.
See context

Parkdale—High Park Ontario

Liberal

Arif Virani LiberalMinister of Justice and Attorney General of Canada

Mr. Speaker, I will be providing 10 minutes of remarks, and I will be welcoming questions from my parliamentary secretary, the member for Etobicoke—Lakeshore. I will be using my time to discuss measures in the recent budget to combat crime, especially auto theft and money laundering. I will also touch on legal aid investments and provide an update of our work on online safety.

Auto theft is a serious problem that affects communities across the country. Not only does it affect people's wallets, it also causes them to feel unsafe. The number of these thefts has risen and, in some areas, they are growing more violent. These criminals are increasingly emboldened. Our government is committed to ensuring that police and prosecutors have the tools they need to respond to cases of auto theft, including thefts related to organized crime.

We also want to ensure that the legislation provides courts with the wherewithal to impose sentences commensurate with the seriousness of the crime. The Criminal Code already contains useful provisions for fighting auto theft, but we can do more.

This is why we are amending the Criminal Code to provide additional measures for law enforcement and for prosecutors to address auto theft. Bill C-69, the budget implementation act, sets out these proposed measures. These amendments would include new offences targeting auto theft and its links to violence and organized crime; new offences for possession and distribution of a device used for committing auto theft, such as key-programming machines; and a new offence for laundering proceeds of crime for the benefit of, at the direction of, or in association with, a criminal organization. We are proposing a new aggravating factor at sentencing, which would be applied to an adult offender who involves a young person in the commission of the crime. These changes are part of the larger federal action plan on combatting auto theft that was just released on May 20.

Auto theft is a complex crime, and fighting it involves many partners: the federal, provincial, territorial and municipal governments, industry leaders and law enforcement agencies.

I will now turn to the related issue of money laundering. Addressing money laundering will help us to combat organized crime, including its involvement in automobile theft. However, the challenges associated with money laundering and organized crime go beyond auto theft.

That is why we are continually reviewing our laws so that Canada can better combat money laundering, organized crime and terrorist activity financing.

Bill C-69 would give us more tools to combat money laundering and terrorist financing. These new measures would allow courts to issue an order that requires a person to keep an account open to assist in the investigation of a suspected criminal offence. Currently, financial service providers often unilaterally close accounts where they suspect criminal activity, which can actually hinder police investigations. This new proposed order would help in that regard.

I hope to see non-partisan support from all parties, including the official opposition, on these measures to address organized crime. It would be nice to see its members support something, rather than simply use empty slogans or block actual solutions. We see this as well in their efforts to block Bill C-59, the fall economic statement, which has been in this chamber for literally months. That also contains a range of measures to combat money laundering, which have been asked for by law enforcement. For a party that prides itself on having a close relationship with law enforcement, I find this obstruction puzzling.

What is more, under Bill C-69, the courts will also be authorized to make an order for the production of documents for specific dates thanks to a repetitive production order. That will enable law enforcement to ask a person to provide specific information to support a criminal investigation on several pre-determined dates over a defined period. That means that the individual will be required to produce specific information to support a criminal investigation on several pre-determined dates.

These two proposals resulted from the public consultations that our government held last summer. We are committed to getting Bill C-69 passed by Parliament in a timely manner so that the new measures can be put in place as quickly as possible and so that we can crack down on these serious crimes as soon as possible.

I would now like to discuss our investments in legal aid. Just as we need to protect Canadians from crime, we also need to ensure that people have equitable access to justice, which is an integral part of a fair and just society, and a strong legal aid system is a key aspect of this. It strengthens the overall justice system. Budget 2024 includes measures to increase funding to criminal legal aid as well as legal aid for immigrants and for refugees to Canada.

For criminal legal aid, budget 2024 provides $440 million over five years, starting in 2024-25. This would support access to justice for Canadians who are unable to pay for legal support, in particular, indigenous people, individuals who are Black and other racialized communities who are overrepresented in the criminal justice system. Indeed, legal representation helps to clear backlogs and delays in our court system as well.

This essential work is only possible with continued collaboration between federal, provincial and territorial governments. The proposed increase to the federal contribution will assist provinces and territories to take further actions to increase access to justice. This legal aid will help with the backlogs I just mentioned. Unrepresented and poorly represented litigants cause delays in our justice system. Making sure that these individuals have proper support and representation will help ensure access to a speedy trial. This, in combination with our unprecedented pace of judicial appointments, 106 appointments in my first nine months in office, will also address backlogs. In comparison, the previous Harper government would appoint 65 judges per year on average. I exceeded that amount in six months.

For immigration and refugee legal aid, budget 2024 would provide $273.7 million over five years, starting in 2024-25, and $43.5 million per year ongoing after that. This funding would help support access to justice for economically disadvantaged asylum seekers and others involved in immigration proceedings. This investment would help maintain the confidence of Canadians in the government's ability to manage immigration levels, and to resettle and integrate refugees into Canadian society. To do this very important work, Justice Canada continues to collaborate with provincial governments and with legal aid service providers, as well as Immigration, Refugees and Citizenship Canada. Together, we are exploring solutions to support sustainable access to immigration and refugee legal aid services.

Before I conclude, I would like to talk a little about Bill C-63, which was raised by the member for Fundy Royal. The bill addresses online harms and the safety of our communities online. Much has already been said about this very important legislation, which would create stronger protections for children online and better safeguards for everyone in Canada from online hate and other types of harmful content. What is critical about this bill is that it is dedicated to promoting people's participation online and not to limiting it.

This legislation is informed by what we have heard over five-plus years of consultations with diverse stakeholders, community groups, law enforcement and other Canadians. This bill focuses on the baseline responsibilities of social media platforms to manage the content they are hosting and their duty to keep children safe, which means removing certain types of harmful content and entrenching a duty to act responsibly.

This bill is about keeping Canadians safe, which is my fundamental priority and my fundamental duty as the Minister of Justice and Attorney General of this country. It is about ensuring that there is actually a takedown requirement on the two types of most harmful material: child pornography and the non-consensual sharing of intimate images, also known as revenge pornography.

There are five other categories of material that would be dealt with under this bill, including material that includes inciting violence, incitements to terrorism, hatred as defined by the Supreme Court of Canada, bullying a child and also inducing a child to self-harm. I am speaking now not only as the Minister of Justice but also as a father. I think that there is nothing more basic in this country for any parent or parliamentarian than keeping our children safe.

I am thankful for the opportunity to speak about how we are making Canada safer and making our justice system stronger, more accessible and more inclusive for all people.

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 7:05 p.m.
See context

Liberal

Arif Virani Liberal Parkdale—High Park, ON

Mr. Speaker, I find this line of questioning quite fascinating, given that the main charter issue that is at issue in Bill C-63 deals with very sensitive issues about the protection of freedom of speech, which is protected under section 2(b).

What I will do is always maintain my oath under the Constitution to uphold the Constitution and people's charter rights. This individual works under a leader who has brandished the idea of using the notwithstanding clause to deprive people of their charter rights. Section 2(b) is subject to the notwithstanding clause.

If we are talking about who is actually committed to protecting people's freedoms, including freedom of speech, people on that side of the House should be looking at themselves in the mirror.

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 7:05 p.m.
See context

Conservative

Rob Moore Conservative Fundy Royal, NB

Mr. Speaker, I notice once again that I have given the minister a lot of opportunities, and he has not answered any of my questions directly.

He knows the answer to this one, and he is not going to give it, so I will have to give it on his behalf. The Victoria Police Department statement says, “Bill C-75, which came into effect nationally in 2019, legislated a 'principle of restraint' that requires police to release an accused person at the earliest possible opportunity”.

The police laid the blame for this individual being released three times in a row to revictimize Canadians squarely at the feet of the minister. A woman was injured in the process of one of the thefts.

On the issue of the Liberals' draconian Bill C-63, which Margaret Atwood has described as “Orwellian”, has he completed a charter statement for this bill that clearly threatens the rights of Canadians?

May 9th, 2024 / 11:05 a.m.
See context

Philippe Dufresne Privacy Commissioner of Canada, Offices of the Information and Privacy Commissioners of Canada

Thank you, Mr. Chair.

Members of the committee, I'm pleased to be here today to discuss the Office of the Privacy Commissioner of Canada's main estimates for fiscal year 2024-25 and to describe the work of my office to protect and promote the fundamental right to privacy of Canadians. I'm accompanied by Richard Roulx, deputy commissioner, corporate management sector.

In January I launched a strategic plan that lays out three key priorities that will guide the work of the OPC through 2027. The first is protecting and promoting privacy with maximum impact, by using business intelligence to identify trends that need attention, producing focused guidance and outreach, leveraging strategic partnerships and preparing for the implementation of potentially new privacy legislation.

The second is addressing and advocating for privacy in this time of technological change, with a focus on artificial intelligence and generative AI, the proliferation of which brings both potential benefits and increased risks to privacy.

The third is championing children's privacy rights to ensure that their unique privacy needs are met and that they can exercise their rights.

I believe that these three priorities are where the Office of the Privacy Commissioner can have the greatest impact for Canadians, and that these are also where the greatest risks lie if the issues are not addressed.

Protecting privacy is one of the paramount challenges of our time. My office is poised to meet this challenge through strong advocacy, collaboration, partnerships, education, promotion, enforcement and capacity building, which includes doing more to identify and address privacy trends in a timely way.

Investigations under the Privacy Act, which covers the personal information-handling practices of federal government departments and agencies, and the Personal Information Protection and Electronic Documents Act, Canada’s federal private sector privacy law, are a key aspect of the Office of the Privacy Commissioner’s work on issues that significantly impact the lives of Canadians.

In February I made public the results of my investigation into Aylo, the operator of the website Pornhub and other pornographic websites. I found that the company had contravened Canada's federal private sector privacy law by enabling intimate images to be shared on its websites without the direct knowledge and consent of everyone who is depicted.

In releasing my report on this investigation, I reiterated that the non-consensual sharing of intimate images is a serious privacy violation that can cause severe harms to victims, and that organizations have an obligation under privacy law to prevent and remedy this.

This case is also relevant to the discussions that will be taking place on Bill C-63, and I will welcome the opportunity to share my views on the online harms act with parliamentarians.

I also look forward to sharing in the coming months the findings of two high-profile investigations that are closely tied to two of my strategic priorities—protecting children’s privacy and addressing the privacy impacts of emerging technology, including AI.

When I appeared before you last year on Main Estimates, I spoke about the launch of investigations into TikTok, as well as OpenAI, the company behind the AI-driven text generation ‘chat bot’ ChatGPT. Both investigations are being conducted jointly with my counterparts in Quebec, British Columbia and Alberta.

In the case of the TikTok investigation, the four offices are examining whether the practices of the company ByteDance comply with Canadian federal and provincial privacy legislation and, in particular, whether valid and meaningful consent is being obtained for the collection, use, and disclosure of personal information.

Given the importance of protecting children's privacy, the joint investigation has a particular focus on TikTok's privacy practices as they relate to younger users.

The investigation into OpenAI and its ChatGPT chat bot is examining whether the company is compliant with requirements under Canadian privacy law in relation to consent, openness, access, accuracy and accountability. It is also considering whether the collection, use and disclosure are done for an appropriate purpose.

Both investigations remain a high priority and we are working to complete them in a timely manner.

Protecting and promoting privacy with maximum impact remains integral to fulfilling my current mandate and preparing for potential changes to federal privacy law.

In the 2023 budget we received temporary funding to address pressures related to privacy breaches and a complaints backlog, as well as to prepare for the implementation of Bill C-27. While these temporary funds provide necessary and immediate support, it is essential that my office be properly resourced on a permanent basis to deal with the increasing complexity of today's privacy landscape and the associated demands on my office's resources.

To address this, we will continue to present fiscally responsible funding requests and will also aim to maximize agility and cost-effectiveness by assessing and streamlining program and service delivery.

With that, I would be happy to answer your questions. Thank you.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

May 7th, 2024 / 6:45 p.m.
See context

Conservative

Arnold Viersen Conservative Peace River—Westlock, AB

Mr. Speaker, I am grateful for the opportunity to wrap up the debate on the SISE act at second reading.

I have appreciated listening to the members give their speeches. At the outset, I want to briefly urge members to use the term “child sexual abuse material”, or CSAM, rather than “child pornography”. As we heard from the member for Kamloops—Thompson—Cariboo, the latter term is being replaced with CSAM because pornography allows for the idea that this could be consensual. That is why the member for Kamloops—Thompson—Cariboo has put forward a bill that would change this in the Criminal Code as well.

During the first hour of debate, we heard from the member for Laurentides—Labelle, who gave a passionate speech outlining the many serious issues of the impact of the pornography industry on women and youth. I simply do not have the time to include all of that in my speech, but we both sat on the ethics committee during the Pornhub study and heard directly from the survivors who testified.

It was the speech, however, from the Parliamentary Secretary to the Leader of the Government in the House of Commons that left me scratching my head. I do not think he actually read Bill C-270 or even the Liberals' own bill, Bill C-63. The parliamentary secretary fixated on the 24-hour takedown requirement in Bill C-63 as the solution to this issue. However, I do not think anyone is opposed to a 24-hour takedown for this exploitative intimate content sharing without consent or the child sexual abuse material. In fact, a bill that was solely focused on the 24-hour takedown would pass very quickly through this House with the support of everyone, but that does not take into account what Bill C-270 is trying to do. It is completely missing the point.

The 24-hour takedown has effect only after harmful content has been put up, such as CSAM, deepfakes and intimate images that have been shared. Bill C-270 is a preventative upstream approach. While the takedown mechanism should be available to victims, the goal of Bill C-270 is to go upstream and stop this abusive content from ever ending up on the Internet in the first place.

As I shared at the beginning of the debate, many survivors do not know that their images are online for years. They do not know that this exploitative content has been uploaded. What good would a 24-hour takedown be if they do not even know the content is there? I will repeat the words of one survivor that I shared during the first hour of debate: “I was 17 when videos of me on Pornhub came to my knowledge, and I was only 15 in the videos they've been profiting from.” She did not know for two years that exploitative content of her was being circulated online and sold. That is why Bill C-270 requires age verification and consent of individuals in pornographic material before it is posted.

I would also point out that the primary focus of the government's bill is not to reduce harm to victims. The government's bill requires services “to mitigate the risk that users of the regulated service will be exposed to harmful content”. It talks about users of the platform, not the folks depicted in it. The focus of Bill C-270 is the other side of the screen. Bill C-270 seeks to protect survivors and vulnerable populations from being the harmful content. The two goals could not be more different, and I hope the government is supportive of preventing victims of exploitation from further exploitation online.

My colleague from Esquimalt—Saanich—Sooke also noted that the narrow focus of the SISE act is targeted at people and companies that profit from sexual exploitative content. This is, indeed, one of the primary aims of this bill. I hope, as with many things, that the spread of this exploitative content online will be diminished, as it is driven by profit. The Privacy Commissioner's investigation into Canada's MindGeek found that “MindGeek surely benefits commercially from these non-compliant privacy practices, which result in a larger content volume/stream and library of intimate content on its websites.”

For years, pornography companies have been just turning a blind eye, and it is time to end that. Bill C-270 is a fulfillment of a key recommendation made by the ethics committee three years ago and supported by all parties, including the government. I hope to have the support from all of my colleagues in this place for Bill C-270, and I hope to see it at committee, where we can hear from survivors and experts.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

May 7th, 2024 / 6:40 p.m.
See context

Conservative

Garnett Genuis Conservative Sherwood Park—Fort Saskatchewan, AB

Mr. Speaker, I appreciate the opportunity to say a few words in support of Bill C-270, which is an excellent bill from my colleague from Peace River—Westlock, who has been working so hard over his nine years in Parliament to defend the interests of his constituents on important issues like firearms, forestry and fracking, but also to stand up for justice and the recognition of the universal human dignity of all people, including and especially the most vulnerable.

Bill C-270 seeks to create mechanisms for the effective enforcement of substantively already existing legal provisions that prohibit non-consensual distribution of intimate images and child pornography. Right now, as the law stands, it is a criminal offence to produce this type of horrific material, but there are not the appropriate legal mechanisms to prevent the distribution of this material by, for instance, large pornography websites.

It has come to light that Pornhub, which is headquartered in Canada, has completely failed to prevent the presence on its platform of non-consensual and child-depicting pornographic images. This has been a matter that has been studied in great detail at parliamentary committees. My colleague for Peace River—Westlock has played a central role, but other members from other parties have as well, in identifying the fact that Pornhub and other websites have not only failed but have shown no interest in meaningfully protecting potential victims of non-consensual and child pornographic images.

It is already illegal to produce these images. Why, therefore, should it not also be clearly illegal to distribute those images without having the necessary proof of consent? This bill would require that there be verification of age and consent associated with images that are distributed. It is a common-sense legal change that would require and affect greater compliance with existing criminal prohibitions on the creation of these images. It is based on the evidence heard at committee and based on the reality that major pornography websites, many of which are headquartered in Canada, are continuing to allow this material to exist. To clarify, the fact that those images are on those websites means that we desperately need stronger legal tools to protect children and stronger legal tools to protect people who are victims of the non-consensual sharing of their images.

Further, in response to the recognition of the potential harms on children associated with exposure to pornography or associated with having images taken of them and published online, there has been discussion in Parliament and a number of different bills put forward designed to protect children in vulnerable situations. These bills are, most notably, Bill C-270 and Bill S-210.

Bill S-210 would protect children by requiring meaningful age verification for those who are viewing pornography. It is recognized that exposing children to sexual images is a form of child abuse. If an adult were to show videos or pictures to a child of a sexual nature, that would be considered child abuse. However, when websites fail to have meaningful age verification and, therefore, very young children are accessing pornography, there are not currently the legal tools to hold them accountable for that. We need to recognize that exposing young children to sexual images is a form of child abuse, and therefore it is an urgent matter that we pass legislation requiring meaningful age verification. That is Bill S-210.

Then we have Bill C-270, which would protect children in a different context. It would protect children from having their images depicted as part of child pornography. Bill C-270 takes those existing prohibitions further by requiring that those distributing images also have proof of age and consent.

This is common sense; the use of criminal law is appropriate here because we are talking about instances of child sexual abuse. Both Bill S-210 and Bill C-270 deal with child sexual abuse. It should be clear that the criminal law, not some complicated nebulous regulatory regime, is the appropriate mechanism for dealing with child abuse.

In that context, we also have a government bill that has been put forward, Bill C-63, which it calls the online harms act. The proposed bill is kind of a bizarre combination of talking about issues of radically different natures; there are some issues around speech, changes to human rights law and, potentially, attempts to protect children, as we have talked about.

The freedom of speech issues raised by the bill have been well discussed. The government has been denounced from a broad range of quarters, including some of their traditional supporters, for the failures of Bill C-63 on speech.

However, Bill C-63 also profoundly fails to be effective when it comes to child protection and the removal of non-consensual images. It would create a new bureaucratic structure, and it is based on a 24-hour takedown model; it says that if something is identified, it should be taken down within 24 hours. Anybody involved in this area will tell us that 24-hour takedown is totally ineffective, because once something is on the Internet, it is likely to be downloaded and reshared over and over again. The traumatization, the revictimization that happens, continues to happen in the face of a 24-hour takedown model.

This is why we need strong Criminal Code measures to protect children. The Conservative bills, Bill S-210 and Bill C-270, would provide the strong criminal tools to protect children without all the additional problems associated with Bill C-63. I encourage the House to pass these proposed strong child protection Criminal Code-amending bills, Bill S-210 and Bill C-270. They would protect children from child abuse, and given the legal vacuums that exist in this area, there can be no greater, more important objective than protecting children from the kind of violence and sexualization they are currently exposed to.