An Act to enact the Online Harms Act, to amend the Criminal Code, the Canadian Human Rights Act and An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other Acts

Sponsor

Arif Virani  Liberal

Status

Second reading (House), as of Sept. 23, 2024

Subscribe to a feed (what's a feed?) of speeches and votes in the House related to Bill C-63.

Summary

This is from the published bill. The Library of Parliament has also written a full legislative summary of the bill.

Part 1 of this enactment enacts the Online Harms Act , whose purpose is to, among other things, promote the online safety of persons in Canada, reduce harms caused to persons in Canada as a result of harmful content online and ensure that the operators of social media services in respect of which that Act applies are transparent and accountable with respect to their duties under that Act.
That Act, among other things,
(a) establishes the Digital Safety Commission of Canada, whose mandate is to administer and enforce that Act, ensure that operators of social media services in respect of which that Act applies are transparent and accountable with respect to their duties under that Act and contribute to the development of standards with respect to online safety;
(b) creates the position of Digital Safety Ombudsperson of Canada, whose mandate is to provide support to users of social media services in respect of which that Act applies and advocate for the public interest in relation to online safety;
(c) establishes the Digital Safety Office of Canada, whose mandate is to support the Digital Safety Commission of Canada and the Digital Safety Ombudsperson of Canada in the fulfillment of their mandates;
(d) imposes on the operators of social media services in respect of which that Act applies
(i) a duty to act responsibly in respect of the services that they operate, including by implementing measures that are adequate to mitigate the risk that users will be exposed to harmful content on the services and submitting digital safety plans to the Digital Safety Commission of Canada,
(ii) a duty to protect children in respect of the services that they operate by integrating into the services design features that are provided for by regulations,
(iii) a duty to make content that sexually victimizes a child or revictimizes a survivor and intimate content communicated without consent inaccessible to persons in Canada in certain circumstances, and
(iv) a duty to keep all records that are necessary to determine whether they are complying with their duties under that Act;
(e) authorizes the Digital Safety Commission of Canada to accredit certain persons that conduct research or engage in education, advocacy or awareness activities that are related to that Act for the purposes of enabling those persons to have access to inventories of electronic data and to electronic data of the operators of social media services in respect of which that Act applies;
(f) provides that persons in Canada may make a complaint to the Digital Safety Commission of Canada that content on a social media service in respect of which that Act applies is content that sexually victimizes a child or revictimizes a survivor or intimate content communicated without consent and authorizes the Commission to make orders requiring the operators of those services to make that content inaccessible to persons in Canada;
(g) authorizes the Governor in Council to make regulations respecting the payment of charges by the operators of social media services in respect of which that Act applies, for the purpose of recovering costs incurred in relation to that Act.
Part 1 also makes consequential amendments to other Acts.
Part 2 amends the Criminal Code to, among other things,
(a) create a hate crime offence of committing an offence under that Act or any other Act of Parliament that is motivated by hatred based on certain factors;
(b) create a recognizance to keep the peace relating to hate propaganda and hate crime offences;
(c) define “hatred” for the purposes of the new offence and the hate propaganda offences; and
(d) increase the maximum sentences for the hate propaganda offences.
It also makes related amendments to other Acts.
Part 3 amends the Canadian Human Rights Act to provide that it is a discriminatory practice to communicate or cause to be communicated hate speech by means of the Internet or any other means of telecommunication in a context in which the hate speech is likely to foment detestation or vilification of an individual or group of individuals on the basis of a prohibited ground of discrimination. It authorizes the Canadian Human Rights Commission to deal with complaints alleging that discriminatory practice and authorizes the Canadian Human Rights Tribunal to inquire into such complaints and order remedies.
Part 4 amends An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service to, among other things,
(a) clarify the types of Internet services covered by that Act;
(b) simplify the mandatory notification process set out in section 3 by providing that all notifications be sent to a law enforcement body designated in the regulations;
(c) require that transmission data be provided with the mandatory notice in cases where the content is manifestly child pornography;
(d) extend the period of preservation of data related to an offence;
(e) extend the limitation period for the prosecution of an offence under that Act; and
(f) add certain regulation-making powers.
Part 5 contains a coordinating amendment.

Elsewhere

All sorts of information on this bill is available at LEGISinfo, an excellent resource from the Library of Parliament. You can also read the full text of the bill.

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 7:40 p.m.


See context

Liberal

Arif Virani Liberal Parkdale—High Park, ON

Madam Chair, I have several answers to give on this matter. The big difference between the senator's bill and Bill C‑63 is that our bill had the benefit of a five-year consultation. That is the first thing.

The second thing is that, although we agree with some aspects, we want to work in close collaboration with the big digital companies to resolve the situation and protect the public and children from pornography. Taking down that information and content within a mandatory 24-hour period is a much stronger measure than what was proposed in the bill introduced by the senator.

The last thing is that we are targeting a situation where all harmful online content needs to be addressed. This concerns children, teenagers and adults. We want a big solution to a big problem. Australia started nine years ago with children only. Nine years later, protecting children only is no longer appropriate—

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 7:40 p.m.


See context

Bloc

Kristina Michaud Bloc Avignon—La Mitis—Matane—Matapédia, QC

Madam Chair, I politely beg to differ. I feel that Bill C‑63 is extremely important, but it is not exactly the same thing. Yes, it contains elements that make it possible to regulate or, at least, be warned before consuming certain types of content, but there is nothing that really makes it possible to verify the consumer's age.

I would therefore advise the government to support a bill like Bill S‑210. Obviously, it is not easy to implement this type of safeguard, and other countries are currently looking at that. However, it is an extremely important bill.

To return to Bill C‑63, would the minister agree that the first part of the bill could be split from the rest so that the digital security commission could be created as quickly as possible? That would enable us to protect female victims of intimate content communicated without consent, including deepfakes.

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 7:40 p.m.


See context

Liberal

Arif Virani Liberal Parkdale—High Park, ON

Madam Chair, with all due respect, I want to correct the member opposite.

First, Bill C‑63 deals mainly with types of content that are appropriate for children. Second, it addresses the obligation to protect children. There is also a provision of Bill C‑63 that talks about age appropriate design features.

We are targeting the same problem. We want to work with social media platforms to resolve this situation in a way that will enable us to protect people's privacy and personal information and protect children.

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 7:40 p.m.


See context

Bloc

Kristina Michaud Bloc Avignon—La Mitis—Matane—Matapédia, QC

Madam Chair, I think that the minister is well aware that those are two completely different missions. Both are commendable.

Bill C‑63 has its good points, but Bill S‑210 really seeks to check the age of pornography users to limit young people's access to it. The Liberal Party seems to disagree with this bill, and yet other countries, like Germany, France and the United Kingdom, as well as some states in the U.S. are looking into this way of verifying the age of users.

Why does Canada not want to move forward in this way to limit the access of children under the age of 18 to pornography?

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 7:40 p.m.


See context

Liberal

Arif Virani Liberal Parkdale—High Park, ON

Madam Chair, that is a great question, but I believe that the senator's bill, Bill S‑210, addresses only one aspect of our broader bill, C‑63.

Protecting children from pornography and sexual predators is a priority for both me and the senator. However, we have different ways of tackling the problem. We are dealing with a much bigger and broader problem in our own Bill C-63. We are also different when it comes to the mandates and the modus operandi that the senator proposes to use.

We are concerned about how to verify someone's age. Does it have to be a piece of government-issued ID? Will this cause other problems or lead to the possibility of other crimes, such as financial fraud, at the international level?

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 7:35 p.m.


See context

Bloc

Kristina Michaud Bloc Avignon—La Mitis—Matane—Matapédia, QC

Madam Chair, I would point out to the minister that he does not want to give Quebec an exemption from the Criminal Code, but he is giving one to British Columbia. In my view, this is something that is possible for the people in this situation in Quebec.

Now, I would like to hear his comments on all the issues related to child pornography, children's access to pornography and the sharing of non-consensual content. To my eyes, the purpose of Bill S‑210, which was introduced by Senator Julie Miville‑Dechêne and which seeks to prevent minors from accessing pornography, is completely different from the purpose of Bill C‑63, which the minister introduced and which seeks to protect the public from harmful content streamed on social media, such as intimate content communicated without consent and content that sexually victimizes a child.

Does he agree with me that these two bills have completely different purposes?

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 7:15 p.m.


See context

Parkdale—High Park Ontario

Liberal

Arif Virani LiberalMinister of Justice and Attorney General of Canada

Mr. Speaker, I will be providing 10 minutes of remarks, and I will be welcoming questions from my parliamentary secretary, the member for Etobicoke—Lakeshore. I will be using my time to discuss measures in the recent budget to combat crime, especially auto theft and money laundering. I will also touch on legal aid investments and provide an update of our work on online safety.

Auto theft is a serious problem that affects communities across the country. Not only does it affect people's wallets, it also causes them to feel unsafe. The number of these thefts has risen and, in some areas, they are growing more violent. These criminals are increasingly emboldened. Our government is committed to ensuring that police and prosecutors have the tools they need to respond to cases of auto theft, including thefts related to organized crime.

We also want to ensure that the legislation provides courts with the wherewithal to impose sentences commensurate with the seriousness of the crime. The Criminal Code already contains useful provisions for fighting auto theft, but we can do more.

This is why we are amending the Criminal Code to provide additional measures for law enforcement and for prosecutors to address auto theft. Bill C-69, the budget implementation act, sets out these proposed measures. These amendments would include new offences targeting auto theft and its links to violence and organized crime; new offences for possession and distribution of a device used for committing auto theft, such as key-programming machines; and a new offence for laundering proceeds of crime for the benefit of, at the direction of, or in association with, a criminal organization. We are proposing a new aggravating factor at sentencing, which would be applied to an adult offender who involves a young person in the commission of the crime. These changes are part of the larger federal action plan on combatting auto theft that was just released on May 20.

Auto theft is a complex crime, and fighting it involves many partners: the federal, provincial, territorial and municipal governments, industry leaders and law enforcement agencies.

I will now turn to the related issue of money laundering. Addressing money laundering will help us to combat organized crime, including its involvement in automobile theft. However, the challenges associated with money laundering and organized crime go beyond auto theft.

That is why we are continually reviewing our laws so that Canada can better combat money laundering, organized crime and terrorist activity financing.

Bill C-69 would give us more tools to combat money laundering and terrorist financing. These new measures would allow courts to issue an order that requires a person to keep an account open to assist in the investigation of a suspected criminal offence. Currently, financial service providers often unilaterally close accounts where they suspect criminal activity, which can actually hinder police investigations. This new proposed order would help in that regard.

I hope to see non-partisan support from all parties, including the official opposition, on these measures to address organized crime. It would be nice to see its members support something, rather than simply use empty slogans or block actual solutions. We see this as well in their efforts to block Bill C-59, the fall economic statement, which has been in this chamber for literally months. That also contains a range of measures to combat money laundering, which have been asked for by law enforcement. For a party that prides itself on having a close relationship with law enforcement, I find this obstruction puzzling.

What is more, under Bill C-69, the courts will also be authorized to make an order for the production of documents for specific dates thanks to a repetitive production order. That will enable law enforcement to ask a person to provide specific information to support a criminal investigation on several pre-determined dates over a defined period. That means that the individual will be required to produce specific information to support a criminal investigation on several pre-determined dates.

These two proposals resulted from the public consultations that our government held last summer. We are committed to getting Bill C-69 passed by Parliament in a timely manner so that the new measures can be put in place as quickly as possible and so that we can crack down on these serious crimes as soon as possible.

I would now like to discuss our investments in legal aid. Just as we need to protect Canadians from crime, we also need to ensure that people have equitable access to justice, which is an integral part of a fair and just society, and a strong legal aid system is a key aspect of this. It strengthens the overall justice system. Budget 2024 includes measures to increase funding to criminal legal aid as well as legal aid for immigrants and for refugees to Canada.

For criminal legal aid, budget 2024 provides $440 million over five years, starting in 2024-25. This would support access to justice for Canadians who are unable to pay for legal support, in particular, indigenous people, individuals who are Black and other racialized communities who are overrepresented in the criminal justice system. Indeed, legal representation helps to clear backlogs and delays in our court system as well.

This essential work is only possible with continued collaboration between federal, provincial and territorial governments. The proposed increase to the federal contribution will assist provinces and territories to take further actions to increase access to justice. This legal aid will help with the backlogs I just mentioned. Unrepresented and poorly represented litigants cause delays in our justice system. Making sure that these individuals have proper support and representation will help ensure access to a speedy trial. This, in combination with our unprecedented pace of judicial appointments, 106 appointments in my first nine months in office, will also address backlogs. In comparison, the previous Harper government would appoint 65 judges per year on average. I exceeded that amount in six months.

For immigration and refugee legal aid, budget 2024 would provide $273.7 million over five years, starting in 2024-25, and $43.5 million per year ongoing after that. This funding would help support access to justice for economically disadvantaged asylum seekers and others involved in immigration proceedings. This investment would help maintain the confidence of Canadians in the government's ability to manage immigration levels, and to resettle and integrate refugees into Canadian society. To do this very important work, Justice Canada continues to collaborate with provincial governments and with legal aid service providers, as well as Immigration, Refugees and Citizenship Canada. Together, we are exploring solutions to support sustainable access to immigration and refugee legal aid services.

Before I conclude, I would like to talk a little about Bill C-63, which was raised by the member for Fundy Royal. The bill addresses online harms and the safety of our communities online. Much has already been said about this very important legislation, which would create stronger protections for children online and better safeguards for everyone in Canada from online hate and other types of harmful content. What is critical about this bill is that it is dedicated to promoting people's participation online and not to limiting it.

This legislation is informed by what we have heard over five-plus years of consultations with diverse stakeholders, community groups, law enforcement and other Canadians. This bill focuses on the baseline responsibilities of social media platforms to manage the content they are hosting and their duty to keep children safe, which means removing certain types of harmful content and entrenching a duty to act responsibly.

This bill is about keeping Canadians safe, which is my fundamental priority and my fundamental duty as the Minister of Justice and Attorney General of this country. It is about ensuring that there is actually a takedown requirement on the two types of most harmful material: child pornography and the non-consensual sharing of intimate images, also known as revenge pornography.

There are five other categories of material that would be dealt with under this bill, including material that includes inciting violence, incitements to terrorism, hatred as defined by the Supreme Court of Canada, bullying a child and also inducing a child to self-harm. I am speaking now not only as the Minister of Justice but also as a father. I think that there is nothing more basic in this country for any parent or parliamentarian than keeping our children safe.

I am thankful for the opportunity to speak about how we are making Canada safer and making our justice system stronger, more accessible and more inclusive for all people.

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 7:05 p.m.


See context

Liberal

Arif Virani Liberal Parkdale—High Park, ON

Mr. Speaker, I find this line of questioning quite fascinating, given that the main charter issue that is at issue in Bill C-63 deals with very sensitive issues about the protection of freedom of speech, which is protected under section 2(b).

What I will do is always maintain my oath under the Constitution to uphold the Constitution and people's charter rights. This individual works under a leader who has brandished the idea of using the notwithstanding clause to deprive people of their charter rights. Section 2(b) is subject to the notwithstanding clause.

If we are talking about who is actually committed to protecting people's freedoms, including freedom of speech, people on that side of the House should be looking at themselves in the mirror.

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 7:05 p.m.


See context

Conservative

Rob Moore Conservative Fundy Royal, NB

Mr. Speaker, I notice once again that I have given the minister a lot of opportunities, and he has not answered any of my questions directly.

He knows the answer to this one, and he is not going to give it, so I will have to give it on his behalf. The Victoria Police Department statement says, “Bill C-75, which came into effect nationally in 2019, legislated a 'principle of restraint' that requires police to release an accused person at the earliest possible opportunity”.

The police laid the blame for this individual being released three times in a row to revictimize Canadians squarely at the feet of the minister. A woman was injured in the process of one of the thefts.

On the issue of the Liberals' draconian Bill C-63, which Margaret Atwood has described as “Orwellian”, has he completed a charter statement for this bill that clearly threatens the rights of Canadians?

Philippe Dufresne Privacy Commissioner of Canada, Offices of the Information and Privacy Commissioners of Canada

Thank you, Mr. Chair.

Members of the committee, I'm pleased to be here today to discuss the Office of the Privacy Commissioner of Canada's main estimates for fiscal year 2024-25 and to describe the work of my office to protect and promote the fundamental right to privacy of Canadians. I'm accompanied by Richard Roulx, deputy commissioner, corporate management sector.

In January I launched a strategic plan that lays out three key priorities that will guide the work of the OPC through 2027. The first is protecting and promoting privacy with maximum impact, by using business intelligence to identify trends that need attention, producing focused guidance and outreach, leveraging strategic partnerships and preparing for the implementation of potentially new privacy legislation.

The second is addressing and advocating for privacy in this time of technological change, with a focus on artificial intelligence and generative AI, the proliferation of which brings both potential benefits and increased risks to privacy.

The third is championing children's privacy rights to ensure that their unique privacy needs are met and that they can exercise their rights.

I believe that these three priorities are where the Office of the Privacy Commissioner can have the greatest impact for Canadians, and that these are also where the greatest risks lie if the issues are not addressed.

Protecting privacy is one of the paramount challenges of our time. My office is poised to meet this challenge through strong advocacy, collaboration, partnerships, education, promotion, enforcement and capacity building, which includes doing more to identify and address privacy trends in a timely way.

Investigations under the Privacy Act, which covers the personal information-handling practices of federal government departments and agencies, and the Personal Information Protection and Electronic Documents Act, Canada’s federal private sector privacy law, are a key aspect of the Office of the Privacy Commissioner’s work on issues that significantly impact the lives of Canadians.

In February I made public the results of my investigation into Aylo, the operator of the website Pornhub and other pornographic websites. I found that the company had contravened Canada's federal private sector privacy law by enabling intimate images to be shared on its websites without the direct knowledge and consent of everyone who is depicted.

In releasing my report on this investigation, I reiterated that the non-consensual sharing of intimate images is a serious privacy violation that can cause severe harms to victims, and that organizations have an obligation under privacy law to prevent and remedy this.

This case is also relevant to the discussions that will be taking place on Bill C-63, and I will welcome the opportunity to share my views on the online harms act with parliamentarians.

I also look forward to sharing in the coming months the findings of two high-profile investigations that are closely tied to two of my strategic priorities—protecting children’s privacy and addressing the privacy impacts of emerging technology, including AI.

When I appeared before you last year on Main Estimates, I spoke about the launch of investigations into TikTok, as well as OpenAI, the company behind the AI-driven text generation ‘chat bot’ ChatGPT. Both investigations are being conducted jointly with my counterparts in Quebec, British Columbia and Alberta.

In the case of the TikTok investigation, the four offices are examining whether the practices of the company ByteDance comply with Canadian federal and provincial privacy legislation and, in particular, whether valid and meaningful consent is being obtained for the collection, use, and disclosure of personal information.

Given the importance of protecting children's privacy, the joint investigation has a particular focus on TikTok's privacy practices as they relate to younger users.

The investigation into OpenAI and its ChatGPT chat bot is examining whether the company is compliant with requirements under Canadian privacy law in relation to consent, openness, access, accuracy and accountability. It is also considering whether the collection, use and disclosure are done for an appropriate purpose.

Both investigations remain a high priority and we are working to complete them in a timely manner.

Protecting and promoting privacy with maximum impact remains integral to fulfilling my current mandate and preparing for potential changes to federal privacy law.

In the 2023 budget we received temporary funding to address pressures related to privacy breaches and a complaints backlog, as well as to prepare for the implementation of Bill C-27. While these temporary funds provide necessary and immediate support, it is essential that my office be properly resourced on a permanent basis to deal with the increasing complexity of today's privacy landscape and the associated demands on my office's resources.

To address this, we will continue to present fiscally responsible funding requests and will also aim to maximize agility and cost-effectiveness by assessing and streamlining program and service delivery.

With that, I would be happy to answer your questions. Thank you.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

May 7th, 2024 / 6:45 p.m.


See context

Conservative

Arnold Viersen Conservative Peace River—Westlock, AB

Mr. Speaker, I am grateful for the opportunity to wrap up the debate on the SISE act at second reading.

I have appreciated listening to the members give their speeches. At the outset, I want to briefly urge members to use the term “child sexual abuse material”, or CSAM, rather than “child pornography”. As we heard from the member for Kamloops—Thompson—Cariboo, the latter term is being replaced with CSAM because pornography allows for the idea that this could be consensual. That is why the member for Kamloops—Thompson—Cariboo has put forward a bill that would change this in the Criminal Code as well.

During the first hour of debate, we heard from the member for Laurentides—Labelle, who gave a passionate speech outlining the many serious issues of the impact of the pornography industry on women and youth. I simply do not have the time to include all of that in my speech, but we both sat on the ethics committee during the Pornhub study and heard directly from the survivors who testified.

It was the speech, however, from the Parliamentary Secretary to the Leader of the Government in the House of Commons that left me scratching my head. I do not think he actually read Bill C-270 or even the Liberals' own bill, Bill C-63. The parliamentary secretary fixated on the 24-hour takedown requirement in Bill C-63 as the solution to this issue. However, I do not think anyone is opposed to a 24-hour takedown for this exploitative intimate content sharing without consent or the child sexual abuse material. In fact, a bill that was solely focused on the 24-hour takedown would pass very quickly through this House with the support of everyone, but that does not take into account what Bill C-270 is trying to do. It is completely missing the point.

The 24-hour takedown has effect only after harmful content has been put up, such as CSAM, deepfakes and intimate images that have been shared. Bill C-270 is a preventative upstream approach. While the takedown mechanism should be available to victims, the goal of Bill C-270 is to go upstream and stop this abusive content from ever ending up on the Internet in the first place.

As I shared at the beginning of the debate, many survivors do not know that their images are online for years. They do not know that this exploitative content has been uploaded. What good would a 24-hour takedown be if they do not even know the content is there? I will repeat the words of one survivor that I shared during the first hour of debate: “I was 17 when videos of me on Pornhub came to my knowledge, and I was only 15 in the videos they've been profiting from.” She did not know for two years that exploitative content of her was being circulated online and sold. That is why Bill C-270 requires age verification and consent of individuals in pornographic material before it is posted.

I would also point out that the primary focus of the government's bill is not to reduce harm to victims. The government's bill requires services “to mitigate the risk that users of the regulated service will be exposed to harmful content”. It talks about users of the platform, not the folks depicted in it. The focus of Bill C-270 is the other side of the screen. Bill C-270 seeks to protect survivors and vulnerable populations from being the harmful content. The two goals could not be more different, and I hope the government is supportive of preventing victims of exploitation from further exploitation online.

My colleague from Esquimalt—Saanich—Sooke also noted that the narrow focus of the SISE act is targeted at people and companies that profit from sexual exploitative content. This is, indeed, one of the primary aims of this bill. I hope, as with many things, that the spread of this exploitative content online will be diminished, as it is driven by profit. The Privacy Commissioner's investigation into Canada's MindGeek found that “MindGeek surely benefits commercially from these non-compliant privacy practices, which result in a larger content volume/stream and library of intimate content on its websites.”

For years, pornography companies have been just turning a blind eye, and it is time to end that. Bill C-270 is a fulfillment of a key recommendation made by the ethics committee three years ago and supported by all parties, including the government. I hope to have the support from all of my colleagues in this place for Bill C-270, and I hope to see it at committee, where we can hear from survivors and experts.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

May 7th, 2024 / 6:40 p.m.


See context

Conservative

Garnett Genuis Conservative Sherwood Park—Fort Saskatchewan, AB

Mr. Speaker, I appreciate the opportunity to say a few words in support of Bill C-270, which is an excellent bill from my colleague from Peace River—Westlock, who has been working so hard over his nine years in Parliament to defend the interests of his constituents on important issues like firearms, forestry and fracking, but also to stand up for justice and the recognition of the universal human dignity of all people, including and especially the most vulnerable.

Bill C-270 seeks to create mechanisms for the effective enforcement of substantively already existing legal provisions that prohibit non-consensual distribution of intimate images and child pornography. Right now, as the law stands, it is a criminal offence to produce this type of horrific material, but there are not the appropriate legal mechanisms to prevent the distribution of this material by, for instance, large pornography websites.

It has come to light that Pornhub, which is headquartered in Canada, has completely failed to prevent the presence on its platform of non-consensual and child-depicting pornographic images. This has been a matter that has been studied in great detail at parliamentary committees. My colleague for Peace River—Westlock has played a central role, but other members from other parties have as well, in identifying the fact that Pornhub and other websites have not only failed but have shown no interest in meaningfully protecting potential victims of non-consensual and child pornographic images.

It is already illegal to produce these images. Why, therefore, should it not also be clearly illegal to distribute those images without having the necessary proof of consent? This bill would require that there be verification of age and consent associated with images that are distributed. It is a common-sense legal change that would require and affect greater compliance with existing criminal prohibitions on the creation of these images. It is based on the evidence heard at committee and based on the reality that major pornography websites, many of which are headquartered in Canada, are continuing to allow this material to exist. To clarify, the fact that those images are on those websites means that we desperately need stronger legal tools to protect children and stronger legal tools to protect people who are victims of the non-consensual sharing of their images.

Further, in response to the recognition of the potential harms on children associated with exposure to pornography or associated with having images taken of them and published online, there has been discussion in Parliament and a number of different bills put forward designed to protect children in vulnerable situations. These bills are, most notably, Bill C-270 and Bill S-210.

Bill S-210 would protect children by requiring meaningful age verification for those who are viewing pornography. It is recognized that exposing children to sexual images is a form of child abuse. If an adult were to show videos or pictures to a child of a sexual nature, that would be considered child abuse. However, when websites fail to have meaningful age verification and, therefore, very young children are accessing pornography, there are not currently the legal tools to hold them accountable for that. We need to recognize that exposing young children to sexual images is a form of child abuse, and therefore it is an urgent matter that we pass legislation requiring meaningful age verification. That is Bill S-210.

Then we have Bill C-270, which would protect children in a different context. It would protect children from having their images depicted as part of child pornography. Bill C-270 takes those existing prohibitions further by requiring that those distributing images also have proof of age and consent.

This is common sense; the use of criminal law is appropriate here because we are talking about instances of child sexual abuse. Both Bill S-210 and Bill C-270 deal with child sexual abuse. It should be clear that the criminal law, not some complicated nebulous regulatory regime, is the appropriate mechanism for dealing with child abuse.

In that context, we also have a government bill that has been put forward, Bill C-63, which it calls the online harms act. The proposed bill is kind of a bizarre combination of talking about issues of radically different natures; there are some issues around speech, changes to human rights law and, potentially, attempts to protect children, as we have talked about.

The freedom of speech issues raised by the bill have been well discussed. The government has been denounced from a broad range of quarters, including some of their traditional supporters, for the failures of Bill C-63 on speech.

However, Bill C-63 also profoundly fails to be effective when it comes to child protection and the removal of non-consensual images. It would create a new bureaucratic structure, and it is based on a 24-hour takedown model; it says that if something is identified, it should be taken down within 24 hours. Anybody involved in this area will tell us that 24-hour takedown is totally ineffective, because once something is on the Internet, it is likely to be downloaded and reshared over and over again. The traumatization, the revictimization that happens, continues to happen in the face of a 24-hour takedown model.

This is why we need strong Criminal Code measures to protect children. The Conservative bills, Bill S-210 and Bill C-270, would provide the strong criminal tools to protect children without all the additional problems associated with Bill C-63. I encourage the House to pass these proposed strong child protection Criminal Code-amending bills, Bill S-210 and Bill C-270. They would protect children from child abuse, and given the legal vacuums that exist in this area, there can be no greater, more important objective than protecting children from the kind of violence and sexualization they are currently exposed to.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

May 7th, 2024 / 6:30 p.m.


See context

Bloc

Julie Vignola Bloc Beauport—Limoilou, QC

Mr. Speaker, the subject that we are dealing with this evening is a sensitive one. My colleagues have clearly demonstrated that in the last couple of minutes.

We all have access to the Internet and we basically use it for three reasons: for personal reasons, for professional reasons and for leisure, which can sometimes overlap with personal reasons. Pornography is one of those uses that is both for leisure and for personal reasons. To each their own.

The use of pornography is a personal choice that is not illegal. Some people might question that. We might agree or disagree, but it is a personal decision. However, the choice that one person makes for their own pleasure may be the cause of another person's or many other people's nightmare. Basically, that is what Bill C-270 seeks to prevent, what it seeks to sanction. The purpose of the bill is to ensure that people do not have to go through hell because of pornography. This bill seeks to criminalize the fact that, under the guise of legality, some of the images that are being viewed were taken or are being used illegally.

I want to talk briefly about the problem this bill addresses and the solutions that it proposes. Then, to wrap up, I will share some of my own thoughts about it.

For context for this bill and two others that are being studied, Bill S‑210 and C‑63, it was a newspaper article that sounded the alarm. After the article came out, a House of Commons committee that my esteemed colleague from Laurentides—Labelle sits on looked at the issue. At that time, the media informed the public that videos of women and children were available on websites even though these women and, naturally, these children never gave their consent to be filmed or for their video to be shared. We also learned that this included youths under 18. As I said, a committee looked at the issue. The images and testimonies received by the committee members were so shocking that several bills that I mentioned earlier were introduced to try to tackle the issue in whole or in part.

I want to be clear: watching pornography is not the problem—to each their own. If someone likes watching others have sex, that is none of my concern or anyone else's. However, the problem is the lack of consent of the people involved in the video and the use of children, as I have already said.

I am sure that the vast majority of consumers of pornography were horrified to find out that some of the videos they watched may have involved young people under the age of 18. These children sometimes wear makeup to look older. Women could be filmed without their knowledge by a partner or former partner, who then released the video. These are intimate interactions. People have forgotten what intimacy means. If a person agrees to be filmed in an intimate situation because it is kind of exciting or whatever, that is fine, but intimacy, as the word itself implies, does not mean public.

When a young person or an adult decides to show the video to friends to prove how cool it is that they got someone else to do something, that is degrading. It is beyond the pale. It gets to me because I saw that kind of thing in schools. Kids were so pleased with themselves. I am sorry, but it is rarely the girls who are so pleased with themselves. They are the ones who suffer the negative consequences. At the end of the day, they are the ones who get dragged through the mud. Porn sites were no better. They tried to absolve themselves by saying that they just broadcast the stuff and it is not up to them to find out if the person consented or was at least 18. Broadcasting is just as bad as producing without consent. It encourages these illegal, degrading, utterly dehumanizing acts.

I am going back to my notes now. The problem is that everyone is blaming everyone else. The producer says it is fine. The platform says it is fine. Ultimately, governments say the same thing. This is 2024. The Internet is not new. Man being man—and I am talking about humankind, humans in general—we were bound to find ourselves in degrading situations. The government waited far too long to legislate on this issue.

In fact, the committee that looked into the matter could only observe the failure of content moderation practices, as well as the failure to protect people's privacy. Even if the video was taken down, it would resurface because a consumer had downloaded it and thought it was a good idea to upload it again and watch it again. This is unspeakable. It seems to me that people need to use some brain cells. If a video can no longer be found, perhaps there is a reason for that, and the video should not be uploaded again. Thinking and using one's head is not something governments can control, but we have to do everything we can.

What is the purpose of this bill and the other two bills? We want to fight against all forms of sexual exploitation and violence online, end the streaming and marketing of all pornographic material involving minors, prevent and prohibit the streaming of non-consensual explicit content, force adult content companies and streaming services to control the streaming of this content and make them accountable and criminally responsible for the presence of this content on their online sites. Enough with shirking responsibility. Enough with saying: it is not my fault if she feels degraded, if her reputation is ruined and if, at the end of the day, she feels like throwing herself off a bridge. Yes, the person who distributes pornographic material and the person who makes it are equally responsible.

Bill C‑270 defines the word “consent” and the expression “pornographic material”, which is good. It adds two new penalties. Essentially, a person who makes or distributes the material must ensure that the person involved in the video is 18 and has given their express consent. If the distributor does not ask for it and does not require it, they are at fault.

We must also think about some of the terms, such as “privacy”, “education”, but also the definition of “distributor” because Bill C-270 focuses primarily on distributors for commercial purposes. However, there are other distributors who are not in this for commercial purposes. That is not nearly as pretty. I believe we need to think about that aspect. Perhaps legal consumers of pornography would like to see their rights protected.

I will end with just one sentence: A real statesperson protects the dignity of the weak. That is our role.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

May 7th, 2024 / 6:20 p.m.


See context

Etobicoke—Lakeshore Ontario

Liberal

James Maloney LiberalParliamentary Secretary to the Minister of Justice and Attorney General of Canada

Mr. Speaker, I am very pleased to speak to Bill C-270, an act to amend the Criminal Code (pornographic material), at second reading.

I would like to begin my remarks by stressing the bill's important objective. It is to ensure that those who make, distribute or advertise pornographic material verify that those depicted in that material are at least 18 years of age and have consented to its production and distribution.

As the sponsor has explained, the bill's objective is to implement recommendation number two of the 2021 report of the House of Commons Standing Committee on Access to Information, Privacy and Ethics. Specifically, that report recommends that the government “mandate that content-hosting platforms operating in Canada require affirmation from all persons depicted in pornographic content, before it can be uploaded, that they are 18 years old or older and that they consent to its distribution”.

This recommendation responds to ongoing concerns that corporations like Pornhub have made available pornographic images of persons who did not consent or were underage. I want to recognize and acknowledge that this conduct has caused those depicted in that material extreme suffering. I agree that we must do everything we can to protect those who have been subjected to this trauma and to prevent it from occurring in the first place. I fully support the objective of the committee's recommendation.

I want to say at the outset that the government will be supporting this bill, Bill C-270, at second reading, but with some serious reservations. I have some concerns about the bill's ability to achieve the objective of the committee's recommendation. I look forward, at committee, to where we can hear from experts on whether this bill would be useful in combatting child pornography.

The bill proposes Criminal Code offences that would prohibit making, distributing or advertising pornographic material, without first verifying the age and consent of those depicted by examining legal documentation and securing formal written consent. These offences would not just apply to corporations. They would also apply to individuals who make or distribute pornographic material of themselves and others to generate income, a practice that is legal and that we know has increased in recent years due to financial hardship, including that caused by the pandemic.

Individuals who informally make or distribute pornographic material of themselves and of people they know are unlikely to verify age by examining legal documentation, especially if they already know the age of those participating in the creation of the material. They are also unlikely to secure formal written consent. It concerns me that such people would be criminalized by the bill's proposed offences, where they knew that everyone implicated was consenting and of age, merely because they did not comply with the bill's proposed regulatory regime governing how age and consent must be verified.

Who is most likely to engage in this conduct? The marginalized people who have been most impacted by the pandemic, in particular sex workers, who are disproportionately women and members of the 2SLGBTQI+ communities. Notably, the privacy and ethics committee clearly stated that its goal was “in no way to challenge the legality of pornography involving consenting adults or to negatively impact sex workers.” However, I fear that the bill's proposed reforms could very well have this effect.

I am also concerned that this approach is not consistent with the basic principles of criminal law. Such principles require criminal offences to have a fault or a mental element, for example, that the accused knew or was reckless as to whether those depicted in the pornographic material did not consent or were not of age. This concern is exacerbated by the fact that the bill would place the burden on the accused to establish that they took the necessary steps to verify age and consent to avoid criminal liability. However, basic principles of criminal law specify that persons accused of criminal offences need only raise a reasonable doubt as to whether they committed the offence to avoid criminal liability.

I would also note that the committee did not specifically contemplate a criminal law response to its concerns. In fact, a regulatory response that applies to corporations that make, distribute or advertise pornographic material may be better positioned to achieve the objectives of the bill. For example, our government's bill, Bill C-63, which would enact the online harms act, would achieve many of Bill C-270's objectives. In particular, the online harms act would target seven types of harmful content, including content that sexually victimizes a child or revictimizes a survivor, and intimate content communicated without consent.

Social media services would be subjected to three duties: to act responsibly, to protect children and to make content inaccessible that sexually victimizes a child or revictimizes a survivor, as well as intimate images posted without consent.

These duties would apply to social media services, including livestreaming and user-uploaded adult content services. They would require social media services to actively reduce the risk of exposure to harmful content on their services; provide clear and accessible ways to flag harmful content and block users; put in place special protections for children; take action to address child sexual exploitation and the non-consensual posting of intimate content, including deepfake sexual images; and publish transparency reports.

Bill C-63 would also create a new digital safety commission to administer this regulatory framework and to improve the investigation of child pornography cases through amendments to the Mandatory Reporting Act. That act requires Internet service providers to report to police when they have reasonable grounds to believe their service is being used to commit a child pornography offence. Failure to comply with this obligation can result in severe penalties.

As I know we are all aware, the Criminal Code also covers a range of offences that address aspects of the concerns animating the proposed bill. Of course, making and distributing child pornography are both already offences under the Criminal Code. As well, making pornography without the depicted person's knowledge can constitute voyeurism, and filming or distributing a recording of a sexual assault constitutes obscenity. Also, distributing intimate images without the consent of the person depicted in those images constitutes non-consensual distribution of intimate images, and the Criminal Code authorizes courts to order the takedown or removal of non-consensual intimate images and child pornography.

All these offences apply to both individuals and organizations, including corporations, as set out in section 2 of the Criminal Code. Should parliamentarians choose to pursue a criminal response to the concerns the proposed bill seeks to address, we may want to reflect upon whether the bill's objectives should be construed differently and its provisions amended accordingly.

I look forward to further studying such an important bill at committee.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

May 7th, 2024 / 5:50 p.m.


See context

Bloc

Andréanne Larouche Bloc Shefford, QC

Mr. Speaker, as the member for Shefford and the Bloc Québécois critic for the status of women, I want to say that we support Bill C-270 in principle. We would like to examine this bill in committee. The Bloc Québécois fully supports the bill's stated objective, which is to combat child pornography and the distribution and commercialization of non-consensual pornography.

Since the first warning about the tragedy of women and girls whose sexual exploitation is the source of profits for major online porn companies, the Bloc Québécois has been involved at every stage and at all times in the public process to expose the extent of this public problem, which goes to our core values, including the right to dignity, safety and equality.

On this subject of online sexual exploitation, as on all facets and forms of the sexual exploitation of women, we want to stand as allies not only of the victims, but also of all the women who are taking action to combat violence and exploitation. I will begin by giving a little background on the topic, then I will explain the bill and, in closing, I will expand on some of the other problems that exist in Canada.

First, let us not forget that the public was alerted to the presence of non-consensual child pornography by an article that was published in the New York Times on December 4, 2020. The article reported the poignant story of 14-year old Serena K. Fleites. Explicit videos of her were posted on the website Pornhub without her consent.

This Parliament has already heard the devastating, distressing and appalling testimony of young Serena, which helped us understand the sensitive nature and gravity of the issue, but also the perverse mechanisms that porn streaming platforms use to get rich by exploiting the flaws of a technological system that, far from successfully controlling the content that is broadcast, is built and designed to promote and yet conceal the criminal practices of sexual exploitation.

Reports regarding the presence of child sexual abuse material and other non-consensual content on the adult platform Pornhub led the Standing Committee on Access to Information, Privacy and Ethics to undertake a study on the protection of privacy and reputation on online platforms such as Pornhub. My colleague from Laurentides—Labelle has followed this issue closely.

The committee noted that these platforms' content moderation practices had failed to protect privacy and reputation and had failed to prevent child sexual abuse material from being uploaded, despite statements by representatives of MindGeek and Pornhub who testified before the committee.

That same committee looked at regulating adult sites and online pornography, without challenging the legality. The committee heard testimony from survivors, critics of MindGeek's practices, child protection organizations, members of law enforcement, the federal government, academics, experts and support organizations, and it received many briefs.

The Standing Committee on Access to Information, Privacy and Ethics made 14 recommendations regarding the problems it had studied. The committee's 2021 report was clear and it recommended that the government introduce a bill to create a new regulator to ensure that online platforms remove harmful content, including depictions of child sexual exploitation and non-consensual images.

We know that sexually explicit content is being uploaded to Pornhub without the consent of the individuals involved, including minors, and that these individuals have tried and failed to get Pornhub to remove that content. We know that these survivors have been traumatized and harassed and that most of them have thought about suicide. That is the type of testimony that we heard at the Standing Committee on the Status of Women with regard to cases of sexual exploitation.

We know that even if content is finally removed, users just re-upload it shortly afterward. We know that the corporate structure of MindGeek, which was renamed Aylo last August, is the quintessential model for avoiding accountability, transparency and liability. We know that investigations are under way and that there has been a surge in online child sexual exploitation reports.

We must now legislate to respond to these crimes and deal with these problems. We also need to keep in mind the magnitude of the criminal allegations and the misconduct of which these companies are accused. Just recently, a new class action lawsuit was filed in the United States against MindGeek and many of the sites it owns, including Pornhub, over allegations of sex trafficking involving tens of thousands of children.

Let us not forget that these companies are headquartered right in Montreal. The fact that our country is home to mafia-style companies that profit from sexual exploitation is nothing to be proud of. The international community is well aware of this, and it reflects poorly on us. For these reasons, we have an additional obligation to take action, to find solutions that will put an end to sexual exploitation, and to implement those solutions through legislation.

With that in mind, we must use the following questions to guide our thinking. Are legislative proposals on this subject putting forward the right solutions? Will they be effective at controlling online sexual exploitation and, specifically, preventing the distribution of non-consensual content and pornographic content involving minors?

Second, let us talk a little more about Bill C‑270. This bill forces producers of pornographic material to obtain the consent of individuals and to ensure that they are of age. In addition, distributors will have to obtain written confirmation from producers that the individuals' consent has been obtained and that they are of age before the material is distributed. These new Criminal Code provisions will require large platforms and producers to have a process for verifying individuals' age and consent, without which they will be subject to fines or imprisonment.

The House will be considering two bills simultaneously. The first is Bill C-270, from the member for Peace River—Westlock, with whom I co-chair the All-Party Parliamentary Group to End Modern Slavery and Human Trafficking. The second is Bill C-63, introduced by the Minister of Justice, which also enacts new online harms legislation and aims to combat the sexual victimization of children and to make intimate content communicated without consent inaccessible.

We will need to achieve our goals, which are to combat all forms of online sexual exploitation and violence, stop the distribution and marketing of all pornographic material involving minors, prevent and prohibit the distribution of explicit non-consensual content, force adult content companies and platforms to control the distribution of such content, and make them accountable and criminally responsible for the presence of such content on their online platforms.

There is a debate about the law's ability to make platforms accountable for hosted content. It also raises questions about the relevance of self-regulation in the pornography industry.

Third, let us talk about what we can do here. Due to the high volume of complaints it receives, the RCMP often reacts to matters relating to child sexual abuse material, or CSAM, rather than acting proactively to prevent them. Canada's criminal legislation prohibits child pornography, but also other behaviours aimed at facilitating the commission of a sexual offence against a minor. It prohibits voyeurism and the non-consensual distribution of intimate images. Other offences of general application such as criminal harassment and human trafficking may also apply depending on the circumstances.

In closing, I will provide a few figures to illustrate the scope of this problem. Between 2014 and 2022, there were 15,630 incidents of police-reported online sexual offences against children and 45,816 incidents of online child pornography. The overall rate of police-reported online child sexual exploitation incidents has also risen since 2014. The rate of online child pornography increased 290% between 2014 and 2022. Girls were overrepresented as victims for all offence types over that nine-year period. The majority of victims of police-reported online sexual offences against children were girls, particularly girls between the ages of 12 and 17, who accounted for 71% of victims.

Incidents of non-consensual distribution of intimate images most often involved a youth victim and a youth accused. Nearly all child and youth victims, 97% to be exact, between 2015 to 2022 were aged 12 to 17 years, with a median age of 15 years for girls and 14 years for boys. Overall, nine in 10 accused persons, or 90%, were youth aged 12 to 17. For one-third of youth victims, or 33%, a casual acquaintance had shared the victim's intimate images with others.

Here is a quote from the Montreal Council of Women: “On behalf of the members of the Montreal Council of Women, I wish to confirm our profound concern for those whose lives have been turned upside down by the involuntary and/or non-consensual sharing of their images on websites and other platforms such as the Montreal-based Pornhub. The ‘stopping Internet sexual exploitation act’ will make much-needed amendments to the Criminal Code to protect children and those who have not given consent for their images and other content to be shared and commercialized.”

We must act. It is a question of safety for our women and girls. Young women and girls are depending on it.