An Act to enact the Online Harms Act, to amend the Criminal Code, the Canadian Human Rights Act and An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other Acts

Sponsor

Arif Virani  Liberal

Status

Second reading (House), as of Sept. 23, 2024

Subscribe to a feed (what's a feed?) of speeches and votes in the House related to Bill C-63.

Summary

This is from the published bill. The Library of Parliament has also written a full legislative summary of the bill.

Part 1 of this enactment enacts the Online Harms Act , whose purpose is to, among other things, promote the online safety of persons in Canada, reduce harms caused to persons in Canada as a result of harmful content online and ensure that the operators of social media services in respect of which that Act applies are transparent and accountable with respect to their duties under that Act.
That Act, among other things,
(a) establishes the Digital Safety Commission of Canada, whose mandate is to administer and enforce that Act, ensure that operators of social media services in respect of which that Act applies are transparent and accountable with respect to their duties under that Act and contribute to the development of standards with respect to online safety;
(b) creates the position of Digital Safety Ombudsperson of Canada, whose mandate is to provide support to users of social media services in respect of which that Act applies and advocate for the public interest in relation to online safety;
(c) establishes the Digital Safety Office of Canada, whose mandate is to support the Digital Safety Commission of Canada and the Digital Safety Ombudsperson of Canada in the fulfillment of their mandates;
(d) imposes on the operators of social media services in respect of which that Act applies
(i) a duty to act responsibly in respect of the services that they operate, including by implementing measures that are adequate to mitigate the risk that users will be exposed to harmful content on the services and submitting digital safety plans to the Digital Safety Commission of Canada,
(ii) a duty to protect children in respect of the services that they operate by integrating into the services design features that are provided for by regulations,
(iii) a duty to make content that sexually victimizes a child or revictimizes a survivor and intimate content communicated without consent inaccessible to persons in Canada in certain circumstances, and
(iv) a duty to keep all records that are necessary to determine whether they are complying with their duties under that Act;
(e) authorizes the Digital Safety Commission of Canada to accredit certain persons that conduct research or engage in education, advocacy or awareness activities that are related to that Act for the purposes of enabling those persons to have access to inventories of electronic data and to electronic data of the operators of social media services in respect of which that Act applies;
(f) provides that persons in Canada may make a complaint to the Digital Safety Commission of Canada that content on a social media service in respect of which that Act applies is content that sexually victimizes a child or revictimizes a survivor or intimate content communicated without consent and authorizes the Commission to make orders requiring the operators of those services to make that content inaccessible to persons in Canada;
(g) authorizes the Governor in Council to make regulations respecting the payment of charges by the operators of social media services in respect of which that Act applies, for the purpose of recovering costs incurred in relation to that Act.
Part 1 also makes consequential amendments to other Acts.
Part 2 amends the Criminal Code to, among other things,
(a) create a hate crime offence of committing an offence under that Act or any other Act of Parliament that is motivated by hatred based on certain factors;
(b) create a recognizance to keep the peace relating to hate propaganda and hate crime offences;
(c) define “hatred” for the purposes of the new offence and the hate propaganda offences; and
(d) increase the maximum sentences for the hate propaganda offences.
It also makes related amendments to other Acts.
Part 3 amends the Canadian Human Rights Act to provide that it is a discriminatory practice to communicate or cause to be communicated hate speech by means of the Internet or any other means of telecommunication in a context in which the hate speech is likely to foment detestation or vilification of an individual or group of individuals on the basis of a prohibited ground of discrimination. It authorizes the Canadian Human Rights Commission to deal with complaints alleging that discriminatory practice and authorizes the Canadian Human Rights Tribunal to inquire into such complaints and order remedies.
Part 4 amends An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service to, among other things,
(a) clarify the types of Internet services covered by that Act;
(b) simplify the mandatory notification process set out in section 3 by providing that all notifications be sent to a law enforcement body designated in the regulations;
(c) require that transmission data be provided with the mandatory notice in cases where the content is manifestly child pornography;
(d) extend the period of preservation of data related to an offence;
(e) extend the limitation period for the prosecution of an offence under that Act; and
(f) add certain regulation-making powers.
Part 5 contains a coordinating amendment.

Elsewhere

All sorts of information on this bill is available at LEGISinfo, an excellent resource from the Library of Parliament. You can also read the full text of the bill.

Rachael Thomas Conservative Lethbridge, AB

Thank you.

Bill C-63 does something very interesting. Rather than updating the Criminal Code to include deepfakes.... It doesn't do that at all. Deepfakes aren't mentioned in the bill, and folks are not protected from them—not in any criminal way. I believe that's an issue.

Secondly, platforms will be subject to perhaps an assessment and a ruling by an extrajudicial, bureaucratic arm that will be created, but again, there's nothing criminal attached to platforms allowing for the perpetuation of things like deepfakes or other non-consensual images.

Does that not concern you?

Keita Szemok-Uto Lawyer, As an Individual

Madam Chair, committee members, thank you for the opportunity to speak before you this afternoon.

By way of brief background, my name is Keita Szemok-Uto. I'm from Vancouver. I was just called to the bar last month. I've been practising, primarily in family law, with also a mix of privacy and workplace law. I attended law school at Dalhousie in Halifax, and while there I took a privacy law course. I chose to write my term paper on the concept of deepfake videos, which we've been discussing today. I was interested in the way that a person could create a deepfake video, specifically a sexual or pornographic one, and how that could violate a person's privacy rights, and in writing that paper I discovered the clear gendered dynamic to the creation and dissemination of these kinds of deepfake videos.

As a case in point, around January this year somebody online made and publicly distributed sexually explicit AI deepfake images of Taylor Swift. They were quickly shared on Twitter, repeatedly viewed—I think one photo was seen as many as 50 million times. In an Associated Press article, a professor at George Washington University in the United States referenced women as “canaries in the coal mine” when it comes to the abuse of artificial intelligence. She is quoted, “It's not just going to be the 14-year-old girl or Taylor Swift. It's going to be politicians. It's going to be world leaders. It's going to be elections.”

Even back before this, in April 2022 it was striking to see the capacity for, essentially, anybody to take photos of somebody's social media, turn them into deepfakes and distribute them widely without, really, any regulation. Again, the targets of these deepfakes, while they can be celebrities or world leaders, oftentimes are people without the kinds of finances or protections of a well-known celebrity. Worst of all, I think, and in writing this paper, I discovered there is really no adequate system of law yet that protects victims from this kind of privacy invasion. I think that's something that really is only now being addressed somewhat with the online harms bill.

I did look at the Criminal Code, section 162, which prohibits the publication, distribution or sale of an intimate image, but the definition of “intimate image” in that section is a video or photo in which a person is nude and the person had a reasonable expectation of privacy when it was made or when the offence was committed. Again, I think the “reasonable expectation of privacy” element will come up a lot in legal conversations about deepfakes. When you take somebody's social media photo, which is taken and posted publicly, it's questionable whether they had a reasonable expectation of privacy when it was taken.

In the paper, I looked at a variety of torts. I thought that if the criminal law can't protect victims, perhaps there is a private course of action in which victims can sue and perhaps get damages or whatnot. I looked at public disclosure of private facts, intrusion upon seclusion and other torts as well, and I just didn't find anything really satisfied the circumstances of a pornographic deepfake scenario—again with the focus of reasonable expectation of privacy not really fitting the bill.

As I understand today, there have been recent proposals for legislation and legislation that are come into force. In British Columbia there's the Intimate Images Protection Act. That was from March 2023. The definition of “intimate image” in that act means a visual recording or visual simultaneous representation of an individual, whether or not they're identifiable and whether or not the image has been altered in any way, in which they're engaging in a sexual act.

The broadening of the definition of “intimate image”, as not just an image of someone who is engaged in a sexual act when the photo is taken but altered to make that representation, seems to be covered in the Intimate Images Protection Act. The drawback of that act is that, while it does provide a private right of action, the damages are limited to $5,000, which seems negligible in the grand scheme of things.

I suppose we'll talk more about Bill C-63 in this discussion, and I do think that it goes in the right direction in some regard. It does put a duty on operators to police and regulate what kind of material is online. Another benefit is that it expands the definitions, again, of the kinds of material that should be taken down.

That act, once passed, will require the operator to take down material that sexually victimizes a child or revictimizes a survivor—

Monique St. Germain General Counsel, Canadian Centre for Child Protection Inc.

Thank you for the opportunity today.

My name is Monique St. Germain, and I am general counsel for the Canadian Centre for Child Protection, which is a national charity with the goal of reducing the incidence of missing and sexually exploited children.

We operate cybertip.ca, Canada's national tip line for reporting the online sexual exploitation of children. Cybertip.ca receives and analyzes tips from the public and refers relevant information to police and child welfare as needed. Cybertip averages over 2,500 reports a month. Since inception, over 400,000 reports have been processed.

When cybertip.ca launched in 2002, the Internet was pretty basic, and the rise of social media was still to come. Over the years, technology has rapidly evolved without guardrails and without meaningful government intervention. The faulty construction of the Internet has enabled online predators to not only meet and abuse children online but to do so under the cover of anonymity. It has also enabled the proliferation of child sexual abuse material, CSAM, at a rate not seen before. Victims are caught in an endless cycle of abuse.

Things are getting worse. We have communities of offenders operating openly on the Tor network, also known as the dark web. They share tips and tricks about how to abuse children and how to avoid getting caught. They share deeply personal information about victims. CSAM is openly shared, not only in the dark recesses of the Internet but on websites, file-sharing platforms, forums and chats accessible to anyone with an Internet connection.

Countries have prioritized the arrest and conviction of individual offenders. While that absolutely has to happen, we've not tackled a crucial player: the companies themselves whose products facilitate and amplify the harm. For example, Canada has only one known conviction and sentencing of a company making CSAM available on the Internet. That prosecution took eight years and thousands of dollars to prosecute. Criminal law cannot be the only tool; the problem is just too big.

Recognizing how rapidly CSAM was proliferating on the Internet, in 2017, we launched Project Arachnid. This innovative tool detects where CSAM is being made available publicly on the Internet and then sends a notice to request its removal. Operating at scale, it issues roughly 10,000 requests for removal each day and some days over 20,000. To date, over 40 million notices have been issued to over 1,000 service providers.

Through operating Project Arachnid, we've learned a lot about CSAM distribution, and, through cybertip.ca, we know how children are being targeted, victimized and sextorted on the platforms they use every day. The scale of harm is enormous.

Over the years, the CSAM circulating online has become increasingly disturbing, including elements of sadism, bondage, torture and bestiality. Victims are getting younger, and the abuse is more graphic. CSAM of adolescents is ending up on pornography sites, where it is difficult to remove unless the child comes forward and proves their age. The barriers to removal are endless, yet the upload of this material can happen in a flash, and children are paying the price.

It's no surprise that sexually explicit content harms children. For years, our laws in the off-line world protected them, but we abandoned that with the Internet. We know that everyone is harmed when exposed to CSAM. It can normalize harmful sexual acts, lead to distorted beliefs about the sexual availability of children and increase aggressive behaviour. CSAM fuels fantasies and can result in harm to other children.

In our review of Canadian case law regarding the production of CSAM in this country, 61% of offenders who produced CSAM also collected it.

CSAM is also used to groom children. Nearly half of the victims who responded to our survivor survey of victims of CSAM identified this tactic. Children are unknowingly being recorded by predators during an online interaction, and many are being sextorted thereafter. More sexual violence is occurring among children, and more children are mimicking adult predatory behaviour, bringing them into the criminal justice system.

CSAM is a record of a crime against a child, and its continued availability is ruining lives. Survivors tell us time and time again that the endless trading in their CSAM is a barrier to moving forward. They are living in constant fear of recognition and harassment. This is not right.

The burden of managing Internet harms has fallen largely to parents. This is unrealistic and unfair. We are thrilled to see legislative proposals like Bill C-63 to finally hold industry to account.

Prioritizing the removal of CSAM and intimate imagery is critical to protecting citizens. We welcome measures to mandate safety by design and tools like age verification or assurance technology to keep pornography away from children. We would also like to see increased use of tools like Project Arachnid to enhance removal and prevent the reuploading of CSAM. Also, as others have said, public education is critical. We need all the tools in the tool box.

Thank you.

June 13th, 2024 / 3:55 p.m.


See context

Associate Professor, University of British Columbia, As an Individual

Dr. Heidi Tworek

Bill C-63 is a step in the right direction to address a problem that, tragically, is swiftly worsening.

I'm looking forward to your questions.

Dr. Heidi Tworek Associate Professor, University of British Columbia, As an Individual

Thank you, Madam Chair, for the opportunity to appear virtually before you to discuss this important topic.

I'm a professor and Canada research chair at the University of British Columbia in Vancouver. I direct the centre for the study of democratic institutions, where we research platforms and media. Two years ago I served as a member of the expert advisory group to the heritage ministry about online safety.

Today, I will focus on three aspects of harms related to illegal sexually explicit material online, before discussing briefly how Bill C-63 may address some of these harms.

First, the issue of illegal sexually explicit material online overlaps significantly with the broader question of online harm and harassment, which disproportionately affects women. A survey in 2021 found that female journalists in Canada were nearly twice as likely to receive sexualized messages or images, and they were six times as likely to receive online threats of rape or sexual assault. Queer, racialized, Jewish, Muslim and indigenous female journalists received the most harassment.

Alongside provoking mental health issues or fears for physical safety, many are either looking to leave their roles or unwilling to accept more public-facing positions. Others have been discouraged from pursuing journalism at all. My work over the last five years on other professional groups, including political candidates or health communicators, suggests very similar dynamics. This online harassment is a form of chilling effect for society as a whole when professionals do not represent the diversity of Canadian society.

Second, generative AI is accelerating the problem of illegal sexually explicit material. Let's take the example of deepfakes, which means artificially generated images or videos that swap faces onto somebody else's naked body to depict acts that neither person committed. Recent high-profile targets include Taylor Swift and U.S. Congresswoman Alexandria Ocasio-Cortez. These are not isolated examples. As journalist Sam Cole has put it, “sexually explicit deepfakes meant to harass, blackmail, threaten, or simply disregard women's consent have always been the primary use of the technology”.

Although deepfakes have existed for a few years, generative AI has significantly lowered the barrier to entry. The number of deepfake videos increased by 550% from 2019 to 2023. Such videos are easy to create, because about one-third of deepfake tools enable a user to create pornography, which comprises over 95% of all deepfake videos. One last statistic is that 99% of those featured in deepfake pornography are female.

Third, while it is mostly prima facie easy-to-define illegal sexually explicit material, we should be wary of online platforms offering solely automated solutions. For example, what if a lactation consultant is providing online guidance about breastfeeding? Wholly automated content moderation systems might delete such material, particularly if trained simply to search for certain body parts like nipples. Given that provincial human rights legislation protects breastfeeding in much of Canada, deletion of this type of content would actually raise questions about freedom of expression. If parents have the right to breastfeed in public in real life, why not to discuss it online? What this example suggests is that human content moderators remain necessary. It is also necessary that they are trained to understand Canadian law and cultural context and also to receive support for the very difficult kind of work they do.

Finally, let me explain how Bill C-63 might address some of these issues.

There are very legitimate questions about Bill C-63's proposed amendments to the Criminal Code and Canadian Human Rights Act, but as regards today's topic, I'll focus briefly on the online harms portion of the bill.

Bill C-63 draws inspiration from excellent legislation in the European Union, the United Kingdom and Australia. This makes Canada a fourth or fifth mover, if not increasingly an outlier in not regulating online safety.

However, Bill C-63 suggests three types of duties for platforms. The first two are a duty to protect children and a duty to act responsibly in mitigating the risks of seven types of harmful content. The third most stringent and relevant for today is a duty to make two types of content inaccessible—child sexual exploitation material and non-consensual sharing of intimate content, including deepfakes. This should theoretically protect the owners of both the face and the body used in a deepfake. A newly created digital safety commission would have the power to require removal of this content in 24 hours as well as impose fines and other measures for non-compliance.

Bill C-63 also foresees the creation of a digital safety ombudsperson to provide a forum for stakeholders and to hear user complaints if platforms are not upholding their legal duties. This ombudsperson might also enable users to complain about takedowns of legitimate content.

Now, Bill C-63 will certainly not resolve all issues around illegal sexually explicit material, for example, how to deal with copies of material stored on servers outside Canada—

Anna Gainey Liberal Notre-Dame-de-Grâce—Westmount, QC

Thank you very much.

Thank you, as well, to all the witnesses for being here.

Mr. Vachon, I also have a question for you.

Ideally, images of the sexual abuse of children would never be published online, of course. Bill C‑63 includes provisions requiring removal of material within 24 hours.

I'd like to know what you think of that tool proposed in the act. Further, are there other tools that could improve this bill or that we should consider?

Kevin Waugh Conservative Saskatoon—Grasswood, SK

Thank you, Ms. Thomas.

I sat and listened to debate on Bill C-63 Friday. There was, I think, a high school class watching from the gallery. It was kind of interesting, because as Bill C-63 was debated—and I give the teacher a lot of credit—the government had their statement and the opposition had their statements, and there's a trade-off between a guarantee of their security and their Charter of Rights. We have seen that in many of these bills.

Ms. Selby, what would your recommendation be to those high school students? Many of them are just coming into the adult world. What would your recommendation be on the Charter of Rights and their security around sexual exploitation?

June 11th, 2024 / 6:30 p.m.


See context

Founder and Mother, Amanda Todd Legacy Society

Carol Todd

Thank you for your kind words.

I'm going to be frank. Amanda died in 2012. We are now in 2024. We're almost at 12 years. I've stood up, I've used my voice and I've been an advocate. I've watched what happened in her life and I've talked to many people and organizations around the world. What you do as politicians and legislators is wonderful, but you put up so many roadblocks.

I'm going to be frank, and I'm not saying this to anyone specifically; I'm saying this generally.

So many roadblocks get put up by one political party versus another political party. I have sat on six standing committees since 2012, on technology-facilitated violence, on gender-based violence, on exploitation against children and young people, on other ones on intimate images, and now this one.

I could copy and paste facts that I talk about: more funding, more legislation, more education, more awareness. Standing committees then come out with a report. We see those reports, but we never know what happens at the end: Do these things really happen? Is there more funding in law enforcement for training officers and for their knowledge? Are there changes in legislation?

Right now we are looking at Bill C-63. I read the news and I look at the points of view. I have someone from the justice minister's office contacting me regularly, because I understand that second reading came up on Bill C-63 last Friday.

Then you go back to the comments, and all it amounts to is infighting and arguing. Will this bill be passed? Other parties say no, it shouldn't be passed.

We are harming Canadians, our children and our citizens when things don't get passed. If you look and do your research, you see that other countries have passed legislation similar to Bill C-63. Australia is already in its third or fourth revision of what they passed years ago. I was in Australia last year and I met the e-commissioner. I met law enforcement. I was a keynote speaker at one of their major exploitation conferences. I felt sad because Canada was represented by two officers in Ontario. Canada was so far behind.

We are a first world country, and our Canadians deserve to be protected. We need to make sure that everyone works on the legislation and on details. It's not just about passing laws: There are different silos. There's the education. There are the kids. There's the community. We all need to get involved. It's not about putting someone in jail because of.... It's about finding solutions that work. As a country, we are not finding those solutions that work right now. We aren't going to find every other predator in the world. Globally today, 750,000 predators are online looking for our children.

In my case, Amanda's predator came from the Netherlands. It's not just about one country, because the Internet is invisible fibres. We know that Nigeria has exploitation—

Martin Champoux Bloc Drummond, QC

So there is a lot of awareness raising and education that we need to do as parents, but also as a society and as legislators.

Since we're talking about legislation, I can't help but mention Bill C-63, which was recently introduced and which I hope will be considered as quickly as possible.

Have you had a chance to look at it? If so, what are your impressions of this bill, which may be intended to help you do your job?

Patricia Lattanzio Liberal Saint-Léonard—Saint-Michel, QC

My next question is addressed to Ms. Laidlaw.

Bill C-63 was developed to ensure compliance with all existing privacy laws and global best practices. Do you have any concerns related to the privacy implications of Bill S-210? Also, how do we ensure privacy is upheld in the development of online safety regulations?

Rachael Thomas Conservative Lethbridge, AB

Thank you.

Thank you to all of the witnesses for your patience today.

My first question goes to Ms. Lalonde.

In an article that you recently wrote with regard to Bill C-63, you said it “contains...glaring gaps that risk leaving women and girls in Canada unfairly exposed.”

I'm wondering if you can expand on those gaps that you see within the legislation that would perhaps leave women and children vulnerable.

Criminal CodePrivate Members' Business

June 11th, 2024 / 6 p.m.


See context

Conservative

Michelle Rempel Conservative Calgary Nose Hill, AB

Madam Speaker, I rise to debate this bill today, and I would like to focus my comments on a specific aspect of coercive control, for which there remains very few easy-to-access and easy-to-deploy de-escalation tools for victims. It is my hope that parliamentarians in the other place will consider the addition of these components to this bill, particularly as it pertains to specific tools to assist law enforcement officials in stopping coercive control from happening.

To set the context for this issue, I would like to refer to the Women's Legal Education & Access Fund, or LEAF. It developed a position paper on the criminalization of coercive control in response to this bill. In it, it defines “coercive control” as follows:

Coercive control is a concept used to describe a pattern of abusive behaviors in intimate partner relationships, based on tactics of intimidation, subordination, and control. This can include, among others, behaviors such as isolation, stalking, threats, surveillance, psychological abuse, online harassment, and sexual violence.

Other sources discussed threats of extortion, including so-called revenge porn, as one of the abusive behaviours also used to exert coercive control.

In its paper, LEAF raises the concern that the process of criminalizing coercive control may encounter significant challenges to legal success and that it may be “difficult to translate clearly into actionable criminal law.” One of the recommendations it makes to at least partially address this issue reads as follows: “Federal, provincial and territorial governments should take a proactive approach in focusing on the prevention of intimate partner violence.”

I would like to focus on two actionable, concrete ways to prevent two specific behaviours or components of coercive control: online harassment and revenge porn. In nearly nine years of power, the Liberal government has not taken material action to address the growing threat and breadth of online harassment, particularly as it relates to coercive control. The government's recently introduced and widely criticized Bill C-63, which many experts say would force Canadians to make trade-offs between their charter rights and their safety, does not adequately address the issue of women who are subject to a pattern of abusive behaviour online. Even if it did, today the minister admitted in the Toronto Star that the bill's provisions, which rely on the creation of an onerous new three-headed bureaucracy, would take years to functionally come into force.

Canadian women do not have time to wait for the minister's foot-dragging. Online harassment has been an issue for years, and the government has not ensured that our laws have kept pace with this issue. For evidence of this, I encourage colleagues to read the Canadian Resource Centre for Victims of Crime's guide to cyberstalking, which admits as much, saying that, when victims seek to report incidents of cyberstalking, “individual officers may be unfamiliar with the crimes or technology in question and may be uncertain about how to proceed.”

Indeed, last month, an article was released that was headlined, “RCMP boss calls for new politician anti-threats law”. It cited the need for more provision to protect politicians from online harassment. I asked myself, if the RCMP cannot protect me, how are they going to protect anyone in my community from the same threat? We should all reflect upon this issue because across Canada, at this very moment, women are receiving repeated, unwanted, harassing digital communications, and the best that many victim services groups can do to help, because of government inaction, is offer advice on how they can attempt to be less of a victim.

Women should not have to alter their behaviour. Potential harassers should be held to account, and their behaviour should be de-escalated before it escalates into physical violence. To do this, I encourage parliamentarians in the other place to consider the following in their review of this bill. They should ask the government to create a new criminal offence of online harassment that would update the existing crime of criminal harassment to address the ease and anonymity of online criminal harassment, which groups, in the deliberation of this bill, have noted as a component of coercive control.

Specifically, this new provision would apply to those who repeatedly send threatening or sexually explicit messages or content to people across the Internet and social media when they know, or should know, that it is not welcome. This could include aggravating factors for repeatedly sending such material anonymously and be accompanied by a so-called digital restraining order, which would allow victims of online criminal harassment to apply to a judge to identify the harasser and end the harassment. This would give police and victims clear and easy-to-understand tools to prevent online harassment and also prevent the escalation of this abuse to physical violence.

It would also allow for national awareness and education campaigns to be developed on what happens when someone criminally harasses somebody online. This would address a major issue of intimate partner violence and make it easier to materially and concretely stop coercive control. Members of the governing Liberal Party agreed to the need for these measures in a recent meeting of PROC related to the online harassment of elected officials.

In addition, the government must do more to address so-called revenge porn as a component of coercive control. An academic article entitled “Image-Based Sexual Abuse as a Means of Coercive Control: Victim-Survivor Experiences” states:

Victim-support advocates and domestic violence sector workers have increasingly acknowledged the role that image-based sexual abuse plays in the perpetuation of intimate partner abuse.... Image-based sexual abuse refers to the non-consensual taking or sharing of nude or sexual images (photos or videos), including making threats to share intimate images.... In the context of an intimate relationship, image-based sexual abuse can include any of the following acts: taking or sharing nude or sexual images without consent; threats to share intimate images to coerce a partner into sharing more intimate images or engage them in an unwanted act; and/or recording and or disseminating of sexual assault imagery.

However, colleagues, this has become even more of a concern given the advent of deepfake intimate images. I have been raising this issue in the House for over a year, and the government has still not moved to update the definition of “intimate images” in Canada's Criminal Code to specifically include deepfake intimate images. This component is not in Bill C-63.

This inaction is already harming women. A Winnipeg high school student had deepfaked intimate images circulated against her; no charges were filed, likely because of the gap in our law. As it relates to coercive control, can members imagine how easy it would be for an abuser to create so-called revenge porn to use against their victim using online technology? The government must act now, but if it will not, we parliamentarians must. Therefore, I ask members of the other place to consider the following in the review of their bill.

They should consider updating Canada's existing laws on the non-consensual distribution of intimate images to ensure that the distribution of intimate deepfakes is also criminalized via a simple definition update in the Criminal Code. This could be done easily and likely with all-party support in this place. It is shameful that the government has not moved to do that to date. In addition, the government admitted today in the Toronto Star that it is committed to dogmatically sticking with Bill C-63 as its only way to address online harms. This is despite widespread criticism and despite admitting that even the few supportable provisions in the bill would not come into force for years. Therefore, we in the opposition must look for ways to address these issues outside the government, particularly since online harm is a growing component of coercive control.

In addition to what I have already suggested, as parliamentarians, we should address the broader issue of online harms by doing things such as precisely specifying the duty of care required by online platforms. This should be done through legislation and not backroom regulation. The duty of care could include mechanisms to provide parents with the safeguards, controls and transparency to prevent harm to their kids when they are online; mechanisms to prevent and mitigate self-harm, mental health disorders, addictive behaviour, bullying and harassment, sexual violence and exploitation, and the promotion and marketing of products or services that are unlawful for minors; and mechanisms to implement privacy-preserving and trustworthy age verification methods, which many platforms have already built, to restrict access to any content that is inappropriate for minors while prohibiting the use of a digital ID in any of these mechanisms.

As well, we require mechanisms to give adults a clear and easy-to-use way to opt out of any default parental controls that a duty of care might provide for. Then, through legislation, we should ensure the appropriate enforcement of such measures through a system of administrative penalties and consequences by government agencies and bodies that already exist. In addition, the enforcement mechanisms could provide for the allowance of civil action when duties of care are violated in an injurious way.

To address coercive control, we need to address online harassment. I hope that colleagues in the other place will consider the suggestions I have made to do just that.

June 11th, 2024 / 5:20 p.m.


See context

Founder and Mother, Amanda Todd Legacy Society

Carol Todd

Thank you.

The prevalence of sexually explicit material has increased due to the widespread use of the Internet. It manifests in various forms, including visual representations, photos, videos, films, written content, audio recordings and print material. The volume grows exponentially day by day. The protection that we have for our children and for our adults isn't there on the Internet. Big tech companies need to take responsibility. I know that throughout the world now, there are more and more lawsuits where big tech companies are being held responsible.

When accessing sexually explicit material, some of the challenges that we are faced with include access to violent and explicit content that can impact sexual attitudes and behaviours, the harm to children through the creation, sharing and viewing of sexual abuse material, increased violence against women and girls, as well as sex trafficking. It can also influence men's views on women and relationships.

In my notes, I comment that we stereotype often that it is men who are violating others, but the offenders can be men and they can be women. They can also be other children—peer violence to peer violence. There is no one set rule on who is creating and who is causing, but we know that those who become traumatized and victimized can be anyone.

What more needs to be done? I'll just go through this quickly.

As an educator, I feel strongly that increasing education is crucial. The awareness and education needs to go to our children and our young adults and to our families.

We need stronger regulations and laws. Bill C-63 is one of them. I know that in the province of B.C., more legislation has been passed and is done.

We need to improve our online platforms and make them accountable. We need to increase parental controls and monitoring, and we need to encourage reporting.

We also need to promote positive online behaviours. Social emotional learning and social responsibility are part of the awareness and the education that needs to come on.

We need to be a voice. We need to stand up, and we also need to do more.

Thank you for the time, and I encourage questions so that I can finish reading my notes.

Thank you.

Carol Todd Founder and Mother, Amanda Todd Legacy Society

I'd like to thank the committee for inviting me to speak. It's an honour to be able to share knowledge.

I'm not coming as a researcher or someone who has studied this. I'm coming as a mom, and I'm coming as a parent and as an educator with lived experience, so confining my conversation to five minutes was difficult. I've written some notes that I will read until my time is up, and I do welcome questions at the end.

I have spent the last 12 years, I guess, looking at learning about sexual exploitation and online behaviours, and it is really hard to imagine the horrid things that are happening out there to our children. As a side note, I believe that Bill C-63 needs to be passed with some tweaks, because it is the safety net for our children and Canadians online.

This subject holds significant importance and warrants ongoing dialogue to tackle not just the ease of access to such material but also the profound harm that can be inflicted upon those who encounter sexually explicit content every day.

I am Carol Todd, widely known as Amanda Todd's mother. In addition, I am an educator in a British Columbia school district with my work primarily centred on digital literacy, online safety and child abuse prevention with a focus on exploitation and sextortion.

Empowering students, teachers and families with the knowledge and skills to navigate the digital world safely is essential, important and now a passion of mine. I will continue to talk forever about how we can keep families and children safe, because this is what we needed for my daughter, and it came a bit too late.

Amanda tragically took her life on October 10, 2012, following extensive online exploitation, tormenting harassment and cyber-abuse. Her story very much relates to what happens when there is creation, possession and distribution of sexually explicit material online and how easily others can access it as it becomes embedded online forever.

Amanda's story garnered global attention after her tragic death. To reclaim her voice while she was alive, Amanda created a video that she shared on YouTube five weeks before her passing. It has been viewed 50 million times worldwide and is now used as a learning tool for others to start the discussion and for students to learn more about what happened to her and why it's so important that we continue to talk about online safety, exploitation and sextortion.

As another side note, it has taken forever for us to catch up on the conversation of exploitation and sextortion. It was something that no one was able to talk about 12 years ago, in 2012. It has evolved because of the increase of exploitation and sextortion online, not only happening to young girls, young boys and young adults but men and women. The nefarious offenders online, because they've gotten away with it due to so many levels of the Internet these days, have increased in numbers and have caused much trauma and much harm, as this is a form of abuse and violence.

Over the past decade, we've observed rapid changes in the technology landscape. Technology primarily used to be used as a communication tool for email, and now we have seen the evolvement of applications for fun. They were explained as safe, but now we know differently, because they have increased the chaos, concern and undesirable behaviours online for Canadians and for all.

This isn't just a Canadian problem. It's a global problem, and I have watched other countries create legislation, laws and safety commissions, just as Canada, with Bill C-63, now wants an e-safety commissioner board, and I think this is a brilliant idea. For anyone here who gets to vote, I hope that it does pass.

The prevalence of sexually explicit material has markedly increased—

Dr. Emily Laidlaw Associate Professor and Canada Research Chair in Cybersecurity Law, University of Calgary, As an Individual

Thank you for inviting me.

With my time, I'm going to focus on social media regulation and on Bills C-63 and S-210.

Social media has historically been lightly regulated. Online safety has only been addressed if companies felt like it or they were pressured by the market. There have been some innovative solutions, and we need them to continue to innovate, but safety has generally taken a back seat to other interests.

Social media companies have also privately set rules for freedom of expression, privacy and children's rights. There are no minimum standards and no ways to hold companies accountable. That is changing globally. Many jurisdictions have passed online harms legislation. The online harms act, which is part of Bill C-63, aligns with global approaches. In my view, with tweaks, Bill C-63 is the number one avenue to address illegal sexually explicit content and sexual exploitation.

Bill S-210 would mandate age verification to access sites with sexually explicit material. It is a flawed bill, yes, but more importantly, it is unnecessary for two reasons.

First, age verification is the crucial next frontier of online safety, but it is about more than sexually explicit material and about child safety broadly. The technology is evolving, and if we are committed to freedom of expression, privacy and cybersecurity, how this technology is used must be scrutinized closely.

Second, age verification is only one tool in the tool box. A holistic approach is needed whereby safety is considered in product design, content moderation systems and the algorithms. Let me give you a few examples of safety by design that does not involve age verification.

Child luring and sextortion rates are rising. What steps could social media take? Flag unusual friend requests from strangers and people in distant locations. Remove network expansion prompts whereby friends are recommended based on location and interest. Provide easy-to-use complaints mechanisms. Provide user empowerment tools, like blocking accounts.

The non-consensual disclosure of intimate images and child sexual abuse material requires immediate action. Does the social media service offer quick takedown mechanisms? Does it follow through with them? Does it flag synthetic media like deepfakes? How usable are the complaints mechanisms?

For example, Discord has been used to livestream child sexual exploitation content. The Australian e-safety commissioner reported that Discord does not enable in-service reporting of livestreamed abuse. This is an easy fix.

The last example is that the Canadian child protection centre offers a tool to industry, called Project Arachnid, to proactively detect child sexual abuse material. Should social media companies be using this to detect and remove content?

In my view, Bill C-63, again with tweaks, is the best avenue to address sexual exploitation generally. I think the focus should be on how to improve that bill. There are many reasons for that. I'll give two here.

First, the bill imposes three different types of responsibility. Vivek discussed this. Notably, the strongest obligation is the power of the commissioner to order the removal of child sexual abuse content and non-consensual disclosure of intimate images. This recognizes the need for the swift removal of the worst kinds of content.

Second, all of this would be overseen by a digital safety commission, ombudsperson and office. Courts are never going to be fast to resolve the kinds of disputes here, and they're costly. The power of the commissioner to order the removal of the worst forms of content is crucial to providing access to justice.

Courts are just ill-suited to oversee safety by design as well, which is necessarily an iterative process between the commission and companies. The tech evolves, and so do the harm and the solutions.

With my remaining time, I want to flag one challenge before I close, which Vivek mentioned as well. That is private messaging. Bill C-63 does not tackle private messaging. This is a logical decision; otherwise, it opens a can of worms.

Many of the harms explored here happen on private messaging. The key here is not to undermine privacy and cybersecurity protections. One way to bring private messaging into the bill and avoid undermining these protections is to impose safety obligations on the things that surround private messaging. I've mentioned many, such as complaints mechanisms, suspicious friend requests and so on.

Thank you for your time. I welcome questions.