Evidence of meeting #127 for Justice and Human Rights in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was platforms.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Frances Haugen  Advocate, Social Platforms Transparency and Accountability, As an Individual
Marni Panas  Canadian Certified Inclusion Professional, As an Individual
Jocelyn Monsma Selby  Chair, Clinical Therapist and Forensic Evaluator, Connecting to Protect
Andrew Clement  Professor Emeritus, Faculty of Information, University of Toronto, As an Individual
Guillaume Rousseau  Full Professor and Director, Graduate Applied State Law and Policy Programs, Université de Sherbrooke, As an Individual
Joanna Baron  Executive Director, Canadian Constitution Foundation

11:55 a.m.

Advocate, Social Platforms Transparency and Accountability, As an Individual

Frances Haugen

I think people should have the right to encrypted, secure private messages, but that doesn't mean platforms get carte blanche to do whatever they want in how they design these services or how people behave once they're on them.

I'll give you an example. You want to exclude things that say you must take down individual pieces of content from encrypted messaging, because that requires breaking encryption. But if you say, “You need to articulate what you believe the risks are of how your product is designed today and have a plan to address them”, that leads to things like what Instagram did maybe two months ago. They said, “We're going to make all under-16 accounts private, because we found that adults were contacting these children.”

That's an example of a behaviour and design intervention, not a content intervention, involving private messaging.

Rhéal Fortin Bloc Rivière-du-Nord, QC

Thank you, Ms. Haugen.

I have a few seconds left, Dr. Selby. Quickly, what do you think?

11:55 a.m.

Chair, Clinical Therapist and Forensic Evaluator, Connecting to Protect

Dr. Jocelyn Monsma Selby

I think all platforms need to have, as Ms. Haugen so articulately said, a duty to articulate what they're going to do as far as their “safety by design” is concerned. I think that's the term we need to use here.

Rhéal Fortin Bloc Rivière-du-Nord, QC

Thank you to all the witnesses.

The Chair Liberal Lena Metlege Diab

Thank you.

Mr. MacGregor, go ahead, please.

Alistair MacGregor NDP Cowichan—Malahat—Langford, BC

Thank you, Madam Chair.

Ms. Haugen and Ms. Selby, I'd actually like to continue on that same subject. One of our previous witnesses, the Canadian Centre for Child Protection, is expressly calling for private messaging services and certain aspects of private messaging features to be subject to regulation.

It's hard. To give a personal example, I have 12-year-old twins. We have them on Messenger Kids. We started them off with iPads. We're not prepared to go to the cellphone yet. I'm sure I'm going through what a lot of parents are going through. This is the new frontier. When they get their own cellphones, how can I be sure that those messaging services will be protecting them?

Ms. Haugen, you cited Instagram, but are social media companies doing enough? Do we need to take this regulatory approach?

I'd just like to hear both of you—Ms. Haugen first, and then Ms. Selby—offer a little bit of context.

Noon

Advocate, Social Platforms Transparency and Accountability, As an Individual

Frances Haugen

The only reason Instagram took those actions—they knew they could have taken those actions a decade ago—was that they were afraid of laws like this one. They were afraid of Australia banning access to social media for under-16s. They were afraid of the lawsuits that are happening in the United States.

You have to put them in situations where they are afraid of consequences, because of the amount of money to be made from cutting corners and from maximizing the number of connections, no matter the risk for these kids and no matter how addictive this is, for advertising dollars. They have to face consequences if you want them to behave well.

To the second question, on how we keep encrypted messaging secure, we need to think a little more expansively. For example, if I am a child on an encrypted messenger and an adult sends me a lewd image—I did not ask for it and I do not want it—I should have the ability to report that adult. No encrypted messaging has been violated by me reporting that adult. Platforms should have an obligation to take people off their platforms who contact children in that way.

Alistair MacGregor NDP Cowichan—Malahat—Langford, BC

Thank you.

Noon

Liberal

The Chair Liberal Lena Metlege Diab

You have 30 seconds.

Alistair MacGregor NDP Cowichan—Malahat—Langford, BC

I just want to give Ms. Selby the final 30 seconds to comment.

Noon

Chair, Clinical Therapist and Forensic Evaluator, Connecting to Protect

Dr. Jocelyn Monsma Selby

I agree 100% with Ms. Haugen—absolutely—but I think we need to go bigger. The umbrella needs to extend, as I mentioned in my discussion, to say that all Internet sites need regulation, because this duty of them taking responsibility early has not happened. As Frances says, it only happened because they were worried about regulation coming forward like this.

Noon

Liberal

The Chair Liberal Lena Metlege Diab

Thank you.

Thank you very much to all of the witnesses for appearing today, in person and on the screen. I will simply add that if there's anything else you'd like to send to us—anything you wanted to say but didn't get an opportunity to say today—please do so through the clerk in writing.

With those words, I'll suspend for two minutes while we get the next panellists ready.

Thank you so much.

The Chair Liberal Lena Metlege Diab

We will now resume for our second panel.

Appearing as an individual, we have Andrew Clement, professor emeritus, Faculty of Information, University of Toronto, by video conference.

I hope that everybody is able to understand both languages and that you've selected the right language of your choice at the bottom.

We also have Guillaume Rousseau, full professor and director of Applied State Law and Policy Programs at the Université de Sherbrooke. He is participating in the meeting by videoconference.

From the Canadian Constitution Foundation, we have Joanna Baron, executive director. She is here in person.

Please wait until I recognize you by name before speaking.

Each panellist will be allowed up to five minutes for opening remarks.

Mr. Clement, please commence with your opening remarks. You have up to five minutes.

Professor Andrew Clement Professor Emeritus, Faculty of Information, University of Toronto, As an Individual

Thank you, Madam Chair and committee members, for the opportunity to contribute to your important prestudy of Bill C-63, the online harms act.

I'm Andrew Clement, a professor emeritus in the faculty of information at the University of Toronto, speaking on my own behalf. I'm a computer scientist by training and have long studied the social and policy implications of computerization. I'm also a grandfather of two young girls, so I bring both a professional and a personal interest to the complex issues you're having to grapple with.

I will confine my remarks to redressing a glaring absence in part 1 of the bill—a bill I generally support—which is the need for algorithmic transparency. Several witnesses have made a point about this. The work of Frances Haugen is particularly important in this respect.

Social media operators, broadly defined, provide their users with access to large quantities of various kinds of content, but they're not simply passive purveyors of information. They actively curate this content, making some content inaccessible while amplifying other content, based primarily on calculations of what users are most likely to respond to by clicking, liking, sharing, commenting on, etc.

An overriding priority for operators is to keep people on their site and exposed to revenue-producing advertising. In the blink of an eye, they select the specific content to display to an individual following precise instructions, based on a combination of the individual's characteristics—for example, demographics, behaviour and social network—and features of the content, such as keywords, income potential and assigned labels. This is referred to as an “algorithmic content curation practice”, or “algorithmic practice” for short.

These algorithmic practices determine what appears most prominently in the tiny display space of personal devices and thereby guides users through the vast array of content possibilities. In conjunction with carefully designed interactive features, such curation practices have become so compelling, or even addictive, that it holds the attention of U.S. teens, among others, for nearly five hours a day. Disturbingly, their time spent on social media is strongly correlated with adverse mental health outcomes and with a rapid rise in suicide rates starting around 2012. We've heard vivid testimony about this from your other witnesses. Leading operators are aware of the adverse effects of their practices but resist reform, because it undermines their business models.

While we need multiple approaches to promote safety online, a much better understanding of algorithmic curation practices is surely one of the most important.

Canadians have begun calling for operators to be more transparent about their curation practices. The Citizens' Assembly on Democratic Expression recommended that digital service providers “be required to disclose...the...inner workings of their algorithms”. Respondents to the online consultation regarding this proposed online harms legislation noted “the importance of...algorithmic transparency when setting out a regulatory regime.” Your sister standing committee, the Standing Committee on Public Safety and National Security, has made a similar recommendation: “That the Government of Canada work with platforms to encourage algorithmic transparency...for better content moderation decisions.”

Internationally, the U.S., the EU and others have developed or are developing regulatory regimes that address online platforms' algorithmic practices. Most large social media services or online operators in Canada also operate in the EU, where they are already subject to algorithmic transparency requirements found in several laws, including the Digital Services Act. It requires that “online platforms...consistently ensure that recipients of their service are appropriately informed about how recommender systems impact the way information is displayed, and can influence how information is presented to them.”

While Bill C-63 requires operators to provide detailed information about the harmful content accessible on the service, it is surprisingly silent on the algorithmic practices that are vital for determining the accessibility, the reach and the effects of such content. This lapse is easily remedied through amendments—first, by adding a definition of “algorithmic content curation practice”, and second, by adding requirements for the inclusion of algorithmic content curation practices in the digital safety plans in clause 62 and in the electronic data accessible to accredited persons in clauses 73 and 74. I will offer specific amendment wording in a written submission.

Thank you for your attention, and I welcome your questions.

The Chair Liberal Lena Metlege Diab

Thank you very much.

Mr. Rousseau, you now have the floor.

Guillaume Rousseau Full Professor and Director, Graduate Applied State Law and Policy Programs, Université de Sherbrooke, As an Individual

Good morning, everyone. Thank you for inviting me to speak to Bill C‑63.

I apologize for my appearance. I had surgery yesterday, which is why I'm wearing a bandage. Although I have a few scars on my head, my mind is working fine. I should be able to make this presentation and answer your questions.

As a constitutional lawyer, I mainly want to draw your attention to the issue of freedom of expression and, since I'm from Quebec, I also want to draw your attention to the fact that Bill C‑63 is very similar to Bill 59, which was studied in Quebec in 2015 and 2016.

For those who, like me, fought against Bill 59, it's a bit like Groundhog Day, since Bill C‑63 contains extremely similar elements, including the prohibition on hate speech. This reminds us of the extent to which Quebec and federal jurisdictions are not always sufficiently exclusive and that there is a lot of overlap. I will stop my digression on Canadian federalism here, but I would like to point out in passing that I have just tabled a report with the Quebec advisory committee on constitutional issues within the Canadian federation. If you're interested in this issue, you should know that a report has just been submitted to the Government of Quebec.

Bill 59, which was studied in 2015 and 2016, banned hate speech, and it was considered very problematic in terms of freedom of expression. In the end, the government of the day decided to set aside part of the bill and not adopt the hate speech component of the bill in order to keep the other part of the bill, which was much more consensual and dealt in particular with the regulation of underage marriages. With respect to Bill C‑63, I hope we are preparing for a similar outcome.

I think the bill contains a lot of interesting things about sexual victimization and “revenge porn”. I believe the equivalent term in French is “pornodivulgation”. I think this whole area of protecting minors and protecting them from sexual victimization is very important. However, everything to do with hate seems much more problematic to me.

Sometimes, people talk about splitting the bill, saying that part 1 isn't a problem, and that parts 2 and 3 are more problematic. For my part, I draw your attention to the fact that, even in part 1, the definition of harmful content includes content that promotes hatred. Even in part 1, there's this mix between the issue of protecting minors from certain elements of pornography and the issue of hate. In my opinion, if we want to rework the bill properly, we must not only not adopt parts 2 and 3, but also eliminate hate from part 1.

The problem with everything to do with hate in the bill is that the definition is very vague and very broad. Hate is defined as detestation and defamation, but the definitions of detestation and defamation often include a reference to hate. It's all a bit circular. It's very vague and, for that reason, it's very difficult for litigants to know what their obligation is, to know what they can and cannot say.

I understand that this definition is inspired by the Supreme Court's Whatcott case, but there are two problems in this regard.

First, this definition was given in a human rights case, but here we want to use it as a model in criminal law. In terms of evidence, in particular, these two areas are very distinct. Second, I understand why we are taking our cues from the Supreme Court when it comes to definitions, because that means that the provision of the act is less likely to be struck down. I understand it on a technical level, but on the substance, a definition that isn't clear and isn't good isn't clear and isn't good, even if it comes from the Supreme Court.

I want to repeat this famous sentence: The Supreme Court is not final because it is infallible, it is infallible because it is final.

As legislators, you really have to ask yourself whether the definition is clear rather than just whether it is the Supreme Court's definition. Ultimately, if you absolutely want to have a definition inspired by the Supreme Court, I would recommend the definition in the Keegstra decision, which is more of a criminal decision. It's a little clearer and a little less problematic than the Whatcott inspired definition.

That said, if you go along with what I'm proposing and remove the hate component from the bill, it will raise the following question: If we create a bill that is more targeted on sexual victimization and the protection of minors, will we need a commission, an ombudsperson, an office and all the bureaucracy that is planned when the purpose of the act is more limited? We will therefore have to rethink the bill so that it is less bureaucratic.

Finally, I draw your attention to the fact that the bill should include the abolition of exemptions that allow hate speech in the name of religion. We were talking earlier about Bill C‑63 and Bill C‑412, but there's also Bill C‑367, which I invite you to study.

Thank you.

The Chair Liberal Lena Metlege Diab

Thank you.

Now we go to Ms. Baron, please.

Joanna Baron Executive Director, Canadian Constitution Foundation

Good afternoon. Thank you for the opportunity to present before this committee.

I represent the Canadian Constitution Foundation, a national legal charity that defends fundamental freedoms. We have participated in Whatcott, Fleming, Ward and other seminal Supreme Court of Canada decisions on freedom of expression. We view this bill, Bill C-63, as posing a grave threat to all Canadians' right to free speech and a flourishing democracy.

We welcome the minister's announcement that he intends to split the bill with regard to parts 1 and 4, but we remain concerned about the constitutionality of aspects of part 1, as well as parts 2 and 3 in their entirety.

First I'll address portions of the bill that expand sanctions for offences related to hate speech, including “harmful content” and “content that foments hatred”. I am referring to both the mandate of the new digital safety commissioner, created in part 1 of the bill, and the expanded penalties for hate crimes in part 2.

Part 1 of the bill imposes obligations on an operator to “implement measures that are adequate to mitigate the risk that users...will be exposed to harmful content”. This includes “content that foments hatred”. This office will cost around $200 million over five years and impose fines up to the millions of dollars on platforms.

Part 2 of the bill, meanwhile, increases penalties for existing hate crimes, including promoting genocide, now punishable with up to life. It also creates a new stand-alone offence, in proposed section 320.‍1001, for any federal offence motivated by hatred, now punishable up to life.

As the previous witness mentioned, and I agree with many of his comments, hate speech is an inherently subjective concept. These expanded penalties and regulatory obligations pose a risk of gross disproportionality and excessive chill of protected expression. In Whatcott, the Supreme Court of Canada said that hatred encompasses only the most “extreme manifestations [captured] by the words 'detestation' and 'vilification'”. Only that type of speech can be penalized without violating the charter.

Bill C-63 adopts this language in proposed subsection 319(7): “hatred means the emotion that involves detestation or vilification”. But “detestation” is really just a synonym for “hate”, and vilification is a highly subjective concept. We are in a present moment of passionate and often fraught disagreement in our society, where a lot of claims are made that are understood differently depending on context.

For example, calling someone a Zionist currently may land as vilification or, more dubiously, promotion of genocide, or as praise, depending on the speaker and the audience. Just a few days ago, a former CBC producer, Shenaz Kermalli, accused MP Kevin Vuong of hateful expression for posing with an individual wearing an “F Hamas” sweatshirt on social media. That's the problem with criminalizing language. It's subjective. It shifts depending on context.

These concerns become pressing with the expanded sanctions proposed in part 2. Even if our judges can be relied upon to respect the principles of proportionality when sentencing an offender under section 320, for example, the range of available sentences in the law will now include life imprisonment. It's not a frivolous possibility that prosecutors can refer judges to a range of sentencing up to life imprisonment for a crime such as vandalism if it is alleged that the crime was motivated by hate.

The reality is that it's virtually impossible to identify in advance, predictably, a line that separates the merely “awful but lawful” from criminal hate speech. This lack of clarity poses an urgent threat to online discourse, which is our current town square and should brook this type of passionate and adversarial disagreement. When these types of sanctions are in play, everyone has an incentive to err on the side of caution. Platforms will flag and remove content that is actually protected expression, and individuals will self-censor.

Finally, I will briefly address part 3 of the bill. It brings back a civil remedy for online hate speech, which allows members of the public to bring complaints before the Canadian Human Rights Commission. This would be disastrous. You should not go forward with this proposal. Even if most alleged instances are dismissed for not meeting the threshold of hate speech, the penalties for individuals found liable—up to $50,000 paid to the government plus $20,000 to the victim—are severe enough that we can infer that the new regime will lead to large amounts of soft-pedalling of expression for fear of skirting the line. It will interfere severely with press freedom to publish controversial opinions, which are necessary for a flourishing civil society. Finally, process is punishment, even if the case does not proceed. We will see more people punished for protected expression.

Thank you. I welcome your questions.

The Chair Liberal Lena Metlege Diab

The time is 12:26. I will guard time effectively to get us to one o'clock.

We will start with the first round, and we'll leave it at six minutes each.

MP Rempel Garner, go ahead for six minutes, please.

12:25 p.m.

Conservative

Michelle Rempel Conservative Calgary Nose Hill, AB

Thank you, Madam Chair.

Ms. Baron, I'd like to focus my line of questions on clause 140 in part 1 of the bill, which lists the different types of powers that the regulator has to make and enforce regulations. I note there are over 25 different areas that the regulatory body would have the power to regulate on. Are you concerned, given the broad terms that are used in this bill, like “harmful content”, that Parliament is ceding both rule-making and enforcement capacity to this regulator in such a broad way that it could have serious implications on things that you mentioned, like press freedom and freedom of speech?

12:25 p.m.

Executive Director, Canadian Constitution Foundation

Joanna Baron

Yes, absolutely. My understanding is that the platforms are not clear on how this is actually going to work. They've had some conversations, and all of this is said to be worked out later. From what we know, this will come down to a lot of these decisions about speech that is, perhaps, close to the line and highly subjective. Much of it we know will end up being protected expression, even if it offends certain people or hurts certain people's feelings. Those decisions will come down to government-appointed bureaucrats or, you know...mindful of the severe financial consequences of running afoul of the bill for platforms.

12:25 p.m.

Conservative

Michelle Rempel Conservative Calgary Nose Hill, AB

Thank you.

It seems to me that having a regulator without a legislated duty of care that includes clearly defined terms on what online platforms would be responsible for is putting the cart before the horse in a potentially dangerous way. That's from the perspective of both delaying action that could protect victims and also allowing opportunities for an unelected regulator to place significant restrictions on speech without legislative oversight.

Would you characterize that as an accurate fear in this situation?

12:25 p.m.

Executive Director, Canadian Constitution Foundation

Joanna Baron

Yes, I think that the goals the government has spoken about in protecting children and victims of revenge porn are pressing. It's unconscionable to create a new, $200-million regulator to combat those very specific harms.

12:25 p.m.

Conservative

Michelle Rempel Conservative Calgary Nose Hill, AB

Would you say that it would be more effective for the government, and perhaps for all parties, to spend time debating a legislated list of responsibilities for online platforms prior to abdicating responsibility? The better first step, prior to looking at a regulator, would be for Parliament to define what that responsibility is.

12:25 p.m.

Executive Director, Canadian Constitution Foundation

Joanna Baron

I think that's fair.