Evidence of meeting #126 for Justice and Human Rights in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was c-63.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Anaïs Bussières McNicoll  Director, Fundamental Freedoms Program, Canadian Civil Liberties Association
Catherine Claveau  Bâtonnière du Québec, Barreau du Québec
1  As an Individual
Nicolas Le Grand Alary  Lawyer, Secretariat of the Order and Legal Affairs, Barreau du Québec
Michel Marchand  Member, Criminal Law Expert Group, Barreau du Québec
Emily Laidlaw  Associate Professor and Canada Research Chair in Cybersecurity Law, University of Calgary, As an Individual
Étienne-Alexis Boucher  President, Droits collectifs Québec
Matthew Hatfield  Executive Director, OpenMedia

Étienne-Alexis Boucher President, Droits collectifs Québec

Good evening, parliamentarians, honourable members of the House of Commons Standing Committee on Justice and Human Rights.

Thank you for this opportunity to speak as part of the pre‑study on Bill C‑63, which concerns online hate speech.

My name is Étienne‑Alexis Boucher. I'm the president of Droits collectifs Québec. I was supposed to be joined by François Côté, senior legal officer at Droits collectifs Québec. Unfortunately, he can't join us on account of the brand of his microphone.

Droits collectifs Québec is a non‑profit organization governed by an independent board of directors. It identifies as an agent of social transformation and operates throughout Quebec. Our mission is to help advocate for collective rights in Quebec, particularly with regard to people's language and constitutional rights. Our approach is non‑partisan. The organization's work encompasses many areas of action, including public education, social mobilization, political representation and legal action.

I've just given a brief overview of the organization. I would now like to focus on the Quebec consensus, which covers two aspects. We've already addressed the first, and this was touched on by the witnesses in the first panel earlier. We heard particularly poignant evidence regarding the mother of a young woman whose intimate images were shared.

While Ottawa refused to budge on this issue, Quebec ended up taking the lead. It became a pioneer in the field. The National Assembly adopted measures that fall under the Criminal Code. Unfortunately, Quebec doesn't have any power over the Criminal Code. At least, that's the current situation. Using its constitutional prerogatives, Quebec adopted measures concerning the sharing of intimate content without consent. In other words, since the federal government wasn't addressing the issue, we responded to the Quebec consensus with this initiative.

Another example of the Quebec consensus is the National Assembly's unanimous adoption of the request to repeal subsections 319(3)(b) and 319(3.1)(b) of the Criminal Code. These subsections state that “no person shall be convicted of an offence” of wilfully promoting hatred against an identifiable group “if, in good faith, the person expressed or attempted to establish by an argument an opinion on a religious subject or an opinion based on a belief in a religious text.”

This exception in the name of religious freedom has no place in a modern state such as Canada. We know that the Constitution of 1867 states that power in Canada is granted by divine right. Even the head of state can't be chosen democratically by the citizens of Canada, but by God. However, it's now the 21st century. I don't think that freedom of religion should rank higher than freedom of conscience, for example, or freedom of political opinion, when everyone acknowledges that certain limits are valid. For example, teachers may not, in the course of their duties, express opinions based on the political status of Quebec or Canada. These limits to a basic freedom are perfectly justifiable.

However, we find it completely unacceptable to make something normally considered a crime into a non‑crime in the name of freedom of religion. As a result, we're ultimately encouraging the parliamentarians to heed the call of Quebec's justice minister. Once again, the vast majority of Quebeckers are in agreement. The justice minister expressed a widely‑held consensus that hate speech based on religion is simply unacceptable.

There have been some concrete examples. We've seen the abuses and effects resulting from this exception up until now. People, in a fully public manner, in front of hundreds of thousands of individuals—if we count the people who viewed the images widely available on social media—could see the call to genocide made in the name of a religion.

Unfortunately, this call was not able to be criminally prosecuted, probably due to the exception. Again, we think this is unacceptable. This position is held by the Quebec government and by organizations such as the Rassemblement pour la laïcité, of which I am the vice-president. Ours is an umbrella organization for dozens of organizations representing thousands of people.

5:20 p.m.

Conservative

The Vice-Chair Conservative Larry Brock

Thank you, Mr. Boucher.

Thank you. You may have additional time if the members decide to give you additional time. However, your time is up.

Now we're going to move on to Mr. Hatfield.

You have five minutes, sir.

Matthew Hatfield Executive Director, OpenMedia

Good evening. I'm Matt Hatfield, the executive director of OpenMedia, a non-partisan, grassroots community of over 250,000 people in Canada working for an open, affordable and surveillance-free Internet.

I'm joining you from the unceded territory of the Stó:lō, Tsleil-Waututh, Squamish and Musqueam nations.

It's a pretty remarkable thing to be here today to talk about the online harms bill. When Canadians first saw what this bill might look like as a white paper back in 2021, we didn't much like what we saw. OpenMedia called it a blueprint for making Canada's Internet one of the most censored and surveilled in the democratic world, and we were far from alone in being concerned.

For once, our government listened. The rush to legislate stopped. National consultations were organized across the country on how to get regulation right with a wide range of stakeholders and experts on harms and speech. The resulting part 1 of Bill C-63 is an enormous, night-and-day improvement. Simple-minded punitive approaches that would have done more harm than good are gone, and nuances and distinctions made throughout show real sophistication about how the Internet works and how different harms should be managed. Packaging part 1—the online harms act itself—with changes to the Criminal Code and Human Rights Act proposed alongside it badly obscured that good work. That's why, alongside our peers, we called for these parts to be separated and why we warmly welcome the government's decision to separate those parts out.

I'll focus here on part 1 and part 4.

OpenMedia has said for years that Canadians do not have to sacrifice our fundamental freedoms to make very meaningful improvements to our online safety. The refocused Bill C-63 is the proof. Instead of trying to solve everything unpleasant on the Internet at once, Bill C-63 focuses on seven types of already-illegal content in Canada, and treats the worst and most easily identifiable content—child abuse material and adult material shared without consent—most severely. That's the right call. Instead of criminalizing platforms for the ugly actions of a small number of users, which would predictably make them wildly overcorrect to surveil and censor all of us, Bill C-63 asks them to write their own assessments of the risks posed by these seven types of content and document how they try to mitigate that risk. That's the right call again. It will put the vast engineering talent of platforms to work for the Canadian public, thinking creatively about ways to reduce these specific illegal harms. It will also make them explain what they are doing as they do it, so we can assess whether it makes sense and correct it if it does not.

However, I want to be very clear: It is not the time to pass Bill C-63 and call it quits. It's just the opposite. Because the parts that are now being separated raise so many concerns, there has not been nearly enough attention paid to refining part 1. I know you'll be hearing from a range of legal and policy experts about concerns they have with some of the part 1 wording and recommended fixes. I hope you will listen very carefully to all of them and pass on many of the fixes they suggest to you.

This is not the time to be a rubber stamp. The new digital safety commission is granted extraordinary power to review, guide and make binding decisions on how platforms moderate the public expression of Canadians in the online spaces we use the most. That's appropriate if, and only if, you make sure they carefully consider and minimize impacts on our freedom of expression and privacy. It isn't good enough for the commission to think about our rights and its explicit decisions. A badly designed platform safety plan could reduce an online harm but have a wildly disproportionate impact on our privacy or freedom of expression. You need to make sure platforms and the regulator make written assessments of the impact of their plans on our rights and ensure that any impact is small and proportionate to the harm mitigated. Bill C-63's protections of private, encrypted communication, and against platforms surveilling their users, need to be strengthened further and made airtight.

OpenMedia has a unique role in this discussion because we are both a rights-defending community that will always stand up for our fundamental freedoms and a community of consumer advocates who fight for common-sense regulation that empowers us and improves our daily lives. If you do your work at this committee, you can made Bill C-63 a win on both these counts. Since 2021, members of our community have sent nearly 22,000 messages to government asking you to get online harms right. Taking your time to study Bill C-63 carefully and make appropriate fixes before passing it would fulfill years of our activism and make our Internet a better, healthier place for many years to come.

Thank you, and I look forward to your questions.

The Vice-Chair (Mr. Rhéal Éloi Fortin (Rivière-du-Nord, BQ)) Bloc Rhéal Fortin

Thank you, Mr. Hatfield.

You're not seeing things. I'm replacing Mr. Brock, but the process remains the same.

I'd like to thank the witnesses for their presentations.

Mr. Jivani, you have the floor for six minutes.

5:25 p.m.

Conservative

Jamil Jivani Conservative Durham, ON

Thank you, Mr. Chair.

My first question is for Mr. Hatfield.

Thank you for your presentation.

I'm curious, given the very clear concerns you've expressed relating to parts 2 and 3 of Bill C-63, why you're not more concerned about some sections of part 1, particularly those related to the digital safety commission, the digital safety office and the digital safety ombudsperson, which would lay some of the bureaucratic groundwork that makes parts 2 and 3 possible.

Are you concerned about those sections of part 1? Would you care to give us some specific concerns you have related to part 1, which we're focused on today?

5:25 p.m.

Executive Director, OpenMedia

Matthew Hatfield

I don't think that part 1 does require parts 2 and 3. I think we can fully separate these, and I think the government should fully separate these.

Regarding the concerns we have with part 1, it's a huge amount of power we're putting into the hands of the regulator. We believe that it is important that Canada have a regulator here in the same way that we have a privacy regulator and we have a competition regulator. Digital safety is a complex and nuanced enough issue that having a source of government expertise helping make good decisions on it is helpful, but that doesn't mean that you should just hand a blank cheque to this regulator. You need to put some really careful assessments and required limits on that regulator before this bill leaves committee.

5:25 p.m.

Conservative

Jamil Jivani Conservative Durham, ON

Would you care to elaborate on what some of those limitations would be, from your point of view, that would need to be considered?

5:30 p.m.

Executive Director, OpenMedia

Matthew Hatfield

The single greatest, in our view, is that its decisions need to have mandatory assessments of their impact on freedom of expression and privacy, both the decisions directly made by the commission and also when they approve these safety plans.

If they approve safety plans submitted by the platforms, the platforms need to write down what they think the privacy and free expression impacts are. The regulator needs to assess and determine that those are proportionate with the opportunity, frankly, for a case to be taken against them, saying that perhaps the plans were not proportionate, so that they make impactful decisions but within bounds.

5:30 p.m.

Conservative

Jamil Jivani Conservative Durham, ON

Mr. Hatfield, there are other bills on the table for consideration, for example, that would be more focused on updating existing laws and making it easier for the existing criminal justice system that we have to be responsive to victims and to hold platforms more accountable. It sounds like—and correct me if I'm wrong—you prefer the creation of what I would consider to be new bureaucracy as opposed to strengthening the system that we currently have. Why?

5:30 p.m.

Executive Director, OpenMedia

Matthew Hatfield

I think that there's a lot to appreciate in Bill C-412. We do think that bill, if this bill does not pass, is worthy of study, but I think that Bill C-63 would accomplish more over a longer period of time for Canadians than Bill C-412. I think that Bill C-412 is narrow, perhaps too narrow a bill. When it comes to the harms that both of them treat, I think having a regulator involved is really beneficial.

Now, if you look at privacy law, we don't just say, “Here are your privacy laws on paper, and here's a private right of action, go to it. Our privacy is defended.” We found it extraordinarily valuable to have a Privacy Commissioner who can assist Canadians in asserting their privacy rights. Our hope for this digital safety commission is that they will function similarly.

5:30 p.m.

Conservative

Jamil Jivani Conservative Durham, ON

Mr. Hatfield, you mentioned the long-term effects of part 1. I would put forward that those long-term effects are the very parts 2 and 3 that you're concerned about, which is why I think a lot of Canadians who agree with a lot of the objectives the government has expressed relating to part 1 are concerned about the ripple effects of what that will mean down the road, especially when the current government has already stated its intentions with a part 2, a part 3 and a part 4. They've already said that there will be sequels. If you are concerned about the sequels, maybe the original is worth reconsidering.

I appreciate your contributions. Thank you.

How much time do I have?

The Vice-Chair (Mr. Rhéal Éloi Fortin) Bloc Rhéal Fortin

You have a minute and a half.

5:30 p.m.

Conservative

Jamil Jivani Conservative Durham, ON

With the remainder of our time, Ms. Laidlaw, I'd like to come back to you to ask you to elaborate a bit more on the amendments that you referenced in your opening statement. If you'd like to give us a bit more detail and more context, I'd appreciate it.

5:30 p.m.

Associate Professor and Canada Research Chair in Cybersecurity Law, University of Calgary, As an Individual

Dr. Emily Laidlaw

Yes, thanks so much for that opportunity.

What I think is critical—and this builds on what Matt was just talking about—is that there is always a risk of overcorrection if the focus is purely on harms. That's why it's important that one of the key harms can be to freedom of expression, and to privacy in particular, so it's important for the companies to be filing digital safety plans that explain how they make decisions bespoke to their services that balance out the scope of harms but also think through a way of doing it that's most protective of privacy and freedom of expression. The digital safety commission would have a duty to consider that in what they do, but it needs to also be on the company.

I think, concerning the child protection measures, that the best interest of the child is protected under international law. I think that is the blueprint here. Detailing specifically what it is about child protection that we're looking for when we talk about safety by design is incredibly important.

Of course, there's algorithmic accountability. I can discuss this further with you, but I'm conscious of time.

5:30 p.m.

Conservative

Jamil Jivani Conservative Durham, ON

Thank you.

The Vice-Chair (Mr. Rhéal Éloi Fortin) Bloc Rhéal Fortin

Mrs. Brière, you have the floor for six minutes.

Élisabeth Brière Liberal Sherbrooke, QC

Thank you, Mr. Chair.

It's great to see you in the chair, Mr. Chair.

I would like to say hello to Étienne-Alexis Boucher, who hails from my region.

I'm going to direct my questions to Ms. Laidlaw and Mr. Hatfield.

Ms. Laidlaw, it was said earlier that the bill was the result of several years of consultation. You've been part of this process.

On what basis was the list of seven categories of harmful content drawn up?

The bill sets out an obligation for platforms to act responsibly. What does that mean in concrete terms? I imagine it implies the obligation to identify the risk of harm and mitigate its effects.

I'd like to hear your comments on that.

5:35 p.m.

Associate Professor and Canada Research Chair in Cybersecurity Law, University of Calgary, As an Individual

Dr. Emily Laidlaw

Thank you.

Having had many discussions with other governments, the duty to act responsibly is generally the same as the duty of care in the U.K. or the due diligence and risk management obligations in Europe broadly. It's all about this due diligence approach of companies at a systemic level.

The duty to act responsibly came out of the Commission on Democratic Expression, and it's really practical. Their recommendation was we shouldn't use the language “duty of care”. That's the language from tort law. It might be confusing if this goes to court. We want this to be a stand-alone statutory duty where the duty is just set out in the legislation, and “duty to act responsibly” captured that better.

When it comes to the harms that are included, I think that is a point for debate. In discussions with colleagues, one that could be added to the list is the crime of identity fraud. That is a major issue, and I think it would be appropriate to include that.

It's notable what's not on the list. A point I have had multiple discussions about is the inclusion of mis- and disinformation, which generally fall into the category of “lawful but awful”. That is included in the EU legislation. The decision was not to include it in Canada because of what I would take as the problematic risks to freedom of expression. That is, we would be biting off more than should be taken on by a regulator.

One last point is we have to think in terms of what a regulator can take on practically. We know this will cost money, so some of this is a discussion on what we should include practically that are high-risk issues that a regulator can investigate and make a difference on now.

I'll leave it there.

Thank you.

Élisabeth Brière Liberal Sherbrooke, QC

Thank you very much.

You are of the opinion that the list of categories of harmful content should be reviewed.

Did I understand you correctly?

5:35 p.m.

Associate Professor and Canada Research Chair in Cybersecurity Law, University of Calgary, As an Individual

Dr. Emily Laidlaw

I'm acknowledging that it's a point of debate. There could be some reasonable arguments either way. I am comfortable with proceeding with the list as-is. The only one I would add would potentially be identity fraud, but I would not be moving far away from the list as it is, considering what's in federal jurisdiction to address as well.

Élisabeth Brière Liberal Sherbrooke, QC

Thank you.

In one of your publications, you said that this bill covered the basics, but certain amendments could be made.

In one article, which was published in English, you say: “This bill gets the big things right.”

Are you still of that opinion?

5:35 p.m.

Associate Professor and Canada Research Chair in Cybersecurity Law, University of Calgary, As an Individual

Dr. Emily Laidlaw

Yes, 100%. The biggest point of debate was how to actually structure a body to address these issues that balances harms and freedom of expression, and this does that. That is because of the years of consultations, and because it will fit in with the other global regulators.

Élisabeth Brière Liberal Sherbrooke, QC

Since the beginning of the study, a number of parents, particularly mothers, have told us horrible stories about what their children had experienced. Some young people have even committed suicide.

Do you believe that Bill C‑63will really allow us to achieve the goals as they are set out?

5:35 p.m.

Associate Professor and Canada Research Chair in Cybersecurity Law, University of Calgary, As an Individual

Dr. Emily Laidlaw

Yes. The caveat is that we're never going to rid the Internet of harm, and it's never going to be a perfect piece of legislation. This is about making things better.

This cannot happen with existing law or with improving existing laws and it cannot happen just through the courts, although the courts are an important process. This requires a regulator that can work with industry, that is more flexible, that can work with impacted community and civil society groups, and that has the power for quick content removal for the worst of the worst.

This is an ongoing project and, in the long term, I think this will make things better, but certainly not perfect.

The Vice-Chair (Mr. Rhéal Éloi Fortin) Bloc Rhéal Fortin

You have 10 seconds left, Mrs. Brière.