Evidence of meeting #95 for Access to Information, Privacy and Ethics in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was tiktok.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Brett Caraway  Associate Professor of Media Economics, University of Toronto, As an Individual
Emily Laidlaw  Associate Professor and Canada Research Chair in Cybersecurity Law, University of Calgary, As an Individual
Matt Malone  Assistant Professor, Thompson Rivers University, As an Individual
Sam Andrey  Managing Director, The Dais
Joe Masoodi  Senior Policy Analyst, The Dais

3:35 p.m.

Conservative

The Chair Conservative John Brassard

I'm going to call this meeting to order.

I want to welcome everyone to meeting number 95 of the House of Commons Standing Committee on Access to Information, Privacy and Ethics.

Pursuant to Standing Order 108(3)(h) and the motion adopted by the committee on Tuesday, January 31, 2023, the committee is resuming its study of the use of social media platforms for data harvesting and unethical or illicit sharing of personal information with foreign entities.

Today's meeting is taking place in a hybrid format, pursuant to the Standing Orders of the House. Members are participating in person, in the room, and virtually using the Zoom application.

I would like to remind all members not to put their earpieces near the microphones, because it could cause injury to our interpreters.

I would now like to welcome our witnesses for the first hour. As individuals, we have Brett Caraway, associate professor of media economics, University of Toronto; and Emily Laidlaw, associate professor and Canada research chair in cybersecurity law, University of Calgary.

Before we begin, the bells are ringing. I received unanimous consent from the committee to begin this meeting for the opening statements in advance of the votes. I appreciate the indulgence of committee members for allowing that to happen, so that we can listen to our witnesses.

Mr. Caraway, you have five minutes, followed by Ms. Laidlaw.

3:35 p.m.

Brett Caraway Associate Professor of Media Economics, University of Toronto, As an Individual

I would like to thank the members of the committee for the opportunity to speak today.

I'm an associate professor of media economics at the University of Toronto. I appear here today in a personal capacity, so the views expressed are mine and mine alone.

I want to speak about the risks posed by the underlying incentive structure of social media platforms. In doing so, I hope to convey some sense of the changes that have transpired in our media landscape and why there are, too often, divergences between public and private interests.

Digital platforms are major features of the information economy because of their capacity to reduce market frictions and lower transaction costs. To understand what I'm talking about, imagine how a social media app like Instagram might make a particular group of users, such as amateur photographers or travel enthusiasts, accessible to advertisers who want to target them with commercial messages.

In this scenario, there are actually three market actors. There are the users, the advertisers and the platform operator, and they each have their own set of incentives. Instagram has a financial incentive to maximize the number of users and their level of engagement. This makes the platform more attractive to advertisers. Advertisers want as much information as possible about the platform's users so they can minimize uncertainty, and users just want to enjoy the functionality of the platform with as little disruption as possible.

Multisided markets like these are nothing new. They've been a feature of mass communication systems since the earliest newsletters began selling advertisements in the 1600s. However, terms like “niche marketing” and “targeted advertising” only begin to scratch the surface of what actually transpires every time you enter a search query on Google, watch a video on TikTok, like someone's post on Facebook or retweet something. Information is gathered, auctions take place and commercial messages are delivered.

My concerns are not driven primarily by escalating geopolitical tensions or foreign threat actors, though foreign interference, misinformation, disinformation and radicalization are all genuine concerns. My concern is that these platforms, even when functioning exactly as intended, have adverse impacts on the public sphere. My concern is that the economics of platforms all but guarantees the propagation of disinformation, efforts to influence behaviour and the erosion of individual privacy.

My concern is born out of the realization that, in the economics of platforms, there is no effective upper limit to the exploitation of human attention. “Attention” might refer to the ability to concentrate on something, but from the perspective of society, it speaks to our collective ability to recognize problems and opportunities, to the horizon of our imagination and creativity, and to our ability to rise to the occasion to meet the world's most pressing problems. Attention is a renewable resource, but it isn't like any other resource. You can't hoard it like a precious metal. You can only direct it at something. In fact, that's the economic function of advertising: the allocation of scarce attention among its competing uses. How we choose to allocate our attention is important, both for individuals and for society. Our attention shapes who we are, who we might be and where we might go.

Economists often speak of “the tragedy of the commons”. The origin of the concept is problematic. As a metaphor, however, it can be quite useful. It alerts us to the propensity for overuse and exploitation of finite resources when we allow unfettered access to them. Digital platforms don't merely attempt to measure attention. They seek to modify it—to make it conform to commercial imperatives. Today's attention economy looks less like AMC's Mad Men and more like the speed-of-light trading that takes place in financial markets. The fundamental economics of this system is inconsistent with robust privacy protections.

The overriding economic imperative is to maximize data collection. It's not just the PRC or Russia. It's U.S. firms like Alphabet, Meta, Amazon and a host of data brokers you have never even heard of. As a consequence, our attention is exhausted. Its quality is diminished.

We have protections to safeguard other resources, such as water, air and habitat. We must likewise manage this renewable resource in a similar manner, in a sustainable manner, as we would air, habitat and water.

We are at an inflection point in Canada. It's my hope that we can take concrete steps to empower Canadians by creating a comprehensive regulatory framework for all digital platforms.

Thank you.

3:40 p.m.

Conservative

The Chair Conservative John Brassard

Thank you, Mr. Caraway.

Ms. Laidlaw, you have five minutes to address the committee. Go ahead, please.

3:40 p.m.

Dr. Emily Laidlaw Associate Professor and Canada Research Chair in Cybersecurity Law, University of Calgary, As an Individual

Thank you, Mr. Chair, for the invitation to address this committee.

I am honoured to speak to you from Calgary and the traditional territory of the people of the Treaty 7 region and the Métis Nation of Alberta.

I've had the opportunity to listen to some of the witnesses and the discussion leading up to my appearance. With my time, I would like to pull us back to look at the broader legal issues at play.

My key message is that this is not just about privacy. Privacy is one piece of the pie. For example, Discord does not use tools to detect child sexual abuse content, and it does not monitor or offer a tool for reporting livestreamed content. That's a recipe for disaster. This is a safety design problem, not only a privacy one.

This is about platform regulation. The health of our information ecosystem depends on privately owned platforms and the choices they make in the design of their products, corporate governance, culture and content moderation systems. In short, platforms have tremendous power.

Canada is currently a laggard in regulating platforms. Much of what this committee has discussed would be addressed by online harms legislation, which we do not yet have in Canada. Europe, the U.K. and Australia all have laws to address these issues. In some cases, they are on their second-generation or third-generation law. Canada has zero federal laws that apply generally to platform regulation. We can learn from the good and the bad of these other laws, but it is time to act now.

What do we need, and what are the areas we must be careful about?

First, platform regulation is a field like protecting the environment, and multiple areas of law must work in concert to protect our safety and rights. In particular, privacy law and online harms legislation are mutually reinforcing, so we need both. For example, algorithms that push harmful content do so by harvesting personally identifiable information, which is covered by privacy law. However, the algorithm can also draw from anonymized aggregate data, which falls outside of privacy law.

Online harms legislation can better target the choices that platforms make about their product designs and content moderation systems. Social media mines data to determine likes and interests, but it is what it does with this that online harms laws can address—such as Meta amplifying emotive and toxic content on Facebook by treating angry and love reactions as five times more valuable than likes. This fuelled the spread of misinformation and disinformation.

Second, platforms are part of the solution. They can be important collaborators and innovators in solving problems. There is, however, a friction when they are almost state-like in their role. Some have their own national security teams, essentially setting national security policy.

We also depend on platforms to go above and beyond the law in addressing hateful content, disinformation and violent extremism, all of which are not necessarily illegal. However, that is not a substitute for law to set industry standards. Standards are needed. The examples I gave were platforms with relatively sophisticated governance structures. There are many popular platforms that minimally govern the risks of their products.

Third, when we talk about the risks of harm, we should be clear that not all risks are the same. Child protection, hate and terrorist propaganda, disinformation, and violence all have different dynamics and should not be distilled to one legal rule, except for the basic idea of corporate due diligence.

Further, when we talk about the risks of harm, these include risks to fundamental rights: the rights to freedom of expression, to privacy and to equality. Any analysis of solutions in law or governance must be through the lens of protection and promotion of rights. This is particularly challenging when it comes to addressing misinformation and disinformation because, except in narrow circumstances, it is lawful to believe and share false information.

I will leave you with this: What are the basic components needed in online harms legislation?

Platforms should have a duty to manage the risks of harm of their products and a duty to protect fundamental rights. There should be transparency obligations matched with a way to vet transparency through audits and access to data by vetted researchers. There should be the creation of a regulator to investigate companies and educate the public, and there should be access to recourse for victims, because this is a collective harm but also an individual one.

Thank you, and I welcome questions.

3:45 p.m.

Conservative

The Chair Conservative John Brassard

Thank you, Ms. Laidlaw.

We are going to ask you to be patient because we have roughly 11 minutes left before the vote. That should take another 10 minutes or so, so we should be back in 25 minutes with our first round of questioning, if that's okay—if you can hang on.

I am going to suspend the meeting for the vote. We'll be back right after.

Thank you.

4:10 p.m.

Conservative

The Chair Conservative John Brassard

I'm going to call the meeting back to order. I do note that there is closure, and a vote is imminent. We have roughly 45-50 minutes here. We've had the opening statements from the witnesses, and we appreciate their patience.

We're going to start our first round of six-minute questioning with Mr. Kurek.

4:10 p.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

Thank you very much, Chair, and thank you to our witnesses.

I appreciate, as well, that in your statements you provided a number of practical suggestions. As always, feel free to follow up if there's anything additional.

Mr. Caraway, I found some of your work very interesting, because the economics of social media is certainly a fascinating subject. It's that balance between a service that is perceived to be free versus the cost associated with something that is quite expensive to run, like a social media platform. When talking about regulations and managing that, how does that get balanced, the consumer's desire not to have to pay for a service versus the demands associated with running a massive web operation?

4:15 p.m.

Associate Professor of Media Economics, University of Toronto, As an Individual

Brett Caraway

That's a great question.

The thing I always tell my students is that there's no such thing as a free lunch. Even though it appears that you're getting these services for free, if it's an advertising-supported model of some kind, you're paying for that when you purchase goods or services later on.

Most of the platforms we're talking about in the social media realm run as multisided marketplaces. It is quite difficult to keep everybody happy. As I alluded to, you're trying to keep the advertisers happy, but the advertisers want as much information as possible about the users. The users just want to be left alone to use the platform, but they also don't necessarily want to pay for it. That's never a popular thing, except maybe in some online streaming contexts when you're looking at services like Spotify or Netflix. However, even in the subscription-based models, a company like Netflix, which isn't necessarily doing the same sort of data harvesting that companies like Meta or Alphabet are doing, is also gathering data on how the users use the platform and deploying AI for recommendation systems, etc.

There's always an economic imperative for the advertisers to demand more data; therefore, the platform operators will harvest more data. I think that speaks to the need for the government to step in and say, “Well, here are the enumerated rights that we consider, such as privacy for citizens.” I don't just mean including it in a preamble, but actually putting those in legal tests, so—

4:15 p.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

I apologize, but we only have very limited time here.

I would like you to follow up—in about 30 seconds, because I would like to ask some more questions—on the specific context of making sure kids are protected. Could you maybe expand a bit on how that plays into making sure that kids are protected from online harms, and what that looks like in the context of young people?

4:15 p.m.

Associate Professor of Media Economics, University of Toronto, As an Individual

Brett Caraway

That's supposed to be part of Bill C-27, the formulations of some sort of protections for minors. If you listen to representatives from TikTok, they will tell you that they have self-regulation and that they are the vanguard of that. They would say that people 18 and under can't livestream, or the privacy settings are by default for people 16 and under, or people 16 and under are limited to 60 minutes. However, these are settings that can just be changed.

It is important for the government to step in and help parents out, because they are literally overwhelmed by all the different social media platforms, and, of course, teens are on these platforms, depending on whom you ask, four to five hours a day.

4:15 p.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

Thank you very much for that. I would just emphasize that “to help parents out” is a great line there.

Ms. Laidlaw, we're talking about protecting young people. There's a range of harms on social media, from things like bullying all the way up to the most heinous types of exploitation, things associated with human trafficking, child exploitative material and that sort of thing.

In the context of social media and young people, what's the government's role in terms of developing regulations? What is the role of social media platforms in terms of trying to create frameworks that deal with the massive range of possible challenges that we face here?

There's about a minute and a bit left, and I know it's a big question. Hopefully, that's enough time for you to give some feedback to the committee.

4:15 p.m.

Associate Professor and Canada Research Chair in Cybersecurity Law, University of Calgary, As an Individual

Dr. Emily Laidlaw

Thank you.

It's an enormous question, but it's the money question.

I will keep it brief and state that it's crucial that government play a role, because thus far we've mostly relied on corporate self-governance and it hasn't worked. I mean, we're seeing all kinds of harms happen online.

What we do need is a regulator, because a regulator can be more agile in dealing with this. It's too cumbersome for some of these concerns to work through the courts. We need help to sort of set practices. Each platform is different, so the platforms really do need to come up with solutions for their spaces. It's just that there needs to be a method to hold them accountable for it. They need to demonstrate to some regulator the steps they're taking to protect children.

I think we need to divvy up the harms. If you're talking about specific child protection measures—looking at child sexual abuse images, intimate image abuse, trafficking—these are crimes, and there are the primary actors who, to the extent they can be found and prosecuted, should be the targets, but there is a separate responsibility and special duties that should exist for platforms.

When it comes to child—

4:20 p.m.

Conservative

The Chair Conservative John Brassard

Thank you, Ms. Laidlaw.

I'm sorry. The worst part of my job is cutting people off.

Ms. Khalid, you have six minutes. Go ahead, please.

4:20 p.m.

Liberal

Iqra Khalid Liberal Mississauga—Erin Mills, ON

Thank you very much, Chair.

Ms. Laidlaw, you talked about solutions with respect to social media companies. Perhaps you can finish your thoughts on Mr. Kurek's question and, as well, comment on what you think are good solutions on this that social media companies can look into and can have best practices for.

4:20 p.m.

Associate Professor and Canada Research Chair in Cybersecurity Law, University of Calgary, As an Individual

Dr. Emily Laidlaw

Just to clarify, are you asking from a corporate governance perspective or what laws we should pass?

4:20 p.m.

Liberal

Iqra Khalid Liberal Mississauga—Erin Mills, ON

I meant both, actually.

4:20 p.m.

Associate Professor and Canada Research Chair in Cybersecurity Law, University of Calgary, As an Individual

Dr. Emily Laidlaw

Okay.

First, this is a legal question and we do need the government to pass online harms legislation, because it needs to set the duties for companies. Basically, it needs to set minimum standards. The companies themselves, though, can start taking more seriously protecting children from harms.

I think one of the issues is that a lot of the transparency we're seeing now tends to be more of a marketing exercise. I think it's not as upfront about what some of these practices are. This is a key aspect, of course, that Dr. Caraway has talked about: the attention economy.

Specifically for children, I think we need to think about this as mind manipulation. Historically, there were interventions in areas of advertising to protect children from mind manipulation. You didn't have certain ads at certain times of day and with children's shows. This is the same thing that's happening on social media: pushing suicide content, eating disorder how-to content and so on.

It is critical that these platforms, from a design basis.... How are we designing this platform? How are algorithms pushing content? How are we nudging certain behaviours? They need to address that and account for that, so I think there should be special duties for children.

4:20 p.m.

Liberal

Iqra Khalid Liberal Mississauga—Erin Mills, ON

Thank you very much.

Dr. Caraway, do you have any comments around the remarks from Ms. Laidlaw with respect to where the economy is versus mind manipulation or changing the behaviour of consumers? What role does government have to play in that with respect to regulation? I realize and understand that we can't just create regulations where bad actors don't always follow the regulations. What role do you think government has to play in that balance of economy, efficiency and mind manipulation?

4:20 p.m.

Associate Professor of Media Economics, University of Toronto, As an Individual

Brett Caraway

One thing that I think is really important...well, there are kind of two things.

I think we need to pay very careful attention to what constitutes informed consent. What is problematic to me is the way in which not just children but also everyday users are confronted with end-user licence agreements that require someone like Dr. Laidlaw to make sense of them because they are so convoluted. They require so much expertise and are subject to change almost on a daily basis. I think it's important to revisit what actually counts as consent here.

Then there is transparency and the way in which the data is used. This is something where I do think that you need to be able to have something like a Privacy Commissioner, who can send in a third party auditor to see what's actually happening behind the scenes.

Lastly, I would say that the penalties have to have bite to them. Yes, $25 million sounds like a lot, but maybe not to Meta or Alphabet, while 5% of global revenues sounds a little more serious. I like that sort of approach too.

4:25 p.m.

Liberal

Iqra Khalid Liberal Mississauga—Erin Mills, ON

Thank you.

We've seen, over the past number of years, social media being used as a platform for advocacy, for speaking out and expressing yourself, not only across Canada but also across the world. A couple of years ago, on TikTok, we saw a young woman putting on makeup and talking about issues in order to circumvent the algorithms on what was being said or what was being displayed.

Where is that interlink between social media and freedom of expression and making sure that kids in Canada have the safety and security they need as they navigate through this space as well?

4:25 p.m.

Associate Professor of Media Economics, University of Toronto, As an Individual

Brett Caraway

Who would you like to respond to that?

4:25 p.m.

Liberal

Iqra Khalid Liberal Mississauga—Erin Mills, ON

I would like your comments first, Dr. Caraway, and then I'll go to Ms. Laidlaw.

4:25 p.m.

Conservative

The Chair Conservative John Brassard

We have roughly a minute. Go ahead, Mr. Caraway.

4:25 p.m.

Associate Professor of Media Economics, University of Toronto, As an Individual

Brett Caraway

Since that bears on freedom of expression, Dr. Laidlaw would maybe want to take that one, instead of me.

4:25 p.m.

Associate Professor and Canada Research Chair in Cybersecurity Law, University of Calgary, As an Individual

Dr. Emily Laidlaw

Thanks, Dr. Caraway.

I will say that freedom of expression is foundational. If you pass a law that just incentivizes a focus on harms, you incentivize companies to put in rudimentary solutions that, in fact, backfire. There's been a lot of evidence of backfiring, where what ends up being silenced is racialized and other marginalized voices.

For the requirement on companies, if we care about harms, we care about harms to rights, so it needs to be a dual focus that social media companies have. They need to focus on how they protect and promote freedom of expression and show that to a regulator. They need to demonstrate the steps they are taking that are contextual and bespoke to their services.