Evidence of meeting #85 for Access to Information, Privacy and Ethics in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was tiktok.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Steve de Eyre  Director, Public Policy and Government Affairs, Canada, TikTok
David Lieber  Head, Privacy Public Policy for the Americas, TikTok
Clerk of the Committee  Ms. Nancy Vohl

5:25 p.m.

Conservative

The Chair Conservative John Brassard

Good afternoon, everyone. I am going to call the meeting to order.

I have a long preamble here, but I want to get right to this so I'm going to shorten it up.

Pursuant to Standing Order 108(3)(h) and the motion adopted by the committee on Tuesday, January 31, 2023, the committee is commencing its study of the use of social media platforms for data harvesting and unethical or illicit sharing of personal information with foreign entities.

This a reminder that the connections test, for the sake of the committee, has been completed with our witness. I know he is coming to us from Washington, D.C., today, so let's start the meeting.

First of all, I apologize for the late delay, but I have my own 30-minute declaration on decorum at committee that I'd like to read right now.

5:25 p.m.

Voices

Oh, oh!

5:25 p.m.

Conservative

The Chair Conservative John Brassard

I also want to recognize that we have the parliamentary officers' study program here today. This is why we have a large crowd in the back. I want to say “welcome to committee” to them as well.

We have our witnesses until 6:30. From TikTok, we have Mr. Steve de Eyre, director of public policy and government affairs in Canada, and Mr. David Lieber, head of privacy public policy for the Americas. I want to thank both gentlemen for being here today.

We're going to start with you, Steve. You have five minutes to address the committee. Go ahead, please.

5:25 p.m.

Steve de Eyre Director, Public Policy and Government Affairs, Canada, TikTok

Thank you, Mr. Chair and members of committee.

As you said, my name's Steve de Eyre. I'm the director of public policy and government affairs for TikTok Canada. I'm joined by my colleague, David Lieber, who's our head of privacy public policy for the Americas. He's on Zoom from Washington.

While I do work for TikTok, first and foremost, I am a father of two wonderful children. I care deeply about them being safe and secure online—as I know all parents would—so I'm really happy to have this opportunity to be here today to discuss how we're protecting the data of all Canadians, particularly teenagers.

Millions of Canadians and over a billion people around the world come to TikTok to be entertained, to learn and to build community. TikTok is where an incredibly diverse range of Canadian creators and artists are finding unprecedented success with global audiences. It's where indigenous creators are telling their own stories in their own voices. It's where small business owners are finding new customers, not just across Canada but around the world.

For example, just last week I was able to listen to Jenn Harper. She's the CEO of Cheekbone Beauty, which is an indigenous-owned B corp-certified beauty brand. She spoke at the Toronto Global Forum about the role that TikTok has played in helping her to grow a global customer base and build a community that's invested in learning more about sustainability and indigenous peoples.

We know that with this rapid growth comes scrutiny. We welcome conversations around how we protect Canadians' data.

Let me start by addressing a few misconceptions about TikTok.

First, TikTok is a subsidiary of its parent company, ByteDance. ByteDance is not owned or controlled by the Chinese government. It's a private company. Nearly 60% of ByteDance is owned by global institutional investors, such as General Atlantic and Susquehanna International Group; 20% is owned by its founders; and the other 20% is owned by employees like me. Of ByteDance's five board members, three are Americans.

TikTok has thousands of employees around the world, with head offices in Los Angeles and Singapore. We have a Canadian office in Toronto's Liberty Village with over 150 employees who work closely with Canadian creators, artists and businesses to help them achieve success on the platform.

The second misconception I'd like to address is around data collection. TikTok's handling of Canadians' user data is governed by Canadian laws like PIPEDA and provincial privacy laws. The way TikTok collects and uses data is similar to the way in which other platforms do. In fact, when someone signs up for TikTok, all they're required to share is their email or phone number, their date of birth and their country location. These are used to verify that they're eligible to create an account and to create an age-appropriate experience. We do not require users to provide things like their real names or to enter personal details about themselves.

We take security concerns about our platform very seriously, and we are working globally to be responsive and put forth constructive, industry-leading solutions to address any concerns. If the Canadian government has concerns about the safety and security of our platform, we want to understand them so we can address them.

As TikTok has grown, we've tried to learn the lessons of companies that have come before us, especially when it comes to the safety of teenagers. While the vast majority of people on TikTok are over 18, we've spent a lot of time on measures to protect teenagers. Many of these are a first for the social media industry.

For example, when a teen under 16 joins TikTok, their account will be set to private by default and they'll have direct messaging disabled. Teens under 18 are unable to livestream and have a default 60-minute screen time limit turned on. We also provide a suite of family-pairing tools so parents and guardians can participate in their teen's experience and make choices that are right for their families.

We're proud to partner with leading Canadian non-profits like MediaSmarts, Kids Help Phone, Tel-jeunes and Digital Moment to support their work to educate Canadians and to create resources for things like online safety, well-being and digital literacy.

We're proud to have also built a constructive relationship across the federal government over the past few years and to have partnered to support key public policy initiatives. For example, during the 2021 federal election, we worked with Elections Canada to build a bilingual, in-app election centre that provided authoritative information to Canadians on when, where and how to vote. We were also proud that year to sign on to the government's declaration on election integrity online.

TikTok is committed to the safety and security of our community and to maintaining the integrity of our platform.

I look forward to sharing more with you about how we accomplish this.

Thank you.

5:30 p.m.

Conservative

The Chair Conservative John Brassard

Thank you, Mr. de Eyre.

I'm going to remind committee members before we start the round of questioning that we're kind of old school around here. If you have a question, go directly to the witness, either in person or online, just to be clear, and let Mr. Lieber know that you're asking a question.

We're going to start with six minutes for Mr. Kurek.

Go ahead, please.

5:30 p.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

Thank you very much.

Thank you to our witnesses for coming.

I'm curious, Mr. de Eyre. You mentioned in your opening statement that you have kids. Do they have TikTok?

5:30 p.m.

Director, Public Policy and Government Affairs, Canada, TikTok

Steve de Eyre

They're too young. In Canada, you have to be 13 to use TikTok, and it's actually 14 in Quebec.

No, they don't use it. I will scroll it with them sometimes and we'll find videos that we want to watch together, but they're too young to have their own accounts.

5:30 p.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

Okay. I appreciate that.

Now I'm curious about Mr. Lieber and if he has kids. If so, are they on TikTok?

5:30 p.m.

David Lieber Head, Privacy Public Policy for the Americas, TikTok

Thank you for the question.

Yes, I do have two children. One is 11 and is too young for TikTok. The other is 15, and he is on TikTok.

5:30 p.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

I appreciate that.

Speaking of age verification, unlike some social media platforms that have age verification, there seem to be ways around some of these things.

Specifically with TikTok, Mr. de Eyre, you're asked to put in your age, and there are really no questions asked. You talked about that. It's simple, but have the concerns ever been highlighted about how simple that is, in that as a result a young person may be able to access content that would not be age appropriate?

5:30 p.m.

Head, Privacy Public Policy for the Americas, TikTok

David Lieber

I'm happy to take that question.

We do have a neutral age-gating function for users. We don't clue them in about what the eligibility age is. We don't indicate to them that providing an age that's either under 13 or over 13 is what they need to do, but I think you're raising a valid point. Age assurance strategies are something that the industry is talking about.

We're talking about things like age detection technology and age verification. Those can be helpful in providing more accuracy about users' ages, but they also have privacy implications. Therefore, there's a broader conversation about the types of age assurance strategies that we can deploy as an industry to increase the likelihood and confidence that we know what age users are when they disclose that to us when they create an account.

5:35 p.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

Thank you for that.

I do apologize if this is a little rapid fire, but we have limited time here.

Governments south of the border in the United States and here as well, and a number of provincial governments, as I understand it, have limited TikTok use on government devices, citing security concerns and privacy concerns. Some of that comes back to ownership and where the data is stored. I have a few questions here that I'm hoping you can provide some answers to.

ByteDance's HQ, I understand, is located in Beijing, but the company is registered in the Cayman Islands. Can you provide, in as short a time as possible, an explanation as to why that would be the case?

5:35 p.m.

Director, Public Policy and Government Affairs, Canada, TikTok

Steve de Eyre

Sure. ByteDance is actually a global company. It was founded in China but has offices around the world. As I mentioned in my opening statement, of the board, which it is ultimately responsible for it, three of those five members are Americans.

5:35 p.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

Thank you for that.

One of the challenges with data is where data is stored. This is not unique to TikTok, but certainly it's been highlighted with TikTok.

You're a global company. Can you highlight—and I'm not asking for an address—the servers? Where are they located? Where does this data flow through? Specifically, as Canadians, for a Canadian on TikTok, where does that data end up? Where is it accessed, both when they're in Canada and if they happen to travel to other jurisdictions?

Go ahead, Mr. Lieber.

5:35 p.m.

Head, Privacy Public Policy for the Americas, TikTok

David Lieber

Canadian data is stored in the United States, in Singapore and in Malaysia. That's where the servers are located.

5:35 p.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

Who has access to those servers? To ensure they are compliant with Canadian privacy laws is obviously a big part of it. I'm wondering if you can highlight how a Canadian can be certain that Canadian privacy laws and their information online are protected when servers are stored around the world and in jurisdictions that have very different privacy laws.

5:35 p.m.

Head, Privacy Public Policy for the Americas, TikTok

David Lieber

Thank you for the question.

As I think my colleague Steve noted in his opening statement, we do have a Canadian operating entity. We have Canadian users so we are subject to Canadian law, but I want to emphasize, too, that we have data access approval policies. If any employee wants to access user data, they need to make a request.

We operate by the principle of least privilege, which means that employees only have access to the minimum amount of data necessary to perform their job functions. In some cases, that may mean they don't have access to personally identifiable information at all. We also have data classification policies with increasing levels of sensitivity of data. User data is the most sensitive. If an employee makes a request for user data, that will require increasing and higher levels of approval and more rigorous review.

We think a combination of these protocols addresses some of the risks and concerns you're alluding to.

5:35 p.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

Okay.

I have a couple of quick questions. I'm hoping the answers can be yes or no.

Does TikTok use biometric data at all, whether that's for logging in purposes and whatnot on a device or for tracking an individual's facial structures and that sort of thing?

5:35 p.m.

Head, Privacy Public Policy for the Americas, TikTok

David Lieber

We do not use biometric data to identify individuals. In terms of the data that you may be alluding to, we have filters and effects, so we identify where eyes may be located on a face in order for funny glasses to be put on, for example, or for voice effects. That data is not personally identifiable.

5:35 p.m.

Conservative

The Chair Conservative John Brassard

Thank you, Mr. Kurek.

Ms. Khalid, you have six minutes. Go ahead, please.

5:35 p.m.

Liberal

Iqra Khalid Liberal Mississauga—Erin Mills, ON

Thank you very much, Mr. Chair.

Thank you to our witnesses for appearing today.

First and foremost, Mr. de Eyre, in your opening remarks you talked about access to TikTok. I will say for the record that TikTok allowed 1.4 million children under the age of 13 to use the app in 2020, despite your own rules that you identified.

I want to talk about a specific thing that you mentioned in your statement. You talked about providing “an age-appropriate experience” to people. What does that mean? How do you identify what an age-appropriate experience is if you're not collecting data on people and what their interests are?

As you're going through that, I'd love to know more about your algorithms and how the content you're displaying to people is for each individual to have an age-appropriate experience or to have an entertainment experience, as you identified in your opening remarks.

5:40 p.m.

Director, Public Policy and Government Affairs, Canada, TikTok

Steve de Eyre

Great. I'd be happy to start on those few questions. I appreciate it and the opportunity to be here.

First, I'm not familiar with the figure you cited. What I can tell you is that we issued our most recent community guidelines enforcement report. It's like a transparency report that we issue quarterly. We issued the report for the second quarter of 2023. It just came out last week. In that report, we identified that we removed over 18 million accounts globally of users who were suspected of being under 13.

I think my colleague David identified to Mr. Kurek some of the tools and some of the challenges but also some of the ways in which we work to identify users who may not be old enough to use the platform. When we do find those accounts, they are removed.

Your second question, about an age-appropriate experience, is an excellent question. We work with non-profits around the world. In Canada those are groups like MediaSmarts, Digital Moment and Kids Help Phone, who are doing leading research on the experience of youth online and how we should approach that. We take that feedback and build policies. In the past few years, we have introduced such things as age-appropriate content labelling. There are types of videos that will be labelled and that will not be recommended to a user who is under 18. Take cosmetic surgery, for example. If somebody posts a video about cosmetic survey, that's ineligible to be recommended to users under the age of 18.

Does that answer your question?

5:40 p.m.

Liberal

Iqra Khalid Liberal Mississauga—Erin Mills, ON

I obviously would like for you to follow up, perhaps, with a little bit more data on how those algorithms are actually operating in terms of providing an appropriate experience to people. Specifically, how do algorithms work? When you're saying that you're not collecting biometric data, what kind of data are you collecting to be able to determine what is an appropriate reel or TikTok video for a certain person?

5:40 p.m.

Director, Public Policy and Government Affairs, Canada, TikTok

Steve de Eyre

I'm happy to respond to that as well.

Essentially, the way the TikTok algorithm works is that it looks at signals on how you interact with videos. There are positive signals: Do you like it, comment on it or share it? Do you watch the whole video? Do you watch it again? There are also negative signals: Do you swipe away from it within a couple of seconds? Based on that, we can identify what types of videos you like and look at similar other users who have interacted similarly with that video and then recommend additional content to you. That really allows Canadians to find content and to be recommended content that they think they're going to love. That's what I hear all the time when I talk to people about TikTok. They've learned new things and found these niches that are really special to them.

Perhaps I can give you an example. There's actually a creator in your riding in Mississauga. Her name is Danielle Johnson. She has a company called Realm Candles. I got to meet her earlier this year at one of our events. She specializes in vegan, ecofriendly candles that she's had great success on—

5:40 p.m.

Liberal

Iqra Khalid Liberal Mississauga—Erin Mills, ON

I'm so sorry. I am quite limited on time.

That's really not what I asked. My question is around the framework of how algorithms work. How does the data you're collecting impact who sees what in their TikTok?

As a spin-off, you mentioned that the data for Canadians is stored in the U.S. It is stored, I believe you said, in Malaysia and in Singapore. I'm just wondering how the legal aspect of it works as well for Canadians.

How do they protect their data, when we don't have any offices for your company here in Canada that are looking over Canadian data?