Evidence of meeting #95 for Access to Information, Privacy and Ethics in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was tiktok.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Brett Caraway  Associate Professor of Media Economics, University of Toronto, As an Individual
Emily Laidlaw  Associate Professor and Canada Research Chair in Cybersecurity Law, University of Calgary, As an Individual
Matt Malone  Assistant Professor, Thompson Rivers University, As an Individual
Sam Andrey  Managing Director, The Dais
Joe Masoodi  Senior Policy Analyst, The Dais

5 p.m.

Conservative

The Chair Conservative John Brassard

Thank you.

Thank you, Mr. Gourde.

Next, we have Ms. Khalid, for six minutes.

5 p.m.

Liberal

Iqra Khalid Liberal Mississauga—Erin Mills, ON

Thank you very much, Chair.

Thank you to our witnesses for coming before us today and shedding light on this very important issue.

Mr. Malone, I'll start with you. We had representatives of TikTok come to our committee, and we learned from them that the majority of Canadian data is actually not stored in Canada. It is stored elsewhere across the world, including Malaysia, Singapore, etc. What are the legal implications of that with respect to Canadians' privacy rights?

5:05 p.m.

Assistant Professor, Thompson Rivers University, As an Individual

Matt Malone

Thanks for the question.

I think it's really important to identify the TikTok representatives who spoke as lobbyists. They're registered lobbyists, and they do lobbyist work. I think it's important to talk about how a lot of the claims they made were very disingenuous. There are easy bypasses around a lot of the safety controls for children that they vaunted.

TikTok has been caught—to respond more directly to your question—engaging in all kinds of worrying conduct with respect to user data. There is public reporting that talks about TikTok accessing physical locations of journalists who are using the app, in order to track down their sources. That's in the public domain. There is public reporting about TikTok directing user data from the United States through China despite assurances otherwise, and there's a raft of other reporting.

There's internal government reporting from Canadian government actors like the Privy Council Office's intelligence assessment secretariat that identifies all kinds of other problems around the type of data and the persistent collection of data that occurs through the app. There are also materials that I've seen from the cyber-threat intelligence unit at the Canadian Forces intelligence command at the Department of National Defence that identify a series of concerning problems around censorship and so forth.

One of the really difficult issues here is that Canadian law is very permissive when it comes to data transfers. Even if you look at the proposed privacy legislation, Bill C-27, there's essentially nothing that would stop data transfers outside of Canada. Certainly, the privacy notice for TikTok states that by using TikTok you accept the terms and conditions, which are that the subsidiary TikTok can share that data with its corporate body, ByteDance, and Canadian law lets that happen. Even the proposed Canadian law would let that happen. Proposed section 19 and proposed subsection 11(1) of Bill C-27 specifically permit this type of data transfer.

Canadian data transfer law is essentially premised on the idea that organizations can send data to other organizations if they deem the protections are sufficient or adequate, as they would be in Canada. This approach is really different from the European approach, which is jurisdictionally grounded—country to country. You can't transfer data outside of a country unless you're satisfied that the protections would be essentially equivalent. There's a really big difference in Canadian data transfer law compared to the European data transfer law. Once data gets out of Canada, there's really no telling what happens to it. They don't take basic safeguards like you do.

For this meeting, I asked the chief information officer of the House of Commons where the data was being localized and processed for Zoom, which I would be using, and I was told—and I was very happy and impressed by this—that the data would be processed in Canada. Your in camera meetings are even more secure, so good on you. It's not for the users of TikTok.

5:05 p.m.

Liberal

Iqra Khalid Liberal Mississauga—Erin Mills, ON

Thank you. I appreciate that.

In previous interviews, you've talked about power imbalances with users and the collection of vast amounts of data. What did you mean by that? Can you expand on that a bit for us?

5:05 p.m.

Assistant Professor, Thompson Rivers University, As an Individual

Matt Malone

Like many folks who have appeared before this committee and committees dealing with related topics, I have a lot of concerns about how these power imbalances affect users' ability to offer consent that is meaningful and that is informed. When you click “accept” on a very length privacy notice, your ability to offer or provide consent is really challenged when the power imbalances that exist are such that you are an individual user and the company that might be collecting the data might have a market valuation that exceeds the size of a G7 country.

5:05 p.m.

Liberal

Iqra Khalid Liberal Mississauga—Erin Mills, ON

Thank you very much. I really appreciate that.

Mr. Andrey and Mr. Masoodi, you've authored or contributed to a report that has surveyed online harms in Canada as it relates to social media platforms in the public square. Can you tell us a bit more about the key findings of this report and speak specifically about the way vulnerable marginalized communities and groups are targeted online?

5:05 p.m.

Managing Director, The Dais

Sam Andrey

Sure. I'm happy to.

Joe, feel free to jump in here.

We do an annual survey of a representative group of Canadians to track, basically, Canadians' experiences online with harmful content or illegal experiences. At a high level, we start with hate speech: 40% of Canadians say they see hate speech at least monthly, and about 10% of Canadians say they have personally been targeted by online hate speech. Those rates are about double or triple for a variety of marginalized communities and racialized communities. It would be about double that rate for 2SLGBTQ Canadians, and three times that rate, or 30%, say they have personally been targeted with hate speech. There's a tracking of that.

We also track exposure to, and belief in, misinformation and disinformation. We have Canadians do a quiz, basically, of a series of true and false statements. We find that about 15% of Canadians we assess as having a high degree of belief in misinformation. Those Canadians are more likely to say they consume their news on social media and are less trusting of mainstream media sources.

5:10 p.m.

Conservative

The Chair Conservative John Brassard

Mr. Andrey, I'm going to have to stop you there, sir. I apologize for that.

We are going to suspend for voting. When we get back, it will be Mr. Villemure's turn for six minutes.

It should take about 15 minutes or so before we're back, so I appreciate your patience.

Thank you.

The meeting is suspended.

5:30 p.m.

Conservative

The Chair Conservative John Brassard

Welcome back to the meeting.

We will now begin the round.

Mr. Villemure, you have six minutes. Please go ahead.

5:30 p.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

Thank you to the witnesses for being here.

Mr. Malone, it's a pleasure to see you back here.

Is informed consent impossible, in your view? Is it pointless? Is it a mirage in today's world?

December 4th, 2023 / 5:30 p.m.

Assistant Professor, Thompson Rivers University, As an Individual

Matt Malone

I think the word mirage accurately captures the current state of affairs.

I think informed consent, which is what all Canadian privacy laws are currently based on, doesn't serve the ends that we really need data protection and privacy law in this country to serve. The reality that Bill C-27 has perpetuated this—the idea that this instrument will still work and still serve its ends even with the legitimate business exceptions, even with the rules around implied consent—really won't take us to a place where we have robust privacy and data protection law in this country.

I think you need to fundamentally shift the paradigm so that possessing, retaining, using or disclosing personal information becomes a liability, as opposed to a profitable way to run a business, which is what we have let these ad exchanges/social media companies do.

5:30 p.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

That's a very good way to look at it. As I see it, free and informed consent, as they say in medicine, is never free if you want to access whatever it is. Informed consent is a fiction, or even a mirage.

You also said that Canada is a middle power in this area. That's particularly true vis-à-vis the European Union, the U.S. and China.

What hope does Canada have of playing a role and carving out a credible place for itself through its legislation?

5:30 p.m.

Assistant Professor, Thompson Rivers University, As an Individual

Matt Malone

I think Canada has an opportunity to reclaim a bit of the traditional role that we like to see Canada have, which is serving as a middle power with allied states.

Several ideas have been floated around creating safe dataflow zones that map onto the security alliances that already exist, like NATO for example. We already have a commitment to mutual defence with our NATO allies. It would seem logical that we might feel comfortable sharing our data, our personal information, with these allies in a free cross-border dataflow zone. There are opportunities for Canada to certainly create a niche role when it comes to regulation and the creation of regulatory frameworks for cross-border dataflows.

I think the more appalling concern that I have is with the state of the current law. The fact is that a lot of Canadian law, and certainly the priorities of legislators right now, is to create privacy law that applies only to the private sector. I think one of the real problems we've seen—and we saw this through the pandemic as well—is that we need robust privacy and data protection laws that also apply to government. I've been really upset at the fact that the artificial intelligence and data act does not apply to government actions, which is really concerning when you think about the deployment of AI technologies, AI-fueled and AI-driven technologies such as the ArriveCAN app.

I've also been really concerned about the fact that the priorities with Bill C-27 have not focused on government. To me, it's disturbing that this effort has been led by the industry portfolio and Bill C-27 would create new regulatory instruments that would be answerable to the Minister of Industry. It's really hard to say that we're approaching privacy from a human rights or law enforcement or national security perspective when the bodies we're creating are not truly independent. Not only are they not truly independent, but they're subservient to an industry portfolio whose mandate is to grow the economy.

5:30 p.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

I share your concerns, believe me.

Mr. Andrey, I have the same question for you.

Canada is a middle power, between the European Union and the U.S. or China.

What could Canada propose that would be seen as acceptable?

5:35 p.m.

Managing Director, The Dais

Sam Andrey

I honestly echo a lot of what was just said. I think there's an ability to build on...and maybe I'll speak specifically with respect to online harm legislation. Germany was the first mover and basically created a 24-hour takedown regime. The outcome was an over-censorship response from many of the large platforms. They didn't want to deal with the liability, so they removed too much lawful expression.

We have an opportunity to learn from mistakes like that.

5:35 p.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

Since time is running out, I'm going to interrupt to ask you to please send the committee some information on what happened in Germany. That would be very appreciated.

I have one last question before I'm out of time.

Do you see a role for an independent regulator, along the lines of the Conflict of Interest and Ethics Commissioner, the Privacy Commissioner or the Commissioner of Lobbying? Conversely, do you think it should fall under the scope of a department like Innovation, Science and Economic Development Canada, as is currently proposed? Where do you see that regulator? What powers should it have?

5:35 p.m.

Managing Director, The Dais

Sam Andrey

I see a role for a digital regulator.

Currently, there's the idea of having an AI data regulator in Bill C-27, but it's an ISED department official. This, I think, is unacceptable, especially given that the minister will have the competing roles of championing the economic benefits of AI and regulating its risks. At a minimum, they should be appointed by the GIC. Ideally, it would be a parliamentary appointment that is separate.

I think you could task the same regulator with the online harms portfolio. It could be two, but that's a lot of digital regulators. That regulator would have the power to do audits and a forum on ombudsman-type functions to support individuals. They would also have a transparency function.

5:35 p.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

It would be an independent regulator, then.

5:35 p.m.

Conservative

The Chair Conservative John Brassard

Very good.

Thank you, Mr. Villemure.

Mr. Andrey, thank you.

Mr. Green, you have six minutes. Go ahead, sir.

5:35 p.m.

NDP

Matthew Green NDP Hamilton Centre, ON

Thank you very much.

I know there was a shout-out to start the round of testimonies. In the spirit of shout-outs, I want to give one to Christelle Tessono. I understand she is now in policy and research at The Dais. I know her work has been reflected in previous committees, as well as in some of the deep dives I have taken into this field. The technology is often far ahead of the scope of our subject matter expertise, so having subject matter experts like yourselves is incredibly important. I appreciate your being here today. I appreciate any contributions that she may have made, as well.

I want to begin with Mr. Malone.

In a September 2023 article, you mentioned you reviewed a federal government document entitled “Economic Security and Technology: TikTok Takeover”. Are you able to highlight the concerns raised in that report, and do you share those concerns?

5:35 p.m.

Assistant Professor, Thompson Rivers University, As an Individual

Matt Malone

I'm not sure what document you're referring to.

Are you referring to the document that informed the piece for my recommendation to ban all social media applications on government-issued devices?

5:35 p.m.

NDP

Matthew Green NDP Hamilton Centre, ON

That's correct.

If you've been following the study, you will have noted I have been very adamant about expanding the scope of regulation, oversight and scrutiny to all platforms, not just TikTok.

If you care to comment on that, it would be helpful.

5:35 p.m.

Assistant Professor, Thompson Rivers University, As an Individual

Matt Malone

The document comes from National Defence and their cyber-threat intelligence unit. It identifies a series of concerns with respect to TikTok that include surveillance and intelligence operations, privacy violations, data harvesting, political interference, narrative control and Communist Party of China censorship exports. In that brief, there are also a series of concerns expressed with respect to many other social media companies, such as Snapchat and LinkedIn.

I would be very happy to share this brief with the committee, if you wish to have it.

5:35 p.m.

NDP

Matthew Green NDP Hamilton Centre, ON

Yes, that would be very helpful.

In your opinion, does the risk of having social media apps on federal government phones differ from that of employees having the same apps on their personal phones?

5:35 p.m.

Assistant Professor, Thompson Rivers University, As an Individual

Matt Malone

With respect to the type of information being shared on government-issued devices, it would seem unquestionable that there's probably greater sensitivity, especially when this information.... Even if it's harmless on an individual level, it could potentially be useful in the aggregate. You have to think about things like location data, which might reveal things like the location of politicians or members of the Canadian Armed Forces. There was a story a few years ago, from public reporting, about how a leak of location data from a Fitbit-style company led to an ability to map, essentially, an American military base in Helmand province. This data is, obviously, very sensitive in the aggregate.

If you permit, I would go beyond this and say there should be a ban on social media applications on government-issued devices, unless there's a strong business justification.

However, there's also a very strong indication of what the priorities of government are. Earlier, I talked about a lack of funds for the RCMP's cybercrime investigative team. However, if you look at the arsenal of folks who work for the government in social media or communications, it's exponentially larger than the resources and personnel we're devoting to fighting online harms as they are actually experienced by some of the most vulnerable Canadians.