Evidence of meeting #94 for Access to Information, Privacy and Ethics in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was use.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Anatoliy Gruzd  Professor and Canada Research Chair in Privacy-Preserving Digital Technologies, Toronto Metropolitan University, As an Individual
Catherine Luelo  Deputy Minister and Chief Information Officer of Canada, Treasury Board Secretariat
Commissioner Bryan Larkin  Deputy Commissioner, Specialized Policing Services, Royal Canadian Mounted Police
Brigitte Gauvin  Acting Assistant Commissioner, Federal Policing, National Security, Royal Canadian Mounted Police
Clerk of the Committee  Ms. Nancy Vohl
Alexandra Savoie  Committee Researcher

3:35 p.m.

Conservative

The Chair Conservative John Brassard

Good afternoon, everyone.

I'm going to call the meeting to order.

Welcome to meeting No. 94 of the House of Commons Standing Committee on Access to Information, Privacy and Ethics.

Pursuant to Standing Order 108(3)(h) and the motion adopted by the committee on Tuesday, January 31, 2023, the committee is resuming its study of the use of social media platforms for data harvesting and unethical or illicit sharing of personal information with foreign entities.

Today's meeting is taking place in hybrid format, pursuant to the Standing Orders. Members are attending in person in the room and remotely using the Zoom application.

I just want to remind all members today that care must be taken with regard to the earpieces for interpretation. Please be mindful to not place your earpiece near the microphone, as this can result in feedback for the interpreters and may cause acoustic shock, which could in turn cause injury to our interpreters.

We have a witness in the first hour on Zoom. I will remind the committee that they have been tested and have the appropriate headwear.

I'd now like to welcome our first witness today. We have, as an individual, Dr. Anatoliy Gruzd, professor and Canada research chair in privacy-preserving digital technologies from the Toronto Metropolitan University.

Dr. Gruzd, you have up to five minutes for your opening statement.

Welcome, sir. Go ahead, please.

3:35 p.m.

Dr. Anatoliy Gruzd Professor and Canada Research Chair in Privacy-Preserving Digital Technologies, Toronto Metropolitan University, As an Individual

Thank you, Mr. Chair and committee members, for this opportunity to discuss the potential threat of foreign interference and the risks associated with the misuse of social media data.

I'm Anatoliy Gruzd, a Canada research chair and professor at Toronto Metropolitan University. I'm also a co-director of the social media lab, where I study social media's impact on society, information privacy and the spread of misinformation around conflicts such as the Russia-Ukraine war.

While my comments today are my own, they are grounded in research conducted at the social media lab and are informed by 15 years of working with various types of social media data.

As previous witnesses have testified, there are concerns that TikTok could be vulnerable to foreign interference, leading to major implications for our national security and individual privacy. However, I would like to point out that a loaded gun is different from a smoking gun. Despite its being framed as a national security threat, to date, there's still no public evidence that the Chinese government has spied on Canadians using a back door, or privileged access, to the TikTok app.

That is not to say there is nothing to worry about. There are valid concerns regarding the potential for TikTok and other platforms to be exploited by malicious actors for propaganda and radicalization. For example, Osama bin Laden's 2002 “Letter to America” recently resurfaced on TikTok and was seen by millions. However, these concerns are not limited to any one platform. Rather, they represent broader challenges to the integrity and security of our information environment.

As such, we must take a comprehensive approach to addressing these issues by compelling platforms to commit to the following: adopting the principles of privacy by design and by default, investing in expanding their trust and safety teams, and sharing data with researchers and journalists.

I'll expand each of these points.

Teaching digital literacy is important, but it's unfair to place all the responsibilities on individuals. Social media platforms are complex, and algorithms that decide what users see and don't see remain black boxes. The only true choice we have is to disconnect from social media, but it's not realistic or practical, as our own research has shown, because most Canadians have at least one social media account.

It's important to shift the focus from individual responsibility to developing strategies that compel companies to implement privacy by design and by default. Currently, it's all too common for platforms to collect more data by default than necessary.

However, even with privacy protection settings enabled, Canadians may still be vulnerable to malicious and state actors. According to a national survey that our lab released last year, half of Canadians reported encountering pro-Kremlin narratives on social media. This highlights concerns about the reach of foreign propaganda and disinformation in Canada, extending beyond a single platform.

In another example, earlier this year, Meta reported a sophisticated influence operation from China that spanned multiple platforms, including Facebook, Twitter, Telegram and YouTube. The operation tried to impersonate EU and U.S. companies, public figures and institutions, posting content that would match their identity before shifting to negative comments about Uyghur activists and critics of China.

To fight disinformation, platforms should expand their trust and safety teams, partner with fact-checking organizations and provide access to credible news content. Unfortunately, some platforms, like Meta and X, are doing the exact opposite.

To evaluate how well platforms are combatting disinformation, Canada should create an EU-style code of practice on disinformation and a transparency repository that would require large platforms to report regularly on their trust and safety activities in Canada.

To further increase transparency and oversight, Canada should mandate data access for researchers and journalists, which is essential to independently detect harmful trends. In the EU, this is achieved through the new Digital Services Act.

Currently, TikTok doesn’t provide data access to Canadian researchers, but it does so for those who reside in the U.S. and EU. Sadly, TikTok is not alone in this regard. Recently, X shut down its free data access for researchers.

In summary, while it's important to acknowledge the impact of foreign interference on social media, banning a single app may not be effective. It could also undermine trust in government, legitimize censorship and create an environment for misinformation to thrive.

A more nuanced approach should consider the various forms of information and develop strategies to address them directly, whether on TikTok or other platforms. This may involve a wider adoption of privacy by design and by default, expanding trust and safety teams in Canada and compelling platforms to share data with researchers and journalists for greater transparency and independent audit.

Thank you.

3:40 p.m.

Conservative

The Chair Conservative John Brassard

Thank you, Dr. Gruzd.

We will start our six-minute round of questioning with Mr. Kurek.

Go ahead, sir. You have six minutes.

3:40 p.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

Thank you very much, Chair.

Dr. Gruzd, thanks for being here with us today and sharing your insights with the committee. I would just mention that we have a short amount of time, the way these committees are structured, so please feel free, specifically when it comes to recommendations, to follow up with this committee if there are specific action items that you would recommend in your expertise.

You talked about TikTok and the loaded gun versus the smoking gun. I'm curious to know whether your research has included anything surrounding WeChat. I know there have been reports of a very close association between the ownership structure of WeChat and the communist state in Beijing. Has your research looked into that?

3:40 p.m.

Professor and Canada Research Chair in Privacy-Preserving Digital Technologies, Toronto Metropolitan University, As an Individual

Dr. Anatoliy Gruzd

Unfortunately, WeChat, like many other messaging apps, is invisible to most researchers. The reasons are good in that these are usually private conversations. Social media researchers look at public discourse on public social media platforms. There are ways in which platforms can provide more evidence and data to researchers for public groups within those platforms. Unfortunately, we don't have this ability.

That goes to one of my recommendations: Canada should mandate research access to independent researchers for their data.

3:40 p.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

Thank you for that.

We're seeing play out before us in real time, with the conflict in Israel and Palestine and the targeting of Israelis and Gazans by the terrorist group Hamas, misinformation and disinformation. I'm wondering if you've had a chance to follow this and if you could provide some comments to the committee about the impact that would have. We've seen how the information shared online has contributed to protests that have taken place on the streets of our country.

Could you add anything to that conversation in relation to social media and the larger experience that Canadians find themselves in?

3:40 p.m.

Professor and Canada Research Chair in Privacy-Preserving Digital Technologies, Toronto Metropolitan University, As an Individual

Dr. Anatoliy Gruzd

Yes. Unfortunately, social media tools, as many previous witnesses have reported, have been weaponized by various state actors and other interest groups. They are too accessible to the public in trying to shape public opinion. In some cases we hear reports about large, automated bot networks. Sometimes it's questionable, though, how effective they might be, simply because it's very hard to gain credibility on social media platforms. In some cases, like Internet research agency cases, where we actually had data provided by Twitter to researchers to dissect, investigate and do a post-mortem of their dataset, we noticed how those bot accounts would develop their credibility by posting innocent content on sites like X, later on switching to different narratives.

This is to say that state actors are using social media platforms across the board to shape our narratives and how we view them, but they also tap into our divisions and polarization. That can be done covertly or overtly. Last year, for example, the Twitter account for the Russian embassy in Ottawa was tweeting anti-LGBTQ messages on its platform. That was not hidden. It was explicit. They were speaking to the group of individuals in this country who might already have subscribed to some of those views.

That's a bit of a long answer, but I think we do see impact. Whether it's direct or indirect, a state actor is trying to impact narratives and influence opinions. Also—

3:45 p.m.

Conservative

Damien Kurek Conservative Battle River—Crowfoot, AB

Thanks. I hate to cut you off, but we have limited time here.

It's interesting that you would bring that up. I know that we and a number of other committees addressed foreign election interference. The use of social media was a key part of that. Certainly, if you have further comments, I would invite you to send them to the committee.

I want to go to a bit of a grey area. We had TikTok before this committee, and they said, oh, privacy is great; all they require is basic information, and their settings are set up for kids. I'm paraphrasing, obviously, but very few people read the entirety of terms and conditions. Very few people understand what information is explicitly being provided. Even fewer, I would suggest, understand how impactful the information they provide is, whether it be pictures of the front of their homes or themselves on holiday.

I'm wondering if you could provide guidance to this committee, in the minute you have left, on how to balance freedom of expression, the advancement that's taken place in the social sphere, and ensuring that Canadians' privacy and safety is safeguarded.

3:45 p.m.

Professor and Canada Research Chair in Privacy-Preserving Digital Technologies, Toronto Metropolitan University, As an Individual

Dr. Anatoliy Gruzd

It goes to my point about privacy not just by design, but by default. When I installed the TikTok app on my phone just the other day, I did not even create an account and it already started tracking and sent 102 requests for information like my battery life, my device ID and such. I don't even have an account, so why do they need that information?

One way to address it is to go for platforms and marketplaces that host these types of applications, because they are the ones that approve these types of applications.

Going back to your point about long terms of service, it is a problem. One initiative that I really like is called Terms of Service; Didn't Read. It's a community-driven initiative that has been around for 10 years. They rate different terms of service for each provider, including social media platforms. They rated an E for all major social media platforms, not just TikTok. This is the lowest grade. A is the highest and E is the lowest—

3:45 p.m.

Conservative

The Chair Conservative John Brassard

Thank you, Dr. Gruzd and Mr. Kurek.

Ms. Khalid, you have six minutes. Go ahead.

3:45 p.m.

Liberal

Iqra Khalid Liberal Mississauga—Erin Mills, ON

Thank you very much, Chair.

Thank you, Mr. Gruzd, for coming in today. We really appreciate your time.

I'll start by continuing where Mr. Kurek was leading.

In the context of the Israel-Palestine war, we've seen Canadians, especially young people, being targeted for posting their views online, to the point where their employment and education are impacted. There is a kind of grouping culture online, regardless of which side of the issue they're on, and online targeting of individuals for expressing their views.

Do you think that social media companies have a responsibility to provide protection and maintenance of freedom of expression, especially for young people online?

3:45 p.m.

Professor and Canada Research Chair in Privacy-Preserving Digital Technologies, Toronto Metropolitan University, As an Individual

Dr. Anatoliy Gruzd

The reason I pause is that it goes hand in hand with the type of influencer content that individuals are consuming on these platforms that would trigger or lead them to certain expressions.

One concern we've observed over the years when conducting surveys with Canadians is that more of us are turning to social media for information about conflicts like the war in Ukraine or the war in Palestine.

What if there are no credible news organizations that provide that content? The reactions that you see quite often on social media platforms are driven by the influencer content that provides the news.

When we asked TikTok users in Canada, half of them said they use the platform for news about the war between Russia and Ukraine. This is concerning, because when you go to this platform and you search for trusted news sources, the most popular ones will be CTV, Global News and CBC, according to the digital trust rating. Their number of followers is 160,000 or 150,000. They cannot compete with influencer content.

Freedom of speech is important, but it's just as important to make sure that when our citizens—Canadians—are participating in those platforms, they have access to credible information when they react to it online.

3:50 p.m.

Liberal

Iqra Khalid Liberal Mississauga—Erin Mills, ON

Thank you for that.

You mentioned also that it's unfair to place responsibility on individuals to do their due diligence in the context of misinformation, disinformation and their own personal information that they're providing to these social media platforms.

What do you recommend? Are we talking about government regulation? Are we talking about regulation of social media platforms?

If not, is it placing or removing some of that individual responsibility from people who have to oftentimes read pages and pages of privacy agreements that they may or may not understand?

3:50 p.m.

Professor and Canada Research Chair in Privacy-Preserving Digital Technologies, Toronto Metropolitan University, As an Individual

Dr. Anatoliy Gruzd

My point about not putting all the responsibility on the individuals comes from several directions. First of all, even if individuals know how to change privacy settings, many platforms will have access to their private messages. While they feel they're protected, they're actually not.

Education is important, but it doesn't necessarily mean training individuals. It's hard to change individual behaviour, but platforms can incorporate tools that can make them more efficient and effective in terms of protecting themselves.

Here are a couple of simple examples. When you go to many browsers now, they have a button when you mouse over a picture that you can use to search and find related images. It's a simple tool that I am happy to train people on, but it's already an embedded part of the platform.

We haven't talked about generative AI, but that's the next stage of this evolutionary process. How do we make sure the tools that individual users can use to detect what is real and what is authentic...? It's not a part of these platforms. It could be through digital certification or it could be through other means, but those should be part of the platforms.

The other quick point about education is that it's much more effective to institutionalize the training.

I'll give you another example. When I was preparing for this meeting, there was a test for Zoom and the instructions told me to go to incognito mode in this browser. Providing instructions is part of the process; it's part of the institution. It's much more systematic and effective.

3:50 p.m.

Liberal

Iqra Khalid Liberal Mississauga—Erin Mills, ON

Thank you.

Can you perhaps walk us through how social media companies like TikTok use the information they gather? What's the role of artificial intelligence and algorithms in the use of that data as well?

3:50 p.m.

Professor and Canada Research Chair in Privacy-Preserving Digital Technologies, Toronto Metropolitan University, As an Individual

Dr. Anatoliy Gruzd

The use varies widely. They are private companies making money; most of their revenue is driven by ads, clearly, and most of the data harvesting is happening for that purpose. How do they deliver eyeballs to companies and individuals who are willing to pay for those eyeballs?

A lot of this will be about collecting your interests—what you like and what you don't like—so that when the time comes, they will show you a particular ad that is attractive, and you will be a ready buyer for that. One of the concerns I have is that this type of data is being linked across platforms and through your browser history. The linkage of data is quite concerning.

You asked about artificial intelligence. Can you repeat that part?

3:50 p.m.

Liberal

Iqra Khalid Liberal Mississauga—Erin Mills, ON

What's the role of artificial intelligence in that collection of data that social media platforms use, and with respect to algorithms as well?

I'll expand on that a bit. Also, how does that impact Canadians' charter rights and freedoms in how they're able to mobilize, organize or express themselves online?

3:50 p.m.

Conservative

The Chair Conservative John Brassard

Thank you, Ms. Khalid.

You're over your time, but I am going to give Dr. Gruzd a chance to answer that.

Answer very quickly, please, if you don't mind, Dr. Gruzd.

3:50 p.m.

Professor and Canada Research Chair in Privacy-Preserving Digital Technologies, Toronto Metropolitan University, As an Individual

Dr. Anatoliy Gruzd

There is a huge use of machine learning in AI to deliver content to eyeballs. Related to your point, essentially, it's concerning sometimes, when you get into echo chambers on a particular topic and that's all you see. If it's full of misinformation, driven by a recommended system, that's even more concerning. I probably don't have time, but I can expand on that later.

3:50 p.m.

Conservative

The Chair Conservative John Brassard

Thank you, Dr. Gruzd.

Mr. Villemure, you have the floor for six minutes.

Dr. Gruzd, I want to make sure that you have it on your translation.

You can go ahead, Mr. Villemure.

3:50 p.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

Thank you, Mr. Chair.

Thank you, Dr. Gruzd. I'm very pleased to be able to get some insight from someone who has as impressive a resumé as yours on this subject.

I'm going to start with a very simple question. You talked to us about the digital trust rating.

What do you think is the ethical concern of social media platforms? Is it large or small?

3:55 p.m.

Professor and Canada Research Chair in Privacy-Preserving Digital Technologies, Toronto Metropolitan University, As an Individual

Dr. Anatoliy Gruzd

When you say “digital trust rating”, are you referring to that being assigned to people—users—or the platforms themselves?

3:55 p.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

I'll rephrase my question instead.

Do you think social media platforms have ethical concerns? To what extent? Is it a little, maybe a lot? Is it important to them?

3:55 p.m.

Professor and Canada Research Chair in Privacy-Preserving Digital Technologies, Toronto Metropolitan University, As an Individual

Dr. Anatoliy Gruzd

That goes directly to my point about expanding their trust and safety departments, not reducing. Essentially, that's the branch of the large major social media companies that actually oversees the content moderation, so that harmful content, problematic content, will not get the audience it's seeking. Unfortunately, we hear in the news that these departments have been shrinking. Trust and safety teams are being left out, and some of the initiatives that were started a while back are being discontinued.

It is a concern. It does signal that maybe that area is not as important, because it can be easily cut when there is no need for it anymore.

3:55 p.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

When we look at the policies of the companies, it seems to us that they're doing the minimum required, nothing more.

What will be the impact of generating artificial intelligence on social media? What can we expect?