Evidence of meeting #19 for Public Safety and National Security in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was twitter.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Evan Balgord  Executive Director, Canadian Anti-Hate Network
Barbara Perry  Director, Ontario Tech University, Centre on Hate, Bias and Extremism
Wendy Via  Co-Founder, Global Project Against Hate and Extremism
Ilan Kogan  Data Scientist, Klackle, As an Individual
Rachel Curran  Public Policy Manager, Meta Canada, Meta Platforms
David Tessler  Public Policy Manager, Meta Platforms
Michele Austin  Director, Public Policy (US & Canada), Twitter Inc.

12:25 p.m.

Director, Public Policy (US & Canada), Twitter Inc.

Michele Austin

Twitter actually has much less algorithmic content than our competitors. The main indicator that we use with regard to our algorithm is who the user chooses to follow. I would also remind you that you can turn off the algorithm on your home timeline on Twitter. You can choose to see tweets in reverse chronological order, or you can turn the algorithm back on and ask us to surface tweets that we think you would be interested in.

Open AI, open machine learning—I think that is the future of this policy discussion, and we're very much looking forward to it.

12:25 p.m.

Liberal

Pam Damoff Liberal Oakville North—Burlington, ON

Thank you.

I'm going to turn to Facebook and Meta. Last year your revenue was $117 billion U.S. The year before that, it was $86 billion U.S. The company has been quite successful in increasing its revenue. I understand that's mostly through running advertisements. How do you decide what advertisements I see when I go on your platforms?

12:25 p.m.

Public Policy Manager, Meta Canada, Meta Platforms

Rachel Curran

Thank you for that question. It's actually a very good question.

On this question of algorithms, what you see in your newsfeed, including advertising, depends on a number of what we call “signals”. Those signals include what you have liked before, what kinds of accounts you follow, what you have indicated your particular interests are, and any information that you have given us about your location, who you are and your demographic information. Those all act to prioritize, or not, particular information in your newsfeed. That will determine what you see when you open it up. It's personalized for each user.

12:25 p.m.

Liberal

Pam Damoff Liberal Oakville North—Burlington, ON

My understanding from other witnesses who have come forward, though, is that, for example, if I search for coronavirus or COVID-19, I very quickly end up on conspiracy sites. A lot of those conspiracy sites were also linked with the far right. Do your algorithms work quite quickly to be able to direct me to those sites?

12:25 p.m.

Public Policy Manager, Meta Canada, Meta Platforms

Rachel Curran

No, that's untrue. If you search for anything about COVID-19 or coronavirus, part of what you will be directed to is our COVID-19 hub, which contains credible information, including from the Public Health Agency of Canada, on the coronavirus and vaccines. We're really thrilled about the fact, actually, that 90% plus of Facebook users in Canada have indicated that they are supportive of vaccination and wish to find out more information about vaccines.

12:30 p.m.

Liberal

Pam Damoff Liberal Oakville North—Burlington, ON

I have only 45 seconds left.

One of the issues with the convoy that happened in Ottawa was these Facebook groups that started up—and remained up, quite honestly. How did you monitor those during the convoy?

12:30 p.m.

Public Policy Manager, Meta Canada, Meta Platforms

Rachel Curran

Yes, that's a really good question.

We had a 24-7 monitoring effort during the convoy protest, which we set up almost immediately. We were looking at groups, accounts and discussions on the platform to monitor them for any breach of our community standards. We removed material that was in violation of our community standards. Again, that was an around-the-clock effort on our part.

12:30 p.m.

Liberal

The Chair Liberal Jim Carr

Thank you very much.

I would now like to invite Ms. Michaud for her six-minute block.

The floor is yours, Ms. Michaud.

12:30 p.m.

Bloc

Kristina Michaud Bloc Avignon—La Mitis—Matane—Matapédia, QC

Thank you, Mr. Chair.

I thank the witnesses for joining us.

I will first go to Ms. Austin, from Twitter.

A little earlier, we discussed with the previous panel Mr. Musk's purchase of Twitter. Those people carried out two surveys in March to ask users whether they felt that Twitter's algorithm should be open source code and whether freedom of expression was respected. Those surveyed answered yes to the first question, and no to the second. Of course, Mr. Musk accused the platform of applying censorship.

Do you think Mr. Musk's taking over Twitter may lead to changes in some of the platform's policies and ways of operating? The fact that people could speak out more may unfortunately encourage the spread of disinformation and hate speech.

12:30 p.m.

Director, Public Policy (US & Canada), Twitter Inc.

Michele Austin

I can't speculate on what Mr. Musk will or will not do until that deal closes, which could take months. I can only comment on our current approach, which will continue.

With regard to open-source code, Mr. Dorsey, the former CEO of Twitter, tweeted extensively yesterday with regard to open-source code and algorithms and his support of those. Twitter has traditionally supported the open Internet and efforts to open-source code. We have a number of experiments under way with regard to that, but I wouldn't be in a position to speculate any more than that.

12:30 p.m.

Bloc

Kristina Michaud Bloc Avignon—La Mitis—Matane—Matapédia, QC

Thank you.

For the benefit of the committee and the people listening to us, could you tell us in more detail what the impact would be on the dissemination of harmful content if Twitter's algorithm was open source code? I am not an expert on algorithms. As many people have probably never heard of open source code, I would like you to tell us what would happen, in concrete terms, if Twitter made that change.

12:30 p.m.

Director, Public Policy (US & Canada), Twitter Inc.

Michele Austin

Just so that people watching and listening understand, as you said, algorithms are used for some of the most basic services by companies around Canada. I would suggest to the committee that, when you speak about open algorithms, you want to think about specifically what the algorithm is trying to solve for rather than just saying generally, “please open up your algorithms”.

We also rely on human curation and not algorithms to produce Twitter moments. Let me give you an example. We are partnering with Openminded, which is an open-source, non-profit organization. We're looking at machine learning and privacy-enhancing technologies, or PETs, to pioneer new methods of public accountability and access to data in a manner that respects and protects the privacy of people who use our service.

12:30 p.m.

Bloc

Kristina Michaud Bloc Avignon—La Mitis—Matane—Matapédia, QC

Thank you very much.

I will now turn to the Meta Platforms representative.

In October 2021, a former Facebook data scientist told members of the U.S. Congress that Facebook knows the algorithms its platforms use are causing harm, but it refuses to change them because eliciting negative emotions in people encourage them to spend more time on sites or to visit them more often, which helps sell advertising. To reduce that harm without hurting Facebook's profits, she suggested that posts be displayed in chronological order instead of allowing the algorithm to anticipate what will engage the reader. She suggested that an additional step be added before people can share content.

What do you think of those accusations?

What would be the consequences of removing the engagement prediction function from a platform like Facebook?

12:35 p.m.

Public Policy Manager, Meta Canada, Meta Platforms

Rachel Curran

The assertion that we algorithmically prioritize hateful and false content because it increases our profits is just plain wrong. As a company, we have every commercial and moral incentive to try to give the maximum number of people as much of a positive experience as possible on the platform, and that includes advertisers. Advertisers do not want their brands linked to or next to hateful content.

Our view is that the growth of people or advertisers using our platforms means nothing if our services aren't being used in ways that bring people closer together. That's why we take steps to keep people safe, even if it impacts our bottom line and even if it reduces their time spent on the platform. We made a change to News Feed in 2018, for instance, which significantly reduced the amount of time that people were spending on our platforms.

Since 2016, we've invested $13 billion in safety and security on Facebook, and we've got 40,000 people working on safety and security alone at the company.

12:35 p.m.

Bloc

Kristina Michaud Bloc Avignon—La Mitis—Matane—Matapédia, QC

Thank you.

I have a bit of time left to put a brief question to you.

Once you have detected potentially problematic content or activities on your platform, approximately how much time do you need to decide to block or hide that content?

12:35 p.m.

Public Policy Manager, Meta Canada, Meta Platforms

Rachel Curran

That's a great question.

Normally, it takes a matter of hours. If a more nuanced review is required and if it needs to go to one of our human reviewers, it might take a little bit longer, but we normally have material that's in breach of our community standards down within 24 to 48 hours at a maximum.

12:35 p.m.

Liberal

The Chair Liberal Jim Carr

Thank you very much.

Mr. MacGregor, we'll go over to you, sir, for your six-minute block of questioning.

12:35 p.m.

NDP

Alistair MacGregor NDP Cowichan—Malahat—Langford, BC

Thank you very much, Mr. Chair.

I'll start my line of questioning with Meta.

Ms. Damoff, my colleague on the Liberal side, already identified the significant profits that your company has made, the majority of which come from advertising revenue. With respect to what you've already said about your algorithms, is it also true that your algorithms are also designed with a profit motive in mind?

12:35 p.m.

Public Policy Manager, Meta Canada, Meta Platforms

Rachel Curran

No. That's incorrect. They're designed to give our users and our community the most value possible, the best possible experience. We want them to see things that are useful to them and that are relevant to them. We want them to enjoy their experience on our platform. Otherwise, they're not going to come back and spend time there.

That's really our priority. It's to make sure that our users—

12:35 p.m.

NDP

Alistair MacGregor NDP Cowichan—Malahat—Langford, BC

I'd like to reclaim my time—

12:35 p.m.

Public Policy Manager, Meta Canada, Meta Platforms

Rachel Curran

—are enjoying their time spent on our platform.

12:35 p.m.

NDP

Alistair MacGregor NDP Cowichan—Malahat—Langford, BC

With respect, though, those algorithms, while promoting all of these positive things that you've said, have also had the added benefit of raising an obscene amount of money for your company. I guess what I'm trying to figure out here is how much that profit motive and the incredible sums of money that your company is able to make off these algorithms.... We know, from the research that is out there and from what this committee has already heard, that emotionally provocative content that reinforces what we already believe works better than factual information.

When we as a committee are looking at the increasing ad revenues that your company is making, when we know that emotionally provocative content can trump factual information and when we see the very obvious role that social media has has played in increasing misinformation and disinformation out there, with very real-world consequences, how can we have assurances that your company is actually taking this seriously when there are all of these competing priorities grabbing your attention?

12:35 p.m.

Public Policy Manager, Meta Canada, Meta Platforms

Rachel Curran

Yes, I understand that. We do make money from advertising. That's true. However, a lot of that money gets reinvested into securing the safety of our community. As I've talked about, we've invested over $13 billion in this area since 2016 alone.

The other thing I would say is that I know it's sort of superficially attractive to say that social media is kind of the reason for division or polarization or some of these things we've seen. The latest research actually doesn't indicate that. In many countries where polarization is increasing, that started long before the advent of social media. In other countries with really significant or heavy social media use, polarization is lower and actually decreasing. Research doesn't back up the contention that social media is actually the cause of increased polarization or increasing divisiveness.

That said, all of our work is to amplify the good that comes from these platforms and try to minimize the bad. Maybe my colleague David can weigh in on this a little bit more—

12:40 p.m.

NDP

Alistair MacGregor NDP Cowichan—Malahat—Langford, BC

My time is limited.

12:40 p.m.

Public Policy Manager, Meta Canada, Meta Platforms

Rachel Curran

A lot of the work we do is to minimize the harmful stuff that you've talked about.