Evidence of meeting #105 for Canadian Heritage in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was platforms.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Joan Donovan  Online Disinformation and Misinformation Expert, Boston University College of Communication, As an Individual
Bram Vranken  Researcher, Corporate Europe Observatory
Riekeles  Associate Director, European Policy Centre, As an Individual
Matthew Hatfield  Executive Director, OpenMedia
Jeff Elgie  Chief Executive Officer, Village Media Inc.
Philip Palmer  President, Internet Society Canada Chapter

9:15 a.m.

Online Disinformation and Misinformation Expert, Boston University College of Communication, As an Individual

Dr. Joan Donovan

I think it's really important that we hold these companies to account and make sure they understand that the consequences of deploying technology don't just come with the PR of innovation, but that the public and government are going to look deeply at these algorithms or malgorithms and try to make sense of them. That doesn't mean you put the algorithms into the world and the 200 data scientists and computer programmers who understand them can look at them. It means that we have to invest deeply in transparency and auditing systems that allow for us to really understand what it means to serve information online and why they prefer different kinds of information at different times.

What we know about virality online is that novel and outrageous things go furthest and fastest. That tends to be conspiracies—

9:15 a.m.

Liberal

The Chair Liberal Hedy Fry

Excuse me, Ms. Donovan. We're about a minute over time. You can expand on that with another question later on. Thank you.

9:15 a.m.

Online Disinformation and Misinformation Expert, Boston University College of Communication, As an Individual

9:15 a.m.

Liberal

The Chair Liberal Hedy Fry

We'll now go to the second round of questions, and it's a five-minute round. Once again, that's five minutes for questions and answers.

We'll begin with the Conservatives and Kevin Waugh.

Kevin, you have five minutes, please.

9:20 a.m.

Conservative

Kevin Waugh Conservative Saskatoon—Grasswood, SK

Thank you, Madam Chair.

For disclosure, I was part of legacy media for over 40 years, and I sat here for Bill C-18 listening to their hardships and to them bashing Meta. Many of them had agreements behind the scenes that they said little about, non-disclosure agreements, and then when Bill C-18 was passed in the House, one of the biggest media giants in this country, Bell, decided to blow off 1,300 of their employees. Again, there was nothing said. CRTC, with its lax regulations, said little, and it was just kind of swept under the carpet. Then when CBC—a broadcaster and digital network—made cuts, everybody was up in arms, yet it's the taxpayer who pays most if not all the bill for CBC.

Mr. Palmer, you mentioned before that there was hardship when Meta withdrew, but I sat around this table listening to these companies and they had certain agreements. Then, of course, when Meta withdrew, they said nobody was going to their websites and this and that. You can't have it both ways. These companies, when they sat here, were in hardship complaining about Meta, so Meta withdrew and they're still complaining today.

You don't think this is constitutional. I did hear you a year ago around this table. What are your thoughts today? Is online news constitutional or not?

9:20 a.m.

President, Internet Society Canada Chapter

Philip Palmer

Frankly, no, it's not constitutional. There's no provision in the Canadian Constitution that gives federal Parliament the authority to regulate online platforms, and none of the legislation is constructed in a manner that would grant that power. Frankly, it is not constitutional.

9:20 a.m.

Conservative

Kevin Waugh Conservative Saskatoon—Grasswood, SK

Mr. Elgie, I give you credit for your 10 years building Village Media. You've done very well over the last 10 years. However, when we say “level playing field”.... When I was in the media, it was never a level playing field against the CBC. I see that their digital news network is now destroying local news in this country everywhere because, let's face it, they have the resources that other media don't in this country.

I'm just wondering what “level playing field” looks like in your view.

9:20 a.m.

Chief Executive Officer, Village Media Inc.

Jeff Elgie

Thank you for the question.

With respect to the CBC, while there are some roles that are important for the CBC to play, certainly having a commercial role is not one. When they're primarily funded by the government, the loophole they found to compete in particular in the digital advertising space seems unreasonable.

With respect to a level playing field, if the goal was to support journalism, whether that be through the government or through the platforms, the hope was always that it be focused on that proportionally and that all players, big or small, could participate in it. We find ourselves in a world where, since the passing of the Online News Act, Meta abandoning the industry does in fact disadvantage small players and start-up publishers in particular. There is no longer a level playing field, including us to some extent, such that mature publications are now advantaged in markets over anyone who might seek to build new businesses in what I consider to still be a very entrepreneurial pursuit today, which is digital publishing.

9:20 a.m.

Conservative

Kevin Waugh Conservative Saskatoon—Grasswood, SK

I don't where the budget officer got the figure $329 million. They pulled that out as the number Google and Meta were going to give news organizations, and now we're down to $100 million.

For that $100 million from Google, who is going to determine where it goes? I think a big concern we have around this table is, with only $100 million of Google's skin in the game, who gets the money and who determines who gets the money? That's the big question here.

9:20 a.m.

Liberal

The Chair Liberal Hedy Fry

You have 30 seconds left.

9:20 a.m.

Conservative

Kevin Waugh Conservative Saskatoon—Grasswood, SK

I will give that one to Jeff.

Jeff, you're an entrepreneur. Is it a concern to you where the $100 million from Google goes and who determines where it goes?

9:20 a.m.

Chief Executive Officer, Village Media Inc.

Jeff Elgie

We're highly interested in the outcome. We expect the final regulations may prescribe some division of that. Certainly, if the full scope of the CBC and the full scope of private broadcasting is in, including Bell and Rogers, then it obviously dilutes what is already an undervalued pot of funds, which originally, at least I believe, was intended to support the traditional digital and print news industry. It's hard to say, but certainly if the full scope is in, then that pool will be highly diluted.

9:25 a.m.

Liberal

The Chair Liberal Hedy Fry

Thank you very much, Kevin. Your time is up.

I will now go to the Liberals, with Mr. Noormohamed for five minutes, please.

9:25 a.m.

Liberal

Taleeb Noormohamed Liberal Vancouver Granville, BC

Thank you, Madam Chair.

I want to thank our witnesses for being here.

I would like to start with Ms. Donovan.

One of the prevailing concerns in many communities, particularly among Muslims, Jews and communities of colour, has been the way online platforms, particularly X, Facebook, Instagram and others, have been used as breeding grounds or an amplification force for extreme hateful views. Given the size of these platforms and the limited places online where folks go to access created content—I'm not talking about news here—and access engagement, how do you see the risk profile growing over the course of the next little while for these communities, particularly given the way we've seen bots, foreign governments and others try to foment discord and hate on these large platforms?

9:25 a.m.

Online Disinformation and Misinformation Expert, Boston University College of Communication, As an Individual

Dr. Joan Donovan

I began my research on the Internet, networks and social movements looking at the Occupy movement primarily. As my attention turned to white supremacist groups online, I was able to use the same methods I used to look at online social movements in order to think about the formation of movements and of what later became known at the alt-right—the networked social movement of certain charismatic individuals and money players who were funding this. It culminated in what my research looks at particularly, which is the “wires to the weeds” effect: What gets said online then ends up in public spaces.

I know that in Canada, numerous organizations like the Oath Keepers and the Proud Boys were active, which formed through their own inertia and were also aided by platform companies allowing them a place to germinate and grow. Since then, a lot of the research in this field has been about removing these bad actors from main-stage platforms. I'm particularly unnerved that Musk returned Alex Jones, who I think has a nearly $2-billion fine ahead of him for having maligned and harassed the families of the victims of Sandy Hook. This is scary because this person, along with many others, organized the January 6 riots at the Capitol. What we understand is that platforms aren't just a space for speech. They're also a networking and organizing space for action. That includes surfacing resources for far-right extremist groups.

I've been very pleased with groups like the American organization Color of Change, which launched a blood money campaign in an effort to get places like Mastercard and PayPal not to serve payment to extremist groups and known white supremacist groups. What we know about platform companies is that for a long time they ignored the problem. Then, when we got them to take responsibility for it, they hired people to do that work. However, now, as public opinion of these platforms has shifted, they're not getting any rewards for putting out information about their transparency related to extremist groups on their platform, so they stopped investigating.

This is what's at stake for 2024. If we can't depend on platforms to understand and moderate their own territories, governments like Canada's will have to step up, step in and say, “This is a serious problem.”

9:30 a.m.

Liberal

Taleeb Noormohamed Liberal Vancouver Granville, BC

I'd like to pick up on what you've been talking about here. I'd like to now talk about YouTube.

We have had political leaders in this country use hashtags in their own search criteria for their videos. One of them used a hashtag that was misogynistic. It was intended to build out a certain viewer base. This was used by the current Leader of the Opposition.

I'm curious as to whether or not you can talk to us a little about the implications of what algorithms start to do and how. When you start to use those hashtags, what kind of rabbit holes do they take viewers down? What are some of the consequences of them?

9:30 a.m.

Online Disinformation and Misinformation Expert, Boston University College of Communication, As an Individual

Dr. Joan Donovan

I have been researching YouTube for a decade now.

9:30 a.m.

Liberal

The Chair Liberal Hedy Fry

Thank you for the question, but we've run out of time. You can answer it in another round.

We will now go to the Bloc Québécois for two and a half minutes, please.

Martin, go ahead.

9:30 a.m.

Bloc

Martin Champoux Bloc Drummond, QC

Thank you, Madam Chair.

Mr. Riekeles, I'm going to go to you. Two and a half minutes goes by very fast, so I'll try to ask my question quickly.

Long ago, all countries saw the damage that social media can cause with content that's so easily discoverable on platforms, including hateful content, which incites to various hateful trends.

Why is it taking so long for countries to put laws in place? Where the European Union is concerned, the digital services legislation will come into force in a few weeks, in January, but it has taken time to put it in place. In the United Kingdom, the Online Safety Act was passed in October. Here in Canada, no bill has been introduced yet, even though we've been told for years that they are working on it.

In your opinion, why is it taking so long to develop legislation on a subject that is so critical, urgent and necessary?

9:30 a.m.

Associate Director, European Policy Centre, As an Individual

Georg Riekeles

Thank you very much.

I think it's an excellent question. In reality, what you're pointing to is that, over the two decades during which we should have been regulating these platforms, the biggest corporates and the monopolies in particular, public action has consistently been too little and too late.

I think the fundamental explanation of that is what I was pointing to, which is that big outside vested interests create whole ecosystems of thought influence and subversion that, in the end, manipulate society and policy-making. One gets to the realization of what is happening too late, and when one comes to take action, there are very important counterforces that are at play counteracting the capacity of legislating and regulating.

These actors are so big and, as was said earlier in the hearing, of such a size today—bigger than the GDPs of many G20 countries. Of course, when money is not a limiting factor, you buy or you try to buy everything. That is what we are seeing in terms of lobbying and framing the narrative, but also, as I was pointing to, creating alliances, setting up front groups and astroturfing campaigns. To say it also very frankly, they are influencing or buying think tanks and academics.

We have a very big problem across the western world, not only in Europe, not only in Canada but in the U.S., in Australia and so on and so forth, dealing adequately with the scale of challenge we are facing.

One point was referred to earlier about the independence of academia. I think we are facing a big challenge in terms of having independent academic scrutiny of this. This is a point that has been raised by somebody called Meredith Whittaker, amongst others. She worked for 13 years at Google as head of their open research efforts, and then she left to work on AI ethics. When she was pushed out of the centre where she was working, essentially what she said was that there is virtually no independent academic research on AI ethics across the world.

I think these are examples of the scale of the difficulties we are facing. These companies are systematically, effectively and extensively using their power and their leverage across the policy debate, and that renders regulatory action very difficult.

9:35 a.m.

Liberal

The Chair Liberal Hedy Fry

Thank you, Mr. Riekeles.

Martin, you have gone over, but that's okay.

Peter, you have two and a half minutes, please.

December 14th, 2023 / 9:35 a.m.

NDP

Peter Julian NDP New Westminster—Burnaby, BC

Thank you very much, Madam Chair.

I wanted to come back to Mr. Hatfield and Ms. Donovan. You're both kind enough to mention and support Bill C-292 on algorithm transparency, which is before the House of Commons under my name. There is similar legislation before the U.S. Congress under the sponsorship of Senator Ed Markey.

How important is it to have that algorithm transparency? Do you feel that the push-back that we're getting from these massive big tech companies is because they realize that if the algorithms are transparent, liability then comes for some of the malgorithms that have led people to commit real-world acts of violence?

I will start with Mr. Hatfield.

9:35 a.m.

Executive Director, OpenMedia

Matthew Hatfield

It's critical to get more transparency into how algorithms are working. I don't know if they would face legal liability, but certainly they would face bad press in some cases. Frankly, we're regulating in the dark on a lot of these issues. We truly don't always understand what is occurring on platforms and why. We need that researcher access to understand it better.

9:35 a.m.

NDP

Peter Julian NDP New Westminster—Burnaby, BC

Thank you.

Ms. Donovan.

9:35 a.m.

Online Disinformation and Misinformation Expert, Boston University College of Communication, As an Individual

Dr. Joan Donovan

I think it's really important that we get these bills passed. Senator Ed Markey lives in my hometown, so I'm happy to be aligned in that way.

What's concerning about transparency and algorithms isn't just that they can dump a bunch of code and you can parse it for years, but really that we set up a transparency and auditing agency whose role is to look at these algorithms and take in changes to the algorithm, ask questions and query these large companies about what is being served.

We also need panel data, which means that we need data that is not about users but more about the links and the kind of information that is circulating online. This would be a way to audit algorithms in terms of what kind of news the algorithms are making popular.

Again, it goes back to this finding from MIT many years ago about how lies travel online, which is that novel and outrageous content moves further and faster online, not just because of what the content is but because of the way algorithms mediate our experience with the information we're seeking.

Google ranking matters if you want to understand a certain issue. We need to know how those things work, and how it decides something very banal like whether, when I type in “salsa” on Google, it's going to give me recipes or dance classes. We need to know why it's making these decisions and how.

When it comes to people—and our names are all we have—it's really important that we have a way of auditing how our own names and identities are shaped online.