Online Algorithm Transparency Act

An Act respecting transparency for online algorithms


Peter Julian  NDP

Introduced as a private member’s bill. (These don’t often become law.)


Outside the Order of Precedence (a private member's bill that hasn't yet won the draw that determines which private member's bills can be debated), as of June 17, 2022

Subscribe to a feed (what's a feed?) of speeches and votes in the House related to Bill C-292.


This is from the published bill. The Library of Parliament often publishes better independent summaries.

The purpose of this enactment is to ensure that online communication service providers do not use algorithms that use personal information in a manner that results in the adverse differential treatment of any individual or group of individuals based on one or more prohibited grounds of discrimination or on any other grounds.


All sorts of information on this bill is available at LEGISinfo, an excellent resource from the Library of Parliament. You can also read the full text of the bill.

December 14th, 2023 / 9:35 a.m.
See context


Peter Julian NDP New Westminster—Burnaby, BC

Thank you very much, Madam Chair.

I wanted to come back to Mr. Hatfield and Ms. Donovan. You're both kind enough to mention and support Bill C-292 on algorithm transparency, which is before the House of Commons under my name. There is similar legislation before the U.S. Congress under the sponsorship of Senator Ed Markey.

How important is it to have that algorithm transparency? Do you feel that the push-back that we're getting from these massive big tech companies is because they realize that if the algorithms are transparent, liability then comes for some of the malgorithms that have led people to commit real-world acts of violence?

I will start with Mr. Hatfield.

December 14th, 2023 / 8:35 a.m.
See context

Executive Director, OpenMedia

Matthew Hatfield

Certainly. My apologies.

To me, this hearing's topic seems to be pinning down what's wrong with tech platforms and what our government can do about it. I'll try to answer that question very precisely for you.

What's wrong with tech platforms and their influence on society? It's three things: their size, their vast asymmetrical data compared to regulators and citizens, and the engagement algorithms that drive their business model.

Let's talk size. Platforms like Amazon and Google have a stranglehold on a huge share of Internet commerce, app purchases, advertising and more. They often use that power to set unfair terms vis-à-vis smaller businesses and consumers. I'll note, though, that Bill C-18 misunderstood the specific dynamic around news. It assumes that news has inherent value to platforms that, for Meta at least, it does not.

The good news about the size problem is that Canada is opening new possibilities to do something about it through competition reform in Bill C-56 and Bill C-59. In the U.S., several bills were proposed last year aimed at regulating how tech giants treat small businesses and consumers. They include the American innovation and choice online act and the open app markets act, both of which OpenMedia campaigned for. In Canada, the Competition Bureau has never had the legal basis to study platform power effectively, let alone change it. Soon they will.

My second point is about data asymmetry and privacy. Platforms like Meta and YouTube have an endless volume of sensitive data about each and every one of us. They use it for advertising and to feed recommendations, but not for much else. Partly that's to respect our privacy, which is a very good thing. Their data in the hands of a spy agency or law enforcement would be a dystopic surveillance nightmare and one that we must guard against. However, that lack of curiosity on the platforms' part is also self-serving. It makes it easy to bury accurate study of what may be going wrong for some of their users and, in the worst case, lead that minority to harm themselves or others. The limited research that exists on how platform models may sometimes amplify harms is done with very incomplete data or with crumbs of researcher data access, which platforms are quick to withdraw if their interests are threatened.

Here we need both an individual and structural remedy. The strongest possible privacy bill, Bill C-27, giving Canadians meaningful and unalienable control of our personal data, is one solution, but another must be a very strong provision for both regulator and approved academic researcher access to perform studies on platform data in our upcoming online harms bill. We can't intelligently regulate platforms if we don't understand how any harms they help produce actually occur.

Last but not least, let's talk about the algorithm. Without even noticing it, we've become a society in which most information we get is delivered because it keeps us scrolling and clicking, not because it is nuanced, well researched or true. For music or hobbies, that can be a wonderful tool of self-exploration. People are not passive consumers of our feed. We curate it heavily, pruning the algorithm to serve us what we like most. However, for facts and reporting, that same process is making us a less-informed, angrier and more polarized society. We all feel the impact and very few of us like it. That doesn't make solutions easy, although I would say that Bill C-292, Peter Julian's bill, is something worth considering here.

I'll give a couple of signposts for what might help. We welcome this committee's interest in a dedicated study of how to create a viable news sector in Canada that continues producing vetted information. There's a case that Canadian news needs permanent government support, but the more involved government becomes, the more urgent it is that funds move through a system that is fully transparent to the public, has clear and fair criteria for who gets what support and prioritizes funds where they're most needed, in local news deserts and public accountability journalism, not shovelling funds indifferently toward Bell or the CBC. The alternative of stacking complex funding band-aids one on top of the other until they represent the majority of news funding is not going to build public trust in truthful journalism.

We would also welcome a Canadian study of how social media algorithms are impacting society. However, regulating the algorithm, if it comes, must be aimed at expanding transparency and personal control over how it works for Canadian Internet users, not manipulating it for what the government thinks is best for us.

December 14th, 2023 / 8:15 a.m.
See context

Dr. Joan Donovan Online Disinformation and Misinformation Expert, Boston University College of Communication, As an Individual

Thank you so much for being here and thank you for the invitation to testify at this hearing.

I'm Dr. Joan Donovan, and I've spent my career studying harmful online campaigns, including misinformation, disinformation and media manipulation. I'm an assistant professor at Boston University's College of Communication.

Until recently, I worked for the Harvard Kennedy School of Government as the research director of the Shorenstein Center and the director of the technology and social change research project, also known as TaSC. TaSC focused on online media manipulation campaigns and influence operations by bad actors, including adversarial nations running misinformation and disinformation campaigns, skewing public discourse, seeding hate, violence and incitement online, and, of course, undercutting democracy's free and fair elections.

Before Harvard, I led my research at Data & Society, a non-profit where my team and I mapped how social institutions were intentionally disrupted through online campaigns. I chose to join Harvard after a lengthy recruitment period because they convinced me that they would support this work at scale.

As we know, governments around the world and the public have come to rely on my work, as well as that of many other researchers in this field, but from my work, they have learned who is behind COVID misinformation, especially the calls for hydroxychloroquine. We also learned what domestic and foreign operatives are doing to create division in communities, explaining the behaviour of 81 countries that deploy cyber-troops to manipulate public opinion online. I have worked with the WHO and the CDC on strategies to mitigate medical misinformation, and most recently, I've worked with the Canadian election misinformation project at McGill University.

In my whistle-blower disclosure submitted on my own behalf by Whistleblower Aid, my team's groundbreaking research in this field was ground to a halt in obeisance to Facebook by the dean of Harvard Kennedy School, a man now known for his deference to donor interests.

In short, in October 2021, a well-known Facebook fixer became enraged in a donor meeting when I told the group that I had Frances Haugen's entire cache of internal Facebook documents and that I planned to create a public collaborative archive of that. I said they were the most important documents in Internet history. This donor and Facebook PR executive attacked everything I said at that meeting. He and Facebook-affiliated donors have powerful influence at Harvard, so that was the start of the Kennedy School's campaign to stop my work and create unceasing misery for my research team. When Harvard received a donation of half a billion dollars from The Chan Zuckerberg Initiative, the fate of my research was sealed. HKS killed the TaSC project and fired me after silencing me and my team for two years.

Courtney Radsch testified here that tech giant intimidation includes researchers and academics and a further weaponization of the big tobacco and big oil playbooks, silencing and skewing research and protecting their profits and lies to the public. However, unlike the censorship campaigns of those before them, tech giants have more tools at their disposal because they control the information landscape and the data about it. For instance, Meta's actions in Canada to fight Bill C-18 have deprived Canadians of more than five million news interactions a day, according to McGill's media ecosystem observatory.

You see the damage of their for-profit motivation acutely in Canada. As Imran Ahmed from the Center for Countering Digital Hate testified to here, we know that bad actors fill the vacuum when credible news and information leave us, with little else to look at. When a school like Harvard is complicit in the corporate direction of research, what can protect those of us who work to document, analyze and share the truth? As others have noted, Facebook's actions to avoid accountability have targeted legislators and regulators in the U.S. and Canada.

I want to close by saying this. I support the online algorithm transparency act, known as Bill C-292 here in Canada, and the similar legislation introduced in New Zealand, the U.K. and the European Union. I was raised with the deepest conviction that I'm responsible for the consequences of my actions, and tech giants must be too. As an academic, I have a moral obligation to tell the truth—then and now.

Thank you very much.

December 7th, 2023 / 10:20 a.m.
See context


Peter Julian NDP New Westminster—Burnaby, BC

I would like to move on to Madam Benavidez.

I appreciated your comments about algorithm transparency. This is something that has come up in the U.S. Congress and in Canada. I have a private member's bill, Bill C-292, which would force algorithm transparency for platforms. Senator Ed Markey in the U.S. Congress is presenting similar legislation. In fact, his legislation inspired our legislation here.

The platforms are opposed, because the possibility of liability, once those algorithms are exposed, is something that they're concerned about. They're concerned they might be liable for the kinds of algorithms, the “malgorithms”, they're promoting that have led to so many incidents of violence.

What is your feeling on algorithm transparency in legislation? Do you feel it's important that legislators move forward with this type of legislation?

November 28th, 2023 / 12:45 p.m.
See context


Peter Julian NDP New Westminster—Burnaby, BC

Thank you very much, Mr. Chair.

Of course, that decision was on the line. The committee has the perfect right to make that decision. I would agree with Mr. Champoux that committee work means working often by unanimous consent, and I'm hoping that we get back to that.

Mr. Ahmed, I want to come back to you.

I'm stunned to learn that you have 25 members on your team. Please pass on our sincere appreciation for 25 people doing such remarkable work in the face of the big tech juggernauts and the massive increase in hate and disinformation we are seeing.

One of the things that can help to push back against this hate and disinformation is having transparency around online algorithms. Bill C-292, before the Canadian Parliament, seeks to do that.

In the United States, before the U.S. Congress, as I know you are aware, Senator Ed Markey has put forward similar legislation to oblige big techs to actually expose the algorithms they use to force-feed, in this case, hate and disinformation to so many people.

The concern in big tech, of course, is that they'll be libel if there is a direct link between the massive terrorist attacks that we've seen linked to hatred, whether anti-Semitic, homophobic or racist, and their algorithms. A legal liability would be established.

How important is it for Parliament and the U.S. Congress to adopt this kind of legislation to hold big tech libel for the egregious practices they have?

Online Algorithm Transparency ActRoutine Proceedings

June 17th, 2022 / 12:15 p.m.
See context


Peter Julian NDP New Westminster—Burnaby, BC

moved for leave to introduce Bill C-292, An Act respecting transparency for online algorithms.

Madam Speaker, with thanks to my seconder, the member for Hamilton Centre, today I am tabling an important bill, Bill C-292, an act respecting transparency for online algorithms.

The purpose of this bill is to ensure that online platforms do not use algorithms and personal information to discriminate against anyone. This legislation is particularly timely, because as we have seen during this pandemic, there has been an unprecedented rise in online hate, disinformation and right-wing extremism.

For years, online platforms have been using algorithms to discriminate, to make predictions or decisions about a user and to direct information by amplifying or promoting content to that user. The online algorithm transparency act would require transparency and accountability in all algorithms that are used.

Other jurisdictions, such as the United Kingdom, the European Union and New Zealand, are looking at implementing similar legislation. Of course, Senator Ed Markey has sponsored a landmark bill in the U.S. Senate. Anti-hate organizations are also calling for algorithm transparency.

I urge all members of Parliament to support this important legislation.

(Motions deemed adopted, bill read the first time and printed)