Thank you so much for inviting me.
I'm Nora Benavidez, senior counsel at Free Press—not The Free Press, which is a different entity. I just want to clarify that.
We are a U.S.-based NGO, where I run public accountability campaigns and federal policy reform efforts to ensure that tech is protecting human and civil rights and upholding democracy.
Following years of work by civil society, academics and lawmakers documenting social media harms and urging more accountability, the largest tech companies have responded with disinterest. What's worse, they have increasingly used dangerous tactics to evade accountability. I'll talk a little about that today.
Since the global pandemic, other crises like the January 6 insurrection at the U.S. Capitol, the attempted coup in Brazil in January of this year and the current conflict in the Middle East illustrate the critical role social media platforms play in shaping rapidly unfolding events.
Their failure to vet and remove content that violates their own stated terms of service harms and alienates users. Failure to moderate content inevitably also leads to the migration of lies and toxicity, from online platforms to mainstream media.
Just today, our organization, Free Press, released new research on the backsliding of big tech companies. In the last year alone, Meta, Twitter and YouTube have weakened their political ads policies, creating room for lies in ads ahead of next year's elections around the world. They have weakened their privacy policies to give AI tools access to user data, and they've collectively laid off nearly 40,000 employees.
Massive cuts have occurred across trust and safety teams, ethical engineering, responsible innovation and content moderation. Those are the teams tasked with maintaining a platform's general health and protecting users from harm.
This dangerous backslide has come under scrutiny. Evidence comes from whistle-blowers, from researchers who are looking at algorithmic discrimination; pressure also comes from organizations like ours, urging advertisers to leave Twitter because of Elon Musk's decisions, which make the platform more hateful and violent.
All of this points to the fact that tech companies cannot be trusted to govern themselves. Their response has been far from collaborative. There have been several new tactics that companies have adopted to shut down inquiry and accountability.
The first is cutting off researcher and API access to platform data. Researchers are now suffering various limitations. The NYU ad observatory was denied access by Facebook in 2021 to get its platform services, following months of inquiry analyzing its ad library tools. Twitter has made its API tool almost impossible for researchers to access, because of the high price tag. All of the major platforms require advance notice from researchers, who must be affiliated with universities to get access to their API. This sets up a de facto process whereby the platforms can approve or reject research access if they don't like how the ultimate product might be used.
The second major threat we are now seeing is litigation to silence researchers and critics. Elon Musk has adopted this tactic and has gone after several research entities and NGOs studying the extent to which hate persists and grows on Twitter. Musk has sued several organizations: the Center for Countering Digital Hate—I know their CEO spoke before you as well—the State of California, and Media Matters for America. He has also threatened other organizations.
These suits are dangerous to researchers, but they're also dangerous to the public, who will really be kept in the dark about tech companies' unethical practices.
Finally, the third major concern now is cross-sector attacks, abusing official power to go after researchers studying disinformation. This past summer, U.S. House judiciary chairman, Jim Jordan, led an effort that was demanding documents from leading academics, accusing them of suppressing speech, in particular Conservative speech. These attacks have absolutely led researchers to retreat from doing the necessary work they had been doing.
Big tech doesn't have to go after every tech accountability researcher and campaigner, because these actions are already having a chilling effect. We've witnessed, in plain sight, tech companies run nearly every play in the book to avoid regulation and accountability. Their platforms are undermining democracy, civil and human rights, privacy and public safety. That's why I'm really excited to be here today to talk with you.
We have called on our U.S. government to compel more transparency; to minimize the data that companies collect, use and retain; to outlaw discriminatory algorithms; and to tax online advertising and redistribute those funds to support local, independent, non-commercial journalism.
Thank you so much for your time. I look forward to your questions.