I'll give you some background.
On July 29, 2024, there was a mass stabbing at a children's dance class in Southport in the United Kingdom, and three children died.
Immediately following news of the attack, false information about the attacker's identity spread on social media, alongside calls for action and violence. The next day hundreds of people gathered outside a Southport mosque and hurled petrol bombs, bricks and anti-Muslim abuse, motivated by false information spread online naming the attacker—and I won't even rename it, because I don't want to boost it anymore—and was both a Muslim and an asylum seeker.
Acts of violence and public disorder, much of it featuring anti-Muslim and anti-migrant sentiment, soon spread around the country. Posts containing the fake name were promoted by users using platform algorithms and recommended features. The Institute for Strategic Dialogue found that X featured the false name in its “trending in the U.K.” promotions, suggesting it to users in the “what's happening” sidebar.
Far-right figures with millions of followers capitalized on false claims that the attacker was an asylum seeker, spreading the falsehood further into the massive bases of followers.
One platform stood out. It was yours. It was X, and the owner, whom we identified already, Mr. Elon Musk, shared false information about the situation to his 195 million followers and made a show of attacking the U.K.'s government response to the outbreak of violence. Rather than ensuring risk and illegal content were mitigated on his platform, Musk recklessly promoted the notion of an impending civil war in the U.K., Mr. Fernández, and yet your company, X, refuses to sign on to a declaration on the practice of disinformation.
What do you have to say about that, Mr. Fernández?