Highlights
- X will aim to review illegal hate speech and terror posts within 24 hours.
- At least 85 per cent of hateful posts must be removed within 48 hours.
- Campaigners say X is still failing to tackle open racism on its platform.
Ofcom online safety director Oliver Griffiths called it a step forward but said there was more to do.
"Terrorist content and illegal hate speech is persisting on some of the largest social media sites. We expect them to take firm action," he noted.
He added the issue was especially important following recent hate-motivated crimes against Britain's Jewish community.
X has run under looser moderation rules since Musk's 2022 takeover, allowing thousands of previously banned accounts back onto the platform.
Musk apologised in 2023 after appearing to endorse an anti-semitic post.
Danny Stone of the Antisemitism Policy Trust welcomed the deal but said X was still failing to tackle open racism.
"I hope Ofcom will hold X to account for what it has promised," he told The Telegraph.
Think tank British Future warned the pledges needed to deliver rapid change, noting X was not yet meeting its legal duties in Britain.














