TL;DR
X has pledged to speed up its review process for hate and terrorist content in the UK, aiming to assess reports within 24-48 hours. The move follows Ofcom’s criticism and ongoing investigations into online safety issues.
X has committed to significantly improve its response to hate and terrorist content in the UK by promising to review and assess such reports within 24 hours, or at least 85 percent within 48 hours, according to Ofcom. This development comes amid ongoing concerns over the platform’s role in hosting harmful content and Ofcom’s regulatory actions.
Following pressure from the UK regulator Ofcom, X has announced plans to accelerate its review process for hate and terrorist content posted in the UK. The platform states it will aim to review and assess such content within 24 hours of being reported, with a target that at least 85 percent of hate content will be reviewed within 48 hours.
In addition to speeding up reviews, X plans to collaborate with UK-based experts on hate and terror content and intends to ban offending accounts. Ofcom will monitor X’s performance quarterly over the next year to ensure compliance. The regulator is also investigating X’s parent company, Elon Musk’s Grok AI, over allegations related to illegal content generation, and continues to scrutinize other platforms like 4chan for violations of online safety laws.
Why It Matters
This development is significant because it represents a potential shift in how social media platforms handle harmful content amid increasing concerns over hate speech and online safety in the UK. If implemented effectively, it could reduce the exposure of vulnerable communities to hate and terror content. However, skepticism remains given X’s history of controversial content moderation and Elon Musk’s own activity on the platform, which often includes racist and inflammatory posts.

Combating Online Hostile Posts in Regional Languages during Emergency Situation (Communications in Computer and Information Science)
As an affiliate, we earn on qualifying purchases.
As an affiliate, we earn on qualifying purchases.
Background
Since Elon Musk acquired Twitter and rebranded it as X, reports indicate a rise in hate speech, with a UC Berkeley study noting a 50 percent increase in weekly hate speech, partly driven by bots. The UK government and Ofcom have become increasingly vocal about the need for social media companies to combat online hate, especially following recent hate-motivated crimes against the Jewish community. Ofcom’s investigation into X and other platforms reflects ongoing regulatory efforts to enforce online safety laws more strictly.
“We have evidence that terrorist content and illegal hate speech is persisting on some of the largest social media sites. We are challenging them to tackle the problem and expect them to take firm action.”
— Oliver Griffiths, Ofcom’s Online Safety Group Director
“We are committed to reviewing and assessing terrorist and hate content in the UK within 24 hours of it being reported, and working with experts to improve our moderation efforts.”
— An X spokesperson

Social Media Equipment Kits for Kids Content Creator T-Shirt
Content creator essentials and content creator kit for every social media lover with a content creator camera. Content…
As an affiliate, we earn on qualifying purchases.
As an affiliate, we earn on qualifying purchases.
What Remains Unclear
It remains unclear whether X will be able to meet these review targets consistently, given its past performance and the platform’s overall approach to moderation. Additionally, skepticism persists over whether Elon Musk’s own activity on the platform aligns with these commitments, and whether the measures will effectively reduce hate speech in practice.
AI-based hate content filtering
As an affiliate, we earn on qualifying purchases.
As an affiliate, we earn on qualifying purchases.
What’s Next
Over the coming months, Ofcom will review X’s performance data quarterly to assess compliance with its commitments. The regulator’s ongoing investigation into X’s handling of illegal content and the effectiveness of its moderation strategies will also influence future regulatory actions. Further updates are expected as these assessments unfold.

Python for OSINT Automation: Building Monitoring Tools for Continuous Intelligence Gathering
As an affiliate, we earn on qualifying purchases.
As an affiliate, we earn on qualifying purchases.
Key Questions
Will X be able to effectively reduce hate content in the UK?
It is currently uncertain. While X has committed to faster review times and collaboration with experts, skepticism remains due to past issues and Elon Musk’s activity on the platform.
What penalties could X face if it fails to meet regulatory requirements?
Potential penalties include fines, increased regulatory scrutiny, or restrictions on platform operations in the UK, depending on Ofcom’s findings and enforcement actions.
Other platforms like 4chan have faced fines and regulatory action for content violations, but X’s commitment to rapid review and collaboration marks a notable shift in its approach to online safety in the UK.