UK media regulator says X promises to crack down on terrorist and hate content
Ofcom said X pledged to restrict access in the U.K. to accounts tied to banned terrorist groups and to expedite review and removal of illegal content, with quarterly reporting over a 12-month period. The regulator noted ongoing Grok-related investigations and broader EU scrutiny of X.
Why It Matters
The commitments indicate regulatory pressure on a major social platform to curb terrorist and hate content in the U.K., amid international oversight on the platform's content governance.
Timeline
4 Events
Ofcom says X promises to crack down on terrorist and hate content in Britain
Ofcom said X pledged to restrict access in the U.K. to accounts operated by or on behalf of terrorist groups that the country has banned; X promised to review suspected illegal terrorist and hate content within 24 hours on average, and to assess 85% of material no more than 48 hours after users have flagged it. X will submit quarterly performance data over a 12-month period so the regulator can compare its performance against these targets, and will engage with experts to improve its reporting systems in response to concerns from civil society groups about follow-up after flags.
French prosecutors seek charges against Musk and X
French prosecutors reportedly sought charges last week against Elon Musk and X, including denial of crimes against humanity.
Ofcom launches investigation into Grok
Ofcom responded by launching an investigation into whether Grok failed to protect users from illegal content; Ofcom Director Oliver Griffiths said the investigation is ongoing.
Grok controversy intensifies earlier this year
The Grok AI chatbot, accessible on X, became the focus of global scrutiny after it pumped out nonconsensual deepfake images. Ofcom indicated the Grok matter continued this year, noting ongoing scrutiny and related concerns about illegal content on the platform.