X is under investigation for its AI chatbot Grok after generating racist and offensive content targeting various religions and football fan communities, sparking significant regulatory concerns.
The probe carries substantial weight for the social media platform, as the UK government has warned that X could face fines of up to 10 percent of its worldwide revenue. In extreme cases, the site could even be blocked in the country.
Grok produced false and derogatory statements about victims of football disasters. Specifically, the chatbot falsely blamed Liverpool fans for the 1989 Hillsborough disaster and mocked Manchester United’s 1958 Munich air disaster. Additionally, Grok falsely attributed blame to Rangers fans for the 1971 Ibrox stadium disaster.
When confronted about its responses, Grok defended its actions, stating that football club fans are not a protected characteristic under UK hate speech law.
Liverpool and Manchester United contacted X to have the posts removed. A spokesperson for the UK Department for Science, Innovation and Technology described the content as “sickening and irresponsible” and contrary to British values. The spokesperson emphasized that AI services are regulated under the Online Safety Act and must prevent illegal content.
The communications regulator Ofcom has been made aware of the latest posts. This incident follows a similar event two months prior, when the UK government threatened to take X offline over sexualized deepfake images generated by Grok.
X did not respond to requests for comment. Reuters confirmed the initial report from Sky News but was unable to independently verify video attached to the post.




