By Martin Coulter
LONDON (Reuters) –
Elon Musk has been accused of exacerbating tensions after a week of far-right rioting in Britain, prompting calls for the government to accelerate the rollout of laws managing harmful online content.
Misinformation and calls to violence have proliferated on social media over the past week, following the fatal stabbing of three young girls in Southport, England, which far-right and anti-Muslim groups have exploited.
As rioters clashed with police in several towns and cities, Musk contributed to the discourse on his X platform, stating that civil war was “inevitable” in Britain. Prime Minister Keir Starmer’s spokesperson commented that there was “no justification” for such remarks.
Starmer further warned social media companies about the violent disorder incited online, labeling it a crime “on your premises,” while acknowledging the need for a careful approach in dealing with these firms.
The official responses illustrate the governmental challenges at hand. An Online Safety Bill was enacted in October but has yet to be implemented. This legislation empowers media regulator Ofcom to penalize social media companies up to 10% of their global revenue if they fail to manage content that incites violence or terrorism.
However, Ofcom is still formulating guidelines on how the law will be executed, and enforcement isn’t anticipated until early next year. In light of the recent violence, there are increasing calls for the regulations to be expedited.
On Wednesday, Ofcom released an open letter emphasizing the responsibility of social media firms to safeguard users against harmful content, even in the absence of the Online Safety Act.
Director Gill Whitehead stated: “In a few months, new safety duties under the Online Safety Act will be in place, but you can act now – there is no need to wait to make your sites and apps safer for users.”
Adam Leon Smith, a fellow at BCS, the Chartered Institute for IT, urged for prompt enforcement of the Online Safety Act, noting, “There must be a tipping point where a foreign billionaire platform owner has to take some responsibility for running a toxic bot network that has become one of the main sources of fake news and misinformation in the UK.”
Kirsty Blackman, an MP for the Scottish National Party, stated that laws governing online safety are long overdue and expressed support for accelerating the timetable for requirements, particularly for high-risk platforms.
An Ofcom spokesperson assured that they are moving swiftly to implement the Online Safety Act, needing to consult on codes of practice and guidance before new safety duties become enforceable.
Musk did not provide an immediate response to requests for comment.
ENFORCEMENT
While individuals inciting online violence can face prosecution, the government cannot compel social media companies to police their platforms until the Online Safety Bill takes effect.
On Tuesday, Britain’s technology minister Peter Kyle reported meeting with TikTok, Meta, Google, and X to stress their duty to curb harmful online content. The companies did not respond immediately to requests for comments.
Nevertheless, numerous posts on X encouraging violence and racism remain active and have been viewed tens of thousands of times.
As of the writing, Musk’s posts on the subject have reached millions of users, with one post implying that Muslim communities receive undue police protection viewed 54 million times.
Although these comments may not violate illegal content rules, allowing explicit calls for violence could.
Iman Atta, director of the advocacy group Tell MAMA, which monitors anti-Muslim activity in Britain, stated, “We would encourage Ofcom to speed up its work on the guidelines, so that X and other social media platforms face financial penalties if they do not remove harmful content.”
She added, “There is a need to force platforms to take more drastic action against extremism and hate speech.”
Comments (0)