Calls for social media regulation after ‘infodemic of misinformation’ on COVID-19

An influential parliamentary committee says the government should legislate to introduce a duty of care on social media companies, as well as appoint an independent online harms regulator

The online spread of COVID-19 related misinformation has led to renewed calls for the government to take action on social media regulation.

A report by the cross-party Digital, Culture, Media and Sport (DCMS) Committee warns of a range of potentially harmful falsities, from dangerous hoax treatments to conspiracy theories provoking attacks on 5G engineers.

“Online misinformation about COVID-19 was allowed to spread virulently across social media without the protections offered by legislation, promised by the government 15 months ago,” says the Misinformation in the COVID-19 Infodemic report, published 21 July.

The mooted legislation is the Online Harms White Paper, published in April 2019, which backed DCMS Committee recommendations to introduce both a duty of care on tech companies and an independent online harms regulator.

“We are calling on the government to name the regulator now and get on with the ‘world-leading’ legislation on social media that we’ve long been promised,” said DCMS Committee chair, Julian Knight MP.

“The proliferation of dangerous claims about COVID-19 has been unstoppable. The leaders of social media companies have failed to tackle the infodemic of misinformation.”

You might also like: CDEI report calls for transparence in social media regulation

State actors – including Russia, China and Iran – as well as the Islamic State group, far-right groups in the US and the UK, and scammers, were among those blamed for spreading false claims.

Concerns were also raised that anti-vaccine conspiracy theories could jeopardise attempts to tackle COVID-19 once treatment became available.

“Evidence that tech companies were able to benefit from the monetisation of false information and allowed others to do so is shocking,” added Knight. “We need robust regulation to hold these companies to account.

“The coronavirus crisis has demonstrated that without due weight of the law, social media companies have no incentive to consider a duty of care to those who use their services.”

In response, Facebook insisted that it doesn’t allow the posting of harmful misinformation, and claimed that it had added warning labels to approximately 90 million pieces of pandemic-related content during March and April.

Twitter told the BBC its chief goal was “protecting the health of the public conversation – this means surfacing authoritative public health information and the highest quality and most relevant content and context first”.

YouTube, meanwhile, said that its policies were in accord with guidance from the World Health Organisation (WHO).

“The proliferation of dangerous claims about COVID-19 has been unstoppable”Julian Knight MP

The DCMS Committee’s calls for legislative action appear unlikely to be met soon.

Last month, Lord Puttnam, Chair of the Lords Democracy and Digital Committee, said he feared that the Online Harms Bill may not come into effect until 2024.

After DCMS minister Caroline Dinenage told his committee that the government would not commit to bringing a draft bill to parliament before the end of 2021, Lord Puttnam told the BBC’s Today programme: “It’s finished.

“Here’s a bill that the government paraded as being very important – and it is – which they’ve managed to lose somehow.”

On the bill’s possible adoption in 2024, he said that would be “seven years from conception – in the technology world that’s two lifetimes”.

Related news: UK fake news detection start-up raises £2.5m to gear up for US election


Leave a Reply

Free live webinar & QA

Blended learning – Did we forget about the students?

Free Education Webinar with Class

Wednesday, June 15, 11AM London BST

Join our expert panel as we look at what blended learning means in 2022 and how universities can meet the needs of ever more diverse student expectations.