The Society of Editors (SoE) says the government should resist the urge to rush towards censorship of the internet in a bid to combat disinformation and fake news.
The warning comes after MPs called on the government to rush through legislation aimed at combatting misinformation by threatening the digital platforms with fines, bans and even imprisonment if they do not do more to prevent the spread of harmful opinions.
The DCMS Committee urged the government to make haste in revealing details of its proposed Online Harms Bill and to include steps against the spread of fake news.
In its Misinformation in the COVID-19 Infodemic Report the committee details evidence on a range of harms from dangerous hoax treatments to conspiracy theories that led to attacks on 5G engineers. It added that online misinformation about Covid-19 was allowed to spread virulently across social media without the protections offered by legislation, promised by the government 15 months ago.
But the SoE has repeated its warnings that attempts to clampdown on misinformation would be likely to lead to censorship, attacks on the UK’s free media, a loss of freedom of expression and an inevitable creation of an Orwellian ‘Ministry of Truth.’
“It would be far better if the government took steps to support the role of the mainstream media in counteracting fake news rather than attempting Canute-like to hold back the tide of misinformation through threats and censorship,” commented the SoE’s executive director Ian Murray.
“There is a huge amount of work the government can do in tackling such online harms as threats to children, abusive behaviour, the promotion of self-harm and other areas where direct action could be taken. No one doubts there is misinformation and fake news on the web, but any attempt to tackle it is fraught with the dangers posed by censorship.
“Once it is decided that some opinions are not acceptable or true then inevitably someone or some body must be chosen to decide what is truth and what is not. In a world where opinions can change even at the highest level – the debate over the wearing of masks during the pandemic is an obvious example – this is an impossible task. No solution can be free from bias, prejudice of thought and political interference. It is a road fraught with peril, which is why the DCMS appear to have decided to concentrate on other areas of online harms where results can be achieved.”
Murray added that if threatened with severe penalties the digital platforms were likely to resort to analytics and AI to monitor for possible problems and remove content with little or no thought to analysis. As a result large areas of legitimate debate will be silenced.
“The government has said that the new bill is not designed to threaten the media and journalistic content, but that will matter for little if the machines are deciding what can be published online. And the creation of a body of any form that will decide what are acceptable truths will be setting a dangerous precedent,” said Murray.
“What is needed to combat false information and rumour is knowledge of the facts as they are available, presented in a manner that retains the trust of the public. The mainstream media with its regulations and adherence to the rule of law is the best antidote to misinformation.”
In its report the DCMS Committee called for the Government to make a final decision on the appointment of the regulator now.
Julian Knight MP, Chair of the DCMS Committee, said: “We are calling on the Government to name the Regulator now and get on with the ‘world-leading’ legislation on social media that we’ve long been promised.
“The proliferation of dangerous claims about Covid-19 has been unstoppable. The leaders of social media companies have failed to tackle the infodemic of misinformation. Evidence that tech companies were able to benefit from the monetisation of false information and allowed others to do so is shocking. We need robust regulation to hold these companies to account.
“The coronavirus crisis has demonstrated that without due weight of the law, social media companies have no incentive to consider a duty of care to those who use their services.”
The report concluded:
Online misinformation about Covid-19 was allowed to spread virulently across social media as a result of delays to online harms legislation
Evidence that tech companies benefited from the monetisation of misinformation and allowed others to do so
Light touch approach advocated by Government ‘insufficient’ to tackle tide of misinformation
Legislation must go beyond requiring platforms to enforce policies that are not fit for purpose
Efforts by tech companies to tackle misinformation through warning labels or tools to correct the record have fallen short
The committee’s key recommendations:
- Government should publish draft legislation – in part or in full – alongside the full consultation response to the White Paper this autumn if a finalised Bill is not ready
- Urges the Government to finalise the Regulator now. Notes Ofcom’s expedited work on misinformation in other areas of its remit in this time of crisis as arguments in its favour
New regulator should be empowered to examine the role of user verification in the spread of misinformation and other online harms
Ministers should set out a comprehensive list of harms in scope for online harms legislation, rather than allowing companies to do so themselves or to set what they deem acceptable through their terms and conditions. The Regulator should have the power instead to judge where these policies are inadequate and make recommendations accordingly against these harms
Government must empower the Regulator to go beyond ensuring that tech companies enforce their own policies, community standards and terms of service, but also ensure that these policies themselves are adequate in addressing the harms faced by society
The Regulator should be empowered to hand out significant fines for non-compliance. It should also have the ability to disrupt the activities of businesses that are not complying, and ultimately to ensure that custodial sentences are available as a sanction where required
Government not the Regulator should bring forward evidence-led process to decide which harms would be covered by legislation. Clearly differentiated expectations of tech companies for illegal content and ‘legal but harmful’ content should also be established
Call for Government to urgently develop voluntary code of practice to protect citizens from harmful impacts of misinformation and disinformation prior to legislation
A full list of recommendations can be found in the report.