News

DCMS Chair calls for tougher measures to hold platforms carrying misinformation to account

Posted on: April 30, 2020 by Mariella Brown

As the DCMS inquiry on misinformation and fake news took place today (April 30, 2020), Julian Knight MP has called for sufficient parliamentary oversight of any independent regulator.

In a post on Politics Home today, the Chair of the Department for Digital, Culture, Media and Sport Committee Julian Knight MP has said the DCMS committee ought to have statutory power of veto over the regulator’s chief executive.

Citing the misinformation perpetuated during the current Covid-19 crisis, Knight has said the ‘infodemic’ is particularly acute as communities are dependent on social media to stay connected with friends or loved ones. 

Knight added: “To save lives and protect the NHS, we must make sure we can trust the information we receive. It’s time for Silicon Valley to play its part.”

This sentiment was echoed today in the questioning of academic researchers and representatives of social media giants in the DCMS Sub-committee hearing on Online Harms and Disinformation regarding Covid-19.

Stacie Hoffman of Digital Policy and Cyber Security Consultant at Oxford Information Labs said governments should require social media companies to be more transparent about what fake news they are taking down so research companies can analyse it further. According to Hoffman’s research, although social media companies are taking action against Covid-19 fake news there is a lack of oversight about the removed information.

Dr Claire Wardle, Co-Founder and Director of First Draft News explained that during the pandemic people are disconnected and therefore connect over sharing information. In some cases, during the pandemic “people are inadvertently sharing false information believing that they’re doing the right thing.”

Wardle said her company’s research shows the most effective thing for the platform to do is to build in friction.

She explained the effectiveness behind WhatsApp’s decision to limit the forwarding of a chain messages was a mechanism that forces people to think about what they are sharing. The outcome yesterday, Wardle said, was WhatsApp has dropped virality by 70%.

Responding to the evidence set out by researchers, the second part of the session set to address ways of tackling misinformation.

When asked about the proportion of ‘bot’ accounts supplying fake Covid-19 news, Twitter’s UK Head of Government, Public Policy and Philanthropy Katy Minshall said it would be misleading to give a yes or no answer given the complexities of determining fake accounts.

However, when asked whether Twitter would consider using passports or driving licences to verify online identities on social media accounts to combat this, Minshall was adamant this was not an action Twitter would wish to take.

Facebook also was questioned on the defences it has set up to combat fake news during the coronavirus crisis.

Facebook’s Richard Earley, UK Public Policy Manager was keen to point out the difficulties of censuring fake news. “Who is the correct authority to decide what is disinformation and what isn’t,” he said.

But Earley added Facebook was taking action against fake news via third party fact-checking apps and said that a huge focus of the company was to connect people to sources of authoritative information.

Earley said that when a fake post appears on a user’s newsfeed and is covered by a Facebook ‘content cover’ warning the user of the potentially fake content, 95% of people do not go on to click through it.

Earlier this month, Facebook created a notification system to alert users who have seen harmful misinformation that has been removed from the platform in order to redirect them to accurate coronavirus information and myth-busting Q&As supplied by the World Health Organisation.