The problem of disinformation and misinformation available on social media platforms must be addressed in forthcoming online harms legislation, MPs have urged.
Speaking as part of a debate on Online Harms in Westminster Hall this week, Labour MP Holly Lynch (pictured) said that, alongside online harms such as terrorism, child sexual exploitation and abuse, bullying and inciting or assisting suicide, the government must do more to tackle the spread of misinformation online including the role of paid ads in promoting false narratives.
She said: “Online social media platforms have said far too often that they only provide the platform and can only do so much in relation to the content that is shared upon it. Where that holds no water at all is where paid ads are concerned. It is a glaring omission from the White Paper that it does not consider misinformation and disinformation which can not only be shared widely for free but can be promoted through online advertising.”
Leading the debate, Lynch pointed to the fact that social media giants such as Google and Facebook continue to dominate the digital advertising market and that while advertising in print and on broadcast platforms is regulated through Ofcom and the Advertising Standards Agency, it remains essential that regulation is similarly extended to online advertising as a matter of urgency.
Lynch went on to warn that the prevalence of false and misleading information on social media platforms, including the suggested causes of the Covid-19 pandemic and false narrative around vaccines related to it, meant that accurate news and information published by the mainstream media was being crowded out of the marketplace.
She said: “So called clickbait advertising and the monetisation of items dressed up as news – with the most outrageous and sensational teasers inevitably receiving the most clicks and income – means that the space for credible news from real journalists with integrity to both their conduct and content, are being driven out of this space. The online business model doesn’t work for those that play by the rules.”
Lynch’s concerns were also echoed by Labour MP Chris Elmore, who said that legitimate concerns existed in relation to misinformation around false cures for Covid alongside vaccines and treatments.
He said: “I have raised anti-vax issues right across the summer, and as the pandemic started. In the last year an additional 7.8 million people have visited anti-vax Facebook pages or followed the Twitter, Instagram or YouTube accounts of organisations that are trying to make a quick buck out of people’s fears by selling false vaccines and treatments, asking them not to consult a doctor if they have any symptoms—“Don’t get tests because you can cure these things with different types of herbal tea”.
Pointing to the Yorkshire Post’s Call it Out campaign, Lynch also warned that better takedown measures were needed to tackle abuse online – including abusive and threatening messages sent to journalists.
She said: “One of the reasons the Yorkshire Evening Post was so motivated to launch the Call It Out campaign was realising the impact of the barrage of online abuse directed predominantly, but not exclusively, towards their its female journalists. Editor Laura Collins, who I commend for her leadership on this issue, told me this week that the sentiment of one comment on Facebook responding to an article about the local restrictions in Leeds was not uncommon: it said, “Whoever is publishing these articles needs executing by firing squad”. The newspaper reported it to Facebook on 28 September and nine days later is yet to receive a response.”
Responding to questions during the debate, Caroline Dinenage MP, Minister of State for Digital and Culture with responsibility for online harms confirmed that the government’s full response to the Online Harms White paper would be published in the coming weeks.
The Society has previously called for specific safeguards to be included in Online Harms legislation to ensure news publishers are not damaged by the proposals. Primary amongst the Society’s concerns is that a clear and absolute exemption for mainstream media content is included in any legislation. The Society has urged caution around attempts to tackle misinformation and disinformation online as part of Online Harms legislation and remains wary of any attempt by officials to create a body that decides what is truth or misinformation. The Society has suggested that it would be far better to concentrate efforts on supporting and promoting the work of the UK news media.