MPs from the Digital, Culture, Media, and Sport Online Harms sub-committee have reiterated the need for enhanced safeguards to protect freedom of expression in the Online Safety Bill.
In a committee hearing earlier today (Tuesday 1 February 2022), MPs questioned the Minister for Tech and the Digital Economy, Chris Philp MP on the bill and called for greater safeguards to ensure that algorithms used to comply with any new legislation do not “stifle freedom of expression”.
The Society of Editors has previously expressed concern that the draft bill contains insufficient safeguards to ensure that valid news content will not be swept away by the use of algorithms and a report by the DCMS sub-committee published last week expressed similar concern that the use of algorithms to monitor content could result in “excessive take-down” of legitimate content.
Responding to a question by Jane Stevenson MP highlighting concerns that the proposed legal but harmful content requirements included in the draft bill could see a reliance on algorithms that “stifle freedom of expression and freedom of speech,” Philp said that protections for journalistic content would be “baked into the bill.”
He said: “In the Bill, as drafted, there are some quite strong provisions that relate to freedom of expression which is obviously very important. All service providers, big and small, will have to have regard to freedom of expression when implementing their safety duties…in addition to that, “category 1” services also have express duties to protect democratic and journalistic content. In considering content of democratic and journalistic importance they must consider the public interest when they are weighing that against any harm that may be caused. Those protections for free speech are baked into the bill already.”
In addition to this, Ofcom’s role as regulator would see it given powers to ensure that platforms are meeting their duties effectively, he said.
He added: “Ofcom are going to get resourced-up. There is a funding package over the first three years amounting to £110m which is partly for our internal resources but mostly for Ofcom. They will be requiring the social media firms, the large ones in particular, to deliver their new duties and the social media firms will have to resource accordingly or they will fall foul of those duties. If they don’t meet those duties they will be subject to regulatory action. The homework of the social media firms will be getting marked by Ofcom and they will be getting fined if they don’t meet it.”
Responding to whether the government had considered the committee’s suggestion that providers be required to have designated compliance officers, similar to that present in financial services regulation, the Minister said that alongside proposals to make named-individuals liable for non-compliance, Ofcom would have the power to request information from platforms including the use of algorithms.
Philp said: “The named individual, who will be an employee of the social media firm, will be responsible for delivering up information and Ofcom will have powers of inspection and audit. Ofcom can go in themselves and demand information and poke around and get reports and get hold of copies of the algorithms – whatever is required.”
Alongside this, the bill would see a duty placed on platforms to assess the impact of their policies said Kate Morris, Head of the Online Harms Regulatory Framework.
She said: “The category 1 services, who are the only ones that have legal but harmful duties, have to undertake an assessment of the impact of their policies on freedom of expression and then publish it. That provides much greater transparency around what they are doing to protect free speech.”