The UK should force social media companies to assess and report the harm caused by their algorithms, a parliamentary committee has recommended ahead of new laws to improve online safety.
The recommendations came days after the photo-sharing app Instagram said it would allow its users to switch to a chronological feed of posts rather than one ordered by its algorithms, which have recently been criticised for promoting harmful content.
The committee was responding to a draft of the Online Safety Bill, which is due to come into law in Spring 2022 in order to protect children and adults from harmful and illegal internet content, including hate speech and content that promotes child sexual abuse, violence or terrorism.
After hearing 50 witnesses and receiving more than 200 written submissions, the committee said that social media companies had exacerbated the “presence, spread and effect of harms” by designing algorithms that focused above all on user engagement, such as commenting, sharing or liking posts.
The cross-party joint committee, set up to scrutinise the bill, also suggested that posts about self-harm and cyberflashing, or sending explicit sexual images without consent, should be made illegal. It added that the definition of hate speech should be updated.
Damian Collins, chair of the committee, added that internet companies should name the decision makers who are accountable for online safety.
“The company has to say this is the group of people that is responsible,” he said. “There’s no excuse. We still don’t know what Mark Zuckerberg [chief executive of Meta] knows. It’s his company but what is he basing those decisions on? What is he prepared to do on this?”
Ofcom, the media regulator, will be given responsibility to regulate tech giants under the draft bill and to sanction them with fines of up to £18m or 10 per cent of annual global turnover.
Earlier this year, prime minister Boris Johnson signalled the UK government’s desire to adopt a tougher stance with regard to tech regulations.
“It’s time the online giants realised that they simply cannot think of themselves as neutral pieces of infrastructure,” he told the liaison select committee. “They are publishers and they have responsibility for what appears on their systems and the online harms bill is designed to give effect to that distinction”.
Online advertising scams should also fall under this legislation, the report said, but in October, digital and culture secretary Nadine Dorries said she was prevented from including it based on “legal advice received.”