We must clamp down on intolerable online behaviour

Article written for the New Statesman - published 7 February 2022

The last time I wrote for the New Statesman on Big Tech and internet safety, the joint committee on the draft online safety bill’s inquiry was in full swing. Over the course of five months, the committee received more than 200 written evidence submissions, and took more than 50 hours of oral testimony from companies, ministers, whistleblowers, campaigners, lawyers and professors, on how exactly the UK could successfully become “the safest place in the world to be online”, as the bill is promising.

We published our 60,000-word report on 14 December 2021, and I’m proud to say that all of our recommendations were unanimous: a clear example of how much not only both the Lords and the Commons, but also all political parties, are willing to work together to finally hold social media companies to account.

We had a simple yet daunting task: scrutinise the Draft Online Safety Bill as published by the government last summer, and make sure it was fit for purpose. For me, like many others, it has sometimes felt like this bill has been a long time coming; the digital, culture, media and sport (DCMS) committee inquiry into disinformation and “fake news” in 2018 kick-started the process, recommending that platforms should be held accountable to a UK-based regulator. But it’s also vital that we get the legal framework right, as the UK is going to be the first country in the world to legislate so comprehensively to tame the digital wild west.

Throughout the inquiry we heard concerns from all sides: that the bill was difficult to interpret for businesses and users alike; that it would give too much power to the government, or social media platforms themselves, to police free speech; and that it was unclear how it would tackle some of the most egregious harms seen in recent years, such as the disgusting racial abuse after the Euros final, terrorists and human traffickers using social media to build networks, the incitement to violence at the US Capitol, or even the promotion of self-harm among teenagers.

At the heart of our recommendations are two core principles: that online platforms and search engines should be held accountable for the design of their systems and the way they promote content; and that regulation should be governed by democratic principles established by parliament, not just by terms of service written in Silicon Valley.

We think the best way to do this is to give independent regulator Ofcom the power to set mandatory codes of practice, based on existing British laws, on how social media companies should make sure their systems and processes don’t promote content and activity that would never be acceptable offline.

These would provide clear guidance to social media platforms on how to deal with content and activity that promotes and glamourises terrorism, facilitates child abuse, fuels online fraud, or amplifies discrimination based on protected characteristics in equalities legislation. Other codes of practice would make sure platforms promote digital literacy, freedom of expression, and above all, safety by design – ensuring devices and software are developed and designed with user safety in mind.

Nobody will benefit from a one-size-fits-all approach. This is why the committee agreed that Ofcom should also conduct a general assessment of all platform features that we heard were risk factors: live location, infinite scrolling, one-click sharing, artificial intelligence (AI) moderation, end-to-end encryption, unmoderated groups, and anonymity, to name a few.

Based on these, Ofcom will come up with different risk profiles, and match individual platforms to them. Social media companies will have to manage the specific risks that have been identified on their platforms and search engines, following minimum standards set by the regulator. This will guarantee that those businesses that have few risk factors don’t have to shoulder the same burden as those with much higher risk factors.

Platforms will have to show how they are following the codes of practice and managing their own specific risks in regular transparency reports to Ofcom. If the regulator has doubts, it should be able to audit the companies, calling on external experts if needed. Also, if some companies resist or refuse to engage, not only will financial sanctions of up to 10 per cent of global turnover come into play but there could also be prosecutions. We endorsed the government’s proposal to bring forward criminal sanctions, and we think they should be directed towards a named “safety controller” within a tech company, responsible for compliance with the Online Safety Act.

The committee also agreed with the Law Commission that new offences need to be created. While many of the harms we want to act against can be resolved by applying existing laws online, practices have also evolved; exposing someone to sexual images without their consent, sending people with epilepsy flashing images, and glamourising self-harm are all new, but extremely damaging, phenomena.

We recommended that the Online Safety Bill be amended to directly include these as new crimes, again with platforms responsible for making sure they don’t amplify them. In an update on 4 February 2022, the government committed to adopting some of these recommendations, and to seriously consider others. They’ve also announced that they will state clearly on the face of the bill all existing offences that platforms will have to mitigate – another win for the joint committee.

Other measures we recommend include mandatory age-assurance on all websites likely to be visited by children, automatic exemption of recognised news publishers and of content that’s in the public interest, as well as the establishment of a permanent joint committee to ensure democratic oversight of the new regime.

Together, as a package, we think they would significantly improve the bill, and ensure it finds that fine balance between protecting freedom of speech and clamping down on behaviours that parliament has decided are intolerable in a free, democratic society. I hope the government will listen and adopt changes that we think will make the UK a shining example of smart regulation in the digital age.

Copyright 2021 Damian Collins. All rights reserved

Promoted by Stephen James for and on behalf of Damian Collins, both of Folkestone & Hythe Conservative Association both at 4 West Cliff Gardens, Folkestone, Kent CT20 1SP

Site by FLOURISH

linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram