New Statesman: What’s illegal offline must be illegal online, says Damian Collins

New Statesman: What’s illegal offline must be illegal online, says Damian Collins

Article by Sarah Dawood for the New Statesman - published 15 December 2021

More online behaviour should be made illegal and fall under the scope of internet safety law, says Damian Collins, chair of the joint committee on the draft Online Safety Bill.

The joint committee is made up of MPs and peers and has now published its report of recommendations following evidence from the victims of online abuse, journalists, academics, tech companies, Ofcom, whistleblowers and the government. Once it becomes law, the Online Safety Act aims to increase internet safety by tackling illegal content such as terrorist propaganda and child abuse, and “legal but harmful” content, such as cyberbullying, while protecting free speech.

Social media companies and other websites where users interact will face fines and their sites being blocked if they do not adhere to the rules, overseen by the regulator Ofcom.

The committee’s recommendations aim to strengthen the powers of the bill and call an end to the “Wild West online”, says Collins.

“What’s illegal offline should be regulated online,” he says. “For too long, Big Tech has gotten away with being the land of the lawless. A lack of regulation online has left too many people vulnerable to abuse, fraud, violence and, in some cases, even loss of life. The era of self-regulation for Big Tech has come to an end.”

Making more online behaviour illegal

The report suggests making more individual acts punishable by law, as recommended by the Law Commission earlier this year. This includes cyberflashing; content promoting self-harm; sharing content with the intent of causing physical or severe psychological harm; and knowingly promoting false content with the intent to cause harm.

It also suggests that fraud and “scam advertising” should fall under scope of the bill, while there should be an “automatic exception” for recognised news publishers to protect journalism.

The bill needs to be more specific about particular harms, Collins tells Spotlight; it has previously been criticised for being too vague and treating all online behaviour, from bullying to misinformation, the same. The committee recommends that Ofcom draw up mandatory “codes of practice” for websites to follow in different areas, with the ability to introduce additional codes in future to ensure the legislation stays up-to-date.

Expanding Ofcom’s powers

Currently, Ofcom has the power to fine companies and block websites. The committee is recommending criminal prosecution for company directors who fail to comply with the new laws, and strengthening Ofcom’s power so it can access tech companies’ information and data. This would allow them to investigate and audit companies more thoroughly. “[Currently], we don’t see this [data], unless it’s leaked by whistleblowers,” says Collins.

Tech companies should also have to conduct internal risk assessments to check for threats to user safety, including the potential harmful impact of algorithms. The Facebook whistleblower Frances Haugen recently spoke out about the dangerous impact of the tech giant’s algorithms, which she said prioritised hateful and false content in people’s news feeds.

Transparency over anonymous accounts

Banning anonymous online accounts has been supported by pro-safety critics and rejected by pro-privacy campaigners. The committee is calling for more “traceability” over anonymous accounts that abuse others or do something illegal, says Collins, meaning tech companies have an obligation to give details of these users to the police. However, they are not recommending a ban on anonymity entirely.

“There are certain circumstances where it’s perfectly understandable that somebody wouldn’t want to post in their own name,” says Collins. “Someone who was themselves a victim of abuse and who wants to talk about their experiences, for instance. People who need anonymity to protect themselves should be able to do so, but those who use it as a shield to attack others should be identified if they break the law.”

Promoting user awareness

Websites should create a mandatory online safety policy for users to sign up to, similar to terms of condition, says Collins: “This would be a reminder of what the platform’s policies are, and [will improve transparency]. It will tell users what they should expect, what they have a right to complain about and what the law requires.”

The report also suggests creating an ombudsman to deal with people’s complaints, allowing people to bring court cases against social media companies. “This would give them another route to seek redress if they feel they’ve suffered as a consequence of the platform failing to meet their duties,” says Collins.

It is not currently known when the bill is expected to become law but a Department for Digital Culture, Media and Sport spokesperson previously told Spotlight the government is “committed to introducing the bill as soon as possible” following the report.

Copyright 2024 Damian Collins. All rights reserved

Promoted by Dylan Jeffrey on behalf of Damian Collins, both of FHCA, 4 West Cliff Gardens, Folkestone, Kent, CT20 1SP.

Site by FLOURISH

linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram