Facebook must not be allowed to dictate how it gets regulated

Facebook must not be allowed to dictate how it gets regulated

My piece for WIRED UK, published 20 February 2020

What happens on Facebook matters. The social network connects billions of users around the world, and is for many of them their principal way of receiving news and engaging with their community. When Facebook’s systems are exploited by Russia to interfere in elections, by extremists to live-stream their atrocities, or by armed gangs to carry out attacks on their rivals, we are right to ask what more could have been done to prevent this.

As an advertising business which gathers huge amounts of data about everything its users do on Facebook we are right to challenge the company about why more can’t be done to prevent people sharing images of child abuse, videos that seek to recruit people into terror organisations, or content that might encourage people to self-harm.

Just two years ago, the Cambridge Analytica scandal lead to widespread concern about how users’ personal Facebook data could end up in the hands of a company whose chief executive boasted in an undercover film about the effectiveness of hiring Ukrainian sex workers to entrap politicians.

At the time, Facebook said it was carrying out an audit into other developers who might also have acquired data in breach of the company’s rules and without users’ informed consent. We have never seen any detailed reporting on the progress of this important investigation.

All these issues merely scratch the surface of the general level of concern about the standards and ethics of companies such as Facebook, and provide the background to Mark Zuckerberg’s visit to Europe this week.

Zuckerberg – who has on multiple occasions avoided open scrutiny by parliaments, instead favouring conversations with policy officials behind closed doors – has come to Europe claiming to be the advocate of greater regulation for big tech companies. It will have surprised no-one that he doesn’t believe that any of these new rules need apply to Facebook.

In his article in the Financial Times on February 16, Zuckerberg called for “more oversight and accountability” on decisions around content moderation on social media. However, in Facebook’s case this accountability would be guaranteed by a new “independent” Oversight Board, to be launched in the summer. This Board is being recruited by Facebook, and its remit is limited to considering whether the company has implemented its own policies correctly.

As Article 7 of the Oversight Board’s charter makes clear: “The board will not purport to enforce local law” when reviewing Facebook’s content-moderation decisions. What’s more, the Oversight Board will not be allowed to moderate on “content on WhatsApp, Messenger, [or] Instagram Direct”. This happens as Facebook is about to introduce end-to-end encryption to its messaging tools, in so doing further limiting any responsibility it might have to moderate them. This policy has been described by chief constable Simon Bailey, the UK National Police Chiefs' Council lead for child protection, as one that “will knowingly put the safety of children at risk – ignoring the warnings of police, charities and experts across the world.”

From a policy paper published by Facebook to coincide with Zuckerberg’s trip to Europe, entitled “Charting a way forward: online content regulation”, it’s even clearer that the direction he wants to take us on is a road to nowhere. This paper argues against establishing liability on social media companies for failing to remove illegal and harmful content. It also warns against creating mandatory standards for content removal, and tougher enforcement of citizens data rights. Instead it proposes that there should be “periodic public reporting of enforcement data”, based of course on self-declaration by the company, without any independent external audit.

Facebook’s paper calls for global standards on internet regulation knowing of course that reaching a consensus between the USA, China and Europe on these issues will be impossible. Another clear tactic for trying to kick all of this into the long grass.

Facebook’s basic pitch on regulation is that social media companies should be required to have “systems in place” to deal with illegal and harmful content, but that these policies should be written and administered by them, the social media companies. In Zuckerberg’s ideal world, the only role for government regulation would be to ask Facebook about the existence of these policies without the power to fully audit them. If that’s what Zuckerberg has come to sell, he might as well have stayed in California.

At the heart of Facebook’s defence against the threat of regulation is the protection of freedom of speech. Yet, in his lecture at Georgetown University last year, Mark Zuckerberg acknowledged that “even American tradition recognises that some speech infringes on others’ rights.” Facebook’s preferred method for dealing with content that might cause harm is to downgrade its prevalence, making it harder to find rather than removing it. This line of action seems to imply that people largely see new content because they search for it.

It’s an approach that ignores the impact of co-ordinated networks of Facebook groups with millions of users pushing content through the system, or the undercover economy of Facebook Page administrators promoting other people’s content at lower fees than the company itself charges through its advertising tools. Also, I don’t believe that freedom of speech is the same as freedom of reach. Someone might have the right to share their opinions, but they shouldn’t have the automatic right to hijack social media in order to broadcast them to millions of people, multiple times a day at the click of a button.

These issues require a more proactive response from Facebook, and it shouldn’t just be left to them. It should be for governments to establish clear legal liabilities for social media companies to act against harmful and illegal content when it is being curated through their systems, by their own algorithms.

These new regulations need to be overseen by truly independent regulators with legal powers to inspect and audit the companies and ensure they are complying effectively. The UK government has set out its approach to this in the Online Harms White Paper, and other countries are now looking at similar measures. Once again, Facebook’s response has been to offer too little too late.

Copyright 2024 Damian Collins. All rights reserved

Promoted by Dylan Jeffrey on behalf of Damian Collins, both of FHCA, 4 West Cliff Gardens, Folkestone, Kent, CT20 1SP.

Site by FLOURISH

linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram