We cannot trust Nick Clegg and Facebook to police their own content

We cannot trust Nick Clegg and Facebook to police their own content

Article written for the Daily Telegraph - published 1 February 2021

Sir Nick Clegg has a bridge to sell America. Having spent most of his career, both as a British politician, and now as a spokesman for Facebook, suggesting he offers a unique middle way between two false alternatives, he is now pretending that his pet project, the Facebook oversight board, is all that stands in the way of either the complete destruction of citizens' freedom of speech rights or the final decent of Western democracy into chaos and anarchy. The promise he makes most often to his American audience is "just trust us" - but why should we?

In 2019, as the United States presidential election approached, Clegg argued that Facebook had no choice but to set its own policies on political messaging and election campaigning, "in the absence of government regulation". Just trust us.

In the aftermath of the 2020 election, as disinformation spread like wildfire on Facebook's platforms, Clegg told an audience in India that the best enforcer regarding the harms created by Facebook was … Facebook itself. Regulation shouldn't try to "micro-manage" content but governments should instead insist on transparency for the systems that tech companies already have in place. Just trust us.

Writing in The Washington Post last month, Clegg shared his plan for then president-elect Joe Biden on how to "save what's left of the global internet". Together, Clegg argued, we can stop repressive governments from cracking down on free speech through "bilateralism" and international co-operation - co-operation with big tech. Just trust us.

For Clegg and Facebook, the watchdog best positioned to mitigate Facebook's damage on political discourse is the company itself. Facebook is always its own best monitor.

That's why we should be very wary of Clegg's latest promises that Facebook's self-appointed "oversight board" will hold Donald Trump accountable for his actions in inciting the Jan 6 Capitol riot, or that the Facebook oversight board can play any meaningful role at all in stemming hate speech, promotion of violence and disinformation on the site.

First, a step back: in 2020, Mark Zuckerberg and Facebook formally launched their oversight board, which is meant to adjudicate content disputes on the company's platforms. Facebook self-refers cases to the oversight board, which has 90 days to rule and make policy. The oversight board members, we learnt from The New York Times, are paid six-figure sums by Facebook for deliberations held behind closed doors.

Last month, Facebook announced that the oversight board would rule on a permanent ban of Mr Trump from the platforms. In a major PR blitz, Facebook has touted the oversight board as a transformational vehicle to address content moderation issues on the site. But you could drive a truck through the loopholes and gaps in accountability.

The oversight board can only rule on content bans after a user has exhausted appeals to Facebook. It cannot initiate its own investigations and is not duty bound to users to hear an appeal. Indeed, of the 20,000 submissions by Facebook users, the oversight board only took up six cases. Too bad for the other 19,994 people who feel they were treated wrongly by Facebook. No day in court for you. The review takes up to 90 days unless it is "expedited", which is 30 days. That means immediate harms on the platform -such as a US president inciting a violent coup - are out of the purview of the oversight board.

The oversight board doesn't adjudicate content on Facebook Live, or in Facebook groups. This renders swathes of content - often the most immediately harmful - impervious to its review. Even when the oversight board rules that it is right for a piece of content to be removed, Facebook is under no obligation to take down copies of the same material shared in the exact same context. Instead, it only needs to consider whether such an action would be "technically and operationally feasible".

Clegg said, disingenuously, to National Public Radio, based in Washington, that "the touchstone principle for us, and people can agree or disagree with it, is this - is that where we feel there is speech on our platform where there is a link to an impending risk of real-world violence, then we act."

But Facebook didn't act for years, notoriously refusing to ban Trump, even after he used the platform to warn Black Lives Matter protesters that "when the looting starts, the shooting starts" last May. The drum beat of the "Stop the Steal" campaign, which asserted without any evidence that there had been a mass fraud at the polls last November, ran extensively through Facebook starting on election night, but for Facebook it took a literal attempted insurrection at the US Capitol before it acted. Its oversight board was powerless to do anything in that case, by virtue of its own founding mandate.

Even today, as the oversight board takes 90 days to decide the fate of Mr Trump on the platform, authoritarians and white supremacists are continuing to foment hate, misinformation and violence. Clegg says Facebook would "act" when it felt its policies on hate speech were broken; meanwhile, Steve Bannon continues to rile up his supporters on Facebook over false claims of a stolen election.

Clegg told The Washington Post that Facebook could be trusted to stop repressive speech but for years, Facebook has met the speech requirements of nearly every repressive regime in the world while simultaneously failing to control harmful speech in democracies. That brings us back to Trump.

Clegg tells us that Facebook's oversight board is on the case - he is "very confident" it will affirm the company's decision to suspend Trump. Even if it does, what will be the long-term consequences of their social media networks having been pumped full of poison for so long?

Nick Clegg likes to lecture Americans, arguing often that the country was deeply divided long before Facebook's algorithms took hold. It's not Facebook's fault - it's yours. The truth is, Facebook can't self-regulate out of the mess it has made. If we want to stop division, misinformation and hate, we need to demand independent oversight and meaningful regulation of Facebook. We can't trust it or Nick Clegg again.

Copyright 2021 Damian Collins. All rights reserved

Promoted by Stephen James for and on behalf of Damian Collins, both of Folkestone & Hythe Conservative Association both at 4 West Cliff Gardens, Folkestone, Kent CT20 1SP

Site by FLOURISH

linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram