Social media failed in duty of care over Capitol Hill riot

Article written for the Times - published 8 January 2021

We should be shocked at the scenes in Washington DC, of protesters storming the Capitol building in order to try and stop Congress from validating the election of Joe Biden to be the next president. These are appalling scenes and an attack on a fair election in one of the world's great democracies. However, we should not be surprised that this has happened.

Donald Trump, and former aides of his like Steve Bannon, have been calling for demonstrations and protests deliberately targeting the sixth of January in Washington DC. Bannon, on his War Room podcast, previously called for the heads of public officials like Dr Fauci to be placed on spikes as they were in Tudor England. He cited how, in the American Revolutionary Wars, people were executed for collaborating with the British. He called that war a civil war, and many people think that he has been inciting an uprising in America today. Donald Trump clearly refuses to accept the result of the Presidential election. He can't bring forward any evidence to support his claims, and indeed he has failed in this attempt, but nevertheless what has taken place in Washington in the last 48 hours is as a result of him inciting people to rise up against that result, and in doing so to rise up against American democracy itself.

All of these things are serious in their own right, but they’ve also been made possible by social media. These rallies and demonstrations couldn’t have been organized without it. We've seen since November, increasing numbers of Americans, and particularly those who voted Republican, now believing that the election was fraudulent, and that Joe Biden didn't win. These baseless allegations and conspiracy theories have been spread through social media deliberately targeting people who are most likely to believe them. This presents a challenge for the social media companies. Is it okay to plan an insurrection on Facebook, to invite people to join it on Facebook, but it only becomes a matter where the platform has to get involved, if people start committing violent acts? It’s time for wider recognition that harmful misinformation can be used as a weapon against democracy, and we've seen that being played out over the last couple of days.

Facebook, Twitter and YouTube have now suspended Donald Trump from their platforms. This is the right action but it has been too late in coming. If they had acted sooner perhaps the deaths and violent behaviour that came from 6 January demonstration could have been avoided. We are yet to fully understand, as well, the long term damage that has been done, by allowing things to reach such a fever pitch, before these companies decided to act.

The people who took part in this revolt on Capitol Hill, and broke into the buildings there, including the chamber of the Senate, had been persuaded their actions were justified, and that they were acting in the interests of President Trump. Their opinions will not be changed when Joe Biden is inaugurated President on 20January. No, they will carry on holding these beliefs that the election was stolen, that there's a great conspiracy against America, and that they need to resist it. These problems will get worse if companies like Facebook don’t act more responsibly against harmful disinformation. What we've seen is an event that's been long talked about, and long planned and largely organized through social media channels that they control, and therefore more should have been done sooner, to intervene.

I believe this should be part of the action we want companies to take against disinformation on their platforms. I don't think it should be just for the tech companies themselves to determine what they believe harmful disinformation is, nor for them to determine what action they should take against it. We need to have proper regulatory structures that set standards, insist that they're followed, and audit the companies to see whether they've done all they can to make sure they comply.

The UK’s online harms proposals can, I believe, do that. I think it's important for the legislation, due to come before Parliament this year, to include powers to give a regulator the authority it needs to set standards and inspect that they're being met. In America, I hope that the new administration will look to reform the section 230 provisions of the 1996 Communications Decency Act, that currently give the social media companies protection from liability on any decisions they make about moderating content, and instead say that if they are hosting harmful content, or if their systems are directing people towards it, then they have a liability for it. I believe we should look at what's happened in Washington this week, and recognize that the social media companies failed in their duty of care to democracy by allowing this to happen, largely unchecked. 2021 should be the year where we start to put this right.

Copyright 2021 Damian Collins. All rights reserved

Promoted by Stephen James for and on behalf of Damian Collins, both of Folkestone & Hythe Conservative Association both at 4 West Cliff Gardens, Folkestone, Kent CT20 1SP

Site by FLOURISH

linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram