Social media firms must take responsibility for harmful Covid-19 disinformation

Social media firms must take responsibility for harmful Covid-19 disinformation

Published in the Press Gazette on 22 June 2020

We are living through the first major public health emergency in the age of social media disinformation. Where the disinformation that is spreading online is not just about politics and elections. It is actually about things that can affect people’s health. And as we have heard from doctors who have spoken out on this issue, there are real-world consequences of fake news spreading.

There has always been a tension in the relationship between the things that people say, and the harm that they can do to others. And medical disinformation creates a really clear example of this.

People are entitled to their opinions. But when they are knowingly and maliciously spreading false information that can lead to real-world harms, including people losing their lives, then I think we should see that as being offensive. And we should expect social media companies to act on that.

I think at the heart of this is a challenge to the business models of these companies, and the need for them to be held responsible and accountable for the things that happen on their platform.

Ultimately, they are responsible for what happens on their platform. And the reason for that is that their tools help to curate and push content to the people that use it.

I was very interested to see the Wall Street Journal reporting on an internal report that Facebook prepared for itself in 2016, which highlighted the fact that “64% of all extremist group joins are due to our recommendation tools”. And a further report two years later saying that “our algorithms exploit the human brain’s attraction to divisiveness”. So these tools are at the heart of it, and therefore the companies need to take some responsibility.

That’s why I think it’s wrong for the companies to be given safe harbour, immunity from acting against harmful content on their platforms.

And therefore you need to make sure the power does not just rest with them. It should not be for them to decide. And it should not be for us to have no ability to audit what they’re doing, and to take action if we feel they’re getting it wrong.

This is so important because social media has become the principal way in which many people get their news and information. People no longer rely on newspapers or news broadcasts to get their facts and information. And if what they see online mixes together truthful content, and harmful content and dishonest content, in such a way that users simply don’t know what to believe, that is not good enough. I don’t think we should accept it.

I fear that the companies will not voluntarily subject themselves to oversight, and therefore they require legislators in parliaments around the world to develop their codes. And I’ve been very encouraging and supportive of the work being done in the UK to legislate against online harms – to designate content categories that are harmful, having independent oversight of tech companies to make sure they act against, and remove and downgrade, harmful content.

I think starting with medical disinformation is going to be so important. Because as we deal with coronavirus, as we eliminate it from our lives, one of the key tools for that is going to be having a vaccine. And one of the problems that we know we’ll have to fight will be anti-vaccine campaigners – exploiting social media’s algorithms, trying to dissuade people from taking a vaccine that could save their lives.

And I don’t think it could get any more important than that.

Copyright 2021 Damian Collins. All rights reserved

Promoted by Stephen James for and on behalf of Damian Collins, both of Folkestone & Hythe Conservative Association both at 4 West Cliff Gardens, Folkestone, Kent CT20 1SP

Site by FLOURISH

linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram