Article written for Bright Blue's spring 2020 edition of Centre Write
The most popular media for some of the most vulnerable people in society, our children, is the least regulated. For most social media accounts, you are required to be 13 to register to use the service, however there are no effective age verification tools when someone creates an account on Facebook, Instagram and Snapchat. They rely entirely on self-certification, which means it is as easy for a ten year old girl to pretend she is 18 as it is for a 50 year old man to claim he is 15. Whilst there exists a ‘YouTube kids’ service, according to the media regulator Ofcom, for children over the age of five, the main YouTube platform is their favourite video streaming service. Young adults aged between 18 and 34 watch more YouTube on average each day than they do all of the traditional free to air broadcast channels combined, and even for all adults YouTube is the third most popular service, only sitting behind BBC 1 and ITV 1.
Over the years we have developed codes of practice for broadcasters to ensure good standards are met and introduced the 9pm watershed to try and keep younger audiences away from harmful content. For most people today, these rules are about as relevant as the Corn Laws. Yet why should we accept that even though media habits are changing, our oversight and regulation of the content that people consume everyday should stay the same? This has led to a world where a small community radio station with a few thousand listeners requires a license from the media regulator Ofcom, but a social media channel with millions of individual subscribers does not.
That’s why I want us to act now to make the big tech companies more responsible, in law, for the content that is served to users on their platform. They should have a legal duty of care overseen by a regulator that has the power to investigate and act against those companies when things go wrong.
In response to our Digital, Culture, Media and Sport Select Committee inquiry on disinformation and fake news, the Government published an Online Harms White Paper last April and has released its response to the subsequent public consultation. In our Conservative manifesto, the Prime Minister Boris Johnson committed to “make the UK the safest place in the world to be online”, protecting children and the most vulnerable in our society from abuse, whilst also going after terrorist content. We will always need to balance the need for regulation with the imperative of freedom of speech, which is a pillar of our democracy. But freedom of speech is not the same as freedom of reach. People have the right to express their opinions, but I don’t believe that means they have the same unchecked right to use the tools of social media to proactively broadcast those views to millions of people, multiple times a day at the click of a button.
Boris Johnson rightly says that we can make the UK the safest place in the world to be on the internet. Sensible principles striking the balance between protection of users and freedom of speech, determined and overseen by an independent regulator such as Ofcom, could allow us to do just that.