The inner workings of Facebook are set to be laid bare by a whistleblower once again on Monday as MPs in the UK prepare to take evidence.
Former Facebook employee Frances Haugen has made numerous blistering claims about the tech giant since releasing thousands of pages of internal research documents she secretly copied before leaving her job in the company’s civic integrity unit.
She has already spoken out about the social network across the pond on television and before politicians, alleging Facebook’s platforms ‘harm children, stoke division and weaken our democracy’, and that it refuses to change its products because executives elevate profits over safety.
Ms Haugen has also accused the tech giant of being aware of the apparent harm Instagram could have on some teenagers and their body image, and said the firm had been dishonest in its public fight against hate content and misinformation by hiding research that shows it amplifies such content.
This afternoon, she will face questions from a UK parliamentary committee scrutinising the draft Online Safety Bill, as the Government works out how to go about regulating tech firms and social media.
Facebook founder Mark Zuckerberg has rejected the claims made by Ms Haugen, saying her attacks on the company were ‘misrepresenting’ the work it does.
He said the company ‘cares deeply about issues like safety, well-being and mental health’ and that Ms Haugen’s recent evidence to a US congressional committee ‘just doesn’t reflect the company we know’.
‘At the heart of these accusations is this idea that we prioritise profit over safety and well-being. That’s just not true,’ he added.
Facebook is reportedly planning to rebrand its business name in an apparent bid to distance its wider business from the slew of controversies in recent years.
One of its latest big ideas is the so-called metaverse, a 3D online world the firm wants to lead the way on building, in which people can meet, play and work virtually, often using virtual reality headsets.
Speaking to BBC Radio 4, Draft Online Safety Bill (Joint Committee) chairman Damian Collins, said: ‘Her (Ms Haugen) central argument is that when given the choice between harmful content that sometimes drives engagement, keeps people on the platform, or protecting people, Facebook favours engagement and that is part of the problem here.’
Writing in the Sunday Telegraph, Monika Bickert, Facebook’s vice president of content policy, said the firm has a commercial incentive to remove harmful content from its sites because ‘people don’t want to see it when they use our apps and advertisers don’t want their ads next to it’.
‘Contrary to recent claims, our research doesn’t conclude that Instagram is inherently bad for teenagers,’ she wrote.
‘While some teens told us Instagram made them feel worse when they were struggling with issues like loneliness, anxiety and sadness, more teens told us that Instagram made them feel better when experiencing these same issues.
‘But if even one young person feels worse, that’s one too many, so we use our research to understand bad experiences and prevent them.’