Article by Sophie Barnett for LBC - published 25 October 2021
The committee is scrutinising plans for an Online Safety Bill and is also considering new laws - to better regulate technology companies.
Former data scientist at Facebook, Frances Haugen, left the business earlier this year after releasing thousands of sensitive documents about the company's practices.
She told MPs on Monday that failures at the social media giant are like an "oil spill".
She also said Facebook "unquestionably" makes hate in society worse, adding that the social media platform is dangerous for teenagers.
Ms Haugen claimed Instagram - which is owned by Facebook - allows "bullying to follow children home".
"Facebook's own reports say that it is not just that Instagram is dangerous for teenagers, it is actually more dangerous than other forms of social media," she said.
"Instagram is about social comparison and about bodies. It is about people's lifestyles and that is what ends up being worse for kids."
She said that before the advent of social media, children "used to be able to disconnect from abuse".
"Facebook's own research says now the bullying follows them home. It goes into their bedrooms," she said.
"The last thing they see at night is someone being cruel to them, the first thing they see in the morning is a hateful statement.
"They don't get a moment's peace."
As she gave evidence about the company - owned by Mark Zuckerberg - she said she was "extremely worried" about the condition of our societies.
She said the platform was "hurting the most vulnerable among us" and leading people down "rabbit holes".
Facebook has denied the accusations.
"Facebook has studied who has been most exposed to misinformation and it is ... people who are socially isolated," she told the select committee.
"I am deeply concerned that they have made a product that can lead people away from their real communities and isolate them in these rabbit holes and these filter bubbles.
"What you find is that when people are sent targeted misinformation to a community it can make it hard to reintegrate into wider society because now you don't have shared facts."
The former employee said Facebook "never set out to prioritise polarising divisive content", but claimed it was a side-effect of choices they made.
She has insisted social networks need more oversight.
"I think regulation could actually be good for Facebook's long-term success," she told the select committee.
"It would force Facebook back into a place where it was more pleasant to be on Facebook. And that could be good for the long-term growth of the company."
She also met home secretary Priti Patel - who's tweeted "tech companies have a moral duty to keep their users safe".
On Thursday, senior representatives from big tech companies including Facebook, Twitter, Google, Youtube and TikTok will give evidence to MPS and Peers on the Joint Committee on the draft Online Safety Bill.
They will be asked about their current approaches to online safety and how they may be affected by the draft legislation.
Chair of the Joint Committee, Damian Collins MP, said the Online Safety Bill will establish a "new era of regulation for tech platforms".
He said this will make them accountable to an independent body for what they host, and the active role their recommendation systems play in promoting it to other users.
“We want to make online products like search and social media safer and ensure that tech companies have effective systems in place to mitigate the spread of harmful and illegal content," he said.
"We look forward to questioning Google, Facebook, Twitter and TikTok on how they intend to comply with the requirements of the Online Safety Bill."