Facebook whistleblower Frances Haugen took aim at her former employer, accusing the social media giant of “tearing our societies apart” in an interview with CBS’ 60 Minutes.
“The version of Facebook that exists today is tearing our societies apart and causing ethnic violence around the world,” Haugen told 60 Minutes Sunday.
Haugen accused the company of placing profit above the good for the public, despite assurances from Facebook leadership that the company was working to make the platform safe.
Here are some of the damning accusations from Haugen’s interview:
Facebook knew of Instagram harm to teens
The document leak that had the greatest impact was a series of research slides that showed Facebook’s Instagram app was damaging the mental health and wellbeing of some teenage users, with 30% of teenage girls feeling that it made dissatisfaction with their body worse.
She said: “And what’s super tragic is Facebook’s own research says, as these young women begin to consume this eating disorder content, they get more and more depressed. And it actually makes them use the app more. And so, they end up in this feedback cycle where they hate their bodies more and more. Facebook’s own research says it is not just that Instagram is dangerous for teenagers, that it harms teenagers, it’s that it is distinctly worse than other forms of social media.”
Facebook has described the Wall Street Journal’s reporting on the slides as a “mischaracterization” of its research.
Facebook promotes hate and division
Haugen said the company had contributed to ethnic violence, a reference to Burma. In 2018, following the massacre of Rohingya Muslims by the military, Facebook admitted that its platform had been used to “foment division and incite offline violence” relating to the country. Speaking on 60 Minutes, Haugen said: “When we live in an information environment that is full of angry, hateful, polarising content it erodes our civic trust, it erodes our faith in each other, it erodes our ability to want to care for each other. The version of Facebook that exists today is tearing our societies apart and causing ethnic violence around the world.”
Facebook dissolved its Civic Integrity unit after the 2020 election and before the Jan. 6 Capitol insurrection
The 6 January riot, when crowds of rightwing protesters stormed the Capitol, came after Facebook disbanded the Civic Integrity team of which Haugen was a member. The team, which focused on issues linked to elections around the world, was dispersed to other Facebook units following the US presidential election. “They told us: ‘We’re dissolving Civic Integrity.’ Like, they basically said: ‘Oh good, we made it through the election. There wasn’t riots. We can get rid of Civic Integrity now.’ Fast-forward a couple of months, we got the insurrection. And when they got rid of Civic Integrity, it was the moment where I was like, ‘I don’t trust that they’re willing to actually invest what needs to be invested to keep Facebook from being dangerous.’”
Facebook prioritizing profits over minimum public good
Facebook changed the algorithm on its news feed – Facebook’s central feature, which supplies users with a customized feed of content such as friends’ photos and news stories – to prioritize content that increased user engagement. Haugen said this made divisive content more prominent.
“One of the consequences of how Facebook is picking out that content today is it is optimizing for content that gets engagement, or reaction. But its own research is showing that content that is hateful, that is divisive, that is polarising – it’s easier to inspire people to anger than it is to other emotions.” She added: “Facebook has realized that if they change the algorithm to be safer, people will spend less time on the site, they’ll click on fewer ads, they’ll make less money.”
Facebook is worse than most other social media companies
In a 15-year career as a tech professional, Haugen, 37, has worked for companies including Google and Pinterest but she said Facebook had the worst approach to restricting harmful content. She said: “I’ve seen a bunch of social networks and it was substantially worse at Facebook than anything I’d seen before.” Referring to Mark Zuckerberg, Facebook’s founder and chief executive, she said: “I have a lot of empathy for Mark. And Mark has never set out to make a hateful platform. But he has allowed choices to be made where the side-effects of those choices are that hateful, polarising content gets more distribution and more reach.”