A Facebook whistleblower revealed her identity in a Sunday night interview while trashing the social media giant for prioritizing divisive content over safety to garner higher profits.
Frances Haugen, 37, spoke out publicly for the first time since quitting Facebook in May when the company dismantled her unit that attempted to address misinformation on the popular platform.
Before leaving the company, Haugen copied thousands of pages on internal documents — some of which had already been reported on — to back up her claims.
“The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook,” Haugen said on CBS’s “60 Minutes.”
“Facebook, over and over again, chose to optimize for its own interests, like making more money,” said Haugen.
Haugen, a data scientist from Iowa, linked what she characterized as Facebook’s inaction in squashing misinformation to the Jan. 6 US Capital riot.
After the polarizing 2020 election, Haugen said the company got rid of the Civic Integrity unit and disabled some safety features they had put in place to reduce misinformation.
“They told us, ‘We’re dissolving Civic Integrity.’ Like, they basically said, ‘Oh good, we made it through the election. There wasn’t riots. We can get rid of Civic Integrity now,” said Haugen.
“Fast forward a couple months, we got the insurrection.”
“As soon as the election was over, they turned them back off or they changed the settings back to what they were before, to prioritize growth over safety,” Haugen said of the features.
“And that really feels like a betrayal of democracy to me.”
Facebook told CBS that work undertaken by the dissolved department was allocated internally to other units.
Haugen told host Scott Pelley that Facebook enables divisive content to flourish because of changes it made in 2018 to its algorithms that prioritize content for individual accounts based on their past engagement.
“One of the consequences of how Facebook is picking out that content today is it is optimizing for content that gets engagement, or reaction,” said Haugen.
“But its own research is showing that content that is hateful, that is divisive, that is polarizing, it’s easier to inspire people to anger than it is to other emotions,” said Haugen.
“Facebook has realized that if they change the algorithm to be safer, people will spend less time on the site, they’ll click on less ads, they’ll make less money,” the woman charged.
Haugen is set to testify before Congress this week. She has already filed reams of anonymous complaints against the company with federal authorities.
In the interview that aired Sunday, Haugen said she acquired a 2019 internal report that details an argument from European political parties over the content dominating on its platform due to its algorithm.
Haugen said the parties “feel strongly that the change to the algorithm has forced them to skew negative in their communications on Facebook … leading them into more extreme policy positions,” according to Pelley.
In a statement to “60 Minutes,” Facebook denied the allegations that the company encourages harmful content.
“We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true,” the company said.
“If any research had identified an exact solution to these complex challenges, the tech industry, governments, and society would have solved them a long time ago.”