Facebook's season of bad-press storms never really seems to end, and the latest came via a September exposé series in the Wall Street Journal that revealed internal documents showing how Facebook conducted its own internal research into the platform's ill effects, but did nothing to fix the problems.

On Sunday, the whistleblower behind the bombshell pieces — which already prompted a new round of congressional hearings that kicked off last week — revealed herself in an interview on 60 Minutes, and discussed her reasons for coming forward. She is former Facebook data scientist Frances Haugen, and she says she joined the company in 2019 and asked to be on the team that fights misinformation — after seeing a friend get drawn in and brainwashed by conspiracy theories.

"Facebook, over and over again, has shown it chooses profit over safety," Haugen said, per the Associated Press, and re-explained the issue that has been ever-present and clear with the company for years.

"The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook," Haugen said. And she explained that a 2018 change in how Facebook's newsfeed algorithm works contributed greatly to the amplification of divisive and anger-inducing content — which in turn heightened engagement and positively impacted the company's bottom line.

In her role at the company, Haugen was tasked with conducting some of the research that the company would never be eager for the public to see — research confirming critics' longstanding attacks on the platform for the way it manipulates human behavior and emotion through its algorithm. She worked in the Civic Integrity division at the company, focused specifically on election misinformation — and one of her biggest bombshells is that Facebook dissolved the Civic Integrity division as soon as the election was over, prematurely, she says, because this may have led directly to the January 6th riot. And she says that controls that were turned on before the election to tamp down rage and the most divisive content were almost immediately switched back off so that engagement numbers were not depressed.

Haugen, 37, previously worked at Google and Pinterest, and she left Facebook in May, taking with her a trove of these internal documents.

"I've seen a bunch social networks," Haugen said in the interview, "and things are substantially worse at Facebook than anything else I've seen."

One of the first revelations reported in the Journal's series, titled "The Facebook Files," was that Facebook research understood the negative impacts Instagram was having, especially on teenage girls. But the company was nonetheless forging ahead with its highly controversial plan to release an Instagram Kids app.

What Haugen said was the most "tragic" about the Instagram research is that when teen girls are depressed, it leads them to use the app more, and get stuck in a cycle in which the app continues to make them hate their bodies more.

The company has responded by saying that the Journal cherry-picked documents to cast the company in the worst possible light, but the PR storm has nonetheless caused Facebook to indefinitely halt the Instagram Kids project as of last week.

But Haugen said she was determined to blow the whistle on Facebook's internal research because of what she sees as the company's role in tearing the fabric of society apart, and in helping spur atrocities across the globe.

Haugen and her attorneys have used the documents to file eight complaints against Facebook at the Securities and Exchange Commission. The SEC may launch its own investigation, which could present trouble for Facebook down the line.

Facebook's head comms guy, Nick Clegg, was on CNN on Sunday doing damage control ahead of the 60 Minutes interview.

"Even with the most sophisticated technology, which I believe we deploy, even with the tens of thousands of people that we employ to try and maintain safety and integrity on our platform, we’re never going to be absolutely on top of this 100% of the time,” Clegg said on CNN. Still, he added, "I think we do more than any reasonable person can expect to."

One of the documents Haugen shared showed the company being aware that it was only catching about 3% to 5% of hate speech on the platform, and only a fraction of a percent of content that incites violence.

Facebook has tried to downplay the research itself, calling it "limited and imprecise," but that has caused an uproar among Facebook's own internal researchers, as the New York Times reports. "They are making a mockery of the research," wrote one of them on an internal message board.

Haugen is also scheduled to be testifying before the same Senate committee on Tuesday that heard from Facebook's global head of safety, Antigone Davis.

"Social media has had a big impact on society in recent years, and Facebook is often a place where much of this debate plays out," wrote Clegg in a memo to all company employees on Friday, per the AP. "But what evidence there is simply does not support the idea that Facebook, or social media more generally, is the primary cause of polarization."

Related: Facebook Seemingly Still Monkeying With Algorithm to Limit Right-Wing Agitprop