Frances Haugen

Frances Haugen

Facebook whistleblower reveals her identity

Frances Haugen says the social media giant prioritizes profits over the public good.

 

Claire Goforth

Tech

Published Oct 4, 2021   Updated Oct 13, 2021, 1:16 pm CDT

The Facebook whistleblower has revealed her identity.

Former product manager Frances Haugen, who first spoke to the Wall Street Journal, publicly came forward this weekend. Haugen has provided internal documents to media, lawmakers, and regulators that she says demonstrate that Facebook is aware of the harms it causes. She believes it is unwilling to effectively address these problems because doing so will potentially reduce engagement and profits.

In a statement to the Journal, Facebook spokesman Andy Stone categorically denied Haugen’s accusations.

“Every day our teams have to balance protecting the right of billions of people to express themselves openly with the need to keep our platform a safe and positive place,” Stone reportedly said.

“We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true.”

Haugen worked at Facebook for two years before leaving in May. She told the Journal that her initial goal was to help the company fix its weaknesses. People have long complained that Facebook and Instagram negatively impact mental health, enable election interference, and facilitate the spread of misinformation and conspiracy theories. As the world’s largest social media company, Facebook tends to receive more criticism, but other platforms have faced similar accusations.

Haugen’s initial optimism waned, she reportedly said, as she began to suspect the company wasn’t serious about minimizing the harms it causes. She was part of a roughly 200-person civic integrity team that focused on elections issue worldwide. Her original task was to build tools to detect targeting of specific communities. She said Facebook gave her team of five—all new hires—three months to accomplish this, which she considered implausible.

They failed. Haugen told the Journal that she started noticing that theirs was among many small teams given large, important tasks. For example, the team responsible for detecting and eradicating slavery, organ sales, and forced prostitution was composed of just a few people, she said.

“I would ask why more people weren’t being hired,” Haugen told the Journal. “Facebook acted like it was powerless to staff these teams.”

Facebook’s revenue increased nearly 50% in the first quarter of this year, to $26 billion, according to the New York Times. Its profits nearly doubled to $9.5 billion.

Over time, Haugen became increasingly concerned about Facebook’s role in contributing to real-world harm. She pointed to the Myanmar military using Facebook to plot the genocide of the Rohingya people.

“The version of Facebook that exists today is tearing our societies apart and causing ethnic violence around the world,” she told 60 Minutes in an interview that aired Sunday night.

After the Capitol riot, she resolved to dig more into the matter. Haugen said that Facebook prematurely deactivated tools that prevented the spread of misinformation and downplayed its role in fostering the conditions that led to the violent assault on Congress.

Facebook spokesman Stone told the Journal that it was absurd to blame the company. “We have a long track record of effective cooperation with law enforcement, including the agencies responsible for addressing threats of domestic terrorism,” Stone reportedly said.

Using her employee credentials, Haugen was able to download and view thousands of internal documents, including many unrelated to her job duties. The Journal reports that she was able to access research notes, privileged attorney-client communications, and presentations given to Facebook CEO and founder Mark Zuckerberg.

In those documents, she told 60 Minutes, she found evidence that Facebook knowingly prioritized profit over safety.

Again and again, she found internal research demonstrating Facebook’s ill effects. The researchers’ findings often culminated with frustrated “badge posts,” i.e. goodbye letters left in the company’s system lambasting it for failing to act or take responsibility for the harms it caused.

Haugen told 60 Minutes that, based on her experience and review of these documents, Facebook’s negligence is intentional.

“Facebook has realized that if they change the algorithm to be safer, people will spend less time on the site, they’ll click on less ads, they’ll make less money,” she said.

She later provided the documents to Congress, the Securities Exchange Commission, and the Journal. She testifies before Congress on Tuesday.

Haugen, who says she bears Facebook no ill will, has said she’s willing to cooperate with state attorneys general and European regulators.

She believes that Facebook should allow outsiders to examine its research and operations, greatly simplify its systems, and limit promoting content based on engagement.

“The company’s own research has found that ‘misinformation, toxicity, and violent content are inordinately prevalent’ in material reshared by users and promoted by the company’s own mechanics,” the Journal reports.

“As long as your goal is creating more engagement, optimizing for likes, reshares, and comments, you’re going to continue prioritizing polarizing, hateful content,” Haugen said.

In spite of the seriousness of her criticisms, Haugen insists she doesn’t want Facebook to be broken up or lose its liability protection guaranteed under Section 230 of the Communications Decency Act.

In her final message on Facebook’s internal social network, she wrote, “I don’t hate Facebook. I love Facebook. I want to save it.”

Share this article
*First Published: Oct 4, 2021, 10:40 am CDT