Mark Zuckerberg is facing further headaches after a former employee turned whistleblower, who is due to testify before Congress, could reveal more damning allegations about how the social network operates.

Facebook was already under criticism in the wake of a series of articles published by The Wall Street Journal that alleged that the company was aware about how potentially harmful Instagram, the photo sharing app it owns, can be for teenagers.

Despite denying the claims, Facebook announced late September it would be pausing the rollout of its planned “Instagram Kids” platform, which would have been aimed at children aged 13 and under.

The allegations that Facebook was aware that Instagram could have harmful impacts on children were made after a whistleblower leaked tens of thousands of internal documents to The Wall Street Journal, as well as law enforcement.

The whistleblower has now identified herself in a 60 Minutes interview as 37-year-old Frances Haugen, a data scientist who joined Facebook in 2019.

During Sunday night’s broadcast, Haugen accused Facebook of not being truthful about the progress it is making about tackling hate speech, violence and the spreading of misinformation on the app, and of prioritizing company growth over public safety.

“There were conflicts of interest between what was good for the public and what was good for Facebook,” Haugen said. “And Facebook, over and over again, chose to optimize for its own interests, like making more money.”

Haugen alleged that the company made a number of decisions they knew could cause real life harm in order to increase their profits, including changing their algorithm so that posts with lots of engagement are pushed onto people’s feeds.

Haugen claimed Facebook made this decision in 2018 despite its own research showing that “content that is hateful, that is divisive, that is polarizing” gets the most engagement.

“When we live in an information environment that is full of angry, hateful, polarizing content it erodes our civic trust, it erodes our faith in each other, it erodes our ability to want to care for each other,” Haugen said. “The version of Facebook that exists today is tearing our societies apart and causing ethnic violence around the world.”

Haugen said that while Facebook did take steps to control the spread of misinformation around the 2020 Election, these were pulled back shortly after the results were declared in November and the algorithm which pushed user engagement returned.

Haugen said the knock on effect of this was that Facebook helped contribute to the January 6 attack at the Capitol.

“Facebook has realized that if it changes the algorithm to be safer, people will spend less time on the site,” she told 60 Minutes. “They’ll click on less ads, they’ll make less money.”

Elsewhere, Haugen said that she does “have a lot of empathy” for Zuckerberg because he “never set out to make a hateful platform.”

She added: “But he has allowed choices to be made where the side effects of those choices are that hateful, polarizing content gets more distribution and more reach.”

Haugen is set to testify to Congress this week, where she will argue that the federal government needs to help impose restrictions on Facebook.

“Facebook has demonstrated it cannot act independently. Facebook, over and over again, has shown it chooses profit over safety,” Haugen said.

“It is subsidizing, it is paying for its profits with our safety. I’m hoping that this will have had a big enough impact on the world that they get the fortitude and the motivation to actually go put those regulations into place.”

On September 30, Antigone Davis, Facebook’s global head of safety, was questioned by senators during a Senate commerce, science and transportation subcommittee over the claims that the company was aware that it knew Instagram could cause harm for children.

Ahead of the hearing, Facebook released annotated slideshows for two internal research reports—”Teen Mental Health Deep Dive,” published in October 2019, and “Hard Life Moments,” published in November 2019—which they argued show “It is simply not accurate that this research demonstrates Instagram is ‘toxic’ for teen girls.”

The annotations even suggested more accurate headlines that should have been used in the internal documents, instead of the ones which were cited by The Journal.

According to The New York Times, a number of people who worked on the reports were unhappy with how the company criticized The Journal for basing its reporting on what it described as limited and imprecise findings of the document in an attempt to save face.

“They are making a mockery of the research,” one employee posted on a company message board.

Pratiti Raychoudhury, Vice President, Head of Research at Facebook, said in an earlier statement: “In addition to putting specific findings in context, it is also critical to make the nature of this research clear.

“This research, some of which relied on input from only 40 teens, was designed to inform internal conversations about teens’ most negative perceptions of Instagram. It did not measure causal relationships between Instagram and real-world issues.

“These documents were also created for and used by people who understood the limitations of the research, which is why they occasionally used shorthand language, particularly in the headlines, and do not explain the caveats on every slide.”

Original Article: Mark Zuckerberg Left Reeling From Facebook Whistleblower Allegations (