LONDON - The Facebook whistleblower Frances Haugen has said that events such as the 6 January US Capitol riot and genocides in Myanmar and Ethopia are the “opening chapters” of worse events if action is not taken against the social media company.
Ms Haugen gave the warning while giving evidence to parliament ahead of the government’s development of an Online Harms Bill.
“Engagement-based ranking prioritises and amplifies divisive, polarising content”, Ms Haugen said, adding that the company could make non-content-based choices that would sliver off half-percentage points of growth but “Facebook is unwilling to give up those slivers for our safety”.
The “opening chapters” of this “novel”, Ms Haugen said, will be “horrific” to read in both the global south and in western societies.
Her evidence comes as a number of files were shared with a variety of media publications about internal research Facebook conducted.
It has been revealed that Facebook lacked misinformation classifiers in Myanmar, Pakistan, and Ethiopia – countries designated at highest risk last year.
Countries like Brazil, India and the United States were placed in “tier zero”, with “war rooms” that would monitor the geographic spaces continuously.
“Facebook never set out to prioritise polarising content, it just rose as a side effect of priorities it did take”, Ms Haugen said. She also emphasised the need for local languages and dialects to be supported.
“UK English is sufficiently different that I would be unsurprised if the safety systems that they developed, primarily for American English, would be underenforced in the UK”, she said.
The difference between systems in the United States compared to Ethiopia is stark. Facebook offers a huge range of services designed to protect the public discourse such as building artificial intelligence systems to detect hate speech in memes and quickly respond to hoaxes and incitement to violence in real-time.
Ms Haugen said that inside Facebook “there is a culture that lionises a start-up ethic that, in my opinion, is irresponsible,” and that she had “no idea” who to flag her concerns to within the company because of the risk it could have to growth.
Facebook said in 2018 that it agreed with an independent report it commissioned that said it had failed to prevent its platform being used to “incite offline violence” in Myanmar.
The report said Facebook platform had created an “enabling environment” for the proliferation of human rights abuse, which culminated in violence against the Rohingya people (a stateless Muslim minority) that the UN said may amount to genocide.
Even in the United States, during the 6 January riot, many of the interventions that Facebook could have taken against speech on its platform that was used for coordinating those storming Capitol Hill were “still off at 5pm Eastern Time”, Ms Haugen said, emphasising an alleged lack of action the company takes even when democracy is at risk in its home country.
Moving to systems that are human-scaled, rather than algorithms telling people where the focus is, is the safest action, she claimed.