Revelations about Facebook's knowledge of its enormous and fundamental problems, and the company's ineptitude in handling them or lack of true interest in trying because it would hurt the bottom line, are continuing to roll in.
We thought that the Wall Street Journal had been gifted the trove of internal documents taken by former Facebook employee and whistleblower Frances Haugen, after it published its series of exposés titled "The Facebook Files." But now it seems that the Associated Press, the New York Times, and 14 other news organization are sharing in the trove, which keeps on giving in the form of damning evidence that company executives can be squarely blamed for knowingly allowing the platform and its algorithm to do great harm.
The Times has a piece today about two of the Facebook News Feed's most fundamental features, the Like and Share buttons, and how internal research pointed to the fact that the very things that drive user engagement on the platform are the things that allow misinformation and divisions to flourish.
As the researchers concluded, "If [the Integrity team] takes a hands-off stance for these problems, whether for technical (precision) or philosophical reasons, the net result is that Facebook, taken as a whole, will be actively (if not necessarily consciously) promoting these types of activities. The mechanics of our platform are not neutral."
Facebook continues to push back on the "false picture" that's being painted by these internal documents, and spokesperson Andy Stone tells the Times that it's a "false premise" to think Facebook puts profits ahead of user safety, when they've clearly invested many millions on integrity and safety on the platform.
But the evidence seems to point to the fact that there are lots of discussions, and lots of research has been done, into all these difficult problems, but getting them solved has been difficult if not impossible without breaking how the whole News Feed functions.
CBS News has a piece about three dummy accounts that internal Facebook researchers created in 2019 to demonstrate how quickly the algorithm pushed them toward politically charged or outright demonic content. One account, Karen Jones, was meant to be a liberal American; one, Carol Smith, was meant to be a conservative American; and the third was a user in India, which is Facebook's biggest market.
Within days, the researchers said, Carol was being shown QAnon conspiracy content, and Karen was being bombarded with "Moscow Mitch" memes.
And the user in India was being shown extremely graphic content relating to violence on the country's border with Pakistan.
"I’ve seen more images of dead people in the past 3 weeks than I’ve seen in my entire life total," said the researcher running the Indian test account.
Meanwhile, the AP today clues us in to another piece of the yearslong fight between Apple and Facebook — a story going on in the background of the conflict in 2019, the same year that Apple suddenly revoked Facebook's security certificates and disabled all their internal iOS apps for a day.
Apparently, Apple nearly pulled Facebook and Instagram completely from the App store in 2019 over a specific human trafficking issue that had previously been reported by the BBC. Facebook was implicated in helping facilitate the sales and trading of foreign maids in the Middle East — an egregious practice that is apparently rampant in multiple countries, with recruitment agencies turning a blind eye to complaints of people being locked in rooms, starved, and forced to work without pay, among other abuses.
Facebook claims to have cracked down on the use of the platform (as well as Instagram) to advertise domestic workers for sale, but the AP found that "Even today, a quick search for 'khadima,' or 'maids' in Arabic, will bring up accounts featuring posed photographs of Africans and South Asians with ages and prices listed next to their images."
These abuses, and the trafficking of domestic workers, is tied up in Middle Eastern laws that allow for the import of cheap labor from foreign countries, with legal residencies of these immigrants tied to their employers. And many workers from the Philippines, Asia, and Africa are willingly enter into these agreements in order to make a living they can not make in their own country — but they are sometimes victimized by bad actors who essentially treat them like slaves.
Facebook says that it removed posts and accounts tied to human exploitation — to which it assigned the abbreviation HEx — and this apparently placated Apple, which dropped the threat to remove the apps from the App Store. But the problem with these ads on Instagram selling maids has persisted for three years since Facebook seems to have acknowledged the problem.
As Mustafa Qadri, the executive director of Equidem Research, tells the AP, "While Facebook is a private company, when you have billions of users, you are effectively like a state and therefore you have social responsibilities de facto, whether you like it or not."
Therein lies the trouble: Facebook is a de facto state, as well as a de facto media company and de facto news organization, and yet as a company it has done everything it can to avoid taking the responsibilities that come with being any of these things. And while it facilitated the death of print journalism in the U.S. over the last decade and a half, it's mostly just paid lip service — and some occasional PR-focused funding — to undoing that damage. And meanwhile, misinformation has continued to proliferate, leading to disasters like January 6th.
Politico has a piece specifically about Facebook's failure to stem that insurrectionist tide.
Also, Haugen testified before the U.K. Parliament today, three weeks after testifying before a U.S. Senate committee.
And there's likely to be more stories popping up this week as we approach the company's big — and sure to be hilarious — renaming announcement, perhaps arriving Thursday.
Top image: Facebook whistleblower, Frances Haugen appears before the Senate Commerce, Science, and Transportation Subcommittee during a hearing entitled 'Protecting Kids Online: Testimony from a Facebook Whistleblower' at the Russell Senate Office Building on October 05, 2021 in Washington, DC. (Photo by Matt McClain-Pool/Getty Images)