Tech

How Facebook Troll Farms Reached 140 Million Americans

[ad_1]

In the run-up to the 2020 US presidential election, arguably the most competitive in US history, the most popular Facebook pages for Christian and African-American content were run by Eastern European troll farms. These sites were part of a larger network that, according to an internal company report, reached nearly half of all Americans. Particularly frightening: This reach was not achieved through user actions, but primarily through Facebook’s own platform design and algorithms that are hungry for so-called engagement – liking, sharing and commenting.

The report, which was written back in October 2019 and made available to MIT Technology Review by a former Facebook employee who was not involved in its creation, states that it was the social network after the last election in 2016 – as Donald Trump became president – failed to prioritize fundamental changes to the way his platform processes and disseminates information. Instead, the company pursued a so-called Whack-a-mole-Strategiewhich consisted of monitoring the activities of problematic actors and only then stopping them when they took part in the political discourse – and introducing some virtual guard rails that prevented “the worst of the worst”.

But this approach did little to contain the real problem, the report said. Troll farms continued to build huge audiences by networking Facebook pages and reaching 140 million US users a month with their content – 75 percent of whom, amazingly, had never followed any of the pages before. They saw the content because Facebook’s content recommendation system pushed it into their news feeds.

“Rather than users choosing to get content from these actors, it is our platform that is choosing to [diesen Trollfarmen] a tremendous reach, “writes the report’s author, Jeff Allen, former senior data scientist at Facebook. Joe Osborne, spokesman for Facebook, said in a statement that at the time of Allen’s report, the company” had already investigated these issues. ” “Since then, we’ve formed teams, developed new guidelines, and worked with industry peers to address these problematic networks.” “Aggressive enforcement measures” have been taken against these types of domestic and foreign “inauthentic groups.” Results are reported regularly in quarterly reports.

But it’s not that easy. In reviewing these statements shortly before publication, MIT Technology Review found that five of the troll farm sites mentioned in the report are still active. The largest troll farm site that targeted African Americans in October 2019 is also still active on Facebook. The report found that the “problematic actors” are reaching the same demographic groups targeted by the Kremlin-backed Internet Research Agency (IRA) during the 2016 US election – namely Christians, black Americans and members of indigenous people Groups. An investigation of BuzzFeed News from 2018 revealed that at least one member of the Russian IRA charged with alleged meddling in the 2016 US election had also visited North Macedonia – a country known for its troll farms – despite no concrete evidence of a connection were found. (Facebook said its investigations had also found no link between the IRA and the North Macedonian troll farms.)

“This is not normal. This is not healthy,” wrote Allen. “We gave fake actors the opportunity to amass huge numbers of followers for largely unknown purposes.” Allen compiled the report as the fourth and final installment of his year and a half efforts to understand troll farms. He left the company later that month, in part out of frustration that management “effectively ignored” his job, the former Facebook employee said. Allen declined to comment.

The report reveals the alarming state that Facebook leadership has left the platform in for years. The US edition of MIT Technology Review introduces the full report as PDF with blackened names of employees available as it is in the public interest.