Joe Osborne, a spokesperson for Facebook, said in a statement that the company “had already investigated these matters” at the time of Allen’s report. “Since that time, we have built teams, developed new policies and collaborated with industry peers to address these networks. We have taken aggressive enforcement action against these kinds of foreign and domestic inauthentic groups and have shared the results publicly on a quarterly basis.
Checking the facts shortly before its publication, the MIT Technology Review found that five of the troll farm pages mentioned in the report remained active.
The report found that the troll farms were reaching the same demographic groups as the Kremlin-backed Internet Research Agency (IRA) in the 2016 election, which targeted Christians, Black Americans and Native Americans. A 2018 BuzzFeed News investigation found that at least one Russian IRA member, charged with alleged interference in the 2016 US election, also visited Macedonia around the emergence of its first troll farms. , although he found no concrete evidence of a connection. . (Facebook said its investigations also failed to reveal a link between the IRA and Macedonian troll farms.)
“This is not normal. This is not healthy,” Allen wrote. “We have allowed inauthentic actors to accumulate huge followers for largely unknown purposes … Possible links with the IRA have access to huge audience numbers in the same demographic groups targeted by the IRA poses a huge risk to the US 2020 election. “
As long as the troll farms were successful in using these tactics, so could any other bad actor, he continued: The IRA currently has a large following there as well.
Allen wrote the report as the fourth and final installment in a year-and-a-half-year effort to understand troll farms. He left the company the same month, in part because of frustration that management had “effectively ignored” his research, according to the former Facebook employee who provided the report. Allen declined to comment.
The report reveals the alarming situation in which Facebook executives have left the platform for years, despite repeated public promises to aggressively tackle foreign election interference. MIT Technology Review is making the full report available, with employee names redacted, as it is in the public interest.
His revelations include:
- As of October 2019, around 15,000 Facebook pages with a predominantly American audience were managed from Kosovo and Macedonia, known bad actors in the 2016 elections.
- Collectively, these troll farm pages, which the report treats as one page for comparison purposes, reached 140 million US users each month and 360 million global users each week. The Walmart page reached the second largest American audience with 100 million people.
- The pages of the Troll Farm have also combined to form:
- largest Christian American page on Facebook, 20 times the size of the next one, reaching 75 million US users per month, 95% of whom had never followed any of the pages.
- largest African American page on Facebook, three times the size of the next one, reaching 30 million US users per month, 85% of whom had never followed any of the pages.
- the second largest Native American page on Facebook, reaching 400,000 users per month, 90% of whom had never followed any of the pages.
- the fifth-largest women’s page on Facebook, reaching 60 million US users each month, 90% of whom had never followed any of the pages.
- Troll farms primarily affect the United States, but also target the United Kingdom, Australia, India, and countries in Central and South America.
- Facebook has conducted several studies confirming that the content more likely to generate user engagement (likes, comments, and shares) is more likely to be of a type known to be bad. Still, the company continued to rank content in user newsfeeds based on what will drive the highest engagement.
- Facebook prohibits pages from posting content that is simply copied and pasted from other parts of the platform, but does not enforce the policy against known bad actors. This allows foreign actors who do not speak the local language to easily post fully copied content while reaching a large audience. At one point, up to 40% of pageviews on US pages went to those that featured primarily non-original content or material of limited originality.
- Troll farms have already entered Facebook’s Instant Articles and Ad Breaks partner programs, which are designed to help news agencies and other publishers monetize their articles and videos. At one point, due to a lack of basic quality checks, up to 60% of instant article reads went to content that had been plagiarized elsewhere. This made it easy for troll farms to mix and even receive payments from Facebook.
How Facebook empowers troll farms and increases their audience
The report looks specifically at troll farms based in Kosovo and Macedonia, which are run by people who do not necessarily understand US politics. Yet, because of the way Facebook’s news feed reward systems are designed, they can still have a significant impact on political discourse.