A midterms election study of political disinformation being fenced by Facebook’s platform supports the company’s assertion that a clutch of mostly right-leaning and politically fringe Pages it removed in October for sharing “inauthentic activity” were pulled for gaming its engagement metrics.
Though it remains unclear why it took Facebook so long to act against such prolific fakers — which the research suggests had been doping their metrics unchallenged on Facebook for up to five years.
The three-month research project carried out by Jonathan Albright of the Tow Center for Digital Journalism has largely focused on domestic political disinformation.
In a third and final blog detailing his findings he says some of the removed Pages had put up Facebook interaction numbers in the billions, and many of their videos consistently showed engagement in the tens of millions.
“I found that at least three of the Pages — removed less than a month ago — reported near-astronomical engagement numbers over the past five years,” he writes. “These are the kind of numbers that would be difficult to justify in almost any scenario — even in the case of a very large and sustained advertising spend on Facebook.”
One of the Pages with suspiciously high engagement flagged by Albright is a Page called Right Wing News.
“Less than a month before the 2018 midterm elections, when the Page was removed, Right Wing News had reported more engagement on Facebook over the past five years than the New York Times, The Washington Post, and Breitbart…combined,” he writes.
He also flags two other Pages that were removed by Facebook which had suspiciously high video views, called Silence is Consent and Daily Vine.
We’ve reached out to Facebook for a response.
The company is currently facing legal action from an unrelated group of advertisers who allege that Facebook knowingly misreported video metrics, inflating views for more than a year, and accusing the company of ad fraud. Facebook disputes the advertisers’ allegations.
In his blog, Albright also details how Facebook has seemingly failed to properly enforce a ban on conspiracy theorist and hate speech purveyor Alex Jones, whose personal Facebook Page and disinformation outlet, InfoWars, it pulled from its platform in August — writing: “Jones’ show and much of the removed InfoWars news content appears to have moved swiftly back onto the Facebook platform.”
How has Jones circumvented the ban on his main pages? By creating lots of similarly branded alternative Pages…
Albright writes that Facebook’s algorithms pushed Jones’ livestream show into search results when he was looking for Soros conspiracies: “And what did I get? The live high-definition stream of Jones’ show on Facebook — broadcast on one of the many InfoWars-branded Pages that is inconspicuously named “News Wars.”
According to his analysis, Jones’ InfoWars broadcasts appears to be almost back to where they were — in terms of views/engagement — before the Facebook ‘ban’ took down his two largest pages. Albright describes the “censorship” case as “a gross enforcement failure by Facebook”.
“Granular enforcement isn’t just reactive takedowns; it’s about proactive measures. This involves considering the factors — even the simple guerrilla marketing tactics — that play into how things like banned InfoWars live streams get further propagated,” he writes, summing up his findings.
“From what I’ve seen in this extensive look into Facebook’s platform, especially in regards to the company’s capacity to deal with the misuse of its platform as shown in the cases above — exactly two years after the end of the last election — I will argue that common sense approaches to platform integrity and manipulation still appear to be less of a priority for Facebook that automated detection and removal publicity.”
“The infinite gray area of information-sharing poses the real challenge: it’s the slippery soft conspiracy questions, the repetition of messages seen on shocking memes and statements like the “Soros Beto” caption [cited in the post], and the emotional clickbait that’s regularly shown in Jones’ InfoWars video cover stills. Without granular enforcement, the non-foreign bad actors will only get better, and refine their tactics to increase Americans’ exposure to [hyperpartisan junk news],” he adds.
“Information integrity is more than the scrutiny of provable statements or the linking of some data to shared content with an “i.” Transparency involves more than verifying one Page manager, putting it alongside a date and voluntary disclosure for a paid political campaign, and adding to an political “ad archive.”
Albright has posted additional findings from his three month trawl through the Facebook fake-o-sphere this week — including raising concerns about political Pages running ads targeting the US midterms which have changed moderator structure and included foreign-based administrators, as well as finding some running political ads that lacked a ‘Paid for’ disclosure label.
He also identifies a shift of tactics about political disinformation operators to sharing content in closed Facebook Groups where it’s less visible to outsiders trying to track junk news — yet can still be shared across Facebook’s platform to skew voters’ opinions.
Written by Natasha Lomas
This news first appeared on https://techcrunch.com/2018/11/06/study-of-political-junk-on-facebook-raises-fresh-questions-about-its-metrics/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+Techcrunch+%28TechCrunch%29 under the title “Study of political junk on Facebook raises fresh questions about its metrics”. Bolchha Nepal is not responsible or affiliated towards the opinion expressed in this news article.