A revealing cluster of emails reviewed by Business Insider and Channel 4 News offers a glimpse at the fairly chaotic process of how Facebook decides what content crosses the line. In this instance, a group of executives at Facebook went hands-on in determining if an Instagram post by the conspiracy theorist Alex Jones violated the platform’s community standards.
To make that determination, 20 Facebook and Instagram executives hashed it out over the Jones post, which depicted a mural known as “False Profits” by the artist Mear One. Facebook began debating the post after it was flagged by Business Insider for kicking up anti semitic comments on Wednesday.
The company removed 23 of 500 comments on the post that it interpreted to be in clear violation of Facebook policy. Later in the conversation, some of the UK-based Instagram and Facebook executives on the email provided more context for their US-based peers.
Last year, a controversy over the same painting erupted when British politician Jeremy Corbyn argued in support of the mural’s creator after the art was removed from a wall in East London due what many believed to be antisemitic overtones. Because of that, the image and its context are likely better known in the UK, a fact that came up in Facebook’s discussion over how to handle the Jones post.
“This image is widely acknowledged to be anti-Semitic and is a famous image in the UK due to public controversy around it,” one executive said. “If we go back and say it does not violate we will be in for a lot criticism.”
Ultimately, after some back and forth, the post was removed.
According to the emails, Alex Jones’ Instagram account “does not currently violate [the rules]” as “an IG account has to have at least 30% of content violating at any given time as per our regular guidelines.” That fact might prove puzzling once you know that Alex Jones got his main account booted off Facebook itself in 2018 — and the company did another sweep for Jones-linked pages last month.
Whether you agree with Facebook’s content moderation decisions or not, it’s impossible to argue that they are consistently enforced. In the latest example, the company argued over a single depiction of a controversial image even as the same image is literally for sale by the artist elsewhere on both on Instagram and Facebook. (As any Facebook reporter can attest, these inconsistencies will probably be resolved shortly after this story goes live.)
The artist himself sells its likeness on a t-shirt on both Instagram and Facebook and numerous depictions of the same image appear on various hashtags. And even after the post was taken down, Jones displayed it prominently in his Instagram story, declaring that the image “is just about monopoly men and the class struggle” and decrying Facebook’s “crazy-level censorship.”
It’s clear that even as Facebook attempts to make strides, its approach to content moderation remains reactive, haphazard and probably too deeply preoccupied with public perception. Some cases of controversial content are escalated all the way to the top while others languish, undetected. Where the line is drawn isn’t particularly clear. And even when high profile violations are determined, it’s not apparent that those case studies meaningfully trickle down clarify smaller, everyday decisions by content moderators on Facebook’s lower rungs.
As always, the squeaky wheel gets the grease — but two billion users and reactive rather than proactive policy enforcement means that there’s an endless sea of ungreased wheels drifting around. This problem isn’t unique to Facebook, but given its scope, it does make the biggest case study in what can go wrong when a platform scales wildly with little regard for the consequences.
Unfortunately for Facebook, it’s yet another lose-lose situation of its own making. During its intense, extended growth spurt, Facebook allowed all kinds of potentially controversial and dangerous content to flourish for years. Now, when the company abruptly cracks down on accounts that violate its longstanding policies forbidding hate speech, divisive figures like Alex Jones can cry censorship, roiling hundreds of thousands of followers in the process.
Like other tech companies, Facebook is now paying mightily for the worry-free years it enjoyed before coming under intense scrutiny for the toxic side effects of all that growth. And until Facebook develops a more uniform interpretation of its own community standards — one the company enforces from the bottom up rather than the top down — it’s going to keep taking heat on all sides.
Source : Facebook’s handling of Alex Jones is a microcosm of its content policy problem