Friday, September 9, 2016

Here's Why Facebook Removing That Vietnam War Photo Is So Important


The social network needs to admit its responsibilities as a media entity.
Facebook is more than just a site where people share photos of their children or pets. It has become a crucial way in which hundreds of millions of people get information about the world around them.

And the tension between those two things is becoming difficult to ignore.

In the latest controversy involving the giant social network’s news judgement, Facebook  FB -1.81%  removed an iconic photo from the Vietnam War: A picture of a young Kim Phuc running naked down a road after her village was hit by napalm.

When a Norwegian newspaper editor—who posted the photo as part of a series on war photography—tried to re-post it, along with a response from Phuc herself, his account was suspended.

The editor-in-chief of Aftenposten‎, Espen Egil Hansen, then wrote an open letter to Facebook CEO Mark Zuckerberg criticizing him for doing so, entitled, “Dear Mark. I am writing this to inform you that I shall not comply with your requirement to remove this picture.”

“First you create rules that don’t distinguish between child pornography and famous war photographs. Then you practice these rules without allowing space for good judgement,” Hansen wrote. “Finally you even censor criticism against and a discussion about the decision – and you punish the person who dares to voice criticism."

After the open letter was published, a number of prominent Norwegians posted the Phuc photo in support of the newspaper, including the conservative prime minister of the country, Erna Solberg. Her post, which was also critical of Facebook’s decision, was deleted.

“I appreciate the work Facebook and other media do to stop content and pictures showing abuse and violence,” the prime minister wrote. “But Facebook is wrong when they censor such images.” Removing such photos, the Norwegian PM said, is a curb on freedom of expression and amounts to the social network “editing our common history.”

In his open letter, Hansen described Zuckerberg as “the world’s most powerful editor,” and that is exactly what Facebook has become.

The social network’s size and influence—particularly for younger users who increasingly get their news there—means it plays a huge role in determining what people see or read about the world around them.

In effect, it has taken over the role that newspaper editors used to play in deciding what photos to show and which headlines to include.

The problem is that Facebook isn’t driven by the kind of news judgement or journalistic principles by which most newspapers and other traditional media outlets are driven. So far, it refuses to admit that it has any such responsibilities.

Because it sees itself as mostly a place where people share photos with their friends, Facebook removes violent or disturbing images as part of its “community standards.” And what if those images are newsworthy? In most cases, the site’s desire to maintain a friendly and non-threatening atmosphere takes precedence.

In a response to a Norwegian news outlet, a Facebook spokesman said: “While we recognize that this photo is iconic, it’s difficult to create a distinction between allowing a photograph of a nude child in one instance and not others.”

Why the Napalm Girl issue is relevant today: Imagine it were a Syrian girl running from chlorine bombs in Aleppo.

This problem didn’t start with a Norwegian newspaper editor posting a Vietnam war photo. Facebook has been removing disturbing or violent images of the war in Syria and in Iraq for some time now, as well as imagery from Turkey and elsewhere.

Investigative journalist Eliot Higgins called out the social network in 2014 for removing crucial details about the Syrian government’s attacks on its own people, by deleting pages posted by rebel groups.

Facebook controls what users see in two fundamental ways: Its news-feed filtering algorithm decides how to rank various kinds of content to make the feed more appealing, and a team of human beings flags and/or removes posts when they appear to be offensive or disturbing.

Both of these methods can break down, or have adverse effects on how we see the world. For example, the algorithm can hide or down-rank important content because of a built-in bias, as some accused the network of doing after a black man was shot in Ferguson, Mo. in 2014 after attacking a police Officer.


The human approach can also be gamed if users repeatedly flag something they disagree with as offensive. Facebook also removes content when asked to do so by governments.

Of course, the social network is a corporation controlled by its shareholders (primarily Mark Zuckerberg), and therefore it isn’t subject to the free-speech dictates of the First Amendment. But it arguably plays a stronger role in information dissemination and consumption than any media outlet has in the history of modern media.

There’s an assumption when reading a newspaper that the editors in charge are interested in informing people about what’s happening in the world, even if that information is disturbing or offensive to some. But there’s no such assumption with Facebook because it denies that it is a media entity or that it has any duty to inform.

That position is becoming increasingly untenable, however, as the impact of its removal of certain kinds of content—or even the way it ranks information in its trending topics section, which has also been the source of controversy—continues to escalate.

Some have suggested that Facebook should have a “public editor,” the way media outlets such as the New York Times  NYT -2.73%  do, or an advisory board of journalists who can help it make such decisions. But that would require the company to admit that it has some responsibilities as a media company, and so far it seems reluctant to do so.

by  Mathew Ingram  @mathewi


No comments:

The Chomsky Hoax

The Chomsky Hoax
Exposing the Dishonesty of Noam Chomsky