Facebook’s Mission Statement states that your objective is to “make the world more open and connected”. In reality you are doing…
Facebook’s Mission Statement states that your objective is to “make the world more open and connected”. In reality you are doing this in a totally superficial sense.
If you will not distinguish between child pornography and documentary photographs from a war, this will simply promote stupidity and fail to bring human beings closer to each other.
To pretend that it is possible to create common, global rules for what may and what may not be published, only throws dust into peoples’ eyes.
– Espen Egil Hansen (Editor-in-chief and CEO Aftenposten)
Building and maintaining a n-to-n communications platform for over a billion *daily* active users across multiple access platforms *is* difficult and *is* hard and you’ve done it and congratulations, that was lots of work and effort. You - and your Valley compatriots - talk excitedly and breathlessly about solving Hard Problems and Disrupting Things, but in other areas - other areas that are *also* legitimate hard problems like content moderation and community moderation and abuse (which isn’t even a new thing!) - do not appear to interest you. They appear to interest you to such a little degree that it looks like you’ve given up *compared to* the effort that’s put into other hard problems.
You can’t have it both ways. You can’t use rhetoric to say that your people - not just engineers - are the best and the brightest working to solve humanity’s problems without also including the asterisk that says “Actually, *not all hard problems*. Not all difficult problems. Just some. Just the engineering ones, for example."What you’re doing right now - with your inflexible process that’s designed to be efficient and work at scale without critically being able to deal *at scale* with nuance and context (which, I’d say, is your difficult problem and a challenge you should *relish* - how do you deal with nuance at scale in a positive manner?!) smacks of algorithmic and system-reductionism.
–Dan Hon, s3e27: It’s Difficult
It is tempting to make every fiasco at Facebook about the power (and the abuse of power) of the algorithm. The "napalm girl” controversy does not neatly fit that storyline. A little-known team of humans at Facebook decided to remove the iconic photo from the site this week.
That move revealed, in a klutzy way, just how much the company is struggling internally to exercise the most basic editorial judgment, despite claims by senior leadership that the system is working.
–Aarti Shahani, With ‘Napalm Girl,’ Facebook Humans (Not Algorithms) Struggle To Be Editor
The same week Nick Ut’s picture didn’t make it, the small town East Liverpool (Ohio) posted two photographs of a couple that had overdosed in their car, with a small child sitting right behind them. Addiction experts were quick to point out that public shaming would very likely be counter productive. In this case, it was reported, “a Facebook spokesperson said the photos did not violate the company’s community standards.”
As in the case of Ut’s picture, the decision over whether or not to publicly share photographs like the two East Liverpool ones ought to be in the hands of highly trained photo editors, people who not only have the knowledge to understand the “news value” of the photographs, but who have also wrestled with the different underlying ethical problems.
However much any editor’s decisions might be flawed at times, at the very least we can be certain that they have thought about the underlying problems, that, in other words, we’re looking at the end result of an educated process (regardless of whether or not we end up agreeing with it or not). The world of Facebook does away with this.
– Jörg M. Colberg,The Facebook Problem