s3e27: It’s Difficult 

by danhon

0.0 Station Ident

11:18am on Friday, 9 September 2016. I am not at the XOXO Festival introduction because I have spent the last hour in a conference call going through (important) minutiae in the statement of work of one of my contracts. One point of view on this would be to take the view that going through this minutiae in deliverables and tasks is micro-managing and getting in way of Accomplishing The Goal, but in this case, Accomplishing The Goal is as much about getting people to focus on Accomplishing The Goal as it is about getting them away from focussing on (in the end) non-productive, unuseful minutiae. So, it’s kind of worth it, in a way? Not particularly enjoyable though, and there’s something about contract negotiation that ultimately ends up feeling adversarial and not collaborative. After all, there is a rather large power relationship at work.

Anyway. I’m at the XOXO Outpost this morning. It’s a nice, sunny, not-too-warm, not-too-cold, not-rainy, nice fluffy clouds kind of day. Exactly the kind of day that Portland puts on to impress visitors, which is NOT WHAT WE WANTED today, when we have lots of interesting people coming in to town for the XOXO Festival and who might get tempted to move here because hey does anyone have any good recommendations for city planning and housing policy that satisfies both the traditional families who want a house and a yard and a basement *and* all the bright young things moving in wanting urban loft living *and* keeping the city walkable and navigable and all of those things.

I’m looking at you, wonks from Arup and elsewhere.

Also, I’m more caffeinated than usual today, so that might explain the following.

1.0 It’s Difficult

Trigger warnings: Facebook, “silicon valley”, applying algorithms to people without an understanding of people and sociology, a public relations response, The Terror Of War, aka That Photograph From The Vietnam War By Mark Ut, living in a pre-Jackpot, pre-Event society where you’re constantly thinking that we’re getting closer to The Other Shoe Dropping Where Everything Changes For Ever, Only Everything Is Already Changing Forever It’s Just Your Dumb Human Perception Of Time And Your Inability To Deal Rationally With Risk And Probability You Poor Homo Sapiens

(This newsletter episode is an expanded version of this thread of tweets[0])

OK, backstory. Facebook did a completely normal Facebook thing if you’ve been following Facebook for the last few years. They have rules – like most online services – about what’s acceptable content to publish. Sometimes (constantly), content that is probably fine and entirely socially acceptable (e.g. breastfeeding mothers) is either removed or requested to be removed from the service because it contravenes (Facebook’s arbitrary) community standards. Facebook is a private platform. They’re allowed to do this. The fact that they’re not very good at it is kind of beside the point. There is, as yet, no law requiring them to be non-terrible at this type of content moderation.

This time, the content in question was a photograph from Nick Ut’s iconic Vietnam war series, The Terror of War[1]. This particular photograph is of a naked 9-year-old-girl running toward the camera, fleeing a South Vietnamese napalm attack. By all accounts, it is an incredibly important part of the historical record and something that, in our best ways, reflects our humanity back to us.

All you really need to know is that Norway’s largest newspaper posted that photograph to Facebook and was asked by Facebook to remove it. So far, business as usual. Then, as they say, things “escalated quickly” when the editor of Afterposten posted a front-page editorial essential saying “No, fuck you, Mark Zuckerberg, are you fucking kidding me, this is an iconic, historical photograph, I’m not going to fucking remove it”[2].

Again, business as usual: sometimes when Facebook asks for things to be taken down for reasons that civilized society thinks are somewhat spurious, there’s a protest organized to generate enough signal to puncture through the noise of the Internet and provoke Facebook into a non-hard-coded reaction (ie: the equivalent of launching an appeal to the Supreme Court of Public Opinion and forcing the issue for a one-time get-out-of-content-jail pass).

And again, as usual, significant signal is generated and this morning I saw that Facebook had, in The Guardian’s words, “backed down from ‘napalm girl’ censorship and [reinstated the] photo”[3].

Which you know, I guess that’s okay if your algorithmic method of exception handling for content moderation is “when something is serious enough that it’s a public relations issue and then have a human come in to fiddle with the box and say something reassuring to the users”.

No, what *really* pissed me off this morning and set me off on that Twitter rant was the *way* Facebook talked about what had happened and – if you take that response and extrapolate it and apply it to not only all of Facebook but Silicon Valley in general and then you just get SO TIRED about the world. So, what did Facebook’s PR say?

This is what they said:

“While we recognize that this photo is iconic, it’s difficult to create a distinction between allowing a photograph of a nude child in one instance and not others.” [my emphasis]

“After hearing from our community, we looked again at how our Community Standards were applied in this case. An image of a naked child would normally be presumed to violate our Community Standards, and in some countries might even qualify as child pornography. In this case, we recognize the history and global importance of this image in documenting a particular moment in time.”

“Because of its status as an iconic image of historical importance, the value of permitting sharing outweighs the value of protecting the community by removal, so we have decided to reinstate the image on Facebook where we are aware it has been removed.”

“[Facebook will] adjust our review mechanisms to permit sharing of the image going forward” and that it is “always looking to improve our policies to make sure they both promote free expression and keep our community safe”[4]

“It’s difficult,” say Facebook (again – not a named quote, but I believe a written, prepared statement from the company) “to create a distinction between allowing a photograph of a nude child in one instance and not others.”

To which I say: NO SHIT, SHERLOCK.

Look, here’s the problem (and really, it’s not just one problem. It’s a giant problem and this is but one symptom, and of those myriad symptoms, merely one of the most recent that’s happened *today* in terms of the entire Internet-attention-complex):

Facebook – and, more or less, Silicon Valley, in terms of the way that the Valley talks about itself, presents itself and so-on – is built on and prides itself in solving Difficult Problems. At least, they are now. Facebook is a multi-billion dollar public company where *some* things are difficult and worth doing (e.g. Internet access to 1bn people using custom-built drones[5], but other things are, by implication, *TOO HARD* and don’t warrant the effort.

An example of a difficult thing that is worth doing: bringing Internet access to everyone on the planet. What are some of the difficult things involved in that?[5]

* design a new, long-flight-time, energy efficient drone
* add solar panels to the prototype
* design and build new high-density batteries to power the drone
* make sure the drones strong and light to reduce maintenance costs
* and so on

Look, Facebook (and by extension, every other company in the Internet-attention-economy-complex), I completely agree that you do solve *some* hard problems but, to coin a phrase, *not all hard problems*. And yes, there’s a weasel way out in the statement quoted above that says of course Facebook is always looking to improve policies. But here’s the rub: are you actually improving them? Are you attacking them with the same zeal and vigour of other “difficult” problems? Because from here, it looks like you’re not. Apologies to Mr. Clarke, but sufficiently depressing lack of progress is indistinguishable from complete lack of caring.

Building and maintaining a n-to-n communications platform for over a billion *daily* active users across multiple access platforms *is* difficult and *is* hard and you’ve done it and congratulations, that was lots of work and effort. You – and your Valley compatriots – talk excitedly and breathlessly about solving Hard Problems and Disrupting Things, but in other areas – other areas that are *also* legitimate hard problems like content moderation and community moderation and abuse (which isn’t even a new thing!) – do not appear to interest you. They appear to interest you to such a little degree that it looks like you’ve given up *compared to* the effor that’s put into other hard problems.

You can’t have it both ways. You can’t use rhetoric to say that your people – not just engineers – are the best and the brightest working to solve humanity’s problems without also including the asterisk that says “Actually, *not all hard problems*. Not all difficult problems. Just some. Just the engineering ones, for example.”

Because it *looks* like this: it *looks* like you’re entirely happy solving hard engineering problems (designing and building new server architectures! “Saving” the planet by reducing your carbon footprint and designing more energy-efficient systems! Making it super easy for me to take a video of my newborn baby and effortlessly share it with my entire Dunbar’s worth of family and friends! (AND I AM NOT BEING SARCASTIC ABOUT THAT ONE, I GENUINELY AM ACKNOWLEDGING THAT IT’S A DIFFICULT PROBLEM AND WORK WENT INTO IT AND SOLVED THE SHARING A BABY PHOTO PROBLEM).

But. These other problems. These soft, social, anthropological problems. These problems that do not fit in boxes. It looks like they are just too hard for you, as an industry, as a community, to deal with. So you ignore them and scratch the engineering itches (which again! Valid itches! But again, NOT ALL ITCHES!)

We – homo sapiens – and the countless civilizations and cultures that we have created over literally *thousands* of years – have been working on some of these problems for a long time. We have figured out some ways to deal with them. They aren’t perfect (nothing is! Apart from, maybe, in our pure world of mathematics and set theory, right?) but they’re what we’ve got and we’ve also figured out ways for those methods to be *malleable* and to change and to accommodate what is fundamentally a continuous system – an analogue, human one – that doesn’t really quantize, that is variable and most definitely, completely, not binary.

It is, of course, always easier to do the easier work than it is to do the hard work. It’s easier to do the attractive work, to solve the difficult problems that interest you rather than the difficult ones that don’t. Or perhaps, even to be tempted to apply one toolset to a different domain. But, the *evidence* is that the difficult problem of bringing Internet access to people (and thus, let’s all acknowledge here, being able to capture some cold hard cash value by being part of the access layer and an intermediary) is way more exciting and interesting and worthy of investment than the difficult problem of figuring out what to do when you can’t tell the difference between an allowable nude photo and an unallowable nude photo.

Building the infrastructure is a hard, difficult problem. Dealing with the fact that *people* will use that infrastructure is also a hard, difficult problem, but one that *also* needs dealing with. Otherwise, things like this will keep happening. The side that the Valley presents to us is of smart people figuring out problems and making the world a better place. So fucking step up. Think about these problems of abuse and content moderation and speech. Try. Harder. Because it looks like you’re not trying at all. And lack of trying looks like lack of caring.

I mean, in some instances, we know how to deal with this as a society. Various countries have ways of determining what kind of content is “allowed” and what kind of content “isn’t allowed”. And yes, those are at the nation-state level, and you don’t *have* to do that, Facebook. But hey, it turns out that maybe you do. Maybe you do when you have a billion active daily users. Maybe you’ll piss some people off if you make a decision or take a point of view!

Apparently that’s okay when you redesign newsfeed, or you tweak recommendation algorithms for placement of newsfeed photos.

But attempting to *improve* upon and be clear about speech on your platform? Too hard?

I mean, I guess that’s too hard when your model of looking at things is fundamentally from a software engineering point of view. When things fit into neat boxes. It would be wonderful if we could apply some sort of map/reduce, scaleable engine to the difficult issue of looking at nude photos. And I understand that, in a way, this is what your A.I. group under Yann LeCun is doing: admirable, hard problem stuff like improving computer vision, object classification and segmentation[6] so that Facebook can automatically provide scene description of images to those who can’t see (and those who can!). But sometimes – maybe, just maybe – things don’t always fit in algorithmic boxes. Maybe there’s a sort of a centaur hybrid of human and algorithmic process working together.

What you’re doing right now – with your inflexible process that’s designed to be efficient and work at scale without critically being able to deal *at scale* with nuance and context (which, I’d say, is your difficult problem and a challenge you should *relish* – how do you deal with nuance at scale in a positive manner?!) smacks of algorithmic and system-reductionism.

This is, in other words, what happens when your society has to be describable by an SQL schema[7] and you have a barely-developed, half-competent, unscalable approach to being able to deal with exceptions that makes it look like you don’t care, even if you do.

Used to be a time, companies would hire anthropologists to help them understand human behaviour and to incorporate that into designs. Perhaps less-so, now. Perhaps now, we think we can do it all with the measurable data we get. Well, guess what; you only get so much, and you can only infer so much from web instrumentation. I’m *not* saying that you don’t get a lot!

Here’s the deal. This *is* a hard problem. It *needs* better approaches. There *are* people out there who can, desperately want to help and bring their expertise to bear. Some of them are probably even open to happily, collaboratively and productively working with software engineers despite the rhetoric of the liberal arts/sciences divide!

But don’t say that you can’t do it because it’s too difficult. You do difficult things every day.

Just show us that you care about these difficult things, too.

(And not just you, Facebook. But the entire industry.

[0] Dan Hon is at XOXO on Twitter: ““it’s difficult to create a distinction between allowing a photograph of a nude child in one instance and not others” NO SHIT SHERLOCK”
[1] Nick Ut – Wikipedia, the free encyclopedia
[2] Dear Mark. I am writing this to inform you that I shall not comply with your requirement to remove this picture. – Aftenposten
[3] Facebook backs down from ‘napalm girl’ censorship and reinstates photo | Technology | The Guardian
[4] Facebook backs down from ‘napalm girl’ censorship and reinstates photo | Technology | The Guardian (this is in direct quotes in The Guardian’s article and also shows up in other press coverage. I didn’t find a URL for it, but it’s presented as a prepared statement that Facebook issued to interested media that asked for comment)
[5] Facebook takes flight
[6] Learning to Segment – Research at Facebook

This very special episode was brought to you by two 20fl oz bottles of Diet Coke and a regular coffee.

I don’t know when I’ll be back, but in the meantime, I always love hearing from you.

If you’re at XOXO, then feel free to come up to me and find me if you can say hi.

Don’t ask me to sign a newsletter or anything because that’s weird and also would feed my ego in unhealthy ways. Also how would you even sign a newsletter. What are you thinking. Stop that.