Episode One Hundred and Fifteen: Manifest Destiny; A Little Bit Desensitized; The Material (2)
0.0 Station Ident
It's hot in Portland and we don't have a/c. So I'm melting.
1.0 Manifest Destiny
There are some people who, in reaction to the Facebook emotion contagion study, have asked "how could Facebook misread the situation so badly?"
At some point, you wonder: was this a misreading, or was this just in fact a conscious choice?
I was talking with a friend the other night about this and wondering if this is a particularly British or even European reaction. There's a lot that's been written about the British national psyche, the psychosocial damage that occurs to a nation state after having the specific experience that Britain had before, during and then after World War 2. As an outsider, there's a lot that's admirable about America, or rather, the ideals of America. These are elements of Brand America, as it were: optimism and a sense of moving forward, not necessarily requiring individualism. I think I've written before about how, in 2002, I visited my now wife in Washington, DC and we went to the Jefferson Memorial. There are some stirring, powerful words locked away there.
Anyway. The point was: at some point, you can decide to *not care* what other people think. And anyone who's grown up on the internet and had some role in community management, or rolling out product changes, knows that if you do *anything* on the internet, people will want to know why they weren't consulted. The connective tissue of the internet forms, after all, a customer service medium[1].
2014 was the year that marked Facebook's transition from Move Fast And Break Things to Move Fast With Stable Infra[2]. Facebook, like a lot of physical protrusions of the Californian Ideology, doesn't believe that you make progress without breaking a few things. That applies - from the outside, looking in - to product-as-engineering-material and also product-as-user-experience.
Or, put another way: the future is unmade, and needs to be made - at least in a fashion - for us to see it. You could argue that empathy gets in the way. Empathy prevents us from doing the things that need to be done, from doing - as they say - the unpleasant thing, the necessary thing.[3]
There has been a sense that has gone back a long time that the internet is a frontier. That, as the critics of the Californian Ideology wrote, the internet was seen as a new place where "all individuals will be able to express themselves freely". To build this new place required breaking from the old; and where else other than in the American spirit do we see this forging ahead.
So no, this is nothing new. This breaking of a few eggs, this doing of the necessary thing: because that is the price of progress.
So when you look at the more libertarian West Coast ideologues, the ones who want to enact regulation free areas - they want the frontier back. There's recognition that the internet, now that it has been tamed and fought back and made usable, now that the rest of the population has moved in and erected their shanty towns in Geocities and filled up the directories of Facebook, been spidered by Google, now that all of that has happened - it isn't a frontier anymore.
So they need a new one.
[2] Facebook kills off its ‘move fast, break things’ mantra
[3] The opening scene of “House of Cards” was a triumph of creativity over data and better judgment
2.0 A Little Bit Desensitized
And so we come to the Wall Street Journal's piece[1] that goes into more detail as to Facebook's Data Science Team and Sheryl Sandberg's sort-of apology where she says "we never meant to upset you", and Slate's sort-of-not-quite-on-the-money-but-near-it piece titled “We Never Meant to Upset You,” Facebook Says of Study That Was Meant to Upset You[2].
But, of course, the Facebook study wasn't necessarily *meant* to upset you, more to see if you'd be upset if you saw upset things. Sounds like splitting hairs, I suppose.
Here we are again, though, back at the empathy wagon and asking: do motivations matter? The Hacker News discussion[3] on this is fascinating, mainly because it includes throwaway lines around the nature of ethics and what it takes to do business these days (is a gym that attempts to maximise the number of inactive members unethical? Yes, probably. Is it a good business? Yes, probably)
danah boyd's piece on Medium[4] cuts to the heart of the matter, I feel:
"Information companies aren’t the same as pharmaceuticals. They don’t need to do clinical trials before they put a product on the market. They can psychologically manipulate their users all they want without being remotely public about exactly what they’re doing. And as the public, we can only guess what the black box is doing."
This is what I mean by understanding the algorithm. Whether it's published in public or at least reviewed by, as boyd suggests, a corporate institutional review board.
The thing is, part of this is simply a by-product of having data. It's just *there*. The society and culture that we built - hopefully by accident, and not intentionally - happens to be retaining all of this information just because it's the easiest, *laziest* thing to do. Moore's law has not only given to us a bounty of communication and computing capability, but the unrelenting progress in storage technology has given us the ability to never have to worry about forgetting. It will all be there, every single little bit or byte, and - what's worse - we might not even know if it was subject to bit rot. It's all built on the same edifice of bits passing through pipes and being scribbled down, whether it's a corporation, a government or a non-profit. It's the architecture we built computing upon.
But.
But what confuses me is that sometimes radical display of transparency (holocracy, for example, or the startup trend of salary transparency) does not always carry through to the user experience. And then there's that phrase again: "user experience". In aid of what? For what?
Attention-based companies may well have to one day deal with a reckoning. Time and attention is the one thing that they can't give back. And unease at the value exchange, of even *thinking* that interior states are being manipulated for a vague end ("a better user experience") means what? Do your users trust you that you know what a better user experience *is*?
[1] Facebook Experiments Had Few Limits
[2] “We Never Meant to Upset You,” Facebook Says of Study That Was Meant to Upset You
[3] https://news.ycombinator.com/item?id=7980743
[4] What does the Facebook experiment teach us?
3.0 The Material (2)
I had a good note back from a reader about one of their favourite no-screen smart objects, one that I can't believe I forgot in my enthusing over the Withings Activite.
The iPod Shuffle[1] - the original one, the gumstick one, the one I wrote about back in episode five[1] - was one of those fantastic smart objects without a screen. And it wasn't even that smart! A tiny USB stick that music came out of, that didn't make any sense at the time because - as ever - we were preoccupied other items on the feature list for the category of MP3 player at the time.
This is part of the annoying thing about Apple and by extension, the annoying thing about Steve Jobs. I wrote back in episode five that that iPod Shuffle had sold ten million units (probably about ten million more than the Galaxy Gear would sell) and that it didn't even have a screen and the beauty of it was that it was selling *music*.
A number of the chats that I had down in the Bay Area were basically plaintive wailing, gnashing of teeth and slamming wearble-adorned wrists down on tables asking: why are these all so bad? Why is it so hard to get them right?
The Shuffle is an early example of this new kind of material, almost a proto-material. It's so *dumb*. It's not networked - it relies on a USB mass storage connection and talking to one piece of software, iTunes. At launch, it could store hardly any music. But, it worked out what people wanted, and it worked out what it was selling, and it did it at a price that was practically stupendous. So yes, lots of things worked together with that particular product. But it was a *thin* material.
So I'm sat with a bunch of people talking about all of this business around wearables and I'm still convinced that no-one has worked out why you might want one of these smart devices. I mean, really, really want, in a way that's going to go mass. And certainly no-one's worked out how to communicate them. Shopping at Target today, it turns out that Fitbit and other health wearables have shelf space in *two* places in the store - in both the fitness section *and* the electronics section. And if my gut is right about how shelf space is the most valuable commodity inside of a store, then something must be up to have the same product in two different places. And to be clear, we're not talking end-cap or FSU or anything - just two pieces of regular shelf space down an aisle.
You want mass appeal? Then work out what these wearables are *for*. Work out how to make wearables make you *feel safe*.
[1] iPod Shuffle
[2] Episode 5: Dreaming Invisible Dreams
--
Best,
Dan