Episode One Hundred and Fourteen: Calm Down. Breathe; Making Visible; Wearables
0.0 Station Ident
71 degrees American, slightly breezy and a slow, quiet and thoughtful day after a packed yesterday and conversationally stimulating morning. As well as the most recent episode of Tom Coates Explains Comics To You So You Don't Have To Read Them All But Now You're Going To Anyway.
1.0 Calm Down. Breathe.
That was how Mark Zuckerberg talked to Facebook users back in 2006 in response to the introduction of the News Feed and Mini Feed products[1].
There's been a lot of talk about the Facebook emotional contagion scientific study. There's a few simple issues here, and it's clear that:
- there's an established protocol in terms of informed consent for running scientific experiments, and Facebook did not appear to follow that protocol
- instead, Facebook chose to interpret their terms of use broadly such that the term "research" applied not only to product, but to scientific research
- whilst many companies (if not all) are constantly conducting research and segmenting user groups, there is a continuum where one end of the spectrum is clear product benefit, and the other end is, shall we say, more manipulative
- Facebook have - I think, charitably, as a result of being a large organisation that is seeking to empower their employees - communicated the study, and their response to it in a less than well-managed manner
- ultimately, they are still providing a service that a large number of people find incredibly valuable, and those people will still keep using that service.
What the study and the context around it does highlight, though - and this will not be a surprise to anyone who's a regular reader of this newsletter - is the manner in which Facebook treats and regards its users. There is the rhetoric from some, of course, that "you're what's being sold" or that "you are the product". That is, in some respects - beside the point and a symptom of another issue that we have yet to deal with (if we choose to) as a society, or collection of societies: we like free things, and we value things in the present more than we value things in the future. Free-at-the-point-of-consumption services like Facebook offer a value proposition that, if you want to look at it that way, are cognitive hacks. They take advantage of the cognitive architecture that we have, in the same way that pretty much every business has done.
So, this is a wider debate about attention-based companies. Facebook, because they are the gorilla in the room, are bearing the brunt of this attention.
Secondly, this *is* bigger than just a/b testing for a particular product feature. To try and put it in a different way, this is more of an atavistic reaction to having your "self" controlled or directed by a third party. To say that corporations have been manipulating people for years is fine - but you must recognise that *because* corporations and whomever have been manipulating us for such a long time, we have achieved some sort of detente or understanding as to that process. When that process is underhand, or opaque, then that's when society raises a red flag.
Friends have joked with me: well, it's all well and fine, but you've just gone and worked in advertising for the last three years. And yes, that's true. But, we were doing it in ad breaks. And advertising is - in some ways - pretty highly regulated. You might think that it's not regulated *enough*, or not enough, but there's regulation nonetheless.
But here we have a platform that is controlled, more than television or print, from top to bottom. Governments gave broadcasters the airwaves and in return asked them to act a certain way. There has not been such a bargain with the internet.
Experiment size also does not matter. The fact that everything you see on a computer has been governed by the machinations of an algorithm - however complex or simple - does not matter.
At its core, this is, I think, an emotional reaction (in those who are having and displaying it) to learning of potential, unknown, manipulation of their emotional state. Advertising may well be *unwanted* manipulation of internal state, but at least it is overt manipulation. This may well seem like a needless distinction ("you're being manipulated anyway!") especially when that manipulation works, but at least you know about it. Cold comfort for some, but comfort nonetheless.
And this is where we come full circle to the opaqueness of the algorithm.
Sure, newspapers editors do this. But, in a way, they talk about the fact that they are doing it.
Facebook, on the other hand, has attempted at times to be a conduit. A pipe, an unbranded one at that, until recently, that has strived to deliver utility and to get out of the way of the user's content. But an algorithm that seeks to show you "the most relevant content to you" is significantly more opinionated than one that is merely "reverse chronological". Further, an algorithm that seeks to show you "the most relevant content to you that is likely to induce state x" is not only opinionated, but willing a change into the world.
That is not, I'd say, the agreement that people would understand they were entering into with a communications utility.
In any event, the state of the art moves forwards. More people are now familiar with sentiment analysis. I'm not aware of Facebook's ad products team working closely with data scientists in research, but everyone else in marketing will now be eyeing the possibility of sentiment-targeted reach advertising, never mind the already existing promise of Facebook's reach plus targeting offering.
[1] Calm Down. Breathe. We hear you.
2.0 Making Visible
In the end, part of this is around literacy. The Algorithm is the News Feed, and a number of people are learning that they want to know what the Algorithm is and how it works, because it turns out the Algorithm has a role in shaping their perception of the universe.
Let me put it this way: the new Plato's Cave is algorithmic fire casting shadows on our screens. This is, and always has been, about control. And you can't understand how that control is being exerted unless it is visible.
Silicon Valley - and the tech industry in general - has in some respects a responsibility with the technology that it is deploying into our world (and that's that word again, connotive of military action *deploying* into a theatre, a region of conflict). It can empower us and help us understand the technology so that we welcome it and know what we can do with it. Or, it can chain us and entrance us with the shapes that it throws at a distance.
I am romanticising of course, but in the era of electrification, we used to have roadshows. Technology is in danger, I think, *because* it has strived to make certain processes hidden and incapable of interrogation or understanding (this isn't the same as "easy to use", note) that one of our other cognitive biases - that of folklore and mythology to explain how the world works - steps into the gap and fills that breach.
And thus: we start imputing into how the algorithm works, what it wants, and what it's doing, when necessarily the algorithm (or, at least most of them) are human creations.
This is what I think is important: people who don't understand how the world works are not empowered to remake the world according to their needs or desires. In a world where a sizeable proportion of people in even first world countries lack basic literacy and numeracy skills, it feels almost Sisyphean to talk of helping people understand the machines that mediate our existence.
In a way, I am not as concerned about resistors and capacitors and circuits, nor necessarily LCDs or LEDs. Perhaps broader areas like "wi-fi" and software radios, and how information gets communicated from one place to another. Almost like a levelling up of infrastructure. But the process: that of gathering information, operating upon it, *deciding what to do with it* and then releasing it back into the world; that's a human thing. That's a thing that we are choosing to do, and we're not blameless in the way that we do it. We do these things intentionally, and the tools with which we do it should be tools that people understand.
(An aside: view-source for the world is simultaneously a good idea - yes, it would be good to understand the world - and a terrible one, because view-source is such a narrow way of presenting such information. Like Brett Victor, I want better ways of interrogating and understanding the world rather than ASCII text.)
3.0 Wearables
So I'm down in the Bay Area, at cloud city zero and I am talking to a lot of people at Wearables. We're talking about how I think they're terrible right now, a downright failure, how I'm not confident that Apple will have an iPod moment, and how still, nearly a week later, I am so excited about something like the Withings Activite.[1]
Why am I excited? It's genuinely the first watch I'm excited about buying. And mainly because it *doesn't have a screen*. It's a smart object that has eschewed the lazy type of Valley and tech culture thinking of smart equals display. But smart also means sensing, computation and communication. Display is interface. A full-colour, not-completely-round is simply one way to show information and sure, it can also be a control surface.
But on the Activite, the (non-display) glass acts as a control surface too. Double-tap it and the hands on the watch move to show you the time your alarm's set for.
I love the idea of non-needy smartness. Smartness just in every day things, without the lure of a screen. And also, because screens are *lazy*. You put a screen in something and suddenly, hey, now you have 24 bit problems with an alpha channel at a 200dpi+ resolution. Congratulations.
And gosh, it turns out I have so many opinions about what's *wrong* about wearables right now. About the distinction between sensing and interface. And the interface that exists after the sensing. And, in the end, how to make the damn things humane.
I like the Culture's terminals. I like the idea of adornments, of things that are smart because they are connected or because they can sense. Or things that speak through other things, or communicate through other things. And I like being in conversations with people who can at least imagine a non-siloed world. At least, until the suits get into the room.
So my hope for Apple is this: please, not a screen. You've sold us a thing with a screen and we love that already, perhaps too much.
--
Best,
Dan