Episode Two Hundred and Nine: Sending Signals; So We Solved Publishing; 2015 (1)
0.0 Sitrep
"Siri, play 'Captain America' from the album 'Captain America: The Winter Soldier' and put thirty minutes on the clock."
"I'm sorry, I can't show you your clock."
"Okay, fine. Siri, set a timer for thirty minutes."
1.0 Sending Signals
Lunch today with Ryan Gantz, with whom I've been internet-friend-circling for the past year or so. I'm happy to report back that Gantz is indeed quite tall and that we had a bunch of super-fun super-interesting things to talk about at lunch, of which one there's one thing in particular I want to write about tonight.
Here's some background: Gantz and I are both geeks (I feel happy to speak for him in this regard) and we were talking about things like whether or not we could justify getting the Apple Watch. For me, I have to be perfectly honest and admit that at least 50% of the purchase would be to have the Cool New Thing, but I feel more and more comfortable with admitting that I'm genuinely curious about *How The New Thing Works* and that I want to have opinions about whether and how Dark Sky taps me on my wrist to tell me if it's going to rain. (Yes, you may make a "ha, when does it *not* rain in Portland" joke, for which I will simply point you in the direction of Seattle). In other words: the Watch and how it works - not whether or not it's useful, as such, or even whether or not it works as a quantified self bauble - is what feels interesting to me right now.
But then we both remembered the article in The Verge pointing out that (maybe, maybe not) the deal with the Apple Watch is that you won't need to look or use your phone as much[1]. On one level, this is a bit bullshit because the Apple Watch won't even *work* without an iPhone, so it's not like they're cannibalising sales in the short-term. Not until battery and radio technology advances to the state where you can get full-day LTE usage out of something that's watch-sized does that feel like it'll even make sense (and, arguably, Apple are planning for that day in the same way that they planned for the iPod to eventually be succeeded by something else that did more, better, than the iPod). All that's to say is that the continuum of computing fabric will become more expansive: for some people, a thing on your wrist may be the only thing they need, for some other people, they might need a phone-sized thing, for others, a Legacy Unmanaged Computing Device that runs full-on OS X. This isn't a big deal, as far as I'm concerned.
What we *did* talk about, though, was the whole idea of not having to *look* at your phone as much any more, and there were a few things that fell out of that. It's not like people didn't used to glance at watches to see what time it was - and that such glancing could be used as an explicit or implicit signal (and be received along the same continuum) as "this person is trying to hurry me along", for example. A pointed look at a watch that is just a watch can be quite rude.
Of course, this presupposes that the thing you're looking at is a watch. My wife and I have a thing - and I suspect we're not alone in this, and it's something that Gantz and I talked about over lunch - which is the sheer opacity to a third-party observer as to what a person who is "computing" is doing. As Gantz put it, when he was growing up, he would see his father settle down at the weekend for most of the day with a newspaper and this was an implicit signal: his dad was paying attention to a newspaper (and the bundle of meaning and associations that come with what a 'newspaper' is). Ditto for settling down with a book, or talking on the phone with someone. These are reinforcing behaviours, showing young Gantz what's important: his father valued time reading a book, reading a newspaper and talking with relatives.
A version that my wife and I have experienced is that when we're looking at our Black Mirrors or using our laptops, from a third party observer's point of view, without looking at the screen, you have *no idea* what the computer operator is doing. They could be paying bills. They could be booking a doctor's appointment. They could be on Reddit.
The collapse of physical forms of media with signifiers as to activity down to media-agnostic electronic devices that can perform *any* media function means that we've encountered a massive loss of context. Gantz and I could be reading books - proper, literary books! - in the same way that our fathers did, but because we're doing it on our iPhone Black Mirrors, there's no signalling as to *what* we're doing.
Sure, we've gained the ability to shield what we're doing, plus we've gained the ability to read books in places that we'd never take books. And we've gained the ability to carry with us a gazillion hours worth of video footage and enough books for lifetimes over and that's before you even get started with games or any of the other stuff that a general-purpose computing device with a high-resolution display and sound can do for you. But we've also lost the ability to show the world - and the people in it - what we're doing.
This reminded me of the YotaPhone[2], an Android smartphone that is a regular phone on one side, and has an e-ink display where the back of the phone would normally be. As far as I can tell from reviews, the YotaPhone is a perfectly capable phone in the way that it should technically be hard to make a *bad* Android phone these days given reference designs from your common garden OEM, but the thing that a YotaPhone *could* do is give you the choice to signal what you're doing.
A way to do this right now in relationships is to explicitly and verbally tell the other person what it is that you're doing. "Hey, I'm going to do this now", which is pretty much a good thing to do anyway because it's not like more communication is generally a bad thing for a relationship in the first place.
But - and this is where I thought it was a bit funny - this type of signalling "hey, I'm listening to this thing now," or "Hey, I'm reading this book now" that help people decode context that might otherwise have been inferred without explicit or verbal communication sounds *awfully* like status messages from social networks of old, or even IM presence indicators. But of course, we don't really have IM presence indicators anymore. We just message each other. All. The. Time.
So what sort of messaging *did* physical media types do? They were local, for starters. More distance than NFC, less than a Bluetooth beacon. They were typically line-of-sight. And they were vulnerable to being hacked, but that kind of thing wasn't done by most people: sure, you could hide your copy of Playboy inside of The New Yorker, but not many people did that. And how people *love* to signal, sometimes! We might not do it all the time, but we're conscious of what other people think of us - sometimes too much - and sometimes make a choice as to what book we're seen to be reading or what music we're, er, heard, listening to.
So, anyway. A phone or a laptop that lets me show people - if I choose to - what I'm doing. That would be interesting. Someone get on that.
[1] Apple doesn't want to talk about the real use for the Apple Watch | The Verge
[2] The YotaPhone 2 has two faces, zero gimmicks | The Verge
2.0 So We Solved Publishing
A brief thought, this: we "solved" publishing by making it easy for anyone, anywhere, to more or less publish anything they wanted on the internet, available for all n billion of the world. Facebook made it easy for around 1.5bn people to "publish" to 1.5bn other people.
So, great. We can publish now. One of the things we're working on at Code for America isn't to make publishing *easier* - it already is easy enough, in some ways, although certainly it can be more *usable* and more *accessible* - but one of the next potential questions is this: how do we make it easy to publish the *right thing*.
A content management system is easy. It manages content.
But making sure only useful, relevant and actionable is published? Now that would be something.
3.0 2015 (1)
There's a 100mbit line going straight into the basement, split into a gigabit ethernet switch. Piped into a 30in high-definition display with speakers and cameras to match, for a bunch of instantaneous video conferencing. And then, stuff a stupendous amount of computing power into an 11in form factor, with a battery that will last the better part of a day and wireless comms that can talk to the worldwide network at around 400 megabits a second.
Then there's this: a fleet of cars parked all around the city, hooked up to the same data network, their positions updated realtime on a map accessible to anyone, anywhere, that can be hailed and reserved. Until there's a jitter in the network, this one around a 1.5 on the Webb scale - a system outage on the car management system that means all of the cars are unresponsive. They've dropped off the network, dumb nodes now, not talking to anything. Sure, they can read the local NFC card, but the auth request tries to get out and goes... nowhere. GPRS or UMTS or whatever's down, and falling back to the voice line - routed over IP to a call center somewhere that's commercially opportunistic, no doubt, definitely not located *near* to this particular city, at any rate - just results in a no-service tone. Maybe make that a 1.7 on the Webb scale. Definitely not a 2.0 yet, because it's just a single service. Sure, that entire single service has fallen over, but it hasn't spread. Not yet, at least.
Ok. Cancel. Reroute. Find another car. Start walking. Follow the blue line. Look up, clouds look like they're getting dark, and there it is: a push notification that it's going to start raining soon and sure enough, fifteen seconds later, the drizzle starts. Keep walking. Ask the assistant to send a message home to say the car wasn't working, that you're walking to the next one. "Okay, send it." Keep walking. Another push notification, this time from a scripted personal coach. "You rarely walk for that long! Goooo Dan!"
This is a long walk. Longer than normal. What would it take, you wonder, for those services to collude: the scripted personal coach to talk to the Diabetes research platform to talk to the car service and decide that a daily goal hasn't been hit yet and to engineer a longer walk. Not yet, you think. Maybe this time next year.
You get to the car, tap the card and wait. No idea if the service is back up. Check the chatter: other people are complaining about it, too. The car opens and you ask for directions home as well as how long it'll take.
Drive.
It's 2015.
--
32 minutes, 50 seconds.
Send notes! I love to get notes.
Best,
Dan