Episode Fifty Six: a16z Strikes Again; Xbawks; Inside Baseball; The Continuity Field

by danhon

0.0 Station Ident

You know what, it’s actually always sunny in Portland. I don’t know why I said it wasn’t. Also I got to see The Lego Movie again last night and it was STILL GOOD if not SLIGHTLY BETTER and the ending still made me cry because it’s dad kryptonite.

1.0 a16z Strikes Again

So yesterday news broke that Andreessen Horowitz led an investment round in Omada Health[1], for which, goddamnit, they did it again. And by “again”, I mean pointing out a smart area in which there’s a defined need and a massive opportunity. This time it’s in early-stage intervention with pre-diabetics through exercise and diet – pretty much the entire thing that I went through back in 2012, only I had to cobble it together via a bunch of proto-quantified self apps and devices[2].

What’s the easiest way to manage or reverse-out of being diagnosed pre-diabetic? Diet and exercise. And by diet, I mean a low-carb diet and proper consideration to portion control, and by exercise, I mean: doing some. More. Anything, really.

In 2012, like I said, I had to cobble together my response to a type-2 diabetes diagnosis with a Nike Fuelband[3], a couch-to-5K app, a blood glucose monitor issued to me by my primary healthcare provider (and software and hardware purchased from Glooko[4] so I could sync the data outside the proprietary Lifescan silo), a Withings smart scale[5] and tracking what I was eating with Eatery, an app from Massive Health (since acquired by Jawbone).

That’s a lot of stuff. And, when it comes down to it, a *relatively* simple integration challenge but one that has the chance to effect a disproportionate change in health outcome for a large number of people.

So: $23mm investment in a company that looks like it’s doing a good job, in a market that’s going to get bigger and that can expand horizontally (other chronic diseases require management, too) and with defined customers (insurance companies and primary healthcare providers). If this type of service can get *prescribed*, then an even bigger market, too. Because if therapy can be prescribed, then why not computer-aided chronic disease intervention?

[1] http://recode.net/2014/04/09/startup-taking-aim-at-chronic-disease-raises-23-million-in-andreessen-led-round/
[2] http://danhon.com/2012/04/28/myself-quantified/
[3] http://www.nike.com/us/en_us/c/nikeplus-fuelband
[4] https://www.glooko.com
[5] http://withings.com/en/bodyscale
[6] http://www.massivehealth.com/#eatery-page

2.0 Xbawks

As seen on the New Aesthetic tumblr noting how people change their behaviour to fit the algorithm rather than getting the algorithm to fit the people[1] in a London Tube ad about Google Voice Search for mobile (and we all remember learning Graffiti so Palm Pilots could recognise our writing, right?), nothing more than another data point and a me-too.

When we first moved to the US we left behind our UK Xbox 360 and picked up a brand new US one (and in the process, re-created an Xbox Live identity). What that meant was when the Kinect came out, we were American and I quickly found out that I couldn’t get the Xbox to recognise what I was saying. The prompt is “Xbox, <command>” and that just didn’t work in my English accent (my American wife had no problem whatsoever). I, however, had to parody an American accent and could reliably get the console to recognise me if I shouted “XBAWKS! <command>” in my best brogrammer impression.

We do this all the time. In fact, humans do it – we adjust our dialect and the language we use according to our audience, and all Brits who move to the ‘states inevitably end up succumbing and saying warder-for-water. So it’s interesting that we do as humans do and adapt to our audience – even when our audience is an algorithm.

[1] http://new-aesthetic.tumblr.com/post/82085140413/algopop-as-algorithmic-systems-become-more

3.0 Inside Baseball

Ken Segall’s got a good blog that covers Apple and advertising, which makes sense because he’s an ad guy who worked on Apple campaigns during the Second Jobs Era. So for those who don’t work in the tech and advertising industries, he’s good at covering the inside baseball on *how* this stuff gets made (and, for anyone who’s ever made anything, it’s like all sausage-making: messy, difficult and protracted and rarely easy for the good stuff). This most recent post[1] covers the emails disclosed during Apple and Samsung’s latest go-around on the iPhone patents spat, and it’s rare that people outside the industry get to see interactions between account directors and clients. And Segall’s point is a good one: clients get the advertising they deserve, because ultimately, they’re the client and they’re the ones paying the bills.

The best relationships are obviously those where the client and agency are challenged by each other and produce better work as a result. My take is that the news that Apple has taken on more agencies like AKQA and Huge to do digital work is a bit damning of the internal capabilities that they’ve been building up. And, the big question is this: who’s the person at Apple who has the taste reins for their advertising?

[1] http://kensegall.com/2014/04/apples-little-advertising-crisis/

4.0 The Continuity Field

And finally, instead of a cutesy story how about this: we’ve known for quite a while that what we think we see isn’t what’s actually out there in the world. There’s a whole bunch of perceptual and cognitive filtering that goes on to present us a *representation* of the world, but not the literal world. This can be a bit hard to understand because it feels like it removes our primacy and being able to think of ourselves as evolution’s hottest shit. Well, it turns out that evolution’s hottest shit is remarkably easy to fool.

The latest news is that this perceptual representation of what’s out there isn’t just spatial, it’s also temporal. The “current view” out of your eyes is one that’s informed by the last fifteen seconds of activity (lossily-compressed MPEG delta frames!). This writeup from Quartz[1] goes into a bit of detail but I guess if you’re paying or you’re at an .edu/.ac.uk domain you can look up the paper at Nature Neuroscience[2]. *Cough* implications for VR *cough*. This kind of makes sense. The raw bitrate of information you’d be getting through your eyes is pretty dense and high-bandwidth, so it makes sense to do a bunch of pre-processing (which is what we see happen in the visual cortex and a successive layer of neurons that do things like edge detection and contrast and stuff like that), but I guess it *also* makes sense if that pre-processing has a temporal axis as well as a spatial one.

[1] http://qz.com/193708/your-reality-is-15-seconds-in-the-making/
[2] http://www.nature.com/neuro/journal/vaop/ncurrent/full/nn.3689.html?_ga=1.68876625.1979423772.1395259641

Phew. OK! Not even 11am West Coast time yet. You’ll be glad to know I have absolutely no reckons, I’m not even being sarcastic, about the whole Dropbox/Rice thing. But I might have a bit more about Heartbleed, that’s still niggling away in my head.

More notes! And, if you feel like it and you know people who might like it, feel free to forward on to them.