Episode Twenty Seven: Shorts; Code as Law; The Real Revolution; The Flip
1.0 Shorts
Some short thoughts.
- It is quicker, and easier for computers to guess what you can remember than it is for you to come up with something memorable and difficult to guess, which is just another way of computers trying to make us bags of meat feel inadequate: http://boingboing.net/2014/02/25/choosing-a-secure-password.html
- Kepler is hockey-sticking its planet-finding capability, which is such a wonderfully optimistic sentence I never could hope to type, but one that simultaneously makes me sad about what I feel is feasible within my lifetime. But perhaps not my son's. http://www.washingtonpost.com/national/health-science/nasa-kepler-telescope-doubles-number-of-known-planets-outside-solar-system/2014/02/26/e83af186-9ee0-11e3-9ba6-800d1192d08b_story.html Also, holy shit, Washington Post URLs.
- Ars Technica takes a look at moving image representation of messaging but doesn't include nearly enough screenshots, in my opinion. http://arstechnica.com/business/2014/02/the-pathos-of-the-text-message/
- Remember when Will Smith was all "I don't trust robots" because some robot saved his life instead of the other person and he was all guilty about it and needed therapy but phew he had Converse boots and an Audi? http://www.wired.com/opinion/2013/07/the-surprising-ethics-of-robot-cars/
Speaking of Will Smith, so I have this thing where I want to throw a film festival and the theme would be Haxploitation and we would show things like Hackers (obviously) and Swordfish and Sneakers (Sneakers is such a good movie, I spent at least six months of my teenage years wanting to grow up to be Robert Redford for entirely different reasons than most other people who wanted to grow up to be Robert Redford) but also Firewall and Stealth and The Net and AntiTrust and I, Robot and maybe TV shows like Killer Net and Attachments (my British sensibility showing through) or even that stupendous supercut of the Mainframe[1], or even just watching episodes of NCIS[2].
[1] http://www.youtube.com/watch?v=Hcywf9mwF5U
[2] http://www.youtube.com/watch?v=u8qgehH3kEQ
2.0 Code is Law
When I raved, a few episodes back, about the potential of the Marvel Universe API, Kim Plowright rightly wrote back and pointed out the other side of the double-edged API sword. That entire fictional universes under the control of an API with binary decisions (albeit enforcing human will) could well be a severe stunt to the creativity of fandom. The geek desire to catalogue and mechanise my necessity organises, categorises and sets up barriers. At the BBC, Plowright (along with others) spent an inordinate amount of time trying to create the one true datamodel that would handle comedy and drama storytelling: because, well, part of the beauty of the BBC is that it's able to think about things in such a large, systemic way. It is, of course, that very British institution.
With hindsight, systematizing and coming up with a framework to describe All Storytelling might seem a little like a fool's errand: before you know it you're knee deep in ontologies and RDF and if you're unlucky someone might even have crowbarred some XML in there if only because it was a buzzword that helped the project retain momentum. This being the BBC, though, where the design of any sufficiently large system is (was?) the driving goal of the place, you end up trying to produce a framework that can satisfy the edgiest of edge cases, and that means something like a multi-decade running storyline like Doctor Who or EastEnders, the latter of which might be defensible but the former of which involves Time Travel, for crying out loud and a singular consciousness inhabiting multiple physical instantiations. It's almost as if the series creators were trolling the requirements capturing process.
The wonderful, wonderful point and parallel that Plowright made in a series of emails to me is what fandom needs to do what it does. It needs dense text as source material, but it also needs an intimate knowledge of that test. It also needs some sort of innate, instinctual understanding of the interconnectedness of all things in terms of that text - how each piece relates to each other piece and in context with all the other pieces. That's an incredibly amount of data, unstructured data, at that. And that's not what an ontological organising system like RDF is good at, really. It's not a coincidence that at *exactly* the time this research was going on at the BBC, a number of other ex-BBC ejecta were busy inventing folksonomies with the introduction of tagging over at Flickr, and the whole idea of bottom-up organisation. And at the same time, what do we see with things like Pinboard and Delicious and their adoption by the fandom and slash-writing community? Exactly.
This is where I drop in to Mike Rugnetta's talk from 2013's XOXO[1] because he explains so eloquently what it is for fandom to be fandom and what it is for fandom to create. The creative tension is in the looseness that can still be preserved against the power unlocked by a queryable API that makes reaching for that information about a story universe so much easier than having it all in your brain.
Plowright's last point also sticks with me: that this post-event construction of a storyline - ie that the linear narrative that emerges when you gasp for air coming up from an extended tvtropes dive - is rather like the way your brain constructs narrative out of the frankly disjointed yet occasionally awesome fragments you see whilst dreaming. Or, even: isn't this what we're being told consciousness is? A post-hoc rationalisation of a bewildering barrage of sensory data, comingled with unbidden retrieval of memory and sense, and only then, only afterwards, do we find a way to describe to us what we've just experienced.
[1] http://www.youtube.com/watch?v=-D9Xq3Xr8aE
3.0 The Real Computer Revolution
So here's the thing. Mobile computing is capital D disrupting everything and our VCs are running around like the world is literally on Fire. Marc Andreessen is Tweeting, for Christ's sake, so something serious must be going on.
One of the favourite stories that I like to tell is about Tetris, Angry Birds, Game Boys and iPhones. I say: there was a time when Tetris could've been on the front cover of every magazine in the world, the latest fad that had caught the attention of children everywhere. Most people nod their heads at this point: yes, I can see that as a thing that could've happened. But Tetris was only so big, right? Because the people playing Tetris needed to have portable video games consoles. But this thing here - an iPhone, specifically - is an Adult Device, it's an important thing that I use to make phone calls and do business business business numbers[1]. It just happens that it's accidentally also powerful enough to be a games console, I mean, Angry Bird playing device. And so Moore's law inexorably drove down the cost of computing and made it more available so that eventually everything could read email and play Tetris. And the stigma, such that culture afforded it, attached to videogames, vanished a bit, because it could be hidden behind a veil of adult productivity.
Computing, at some point, became this thing that was wrenched from the hands of women -- computers -- and thrust into the, well, thrusting hands of men. Realise that this is generalisation, of course, but manly productive things are done with computers - like business business business numbers, sorry, I mean Word Excel PowerPoint - and Microsoft's ambition to put a computer on every Desk.
The liberation of computing by the inexorable application of Moore's law (in the efficiency direction, not in the hot and fast desktop direction) has meant that as a direct result of needing to find more things to put processors in (or, more charitably, the things for which it is now trivial and pennies on the dollar to put processors in), by definition, the audience for things that can process things has expanded beyond People Who Wear Shirts And Sit At Desks.
The wonderful thing about this is that it's going to happen anyway. Moore's law doesn't really care. The Bay Area doesn't even have to notice.
I mean, it'd be nice if it happened *faster*. But I think it's going to happen anyway. That's the optimistic version of me.
What's going to happen? Things like this: the Bay Area only sees the top of the ice berg. The bay area gets excited about a trend like wearables and quantified selves and is good at seeing things that it thinks it can sell to closely-aligned versions of itself. The smartest thing a company like Apple ever did was to decide to make things for 'the rest of us' and for Jobs to be such an outsider. So, with something like quantified self, the Bay Area can get excited about charts and dashboards and graphs and tracking and journalling whilst completely forgetting that Weight Watchers has been doing quite well at this tracking and measuring lark and that hey, maybe scrapbooking is totally a thing and do you remember Zazzle because they're totally still around.
The rest of the ice berg is *everyone else*.
There are clearly structural barriers to a more equitable gender balance in the field of "making apps", but the confluence of factors such as (relatively) easy digital distribution as well as pretty much approaching the point of "free Android smartphone with your cereal" mean that the chances of something for the rest of the iceberg being designed, developed and deployed are *so much* higher than they ever were before.
So that's the real revolution. I feel like it'll happen anyway. And yes, it should happen (or have happened) sooner, and the fact that it's going to happen anyway shouldn't preclude efforts to accelerate it. But. I am looking forward to the Bay Area and the traditional White Male Scratch Your Own Privileged Itch model to have to compete with a wonderful diversity of ideas that can't be held back.
[1] Look, if you haven't seen The Lego Movie at least once by now, you're just going to be left behind.
4.0 The Flip
Abraham Loeb's paper got published over a month ago on arxiv[1] (which, from what I can make out, is where scientists go to publish their reckons before they get run through some sort of peer-review validation system where they can be published alongside markov-generated non-science, but I digress).
This week, the Kepler mission team at NASA announced that, with a new crop of 715 discovered planets, we'd practically doubled the haul of planets we know about in the universe. This is really exciting. I mean, really, really exciting: to actually know that there are other planets out there; I don't know. You should be shitting yourself with excitement. It shouldn't make you feel small. It should make you feel amazed to be part of something so wonderfully complex that we're just scratching the surface of its understanding.
Of those 715 discovered planets, four are in the habitable zone of their system's star. The habitable zone is also called the Goldilocks zone because it's not too hot, not too cold and is just right in terms of its planet surface temperature to allow for the existence of liquid water, one of the things (in our admittedly limited knowledge) that allows life to take place. All the life that we know about requires liquid water as a solvent - and while we can imagine different types of life supported by different solvents, like ammonia or methane-based life, we obviously haven't discovered those yet.
That's why we get inordinately excited about planets which are in a star's goldilocks zone. They're the closet we've gotten to even the most tantalising evidence that we might not be alone, and that life on Earth isn't a fluke.
But what if there were another way of thinking about the goldilocks zone? What if there were other habitable zones?
That's what Abraham Loeb figured out, in such a stupendously smart, inverted way of looking at the problem of finding habitable zones. See, we were preoccupied with finding habitable zones as a function of physical space. A habitable zone, we thought, was a narrow strip, a spherical envelope around a heat source capable of sustaining the only life we know about.
Abraham Loeb didn't look for a habitable zone in any of the physical dimensions. He looked in the temporal one.
See, one of the most amazing things about science and how it works is its power for prediction. This is a digression, but an important one, I feel. We had a theory about the big bang: that the universe originated at a single point and inflated everywhere, at the same time. If it did that, then it meant that everywhere would've been the same temperature at the same time. That was the theory. When the Cosmic Microwave Background Explorer (COBE) mission[3] flew, it would observe the cosmic microwave background radiation - essentially, the heat and echo left over from the birth of our universe. And it turned out that the data from COBE fit the theory perfectly. Absolutely god-damned perfectly. Science. It works, bitches[3].
Our universe started small and hot. Like, really, really hot. And it inflated and cooled down.
Abraham Loeb realised that there would've been a point - in his paper, he refers to it as a red-shift of 100<(1+z)<137 - which essentially means around 10-17 million years after the Big Bang - when the temperature of the universe would've cooled to somewhere between zero and a hundred degrees celsius.
In other words, the temperature range that allows for liquid water.
In other words, the goldilocks zone.
In other words: the entire universe was a goldilocks zone.
Small, rocky planets, if they existed (and there are theories that they could have), could then have had liquid water on their surface.
If there were a perfect moment for life - at least, as we know it - to have existed, it would have been at that time.
The entire universe would have been habitable.
A giant, all-encompassing soup of gassy habitableness.
I love this on so many levels. At least a few are just because of what feels like the poetry of the science involved. But really, it's because Loeb made such a stupendous, orthogonal intuitive leap. The whole premise of looking for a habitable zone necessarily precluded considering the entire universe as such a zone. The search parameters for the zone and its definition *require* a star as a heat source, and then that one act - well, we know of a time when the entire universe was hot - of removing that requirement of a *star* as a heat source and instead trying to find it elsewhere.
It's amazing. Literally inspiring of awe.
[1] http://arxiv.org/abs/1312.0613
[2] http://en.wikipedia.org/wiki/Cosmic_Background_Explorer
[3] http://store-xkcd-com.myshopify.com/products/science-works
--
That's it for this week. I hope you've enjoyed today's episode. And, as an experiment: if you did enjoy it, please suggest subscribing to anyone you know who might be interested in what I write about (and I realise that might be quite hard to categorise).
As ever, I appreciate your notes. Whether they're just to say that you got something positive today's newsletter or if you have a point or reaction to anything I've written. Or even if you disagree and it's made you inordinately angry and I've induced some sort of rage in you. If it has, I'm sorry about that. I didn't mean to.
If you're a recent subscriber, there's a list of archival links over at http://danhon.com/newsletter.
Best regards,
Dan