Episode Sixty Nine: Transcendence; Broom-Shaped Objects; Odds
0.0 Station Ident
I'm sat in MPK 17 at the Facebook campus in a few minutes of down time. I'm in the Bay Area for work, sitting with the Big Thumb for the week, also in town for the FBF8 developer conference.
Some housekeeping: I've now added iTunes affiliate links as well as Amazon affiliate links for my son's college fund. Get clicking, etc.
1.0 Transcendence
So I took one for the team and saw Transcendence[1] on Sunday afternoon. The quick version is this: this is like a bad, unfunny remake of Lawnmower Man[2], the only difference being that there isn't a cringeworthy virtual/cybersex scene. It turns out, obviously, that Moore's law applies to the CGI portion of filmmaking but perhaps not the scriptwriting part.
I had a bet with myself that, when it came down to it, Spike Jonze's Her[3] would be a "better" portrayal of the singularity than Transcendence would be, and it turns out that I'm stupendously smart because I was totally right. Sure, it's easy to criticise the way that Hollywood portrays something which which you have more than passing knowledge (this is a reminder of why a number of people were freaked about by Cuaron's Gravity, which, after having seen in IMAX 3D as well as screener, is really only worth seeing in Big Three D Vision), but the other egregious sin that Transcendence commits is that it's just freaking boring. Literally boring.
So, spoilers, but you shouldn't care, because you weren't going to see this movie anyway, and if you were going to see it I hope someone was paying you to do it and that you literally had nothing else better to do with your time.
Things that are a) funny, b) wrong, c) lazy, or d) stupid about Transcendence:
- the hilarious bit where Johnny Depp's character is putting chicken wire up in the garden to create a sanctuary (foreshadowing!) because the copper chicken wire forms a faraday cage and his wife's all: you know, you could just turn off your phone.
- the languid slow-motion cinematography of water droplets (foreshadowing!)
- the TED-style conference with fake WIRED magazine covers, which we can all agree were done a lot better when Tony Stark was on the cover of every magazine in Iron Man[4] which is just so lazily done because it assumes no knowledge on the part of the audience, who probably *have* seen a TED conference or heard of one, but no, the speakers at the Electrical Engineering or whatever conference have to hold mics and use a combination of really distracting full motion background video in their presentations, or stupid equations that connote "science!"
- the kind-of referencing to some sort of dated pop-culture zeitgeist with a luddite group that's *really fucking obvious* about the assassination attempt on Johnny Depp, and then shoots him anyway, and hey, have you heard of Polonium?
- all scientists involved in AI research - which has been making lots of progress, by the way, and the government is totes interested in it and funding lots of it (ha!) - suffer some sort of crisis of conscience apart from Johnny Depp and his wife, Rebecca Hall. And Morgan Freeman's in it, to lend some gravitas and humanity and reasonableness. And Paul Bettany, who is also in it, and a traitor to the AI cause
- you feel like you need to take a drink when someone causally mentions "strong AI" because, hey, you only need to say that two thirds of the way through the movie and only as a side reference
- and you really feel like you need to take a drink when Rebecca Hall, trying to, er, decipher the uploaded data that was Johnny Depp reading out words from the OED while having electrodes stuck in his brain, because she's "tried everything: cryptography... coding... nothing's working"
- all the monitors display scrolling code, because hey, er, many eyes make bugs shallow?
- the luddites never really make a convincing argument against what a strong AI might do, only instead saying that machines should be servants of humans, and not the other way around. It's Depp's actions as a Physically Independent Neural Network that scare them, not that they try to talk or reason with him anyway.
- It's a physically independent neural network that relies upon physically located, specialized "quantum processors" that look pretty bad-ass because they glow and aren't black
- seriously, they deal with the whole Turing test by Morgan Freeman asking an AI: "Can you persuade me that you're self aware" and the AI says "Well, that's a difficult question, can you?" and everyone's all high-fives yay we crushed it, strong AI woo!
- nanotechnology and grey goo
- computers are evil, but honestly, the one good quote they could've used to tell the story was Elizer Yudkowsky's on how an optimising algorithm is pretty indifferent to the long-chain carbons that compose you, me and every other living thing on this planet until it wants/needs to use those long-chain carbons for something else, like, er, computing power
- I wasn't kidding about the Lawnmower Man thing. Depp literally says "I need more power!" and then you cut to them building a data center in the desert. And there's high-frequency algo trading. And Depp is all "I need access to the internet!"
- Internet access montage! That montage is going to be awesome on Blu-Ray. There is practically no new-aesthetic style imagery of how Depp perceives the world.
- ultimately, this is all Rebecca Hall's fault because she is a woman and she loves her man and she doesn't listen to any of the reasonable men in her life who are trying to tell her: hey girl, you just woke a strong AI and we're not really down with that, because she loves Johnny Depp, see, and can't you see that? Can't you see his soul in the machine? He's real! Apart from when her newly uploaded husband is all: hey, I made these people stronger and fixed them also I installed GoToMyPC in them because I saw a banner ad for it and I can remote control them now, isn't that neat?
- And here I was thinking we'd see an introducing cinematic representation of Greg Egan's introdus[5], but no, not really, it's super boring
- I find it hard to believe that someone could build like a giant data center in the middle of nowhere and a) James Bridle wouldn't notice, and b) Andreessen Horowitz wouldn't notice, or c) Ben Thompson wouldn't stick it in a newsletter somewhere
- "He's modifying his own code!"
- Depp's visual representation (and his later physical one) literally closes his eyes and concentrates when he's, er, coding. Or interacting with the real world. Or something.
- How do you take down a hostile computer system? Virus. Oh, and shutting down all the internet.
- In the flashforward at the beginning of the film, Paul Bettany is all being Not Jarvis and saying how there are still pockets of phone service in parts of the country, betraying a complete lack of understanding as to how telecommunications systems work these days in the favour of your Standard Hollywood Technologyless Dystopia But Not Really Because Hey Look At This Imagery of People But Maybe Not Because There's A Soldier.
It's easy (really, it is) to throw peanuts from the sidelines about this movie and to say "well, Hollywood could've written a much better movie" without actually, you know, writing it. But, they could have. And the reason why is because everything that you see in Transcendence is stuff you've seen before, in everything from Lawnmower Man to Ghost in the Machine to All Those X-Files episodes, to that Battlestar Galactica spinoff series to Steven Spielberg's AI. There is *nothing* new here, other than just the idea of the singularity, which we don't even get to see that much of anyway, because it doesn't feel real, because we don't see the rest of the world reacting to it, because they're out in the middle of nowhere.
Bad film, Christopher Nolan. No biscuit for you.
[1] Transcendence (Amazon Blu-Ray pre-order: http://amzn.to/1hOdc4M)
[2] Lawnmower Man (Amazon: http://amzn.to/1rCOOEq, iTunes: https://itunes.apple.com/us/movie/lawnmower-man/id310466742?uo=4&at=11ly9m)
[3] Her (Amazon: http://amzn.to/1fuZp46, iTunes: https://itunes.apple.com/us/movie/her-2013/id810314926?uo=4&at=11ly9m)
[4] Iron Man (Amazon: http://amzn.to/1nEjdVa, iTunes: https://itunes.apple.com/us/movie/iron-man/id688163154?uo=4&at=11ly9m)
[5] Diaspora (Amazon: http://amzn.to/1mVLhDx iTunes: https://itunes.apple.com/us/book/diaspora-multimedia-edition/id673540743?mt=11&uo=4&at=11ly9m, Powell's: http://www.powells.com/biblio/62-9781597805421-0, Abe Books: http://www.abebooks.com/servlet/SearchResults?isbn=9780752809250&cm_sp=mbc-_-9780752809250-_-all)
2.0 Broom-Shaped Objects
Or, things pretending to be other things.
Fred Scharmen sent a good note in reply to episode sixty two[1] about my throwaway phrase of evolutionary unsecured back doors in the way our brains work (or: unconscious biases and all the kind of stuff like in Thinking Fast And Slow[2]). He told a story about needing to buy a broom for the first time, so bought the cheapest thing they had, which fell apart after about three uses: it wasn't a broom - it was a broom-shaped object, "designed to explicitly hack the pattern recognition faculty, and trick dumb college students into buying it."
And it feels like that's one of the fallacies here: things that look like other things, but have other jobs or agendas. In the context of physical objects, it's shoddily made stuff that's designed to trick you once, maybe, because it looks like a broom, but doesn't work the way the broom-object is supposed to work.
So Fred's worried about "an internet of broom-shaped things, designed by agents actively interested in hacking empathy."
My first response to that was, again, because this is a current preoccupation of mine, the empathy hacking that makes up both advertising and corporate social media policy. The former isn't that novel: sure, effective advertising makes use of the fact that it attempts to form a low-level emotional bond with you. Corporate social media policy is even worse, in a way: the orthodoxy is that social media is a conversation, so what you've got to do, as a corporation, is have a conversation - or fake having a conversation - with your audience. The funny thing here is that you've essentially got stuff like Brand Guidelines and a Brand Voice going on and a bunch of social media internal or outsourced staff and, er, a sort of Searle's Chinese Room[3] for Twitter. Side note: start a social media company called Searle's Chinese Room.
The thing is that when you have a thing that's pretending to have a conversation but isn't actually capable of having a conversation, then again, that's pretty sociopathic. A bit of the organism that, in order to "do better business" learns how to form connections with people in the hope of advancing the whole, without *actually understanding* the people it's dealing with.
You could make the argument that humans are pretty much Chinese Rooms anyway (and, er, if you're prone to introspection, rumination and generally depressive thoughts it's probably best not to think about that too much) and of course corporations are Chinese Rooms, but there's something about the bit when the Chinese Rooms start pretending to be people that feels a bit squicky.
Of course, that raises the question: how do you transparently build things that *do* leverage our ability to empathise with things and recognise faces and all that stuff, while at the same time not being trapped in an uncanny valley, or, actually, just deceiving or manipulating people? It feels a little like the more we learn about how our brain works, the more you kind of need to be dealt a user's manual or a sort of security/vulnerability disclosure: Congratulations! You are a newly instantiated human being! If you are able to read and understand this video, then it means you are susceptible to the following attacks on your consciousness...
[1] Episode 62: http://newsletter.danhon.com/episode-sixty-two-wearables-unworn-look-at-what-they-want/
[2] Thinking Fast and Slow (Amazon: http://amzn.to/1iyblgm, Powell's: http://www.powells.com/biblio/9780374275631,
3.0 Odds
Budd Caddell has kindly written a response[1] to my critique[2] of the Responsive OS manifesto. We're going to grab a friendly coffee some time, but he's got good points that I'm going to think about and respond to.
[1] http://responsive.org/2014/04/on-criticism/
[2] http://newsletter.danhon.com/episode-sixty-three-disbanded-the-responsive-os/
I was lucky enough to grab lunch with Robin Sloan today and my brain is predictably fizzing post-conversation, so I'm excited about the next few episodes. Let's just say that he converted me with his love of libraries and books.
The lineup at Theorizing The Web[3] looked amazing, and one of the outputs was Princeton University assistant professor of sociology Janet Vertesi's attempts at hiding her pregnancy from, well, the web. If anything, it's interesting what it would take form a regular person's point of view to practice informational operational security.
[3] http://theorizingtheweb.tumblr.com/2014/program
I had a good note from Liz Henry in response to the episode about NGO design for products and services, rightly asking what the baseline should be for "social good" objects, and if there's a responsibility for them to be dealt with in a copyleft "for the good of humanity" fashion. Which struck with me because it felt very Long Now-ish[4], and at least a bit SFnal.
I wrote in episode 55[5] about the good job I thought If This Then That had done communicating the Heartbleed bug and what they had done about it. In a similar vein, I think Github's *followup*[6] post to their frankly terrible original[7] post regarding Julie Horvath's departure was another good example of empathy-led communication. Note: I'm not saying that by communicating in this way Github have changed their ways, but I'm happy to treat it as a weak signal of improvement.
[5] http://newsletter.danhon.com/episode-fifty-five-sharing-heartbleed-a-weakness-of-the-heart-zero-day/
--
OK, big day tomorrow at FBF8. Expect Facebook-related reckonings.
As ever, I welcome your notes, and pinky-swear promise that I reply to them. So send me some!
Best,
Dan