Episode One Hundred And Thirty Seven: Human Extinction; Let There Be Jobs
0.0 Station Ident
Feeling: weird. Empty. Hiding from a house that has cleaners in it. My wife and son left this morning on a family vacation back to the farm in Missouri - there's a steam fair(!) on, and my son being nearly a year and a half is already displaying the predictable signs of being fascinated by machinery like diggers, trucks and a cement mixer toy that he plays with. So a steam fair is going to be like crack to his little growing brain.
Reading: Transmetropolitan, Supreme: Blue Rose and Hawkeye #19.
Watching: Guardians of the Galaxy, finally. I am Groot!
1.0 Human Extinction
Well, not new words as such, more new phrases that have stuck in my mind as being typically five-seconds-into the future ways of looking at things that snap your brain just so. One of them was in the comments feed of a friend's Facebook post making the point that they didn't so much talk about "Climate Change" anymore because it's pretty hard to have empathy for a climate (zing!) and instead talked about human extinction.
This kind of stuck around in my head, a weak firing pattern of neurons that was just hanging around, slowly fading away until it received reinforcement when I met up with Ken Eklund[1] of alternate reality game World Without Oil[2] and climate change playable design fiction FutureCoast[3] for a chat and catchup and we were talking about how to make the future sound real because, at least from the point of view of our somewhat vague lunchtime chat, it can be hard to get your head around a five degree celsius change by the year 2100[4]. No, you need a more fundamental understanding of how environmental change is going to affect you, whether it's chalk lines on the ground[5] or projections onto buildings[6].
This is the science communication bit - Peter Watts talks about this in a recent interview[7] over at Clarkesworld, where he has a thoroughly depressing view of whose fault it is (not that it's particularly productive to be talking about fault this late in the game) that the public doesn't understand serious scientific issues like climate change, neither blaming scientists (who are trying harder, and have a responsibility to try harder) nor their public audience, because at least the latter have been studied in a peer-review fashion and that
"[it's] been pretty firmly established that Human Nature is so rife with Confirmation Biases and Backfire Effects that even if you present someone with ironclad, irrefutable, expert evidence that their cherished beliefs are wrong, they’ll just dig in their heels and clutch those beliefs even closer to their bosoms (bosa? bosii?), while at the same time vilifying the expert who contradicted them. It’s not that they don’t understand the arguments; it’s just that they’ll reject anything that’s inconsistent with their preferred worldview."
So, you know: facts (or even Strongly Held Theories) don't help people change their minds.
Eklund reminded me that you treat your future (and past) self as literally a different person. The me-in-30-years thing is interesting, because you acknowledge that me-in-30 years *is* a different person. And yet you don't care as much about them as you probably should, which is why you keep making all those short-term decisions. You hardly have enough empathy for yourself, never mind yourself in a future that you find it hard to imagine.
And so: techniques and strategies for people to engage more with the future, never mind the present. The present is doing just fine. The present is something that we can't do very much about. But the future, that's something that we can change, if we want it well enough. And it's an unfamiliar place that we stumble into, half-blindly, without necessarily knowing that there are futures that we can imagine, create and then also choose.
And from that same Peter Watts interview, the phrase about one of his novel's characters being "traumatically rehumanized" - *re*humanized, not de-humanized. And in response to a specific threat, too. Almost like a MIL-SPEC injection of weaponised empathy for the safety and security of the human race. (If that isn't a blatant clue that you should go read Blindsight, here it is: you should go read Blindsight[8]. It's free online, too.)
So. Those are your two activation phrases for today: "traumatically rehumanized" and "human extinction".
[1] Ken Eklund
[2] World Without Oil (might be down, otherwise try World Without Oil at Wikipedia)
[3] FutureCoast
[4] We're fucked
[5] High Water Line
[6] Watermarks
[7] Human Nature: A Conversation with Peter Watts
[8] Blindsight
2.0 Let There Be Jobs
The accusation is that the robots are stealing our jobs, and the reply is usually that new jobs will be created. That's institutional thinking, right? I mean, we're creating robots, that can do things that we can do, and we're worried about them a) taking away our jobs, and b) not being able to come up with new jobs for us to do, when instead you could neatly sidestep the entire issue and just say: well, those jobs are gone! Awesome! Let's all go to the beach.
But we can't go to the beach. We have to invent new jobs.
This morning (I think?) a link to this New York Times Upshot article popped up in my stream[1], itself based upon a Pew Research "report" (more "we got in touch with a bunch of people to ask what they reckon") titled AI, Robotics and the Future of Jobs[2].
I mean, if we've got all this fantastic productivity lying around, then do we *really* have to invent jobs? It seems like all this technology is coming about and what we're inventing is a terribly productive and yet capitalist culture where we're working all the time - there's an appreciable subset of reckoners polled who genuinely wondered about a) what people were going to do with all of their time; b) all of the replaceable meatpuppet jobs at the low-end; and c) the complete disappearance of "jobs" at the middle end.
One of the quotes that I did like was from Tim Bray:
“It seems inevitable to me that the proportion of the population that needs to engage in traditional full-time employment, in order to keep us fed, supplied, healthy, and safe, will decrease. I hope this leads to a humane restructuring of the general social contract around employment.”
Where, of course, the emphasis is on "hope" and "humane".
I'm not sure there have been that many positive examples of wholescale restructuring of "the general social contract around employment", but I suppose the mark of civilisation and progress is that this time, there's a chance that people won't have to die.
So, here's your challenge: boot up a post-capitalist but still firmly-rooted-in-scarcity society only, you know, not Thiel's seasteading institute. The other kind.
[1] Will You Lose Your Job To A Robot? Silicon Valley Is Split
[2] AI, Robotics and the Future of Jobs.
--
An early start tomorrow, 6:30am flight and then another one, ultimately ending up somewhere in Texas.
End-of-file,
Dan