s4e09: Earl Grey, Hot (again, probably) 

by danhon

1.0 Station Ident

Wednesday, March 22, 2017 in the afternoon. Despite everything that’s going on in the world, I’m feeling pretty good today after taking time to look after myself. I’m (re)learning that the principle behind being told to put your own mask on before you put on anyone else’s applies to other things than just catastrophic airline accident safety.A dumb one, today. Or maybe not. It was just something that made me think

1.0 Earl Grey, Hot

I was thinking, as you do, about Starfleet’s User Research division the other day[0].Ok, so really: how would the computers in Star Trek: The Next Generation really work? And in particular, how do the replicators work? Take this typical exchange:

PICARD walks up to a replicator.

PICARD: Computer. Tea. Earl Grey, Hot.

The computer replicates tea. PICARD drinks the tea, and it is good.

How does this work? Has Picard previously spent time with Computer, in something like the following fashion?

PICARD: Computer. Tea. Earl Grey, Hot.

Computer: There are over two thousand known types of “tea” known as “Earl Grey”. Which “Earl Grey” did you mean?

PICARD sighs.

I mean, in this case, we have a replicator, so the rough idea is that it can replicate *anything* that it has a pattern for. Let’s just assume that the computer and the replicator have patterns for anything. Any tea that has ever existed, for example. So the problem isn’t just telling the computer what you want, the problem in this case is, when you have a computer that can replicate anything, how do you tell the computer the right thing to replicate? And, in the universe of Star Trek, this looks like it’s mainly accomplished through a voice user interface with the occasional tapping at an LCARS display.

So, do we try again? Does Picard ask for Tea, the computer says “What Kind?” and presents him with… what, a taxonomy of all known hot drinks made from boiled leaves? Picard did say “Earl Grey”, so the computer knows to narrow it down to, well, all the Earl Greys that it knows about. Maybe Picard likes Twinings? Is there only One True Earl Grey in the Star Trek universe?

And how hot is hot, anyway? At some point did Picard try some tea and it was *too hot* or *not hot enough*? Did he throw it away and tell the computer “Computer, make the same tea, but hotter.” How much hotter?

I can imagine that at the end of this, Picard and the Computer came to some sort of tacit understanding or arrangement and that he now has a personal preference saved (under, say, Picard Tea Preference Alpha Four Seven) and that Picard either explicitly told Computer to save these settings, or that Computer implicitly did it on his behalf in an effort to meet his tea needs.

And that’s just for tea! Get this:

BARCLAY is in his quarters and feels like a snack. He walks over to the replicator.

BARCLAY: Computer. Salt and vinegar crisps, please.

COMPUTER replicates some salt and vinegar crisps. BARCLAY tries one.

Barclay: Ugh! Computer! Too salty! Less salt, please.

COMPUTER replicates some salt and vinegar crisps. BARCLAY tries one.

Barclay: Ugh! Computer! These crisps are not vinegary enough. More vinegar, please.

COMPUTER replicates some salt and vinegar crisps. BARCLAY tries one.

Barclay: Ugh! Computer! While these crisps are sufficiently salty *and* vinegary, they are not crunchy enough! Crunchier, please.

COMPUTER replicates some salt and vinegar crisps. BARCLAY tries one.

Barclay: Ugh! Computer! While these crisps are very crunchy now they are too heavy! Lighter, please.

COMPUTER replicates some salt and vinegar crisps. BARCLAY tries one.

Barclay: Ugh! Computer! These are not entirely unlike salt and vinegar crisps and now I am not hungry. Also I am late for my meeting with Commander LaForge. We’ll continue this later. Save our progress as Barclay Salt and Vinegar Crisps Preference Alpha Two.

At this point, it’s pretty clear that Arthur encountered this exact problem when he tried to have some tea on the Heart of Gold.

There are more issues, though! Someone on Twitter suggested that maybe the Federation gets around this by modeling the taste receptors of the requesting entity (which again, is what the Nutri-matic Drinks Synthesizer said it did), at which point you wonder – are Federation computers running hi-fidelity, non-sentient simulations of people *just so that they can replicate the right kind of food*?

I’d like to think that this silly thought experiment at least highlights some of the issues involved (although they’re probably glaringly obvious to researchers and practitioners in the field) of voice user interfaces that aren’t domain restricted. Or, in other words: this is exactly why talking to Alexa is sufficiently infuriating as well as helpful.

In this way, it’s infuriating when Alexa definitely understands the request (in our kitchen, it’s “Alexa, play music from Daniel Tiger’s Neighborhood”), and then replies that she can’t find any music by “Daniel. Tiger’s Neighborhood” but if you say “Alexa, play music from Daniel Tiger’s Neighborhood from Amazon Music” she’ll happily do that.

The old way of explaining and talking about all the above is the problem people have with computers doing what I say, instead of what I mean. When we imagine computer interfaces and how they might be helpful – and especially when we get inspired by ones that we see in fiction, we don’t see all of the built-in assumptions about how these things might fall apart.

The dumb insight here is that voice computing works when preferences are known (maybe?). It’s one thing to use a computer to ask for tea if the computer knows *exactly how to make your tea* but if you’re asking computer to replicate a new chair for your quarters, it might be easier for you to use a PADD to scroll and swipe through the *literally gazillion* chair options available to you, and after that you can say “replicate that chair you replicated for me that other Stardate”.

A related thought about Starfleet and its vision of enterprise computing[1] is that in the NCC 1701-D’s anthropocentric view, the computer replies audibly to everyone’s request. The assumption here is that everyone on the bridge can hear what the computer can say, and that it’s useful for everyone to hear it. This breaks when you get Ensign Folami, a new non-human ensign on the bridge from Starfleet academy whose hearing is only in the infrasound range (but is perfectly normal from xi’s point of view) and suddenly the computer has to not only vocalize in response to humans but also there’s a surprising bass thrum so that Ensign Folami can hear what’s going on.

[0] https://twitter.com/hondanhon/status/844275764149170176
[1] Sorry

OK. Lunchtime over. Have a good day, and thank you to those who have been sending notes!