s16e01: I too aspire for my memory to be safe
0.0 Context setting
It's Friday, 22 September, 2023 in Portland, Oregon and I am sitting in my study on a sunny day, which sun is fighting a valiant effort to struggle through and past the window, the window screen, and what I would like to call an almost artisanal and hand-curated layer of dust.
I can see the shadows of leaves moving in the winds, and Taylor Swift is angry and swearing in the background. Siri, earlier, was as useless as usual when I wanted it to turn up the music in the living room.
Three things today, let's go see what they are.
1.0 Some things that caught my attention
1.1 Roofshots
Luis André Barroso passed away recently. Barroso was known for lots of things -- he's credited with reinventing the data center at Google -- but one thing of his that caught my attention was his essay on roofshots1 as a counter to the bigger, splashier moonshots that Google was particularly good at.
Caught my attention because: something to pull out of a pocket whenever you need an appeal-to-authority about incremental improvement--even the giants do it.
1.2 I too aspire for my memory to be safe
CISA, the U.S. government's Cybersecurity & Infrastructure Agency, published something like a manifesto about the urgent need for memory safety in software2.
One of the most common vulnerabilities in software, the buffer overflow, is when you're dealing with some sort of input (say someone's sent you an email with an image and that image is encoded in, oh, I don't know, webp) and when you're busy reading in that webp image data (gobble gobble gobble) you accidentally eat too much. Before you know it you're spilling your guts up and everyone can see what you had for lunch.
(I do not think anybody has ever made this analogy before. I'm not sure if I'm proud of it.)
It's worse than seeing what you had for lunch, though! I mean in this extended and now quite disgusting analogy, not only do you throw up all your food, but you also keep eating, and one of the things that you eat is some very complicated designer microbe that takes advantage of your gut-flora-brain system and before you know it, you're even-easier-than-cordycepsing your way to forking over your social security number and bank details (although quite why the former matters anymore I don't know -- at this point I wonder if you just assume that type of data has already been leaked).
Anyway.
CISA is quite sick and tired now of all of these zero-day vulnerabilities in software now it's clear that "software" is, well, infrastructure, so would like the people who make it to a) quit it, and b) perhaps use different languages that deal with memory in less unsafe and unsanitary ways.
It's easy -- I think partly because the way we describe this stuff is by calling infrastructure -- to come back to building/engineering/architecture analogies, so I wonder what the analogy is in this case. Is it as if we've been building with a certain kind of concrete for a long, long time (Unix was rewritten in C in 1972, so it is older than me and I feel good about that) and oops we just found out it's riddled with holes and super easy to poke with a paperclip, then oopsy, we all fall down.
Caught my attention because: I have a (wry?) joke about how the answer to the Fermi paradox is because writing safe and secure software is too hard, most civilizations get wiped out before they're able to get sufficiently off-planet. This report also coincided in my head with the wonderful work that Yael Grauer's doing at, of all places, and most admirably, Consumer Reports(!), who published a report about memory safety34 earlier this year, arguing specifically for the usage of memory safe languages like Rust -- in which Microsoft is using to rewrite parts of Windows5.
Now that I'm thinking of it, I'm going to pull on this thread about memory safety and the use of languages that aren't memory safe (and their prevalence) to look for other analogies that might be helpful or illuminating. What about microplastics or PFAS/'forever chemicals'? Both of these have the aspects of being teeny tiny and invisible. They're both, in their own ways, ubiquitous and keep turning up, alarmingly, wherever we look.
Microplastics and PFAS accumulate -- does software accumulate? I suppose so: in the sense that stacks get taller and more complex, abstractions increase in the name of say productivity and velocity and time-to-release. I suppose software also accumulates in the sense that it eats the world, and it's not like there's a thin layer of software coating the world -- it feels lazy yet accurate to say there is a software layer across the world, and its depth/thickness is unevenly distributed.
While there are lab tests and statistical methods (what the hell do I know, I assume these exist) to understand and model the prevalence and distribution of materials like microplastics and PFAS in different environments, how might you go about understanding the prevalence of memory-unsafe software?
The CISA manifesto gives some sort of estimate by citing Microsoft and Google's reports of the ubiquity of memory safety issues (around 70% of CVEs in Microsoft's case, similarly around 70% in the Chromium project). Ars Technica has been doing some great reporting on this; a recent article went into great detail about how a memory safety issue in libwebp is likely behind a spate of zero-day vulnerabilities, and that incomplete reporting means we have no idea (and developers and publishers) about how far the libwebp issue goes6.
And then there's the issue of "how much is safe"? When would you be done? It's not like we can talk about how unsafe software is or is not metabolized.
The other thing about this comparison is that the threat -- or danger7 -- is diffuse. The issue here isn't like the mass-coordinated time-limited engineering effort of the Y2k problem (which has now passed, unsurprisingly, into the realm of "no, really, it was real, and it was a lot of work, and things could have gone terrible, it's not just a joke), instead it's, well, gestures everywhere. How do you update everything? How do you update enough? Do you issue a recall?
And even that notion, of a recall, brings to my mind another connection. I was going through my bookmarks, and back in 2017 I noted a tweet8 about an airbag having a CVE and of course an airbag would have a CVE9. But this is the thin layer of software, and I suppose it only matters more when things are connected. (You ruined a perfectly good computer. Look, it's got significantly more easily exploitable zero-days up the wazoo).
Ah, of course.
But how else could you possibly update things if you want to make sure they're patched?
(If you're talking about national security, this type of hard, networked, software vulnerability is only one kind you might worry about. You might worry about an asymmetric social cyber-attack10 that plays off not just traditional hacking (emptying bank accounts) but the psychological aspects, too).
We could pull on the pollution analogy more. Is the creation and distribution of unsafe software onto networks like an unsafe emission that affects our environment? Arguably, yes? Would it make sense to measure, like I was wondering above, the levels of such emissions of unsafe software? We're now fairly comfortable using metaphors like leaks and trailing clouds of data behind us, or even that data is the new oil, and not in the golden-age sense of being a fuel for all mankind.
If you do think about the distribution and use of unsafe software and see it as a threat to safety and security, then is the next step after CISA's call something like the equivalent of a Net Zero target for secure software? (What does net zero for software security and safety mean anyway, and does it even make sense as a concept, never mind a useful one?) Big audacious goals in security have been set before -- see Bill Gates' 2002 memo on Trustworthy Computing1112.
Actually, you know what, reading that, it's kind of funny. I mean, the 2002 memo also called out using .net (which was a big part of Microsoft's strategy at the time) for its approach to producing secure code. It's 21 years later! Microsoft last year celebrated 20 years of progress, and like I linked above, in 2019 70% of the CVEs it assigned were still memory safety issues.
One thing I think about, as an outsider and Definitely Not Someone Who Has Done Any Infosec And Is, Upfront, Just Some Guy Having Opinions And Thoughts, is that my recollection of "computers can be dangerous! be careful!" is stuffed full of things like "remember when that rocket blew up" or "hey, don't give people too much radiation, it's not good for them and then die", but not necessarily things like "hey guess what, Apple has 1.8 billion active devices now" and "turns out, if there's a memory safety problem in one library, surprise! there's a zero-day for 1.8 billion active devices! now!"
But, you know. In today's economy, you've got to ship, and shipping can't cost too much. After all, maybe your casino just got hacked and you can only afford $100 an hour to get it up and running again.
1.3 Apps
My internet friend Brad Barrish wrote about EV charging infrastructure and how terrible it is13, and it is exactly as terrible as you would expect something to be sitting at the intersection of the following:
- the post zero-interest-rate era
- "apps"
- Tesla
- "now you have n+1 standards"
Caught my attention because: there's a part where to use a fast charger, he had to install an app, which presumably also requires you to create an account. Now this is just an assumption, but my assumption is that the onboarding process is a pile of flaming shit (sorry, I suppose, to the people involved in the EVgo app) and the reason why it's a flaming pile of shit is, primarily, because of "money" and probably "data" closely followed by "a curious definition of privacy" and, now that I think of it, yet another slight thickening in the layer of software gunked onto the world. Now I'm just angry about "surprise and delight"
That's it. I've been away. I'm back. This was very hard to write, and even harder to finish. You wouldn't believe the number of unfinished drafts I've got lying around of newsletters that, for some inexplicable reason, now feel stale and pointless. But hey.
How are you? I've been terrible. But like I say, I'm back.
Best,
Dan
-
The Roofshot Manifesto, July 13, 2016, Luiz André Barroso, Google Fellow and VP of Engineering ↩
-
The Urgent Need for Memory Safety in Software Products, September 20, 2023, Bob Lord, Senior Technical Advisor ↩
-
New Report: Future of Memory Safety, January 23, 2023, Yael Grauer, Consumer Reports ↩
-
Future of Memory Safety, Challenges and Recommendations (PDF), January 2023, Yael Grauer ↩
-
Microsoft is busy rewriting core Windows code in memory-safe Rust, 27 April, 2023, Thomas Claburn, The Register ↩
-
Incomplete disclosures by Apple and Google create “huge blindspot” for 0-day hunters, 21 September, 2023, Dan Goodin, Ars Technica ↩
-
such danger which is still present in your time, as it was in ours ↩
-
"CVE-2017-14937 shows airbags will deploy when told to deploy. They point out problems with security access." 23 October, 2017, Charlie Miller / @0xcharlie at https://twitter.com/0xcharlie/status/922653806537658369 ↩
-
See: The bogus CVE problem, 13 September, 2023, Jake Edge, LWN.net and CVE-2020-19909 is Everything that is Wrong with CVEs, 26 August, 2023, Daniel Stenberg ↩
-
What Cyber-War Will Look Like, 6 July, 2018, Tanner Greer, The Scholar's Stage ↩
-
Memo from Bill Gateshttps://news.microsoft.com/2012/01/11/memo-from-bill-gates/, January 11, 2002 ↩
-
Celebrating 20 Years of Trustworthy Computing, January 21, 2022, Anachal Gupta, Corporate Vice President and Deputy CISO, Microsoft ↩
-
EV charging infrastructure is a joke, 18 September, 2023, Brad Barrish ↩