The Media Lab
The slab and
This is a note intended to lay out something that’s lately clicked for me. Here are three glimpses of the future of computing that all seem to “rhyme”:
1. Cloud functions
I wrote about my experience with Google Cloud Functions back in the spring; for me, these represent the “perfection” of the AWS/GCP model. Their utility snuck up on me! At this point, I have about a dozen cloud functions running —
2. Colab notebooks
I’d heard about these forever, but it’s only in the past year that I’ve used them, and they have in that time become indispensable. The fusion of “document” with “program” AND “environment” is frankly dizzying; using Colab feels more futuristic than just about anything else I do in a browser. (I should add that I’m terrible at Python, but part of the appeal is that you can be terrible at Python and still get a lot done in these notebooks.)
You might reply, “surely, Robin, you’re just saying that you admire IPython and Jupyter notebooks generally” — but I’m not really.
The astonishment of the Colab notebook is that it comes with a powerful computer attached, instantly, for free! You can buy a subscription to make that computer even more powerful, which I do, happily.
Of course, I do recognize Colab’s lineage, and I’m grateful for all the labor behind IPython and Jupyter. This recent interview with IPython’s creator, on the occasion of its 20th anniversary, is a wonderful story of invention. Fernando Pérez says:
[My mentor] had the patience to let me “productively procrastinate” by building IPython, something that I could somewhat justify as a tool for finishing that dissertation. I regained some much needed confidence, I got attracted to building something, and it turned out to be really important.
For me, the magic is in the specific combination of Jupyter’s affordances with Google’s largesse: you open a new tab, and poof, it’s a document with a powerful computer attached.
3. World computers
All of the blockchains, particularly those that depend on costly proof of work algorithms, strike me as deeply aesthetically ugly; these are systems with thrashing waste at their core, by design. Clever, maybe, but not elegant.
Even so, I can’t deny that Ethereum’s “world computer” is interesting and, even more than that, evocative. The Ethereum blockchain is one entity, shared globally, agreed upon by all its participants: that’s what makes it useful as a ledger. The Ethereum Virtual Machine, a kind of computer —
As with a lot of things in crypto, the feeling is as much mystical as it is technical. I understand why people get excited when they deploy an Ethereum contract: it feels like you are programming not just a computer, but THE computer. That feeling is technically wrong; it is definitely just a computer; but since when did the technical wrongness of feelings prevent them from being motivating?
I think these are glimpses of an accelerating reformulation of “computers”—the individual machines like my laptop, or your phone, or the server whirring in the corner of my office —
I like “slab” better than “cloud”, both for its sense of a smooth, opaque surface and its suggestion of real mass and weight. That’s the twist, of course: cloud functions and Colab notebooks and Ethereum contracts DO run on “computers”, vast armadas of individual machines taking up real physical space, venting real hot air. A responsible user of these systems ought to remember that, but … only sometimes. Power outlets also conceal gnarly infrastructural realities, real mass and weight, and a person ought to be aware of those, too —
The idea that “computers” might melt into “compute”, a utility as unremarkable as electricity or water, isn’t new. But I do feel like it’s suddenly melting faster!
For me, a more useful analogy than electricity is textile manufacturing, which was, a couple centuries ago, THE high-tech industry; innovations in mechanical weaving were close to the core of the industrial revolution. Today, aside from the weird technical fabrics that are like, bullet-proof and opaque to cosmic rays, textile manufacturing isn’t considered high-tech: it’s just … industry, I suppose. Textiles are produced with extreme efficiency in huge, matter-of-fact facilities. Move along! Nothing to see here.
I recently read David Macaulay’s book Mill, illustrating the construction and growth of a textile mill in Rhode Island in the early 1800s, and, I’ve got to tell you: Macauley’s mill looks and feels like a data center.
They put data centers near rivers, too!
For me, this raises the analogical question:
Textiles in 1800 : textiles in 2020 :: computers in 2020 : ???
I mean, I am betting the ??? is a slab —
The dutifully critical part of me wants to shout: you shouldn’t trust these slabs! Their operators, G —
There are other endings, too: even now, the slabs occasionally flicker offline, and it’s not difficult to imagine a seriously hard crash, one that lasts a long time, caused by either an accident or an attack. So much for my terra cotta warriors.
Then again … internet trunk lines run alongside railroad tracks. Won’t the slab operators and their infrastructure still be with us in a hundred years, in SOME form, just as the railroads are today? I would guess yes, probably.
So, I think maybe we —
First, if somebody offers you a seamless slab of compute and says, here, take a bite: sure, go for it. See what you can make. Solve problems for yourself and for others. Explore, invent, play.
At the same time, think further and more pointedly ahead. There’s an idea simmering out there, still fringe, coaxed forward by a network of artists and hobbyists: it’s called “permacomputing” and it asks the question, what would computers look like if they were really engineered to last, on serious time scales?
You already know the answers! They’d use less power; they’d be hardy against the elements; they’d be repairable —
Plenty of computers were like that, up until the 1980s or so; but permacomputing doesn’t mean we have to go backwards. The permacomputers of the future could be totally sophisticated, super fast; they could use all the tricks that engineers and programmers have learned in the decades since the Altair 8800. They would just deploy them toward different ends.
As a concrete-ish example, I think this project from Alexander Mordvintsev is lovely, and totally permacomputing:
Alexander is the discoverer, in 2015, of the “DeepDream” technique, an early —
Earlier this year, Alexander released a stripped-down implementation of DeepDream written in a vintage dialect of C, his code carefully commented. This version runs on a CPU, not a GPU. It does so very slowly. Who cares? It whorls its eyeballs eventually, even on the humblest hardware. You could run Alexander’s
deepdream.c on a Raspberry Pi. You could probably run it on a smart refrigerator.
The implementation does depend on a single pre-trained model file, produced at (then-)great expense by many computers with very fast GPUs. I find this totally evocative: it’s easy to imagine future permacomputers that rely, for some of their functions, on artifacts from a time before permacomputing. It would be impossible, or at least forbiddingly difficult, to produce new model files, so the old ones would be ferried around like precious grimoires …
(For the record, I already feel this way about some ML model files: whenever I find one that’s interesting or useful, I diligently save my own copy.)
Even if it turns out you never need a permacomputer, you’ll be glad you thought about them. Powerful forces are pushing computing toward vast, brittle, energy-hungry systems that are incomprehensible even to their own makers; I should know, because I am a small constituent part of these forces. Given such pressure, even a faint countervailing wind is precious.
The sailing/computing duo Hundred Rabbits are pilgrim-poets of permacomputing. Their Uxn project is a clever 8-bit computer design that can be built or emulated in a variety of ways, including on old, recycled hardware.
Of Uxn, they write:
With only 64kb of memory, it will never run Chrome, TensorFlow or a blockchain. It sucks at doing most modern computing, but it’s also sort of the point. It’s more about finding what new things could be made in such a small system.
Where does this leave us? I’m perfectly comfortable in the both/and. I accept the invitation of the slab; I benefit daily from the leverage it grants me. I am, at the same time, certain my functions and notebooks will be blown away before the decade is out; maybe just by the leviathan’s restlessness, or maybe by something more dire.
I’d like a permacomputer of my own.
Sent to the Media Lab committee in October 2021