Can software platforms reverse enshittification?
Alex Komoroske and Sam Arbesman on the next ten years in tech
Software kind of sucks these days, doesn’t it? Cory Doctorow invented the word “enshittification” to describe a pattern he repeatedly observed across software platforms. They start generous and flexible, but over time, they increase their value capture to maximize profits at the expense of their users. Software ends up feeling over-optimized and hostile, constantly fighting our desires. But software ultimately is for us, and there must be a better way.
Well, there is, at least in theory. A coalition of software and tech luminaries, joined by hundreds of supporters, recently launched the Resonant Computing Manifesto. They want software that is private, dedicated, plural, adaptable and prosocial — the antipode of the offerings available to us today. It’s a fresh vision, one desperately needed as LLMs rapidly democratize software engineering to everyone.
The Orthogonal Bet host Sam Arbesman and I jointly host this special episode with Alex Komoroske, founder of Common Tools, which dubs itself a new fabric for computing.
The three of us talk about the manifesto, how LLMs are changing software design, the same-origin paradigm, fully homomorphic encryption, remote attestation, and whether it is possible for software to be good and also be profitable.
This interview has been edited for length and clarity. For the full episode, please visit our podcast.
Danny Crichton:
You have a new manifesto out on resonant computing. So for those who haven’t read it, I would love a primer on what resonant computing is and how the manifesto came together.
Alex Komoroske:
The core idea is that hollow things leave us feeling regret, and resonant things leave us feeling nourished. Superficially, they look very similar, but they’re fundamentally different. The tech industry is really good at providing and creating things that look good but are hollow inside — and large language models will take that an order of magnitude beyond what it’s ever been before. So it’s more important than ever before for us to make sure that the computing experiences we’re using are resonant. There are five specific qualities of resonant computing.
One is that they’re private, that you are the steward of your own data, and it’s used in ways that align with intentions and expectations.
Two, that it’s dedicated. It has no conflict of interest. It acts as an extension of your agency as opposed to somebody else’s agency.
Three, it’s plural. There isn’t one centralized power. It’s a lot of diffuse federated power structures.
Four, it’s adaptable. It’s something that lifts you up, it doesn’t box you in. It’s not like, oh, well, some product manager said I can only have these five options. No, it should be something that’s open-ended.
And five, it’s pro-social. It’s something that helps bring you in harmony with the world around you. It brings you into the world as opposed to allowing you to retreat from the world.
Danny Crichton:
When we think about where the technology industry’s been over the last 20 years, I can see a lot of platforms, experiences and apps that are in opposition to resonant computing. One of the challenges when I think about pro-social tech is, well, social media. People would say, “Well, social media was pro-social. It was designed to connect people.” It just didn’t really work out that way.
So how do you create a distinction between the past — at least the things people have been trying to build — and where you want the industry to go now?
Alex Komoroske:
I don’t know about you, but I joined the tech industry because of the hacker ethic. In the last decade or so, though, as the tech industry has consolidated, it’s gotten to this late-stage thing where all consumer minutes are sliced up between five or so aggregators. We’ve gotten to this overly optimized phase, and everything is hollowed out.
Much of the tech industry assumes that software is expensive to write and cheap to run. LLMs undermine both of those, it’s now possible to write shitty software for basically free but running it isn’t.
It’s not just about tech. I think it is also true in politics. And in business. We’re so focused on optimization that we’re inadvertently hollowing out the thing we’re optimizing. And so, to me, resonant computing feels less like a new thing and more of a reminder of when we did technology differently.
And now is the right time. Large language models undermine a lot of assumptions that are baked into the current tech industry. Much of the tech industry assumes that software is expensive to write and cheap to run. LLMs undermine both of those, it’s now possible to write shitty software for basically free but running it isn’t.
And that is very destabilizing for the tech industry. So, if we’re going to destabilize, what are the principles we want to come back to? What are the seedlings we want to grow?
Danny Crichton:
Tim Wu just launched his new book, The Age of Extraction, talking about how platforms are increasingly taking more of what we might dub in economic terms, “consumer surplus,” and trying to put it on the production side of the equation. There’s this very broad intellectual banner that says like, “Look, we’ve lost control over our computers. We’ve lost control of the customization.” We used to have HyperCard and you could do really fun, interesting things. You had this open sandbox where even as a nine-year-old, you could make little games and it was super exciting.
Now it’s locked down. I can’t even build my own computer, so to speak, as the parts aren’t even available. RAM prices are triple the cost. I’m fighting with Nvidia to get access to a chip. The forces arrayed against you are the richest, most powerful, most influential, most popular companies in the world. What do we do?
Alex Komoroske:
There’s a dozen or so of us who have been working on this, Sam here is one of them. But Danny, two things to your point. One, I’m so grateful we have large language models available as APIs, and we have multiple options. You could imagine a world where ChatGPT went big before OpenAI ever made one of their completion APIs public. You can imagine a world where they go, “Oh, shit, I’m not going to give away my primary thing.” And then, you can imagine Google and Anthropic being like, “Oh, it’s too dangerous to expose this to anybody else or whatever.” And that would be a very different world than we are today.
But we’ve got multiple providers, and they are all competing on cost and quality. There’s a number of amazing open-source models that keep on nipping at their heels, and at some point will surpass them. And that’s great, because it is destabilizing current power structures.
The second thing I’d say is this: I’m obsessed with the “same-origin paradigm,” which is the laws of physics that describe how our software, especially the web and apps, work. I think somewhat surprisingly, the same-origin paradigm, this fateful decision to slice up different origins into silos and then give the owner of that origin — the creator of that app, that software — full control over that data, was a very fateful decision in 1994.
It helped lead to significant centralization and aggregation. A lot of things we’ve seen in the tech industry are all downstream of that decision. To me, if we were able to transcend that model, add in new or alternate models that work for different use cases that otherwise aren’t supported, combined with LLMs — wow, that could really catalyze something big.
Danny Crichton:
One of the big things people talk about is no one wants to run their own server. No one wants to run their own VPN. Not just because it’s complicated, but also because of the knowledge required to be your own sysadmin.
Isn’t that why a lot of these centralized services start? You have an app, it is home-baked. It is literally running on ramen for the first week or two. There are eight users. And then, it scales up because it’s successful. It gets up to hundreds of millions and a billion people.
In his book Enshittification, Cory Doctorow really emphasizes this evolutionary challenge: platforms always start by trying to give as much of the users as possible. Then, there’s some sort of pivot point where they either want to get to profitability or they just want to extract more money.
Alex Komoroske:
Yeah, I wrote a note to myself that said the SaaS business model is downstream of convincing you to accumulate your data on someone else’s turf, and then they rent access back to you in perpetuity to your own data.
You could argue no one wants to be their own sysadmin, but we’ve combined not being a sysadmin with not owning your data. And that is what leads to this downside. Have we talked about confidential computing? I forget.
Danny Crichton:
Not on this podcast.
Alex Komoroske:
Nobody knows about confidential computing, and it is wild, because it can do things. It changes fundamental assumptions about where control can live — and who owns what. Confidential computing is hardware support basically baked into all the chips that have been deployed to public clouds in the last few years, including H100s. Now, if you run it in confidential compute mode, no one — even those with physical access to the machine — can peek inside, and that’s awesome.
The reason you haven’t heard about this before is because it’s really a tail need for the vast majority of things. If I’m running some dinky web service, and I’m running it in Google Cloud or whatever, I already assume that the Google SREs are not peeking into my thing. It’s against their terms of service, and also it’s actually quite difficult to do. The people who use confidential compute are primarily defense contractors running secret workloads or people in the finance industry doing highly sensitive calculations.
But, one of the things confidential compute can do as a party trick is actually the most interesting component: remote attestation. There’s functionality built in, where each chip has a private key that’s derived from Intel’s root key that’s burned into the silicon and impossible to remove unless you destroy the silicon. That key can then be used to sign an attestation that says the bearer of this attestation was indeed running in confidential compute mode. And here is the fingerprint, here’s the shot of the VM that was running.
The key test is, as you look more closely, does it work? Does it impress you? Is it something you like more the closer you look, or do you like it less? A resonant thing is fractally aligned.
You can pass this to somebody else across the network, and they can verify, “Oh, this is definitely signed by Intel. It’s definitely running in confidential compute mode. And it’s definitely running this software, bit-for-bit.” I’m simplifying, but this allows you to have a very high degree of confidence that your cloud provider is indeed running precisely the software they say they’re running.
This, in turn, allows you to construct a system you might call a “private cloud enclave” that allows you to have your turf in somebody else’s area. It’s like having an embassy. Technically, someone could break in, but that would be an act of war.
So this allows you to get the benefits of cloud computing with somebody else as sysadmin, but it’s still on your turf.
Danny Crichton:
I want to get to another piece of resonant computing: the qualitative factor. What makes something feel like I have expressive control over it, that I am the one who is the master of the piece of the software as opposed to the software disciplining me. How do you define what that would feel like? What kind of software do you think inspires you that way?
Alex Komoroske:
I think the key test is, as you look more closely, does it work? Does it impress you? Is it something you like more the closer you look, or do you like it less? A resonant thing is fractally aligned. Every layer, the more you peel back, the more you go, “Ooh, I didn’t even think to care about that, but I like that. “ Whereas hollow things, you pull back and you’re like … “Oh, wait, they can send arbitrary firmware updates, and it’s got a microphone and the terms of service say they can send it to any of their partners?”
Samuel Arbesman:
Danny, you were asking earlier about the pro-social nature of social media. Yeah, they were designed to be pro-social, but that’s very much on a single level. It’s like the first order of thinking. Once you get to the second, third order of thinking, you realize these things are not pro-social at all. They’re actively against empowering and improving social interactions. That goes along with this fractal thing.
Danny Crichton:
Apple is famously very expressive. But Apple would tell you, “Look, the only way to create a really expressive device that allows you to feel like its yours, it is private, et cetera, is we have to own it vertically integrated top to bottom. We own it all the way down to the chip at this point.”
Yet you have this value in the manifesto of pluralism, which sounds really great, but competition is in many ways what triggers some of the negative effects where people are trying to grab more of the surplus value of these products and platforms.
I’m curious: When you think about competition and vertical integration, is it possible to allow all these different layers to be plural, or does it need to be vertically integrated?
Alex Komoroske:
I think it has to be vertically integrated. You make a really good point about pluralism requiring competition in some form. I would argue that Apple fails the test precisely because they are not plural on these things. If you told people when the iPhone was announced that this was going to become the single most important computing device of all time, and it was going to be the vast majority of your computing, the idea that one company — which by the way, has consistently put a pretty heavy finger on what things they allow on that device — owns it would be insane. That’s an insane thing for us to allow as a society. iPhones are great. And also, the idea that one company that is very jealous gets to decide what goes on the iPhone is wild to me.
Samuel Arbesman:
Just taking a step back in terms of thinking about pluralism, we talk about this idea that a healthy ecosystem requires lots of different choices and things like that. Putting on my lapsed evolutionary biologist hat, healthy ecosystems are not just imposed or designed. They evolve over time into a thing that becomes healthy.
And I feel like that’s the argument being made: Yes, on the one hand, maybe some company can design a perfectly resonant experience in this vertically integrated way, but more likely than not, the resonant experiences we want in the tech world are going to evolve from the bottom up. And that’s why this plural principle is so important, because you need to almost cobble it together.
Danny Crichton:
Let’s project forward five to ten years. Your resonant computing manifesto has gone absolutely bonkers. Everyone has signed onto it. What does that world look like? What would companies be doing differently in terms of designing software, in terms of how they’re giving it to folks, in terms of the collaboration that’s built around it? What does it feel like?
Alex Komoroske:
My mental model is this: software today feels like a thing you go get at the big box store. You choose which of three basically similar options to buy, and they all suck. What if software felt instead like a thing that grew in your personal garden, something that was nourishing, that was specific to you. Your data comes alive in ways that help you accomplish the things you care about.
In that world, software doesn’t look like it looks today. Software does not look like having silos that do one particular task. Clay Shirky has this essay from, I think 2004, 2007 called “Situated Software.” Situated software is software that’s highly situated to a specific context.
To anybody else who looks at a piece of a spreadsheet you’ve modified, that’s situated software. It’s highly situated to your particular need. And if you show it to anybody else, they go, “That’s a piece of shit. It barely works. It’s ugly, it’s insecure.” But, to the person who made it, it’s perfect, that’s exactly what they need.
I think situated software, that’s what it will feel like in the future.
To me, this is totally compatible with business and business goals — products that are resonant are ones people love using, and they evangelize them, and they feel good about them.
Samuel Arbesman:
Just adding on to what Alex was saying: there’s this blending of consumer and creator. There doesn’t need to be a single place where there are people producing software, and then people just consuming it. In some ways, it’s a return to the earlier days of computing, which unfortunately was only available for a very small set of people.
My family’s first computer was the Commodore VIC-20. And during that time, one of the ways you got software was you would go to a magazine, and find these type-in programs where you literally had just pages of text and typed them in. And you could see this very clear relationship between the text you were typing in and the result on the machine. Sometimes it worked, sometimes it didn’t.
But that blending and blurring of creator and consumer was only open to a very small subset of people. And hopefully, that kind of thing will now be available to everyone.
Danny Crichton:
I think this works within capitalism. The biggest challenge right now is people are worried they can’t make money. They have to own more. They have to control more.
We all know how there’s no sustainability to open-source software because there’s no money. But I do think there’s a world where you can align the value that’s created — that still makes plenty of money for all the companies. It’s very much in line with the resonant computing mission. You’re also seeing this with the fatigue a lot of folks have with the systems and apps they use every day right now. They’re not being nourished. They’re getting tired of it. You see how many people are quitting various social-media platforms.
And so, I do think that there is a world here. It requires imagination and vision from a lot of leaders, which you have brought quite a few around the table.
Alex Komoroske:
Yeah, I agree. To me, this is totally compatible with business goals — products that are resonant are ones people love using, and they evangelize them, and they feel good about them. When you take a long enough time horizon, and have a broad enough perspective, resonant products are simply better products. They’re better from a business perspective, for shareholders, and what have you.
I have another essay about the optimization ratchet. We didn’t necessarily choose to get so hyper-focused on optimization. It’s just that as you get more efficient, the benefits of optimizing are concrete, short-term and very visible. The downside of optimization is that you’re losing adaptive capacity, you’re losing resonance. But what you’re losing is not nearly as obvious. And so, with each of these micro decisions, the universe tilts towards optimization, and you do more of it.
But you have to know that there’s something on the other side of the balance scale.
Samuel Arbesman:
As people see all these things in modern society, especially in the tech world, they’re very unhappy, they’re burned out with everything in big tech. In the absence of realizing that there’s another path forward, you just see technology as the problem. But it turns out it’s not. There are just certain ways of building technological systems and software that are very problematic. But, if you can provide a name and a framework for this other path, it shows that it needn’t be the default.










