Intro Substack: This week, Laurence and I sat down with Gideon Lichfield, journalist and author of the Substack Futurepolis. We talk about the future of democracy, public AI, and tech governance. This Q&A has been edited for length and clarity.
For more of their conversation, please subscribe to the Riskgaming podcast.
Danny Crichton:
So Gideon, you've joined us in real life at two of our Riskgaming events: Powering Up, which is focused on the Chinese electric vehicle market, and DeepFaked and DeepSixed, our game focused on AI and election security. What was your experience between those two?
Gideon Lichfield:
With DeepFaked and DeepSixed, what I mainly remember was the feeling of the fog of war, where all this information is coming at you from different directions and it's time critical, and who's doing what? Is this piece of information true? So for me, even though it was just a game, it was actually a really effective way of making clear how confusing life can be when you're trying to deal with these critical decisions.
Laurence Pevsner:
One of the things I point out to folks is that the game’s the easy version. In real life, you don't have everyone in a room together. You're all in separate places, you're all talking at different moments and times. And so we've done a convenience for you by bringing you all together, and yet people still feel like it is total chaos.
Gideon Lichfield:
Yeah, exactly.
The China EV game was obviously a very different experience. In some sense, it felt a lot more like a board game. You're sitting around the table and you're trying to trade and gain investments from the automakers and so forth. But what became clear by the end of the game is just how much it's rigged against the mayors of the cities.
Danny Crichton:
That's exactly right. Mayors get squeezed by the population from the central government, you're getting squeezed by the car makers and the consultants. And so there's really no way out of the system.
Gideon Lichfield:
Yeah, as a mayor, I did not do particularly well in the game. I realized by the end that I was trying to be too much of a good mayor to the central government. I should have just ignored them and pursued my own interest more.
Danny Crichton:
Yes. And the other thing is, essentially, the car companies try to pit the two mayors against each other. So you're both offering tax incentives, you start to get into an arms race. Suddenly there are no revenues, and the whole thing kind of collapses. But if you hold the line and essentially act as a cartel, that's where you have a lot more leverage.
Gideon Lichfield:
Yeah, that’s so interesting, because it's a strategy that did not occur to me at all during the game.
Laurence Pevsner:
Yeah, it almost never occurs to the players to work together, because you see, "Oh, here's this other city. They are, in fact, my most direct competitor." That is how we tend to think in the game, which I think reflects human thinking in general.
Danny Crichton:
With Riskgaming, what we try to do is emphasize how people who are competing with each other under uncertainty — under risk — find moments of collaboration, find moments of competition, and make the decisions that they do.
When you are writing on your Substack, Futurepolis, you're doing the same thing. You have this very bold mission statement of updating our systems of democratic governance to make them fit the 21st century, and you emphasize institutions like participatory budgeting, citizens' assemblies, digital public spaces, and so on as means of building that kind of consensus. I'm curious how you see the world from this perspective.
Gideon Lichfield:
Yeah, I mean the problem with all of these systems of collaboration is getting people to use them, and getting established lawmakers, policymakers, governments to pay attention to them. One of the most interesting experiments, or certainly one of the biggest experiments in participatory budgeting in the United States, was in Seattle, where after the George Floyd protests, there emerged pressure for the city council to give local communities, particularly communities of color, some say in how money was spent. And that led to a total of, I think, something like 25–30 million dollars from the city budget being allocated to citizens to participate in a budgeting process. That’s a tiny sliver of the city budget, and I think there are also mixed feelings about how successful the experiment was. So this is one of the issues: a lot of these things are very small.
Another example that I wrote about in Futurepolis was this small experiment of a mini citizens' assembly in Tennessee, where they brought together 11 people to talk about gun violence. They had this full range of people — from someone who's a firearms instructor and a combat veteran to a teacher who has seen several of her students shot dead. And they came together to discuss ways of dealing with gun violence. Certain things ended up being off the table, like nobody on the pro-gun side was really willing to talk about gun control legislation. But nonetheless, the group managed to reach consensus on a handful of issues around gun safety and educating people better about gun use. Anyway, there were a handful of proposals that came out of that exercise, but they didn't really lay any groundwork with legislators in Tennessee to get any of it adopted.
And this is the thing that you come up against with any of these deliberative bodies: unless you're getting lawmakers' buy-in from the get-go to take the process seriously, there's a large chance it won't make an impact.
Laurence Pevsner:
When it comes to participatory budgeting versus a citizens' assembly, one difference that really strikes me is that, in a citizens' assembly there's an active effort to try to get people from all walks of life.
Participatory budgeting, though, is done by whoever volunteers. It's whoever wants to show up to the meeting. And often the people who show up to the meeting might have much more extreme views than the regular populace, and this can create perverse outcomes.
Gideon Lichfield:
Right. It's always the people with time on their hands and with pots to stir who are most motivated to show up.
I was at a discussion a couple of months ago where a lot of people came together to talk about democracy innovation. The point that the speaker made was that every system can be gamed, because special interests will always want to find a weak point.
Citizens' assemblies are at least notionally more resistant because you have a process that is meant to ensure that you reach out to people from across a wide range of the political spectrum and socioeconomic backgrounds. But if somebody is determined enough, they could probably find a way to subvert that process, too.
Danny Crichton:
When I think about Riskgaming, we’re not necessarily trying to persuade. It's actually an education of saying, "Well look, this is the trade-off. It is between money and profit, it is between long-term investment and short-term investment. And it's hard."
And so you may disagree with the decision. You may not even like the decision you made. But you learn something in the process; there's some intellectual humility that comes with trying to balance a budget. There is the challenge of different points of view, and realizing that you can come to understand someone else’s perspective.
Gideon Lichfield:
I think that there's definitely an opportunity presented by things like citizens' assemblies, too, because what you have is a fairly small but representative number of people, who get to spend a long time talking about an issue and listening to experts from all different fields, and coming to grips with the complexity and the difficulty of it.
They're also given a framework and a structure that allows them to reach consensus rather than end up in an adversarial position. So for those people, it can be a really useful exercise. But the question is how do you translate that awareness, that understanding, to a broader public?
Doing interviews with the participants, maybe recording the discussions, publishing stories about the deliberations and how they got from disagreement to consensus on this particular issue. But again, all that does kind of depend on having a government that cares about this stuff and wants to do it, and I don't think we have that right now in the United States. Not at the federal level.
Laurence Pevsner:
It strikes me that you just have a fundamental numbers problem here. This is the whole reason that republicanism exists to begin with. At a certain point, you just can't have an Athenian democracy, where everyone gets into the room.
Gideon Lichfield:
I don’t think citizens' assemblies taking over the running of the government is what we should expect. But I do think that in an ideal world, you’d have assemblies that are convened for very specific questions and with different sets of people, and then those inform the work of an elected full-time congress or parliament.
Laurence Pevsner:
This is a podcast hosted by a venture capital firm, so it is only natural to inquire, is there a tech solution here? If it's a numbers problem, well, we do have technology solutions to be able to gather a lot of data and get a lot of people in a virtual room, even if we can't get them in a physical one.
Gideon Lichfield:
I hesitate to use the words “tech solutions.” But there are some interesting uses here, and AI inevitably gets a shoutout.
One of the largest-scale participatory processes we have right now, certainly in this country, is open comment on rulemaking and on laws. And so thousands, perhaps millions of people will sometimes write in on a proposed piece of legislation and express their opinion. Now, this too is very prone to gaming. I think it was the net neutrality legislation that got something like 12 or 14 million comments, most of which turned out to be from bots.
But you at least have the potential for a large number of people to express an opinion, and technology tools can help with things like filtering out bots and spam, collating comments, finding common themes, identifying points of agreement or fault lines, and providing that information to lawmakers and to policymakers in a way that simply wasn't really available when someone just had to sit and read through everything.
Danny Crichton:
Let's go to another subject you've been writing about very recently, which is artificial technology models in general, and specifically the issue that the all frontier models today — OpenAI, Anthropic, DeepSeek, Mistral — are owned by private companies. You have identified this as a major problem from the perspective of the public.
Gideon Lichfield:
Yeah, so there's this idea that has emerged of public interest AI, or public AI. And as you say, it's the idea that there should exist public versions of AI models and data centers and training data — all of the layers of the AI stack — that serve the public good, meaning anyone can have access to them at reasonable fees. Training data should be made available to people who want to develop models for scientific research or for certain social ends.
One parallel that is used is the BBC. So the UK has various broadcasters, they're in private hands, but then it also has this public broadcaster, and part of its mission is to make sure that it's educating the public and that it's creating a healthy public discourse. Yes, it's funded by taxpayer money, but it provides good programming and good content, and it increases Britain's soft power.
And so by the same token, if you have AI that is not trained to spout misinformation, but instead to espouse democratic values and bridge gaps and so forth, then that would be a public good.
So there are all these reasons for having it. The question then becomes, well, who should pay for it? It's not necessarily entirely taxpayers.
Danny Crichton:
I think the BBC model is really interesting, because it's not limited to UK citizens. When you mentioned soft power, I think of everywhere from Africa to Asia, all the way through Latin America has access to BBC, and you both get an impression of the UK, and you get this source of information you wouldn't otherwise have in your own information commons.
Let me pivot, though, to one final question, because obviously we've been talking at a very high level. But in terms of practicalities, we're in 2025. I'm curious what you think about the balance between, "God, the system's so broken. I want to have a complete second system," versus "God, we just need to do micro improvements to pieces of the system as we can, because government is like a ship of Theseus where you're constantly replacing each part, and yes, maybe in 30 years it's entirely different government, but as of right now you can only fix one and mend a piece at a time.”
Gideon Lichfield:
I mean, I think the gradual approach is the one that inevitably is going to happen. It's not like we're going to tear up the system. If I try to imagine what great government looks like 30 or 40 years from now, I think it contains a lot of the parts that we already have. I think the stuff that I write about and that is interesting is, how do you make all of that work way better?
And you could divide this into a handful of big buckets. One is civic participation. How do you go beyond just having people vote every four years to having them have a say and have that say be taken seriously and have it influence policymaking? So that's where all the participatory stuff goes.
Then there's making the government itself run better. It's about cleaning up procedures. There's huge amounts of bureaucracy and really, really complicated procedures, and if Elon Musk were actually trying to make government work more efficiently, he would go after those procedures instead of firing a bunch of people.
And then I think the final part, which is really necessary and that nobody has really yet figured out, is just how to make government, lawmaking, and policymaking, move at the speed of technological development.
But none of this precludes keeping existing institutions.
Danny Crichton:
If you look around the world today, there's a huge spectrum, from people who see the entire world collapsing and blowing up to people who are extremely optimistic for the future, who see technology as extraordinarily empowering. I'm curious where you stand on the spectrum. Are you very optimistic, very pessimistic?
Gideon Lichfield:
I was thinking this morning that maybe I would describe myself as a cautious pessimist, but no, I think long term I'm relatively optimistic.
Short term, I'm pretty pessimistic. I think the institutions that we have right now are being sorely tested. I don't know how much destruction the Trump administration is going to do to the institutions of the country, but it could be considerable. And I think in saying that, we also have to recognize that, for a very large number of Americans, it wasn't really a great example of democracy before this either. It was creaking and it was elitist and it was out of touch with people's needs.
Longer term, no regime lasts forever. I think people will start to apply some of these techniques that I've been talking about. They'll start to see the need for a rebalancing of power between very large companies and civil society. But I think about what happened during the industrial revolution, so late 19th, early 20th century. The impacts of industrialization were absolutely terrible. Urbanization, disease, slave-like conditions in factories, child labor, pollution, all these terrible things. And it took a while for basically society to rally round and say, "We need laws to protect us from these bad effects." And that's where we got social safety nets and labor laws and unionization and the like.
And we're at that stage now when it comes to digital technology.
Danny Crichton:
Gideon, on that note, always great to have you here, and hope to see you again in person at a risk game soon.