Welcome to the Post-Truth Drug Trial
Introducing Gray Matter, our newest Riskgaming scenario
Last week, FDA Commissioner Marty Makary told CNBC “everything should be over the counter” unless a drug is unsafe, addictive, or requires monitoring. “We have to trust people to make their decisions,” he said. “We’ve got to get away from this paternalistic mindset.” In other words, let the market decide what medicines work.
Days later, TIME published its new cover story, “The People vs. AI,” profiling nine Americans — nurses, pastors, filmmakers, state legislators — united by a shared conviction that the technologies reshaping their lives are being forced on them by institutions they don’t trust. The newsletter Transformer, meanwhile, published a long, damning analysis of why the political left has essentially refused to engage with artificial intelligence at all, instead dismissing it as “spicy autocomplete” while ceding the entire policy debate to the right.
So: The government is saying trust the people, the people are saying they don’t trust anyone, and the political movement you’d expect to care most about corporate power, public welfare and inequality has decided the most transformative technology of our time is a mirage.
Debates over the role of trust and truth in policy have followed a striking pattern, from Ozempic, to mRNA vaccines, psychedelics, even crypto. A potential breakthrough technology arrives, and then the attention economy engulfs it. The resulting information market — not peer review, clinical trials or careful regulatory deliberation — determines what the public believes, what gets funded and what gets approved. Over time, the infrastructure for figuring out what’s true about a new technology has collapsed. Everyone can feel it…and no one knows what to do about it.
I’ve spent the past year building something to try to help people understand this problem better by letting them feel what the collapse is like from the inside.
Gray Matter: Selling Science in the Age of Attention is our newest Riskgaming scenario, and it’s available to download and play today. It’s a live trading card game for 18 to 54 players, playable in about ninety minutes. It is the first game I’ve designed, and the most personal project I’ve worked on at Lux.
The premise: It’s 2027, and a postdoc at NYU has accidentally created a “pattern recognition” drug — essentially, a drug that purports to make you smarter. Within months, three variants hit the market: AlphaAxon, the Wall Street favorite; BrainBatter, the artist’s choice that also attracts Chinese influence; and ClariCore, the “responsible” European alternative.
The FDA skips Phase III efficacy trials under the new administration’s “innovation acceleration” framework (sound familiar?). But a mandatory one-year review is fast approaching, Congress is in chaos and the fate of human cognition will be decided not by peer review or clinical trials, but by the three forces that actually shape our reality: the people who know, the people who are loud, and the people who are rich.
When I started designing the game, as I later told Business Insider, I wanted a way to talk about AI without actually talking about AI. I’d been watching how Ozempic exploded through influencer culture decades after the underlying science emerged, how mRNA vaccines became hopelessly politicized as rumors outran safety data, how crypto inspired hot takes from people who couldn’t explain the technology. In each case, the pattern was identical: a breakthrough hit the attention economy, and the resulting information market determines outcomes just as much as the science. I was absolutely sure the exact same thing was about to happen with AI.
So I turned information, attention, and money into the three currencies of the game. Information Agents — scientists, regulators, professors — hold exclusive Clue Cards but need attention to make their data matter. Attention Agents — journalists, influencers, podcasters — can amplify anything but need to get paid. Money Agents — investors, lobbyists, hedge funders — can buy information and attention, but have to decide whether to bet on credibility or virality. Players trade all three currencies freely. There are no rules governing deals. No enforcement of promises. The room determines the market prices for attention, information, and money. The only thing that matters is the cards in your hand.
Here’s where the game gets at something some simulations miss: every Clue Card has a credibility score and a virality score — and critically, those two metrics don’t always move in the same direction. High-credibility information is often low-virality (the juiciest rumor is often the least true), although there are plenty of exceptions (sometimes truth is stranger than fiction).
And the act of making information public (publishing it) is itself a hard strategic choice with real and permanent consequences. Any player can publish a Clue to a public board at any time, which means it counts toward the FDA’s decision, since it is now aware of the information. And once information is out there, it can’t be retracted, although it can be republished. But the board only displays the last three Clues published, because even public information is ephemeral. Want to know if something was published earlier? That costs a Money Card, because in the game world, as in the real one, research to dig up public data is its own economy.
The game thus creates a tension meant to mirror our reality. Attention Agents want to publish, because being first earns them a virality bonus. But Money Agents often want to suppress information, because their scoring rewards holding Clues that were never published — insider knowledge that the market hasn’t priced in. So the same piece of true, important information might be worth more to one player kept secret and to another blasted across the public square. That’s how information actually moves through the world — or doesn’t.
It was beautiful pandemonium, and exactly how debates play out in real life. Instead of thoughtful deliberation we got narrative combat, where volume and timing decide who gets heard.
One addition to the game, which changed its character, came after beta testing. Something was nagging at me: the game was modeling information, attention, and money well, but it wasn’t fully modeling the entire spectrum of motivations. Yes, most of the characters cared about seeing a different combination of drugs either stay approved or get pulled by the FDA. But for all players, the “true” efficacy, safety, or backdrop of those drugs was neither a positive nor a negative. It simply wasn’t relevant. So I added a new layer: eight of the players in a full game don’t care which drugs the FDA approves. They care only about the truth environment itself.
The Truth Seekers score highest when accurate information dominates the public record. Conspiracy Theorists score better when false narratives spread. And Chaos Monkeys (my favorite addition) win only when no consensus forms at all (think media broadcasters who succeed when there’s constant news churn to spew). The point, though, is that none of these people are villains. They all think they have good reasons for wanting what they want.
When we ran the game at The Ned in Manhattan last fall, that design choice transformed the room. Midway through, a player posing as a neuroethics professor silenced the crowd by tapping his wine glass. I suspect most people thought he was a facilitator making a game announcement. Instead, he declared (in-character) that the drugs were being peddled by evil corporations that only wanted to make the rich smarter. “He’s right!” chimed in another player. “He’s an enemy of capitalism and American innovation!” shouted another. It was beautiful pandemonium, and exactly how debates play out in real life. Instead of thoughtful deliberation we got narrative combat, where volume and timing decide who gets heard.
By the end of that runthrough, the information players had published, traded, amplified, and buried over ninety minutes had taken on a life of its own. The public record bore only a passing resemblance to the underlying science I’d written into the “secret canon” of the game. Players who held true information had sometimes found it more profitable to stay quiet, while players who held garbage had sometimes found it easy to go viral. The system had worked as intended, and the result was a reality that no single player had intended but that all of them had built.
When I started building this game, the premise was speculative: a fictional administration skipping Phase III trials, an FDA under political pressure to move fast, and a public desperate for answers but unable to agree on where to find them.
Today, the real FDA commissioner is proposing to invert the standard for which drugs require a prescription. DOGE has gutted FDA staffing, so the agency’s reviewers say they can’t meet congressionally mandated deadlines. And 99% of Americans use a technology weekly without always realizing it (AI) is one that over 70% of them view negatively.
Gray Matter is still fiction. There aren’t super-cognitive enhancing drugs yet, unless you think of AI as a super-cognitive enhancing drug. Perhaps it is. Regardless, the whole scenario is less fictional now than when I wrote it. The post-truth drug trial is not a thought experiment but the world we’re living in: a world where the science is never settled fast enough, where the people with the data aren’t the people with the megaphones, and where the line between selling and settling science has become impossibly blurred.
The question is whether we’re going to understand the forces that got us here, or keep pretending they don’t exist. Gray Matter is my attempt to help you feel what it’s like to be inside the machine. My hope is that you’ll never look at a drug approval headline or a tech tweet the same way again.








