Will Trump’s funding cuts actually save science?
No one wants American science to fail, but the old system wasn’t working. A path forward is finally opening
Science is the single most powerful engine of American prosperity, and not unlike the preliminary report on the Air India crash, the pilots of our country’s airframe seem to be flicking off fuel to the engines just as we soar skyward. The most recent analysis of the administration’s proposed appropriations shows a roughly one-third cut for basic research, the fundamental bedrock of knowledge production.
We launched the Lux Science Helpline in response to the prospect of these grievous slashes, but that’s merely a down payment. Our goal runs much deeper: to rebuild America’s engines of innovation for another century of excellence. To that end, my Lux partner Michelle Fang, who just joined us from AI chip company Cerebras, alongside our scientist-in-residence Sam Arbesman recently organized a dinner with a prominent axis of the research establishment to debate the future of science funding and universities in the United States.
I was charged with kicking off the discussion with a provocation, and I said that President Donald Trump’s intended science cuts were, opportunistically speaking, a blessing.
Let’s rewind a bit. For those walking the halls of science policy conferences, the common refrain had been of a research enterprise in the throes of stagnation and decline. A publish-or-perish culture led to the replication crisis and p-hacking, drawing scientists away from brilliant discovery on the frontiers toward the nihilism of unearthing small but publishable facts to hit quotas and keep careers intact.
Thanks to a surfeit of scientists and a lack of real budget growth, the average age of a first-time recipient of a National Institutes of Health R01 life sciences grant had been forced into the mid-40s. In fact, most scientists starting out with their own labs are closer to retirement than graduation. That led to the rise of the “permadoc” problem, where budding scientists spend their most creative and productive professional years migrating from institution to institution in search of postdoc positions while waiting for their chance to experiment independently.
A publish-or-perish culture led to the replication crisis and p-hacking, drawing scientists away from brilliant discovery on the frontiers toward the nihilism of unearthing small but publishable facts to hit quotas and keep careers intact.
The list of crises truly goes on and on. Administrative overhead ballooned so much that the average scientist on a federal grant spends roughly half of their time filling out paperwork. Pathbreaking researchers innovating in the interstices between fields are often overlooked — sometimes for decades — eking out precarious existences. Due to the potential fallout of negative publicity over science the public can’t understand, NIH and National Science Foundation funding panels are often very conservative in their approach, selecting for past but safe work as opposed to compelling new breakthroughs. Add in university politics, research fraud, a toxic work culture and more, and the balance sheet for current science institutions is remarkably bleak.
Rather than seeing this just as a disastrous attack on science, it’s a crisis that really could open the door to improve science in ways that make the community and its outputs far stronger in the decades to come.
All that was true before this administration entered office, and it all remains true today. I first started studying these issues back in 2009, and in the intervening 16 years, nothing has changed, other than the small addition of the Directorate of Technology, Innovation and Partnerships at the NSF as part of The CHIPS and Science Act. Incremental change has proven impossible, and in fact, any changes that do sneak through have nearly exclusively gone in the wrong direction.
In short, there is almost universal agreement among scientists and university leaders that the current system is sclerotic and untenable. No one wanted someone with a sledgehammer to come in and wallop the system, but that’s what the Trump administration seems hell-bent on doing. Rather than seeing this just as a disastrous attack on science though, it’s a crisis that really could open the door to improve science in ways that make the community and its outputs far stronger in the decades to come.
One of the interesting lessons from the dinner conversation was that many higher education leaders understand that this is a unique opportunity to question fundamental assumptions about the university and the country’s approach to science. What should tenure and promotion look like when every scientist is ultimately on their own to secure a paycheck through extramural grants? What does the rise of AI portend for the future of science and scientists themselves? Is publishing endless research papers on Arxiv and other preprint servers the model that best helps scientists make progress on challenging problems? How can we encourage more reproducible science when the only thing that counts as performance are new discoveries?
Many of these questions have been festering for decades, but they are now front and center, offering an opportunity for ambitious leaders to reposition their institutions for the century ahead. It was heartening to see folks consider that the whole system could (and in some cases, even should) radically change in the next year or two.
That early discussion led to a broader and even more fundamental question on the role of the university in America this coming century. Most institutions still run on what was dubbed the “multiversity” model suggested by Clark Kerr, the first chancellor of the University of California, Berkeley. He described universities as communities bringing together many different types of constituencies in one place, from students seeking an education and researchers pursuing discovery to industrialists and mayors looking for regional economic growth and sports fans looking to cheer their favorite teams. The argument was that all these independent constituencies ultimately benefit one another, with the university collecting the positive spillover effects from all their different activities.
At our gathering, there was real debate whether the multiversity model can survive this century. The NCAA and the courts have completely upended the economics of college sports with the House v. NCAA settlement, moving away from a student-amateur model to one that will offer athletes vast new sums in revenue, particularly around their name, image and likeness. Outside the biggest university franchises, sports will increasingly be a massive money sink. Do they still have a place on most campuses?
There has been a teaching crisis at universities for years now, where overworked adjuncts cover classes for tenured faculty, who are more concerned with research than student success. Will that tradeoff still make sense when students are increasingly eschewing the high cost and poor outcomes of a university education?
There’s an obvious superstar effect in research, which means that even many flagship state schools fail to make fundamental advances that can transform their regional economies. Most technology transfer offices are ultimately unprofitable in their pursuit of patents and venture capital when all costs and employee time is accounted for. Is it time to reconsider whether the university is really an engine for innovation? In short, do we have too many research universities, and should we rebalance between different types of institutions?
This was a wide-ranging debate, and there was no consensus. I asked whether any university would consider eschewing federal funds in order to have complete independence from the federal government and its policies (Hillsdale College in Michigan is one extant example). No one felt that was viable given the hefty costs of modern research.
What do American universities owe the public for their past, present and future success? These institutions have been and are heavily subsidized by taxes — money that has helped them establish the country’s higher education system as nonpareil.
In the denouement over a delightful chocolate lava cake, one higher education leader argued that it was time for universities to stop being so reliant on the United States. Instead, they said institutions should think of themselves more as multi-national corporations with campuses around the globe, sustaining their performance by offering flexible arrangements so that scientists can better collaborate with their peers. Given the Trump administration’s shifting policies on immigration, a global footprint is also increasingly attractive for international researchers who may suddenly find themselves without a U.S. visa.
Beyond a multiversity and its many constituents, this comment raised a simpler question around the ultimate constituent: the American public. What do American universities owe the public for their past, present and future success? These institutions have been and are heavily subsidized by taxes — money that has helped them establish the country’s higher education system as nonpareil. On the other hand, the brains walking around these campuses increasingly come from overseas, and science has never been so cosmopolitan. There’s thus a fundamental tension between the goals of competitive intellectual sovereignty driven by populists and the more cooperative mode of scientists seeking discoveries for the benefit of all of humanity.
No one walked out of the dinner with answers (admittedly, much to the chagrin of some). Not unlike the goal with most of our Riskgaming scenarios though, the reality is that even getting these challenging questions to be taken seriously is a crucial first step. Incumbency breeds stasis, and that means ambitious questions are easily redirected and ignored. Trump’s funding cuts (whether or not fully enacted by Congress in the months ahead) mean universities can no longer pretend that they are immune to change. They too must embrace the future, one that will look very different — and hopefully, one far more optimized for pathbreaking science.
That all seems about right. While burning it all down to start over is rarely the best of strategies, I don’t know what else you do here. One of my biggest disappointments was showing up (admittedly ill prepared) for a graduate program in chemistry and finding out that we as future and current scientists weren’t getting together to study and solve cutting edge problems, but rather practicing defensive research designed to check boxes and acquire funding because it was just too risky to let it ride on a wild idea. At the time the buzzwords for just about every grant applications were “lab on a chip” and the ubiquitous “potential cancer target.” You couldn’t not include these options. Not if you wanted funding.
Maybe Heather Cox Richardson is right and we will emerge out of the current disruptions stronger and able to utilize the chaos for change. I hope so. It’s nice counter narrative to Trump = bad/wrong/terrible for the future.