I'll be honest with you—three ideas have been haunting me lately, and they all sound utterly mad at first.
First: science deniers aren't the real threat to scientific progress. Second: scientific consensus is overrated, potentially even dangerous. Third: what if some of history's most valuable scientific contributions came from people who were completely, spectacularly wrong?
I know, I know. These sound like the ramblings of someone who's spent too much time scrolling through conspiracy theory forums. But stick with me here, because I'm about to argue the exact opposite of what these provocations suggest. And I'm going to do it through a single story—one involving a brilliant physicist, a naturalist with doubts, and an Earth that turned out to be far older than anyone imagined.
This is Gerd from Free AstroScience, where we take complex scientific principles and strip them down to their essence. Today, I'm wrestling with something that's been keeping me up at night: the strange, beautiful paradox of how being wrong can be more valuable than being right.
The Physicist Who Got It Wrong (And Changed Everything)
Picture this: it's 1863, and Lord Kelvin—yes, that Kelvin, the absolute zero guy—just dropped a bombshell. Earth is only 100 million years old, he declared. The maths checked out. The thermodynamics were sound. And Charles Darwin's newfound theory of natural selection? Well, it needed Earth to be at least twice that old to work.
I imagine Darwin sitting there, reading Kelvin's calculations. The weight of doubt pressing down. The rustle of papers in his study. Maybe the smell of tea gone cold as he contemplated this challenge that could unravel everything he'd proposed.
Darwin didn't lash out. He didn't dismiss Kelvin as some thermodynamics denier. Instead, he wrote something rather remarkable: we simply don't know enough about "the constitution of the Universe and of the interior of our globe" to be certain. There's epistemic humility for you—wrapped in Victorian prose.
Here's where it gets interesting. Kelvin was wrong. Dramatically, comprehensively wrong. Earth turned out to be about 4.5 billion years old—roughly 45 times older than his estimate. But his wrongness? It became one of the most valuable mistakes in scientific history.
When Wrong Becomes Right
You see, Kelvin's misunderstanding sparked something extraordinary. Physicists started testing hypotheses about Earth's heat and rigidity. Geologists developed entirely new methods using radioactive decay to date rocks. Palaeontologists got creative about evolutionary mechanisms. Even Darwin himself began exploring sexual selection as a way to speed up evolutionary change.
None of this would have happened—at least not in the same way, not with the same urgency—if Kelvin had just gotten it right the first time.
This is what I've come to think of as a "valuable misunderstanding." It's not valuable because Kelvin was some misunderstood genius (spoiler: he stubbornly clung to his wrong answer even after the evidence mounted against him). It's valuable because of what other scientists did with his mistake. They transformed confusion into clarity. They built entire methodologies from the rubble of his error.
The texture of this process matters. It's not smooth or elegant. It's gritty, frustrating work—the kind that requires you to take someone's wrong answer seriously enough to understand why it's wrong, and in doing so, discover truths you'd never have found otherwise.
The Anatomy of Being Usefully Wrong
Let me break down what makes a misunderstanding valuable, because this distinction matters more than ever right now. A valuable misunderstanding has a few key characteristics.
First, somebody has to actually correct it—not just disagree with it, but do the hard work of building new theories, collecting new data, developing new methods. Correction requires construction, not just contradiction.
Second, this corrective process has to be intentional and reliable. Scientists need to be aiming at truth, using methods that consistently turn confusion into comprehension. It can't be accidental or fluky.
Third, and this is crucial, the process has to remain open-ended. Today's understanding might become tomorrow's valuable misunderstanding. Science doesn't deal in final answers—it deals in progressively less wrong answers.
Here's where I need you to follow me carefully, because the implications get interesting. Understanding isn't just about knowing what's true—it's about grasping what's possible. If I know a fire started from a match, that's understanding. But if I also know it could have started from lightning, embers, or chemical reactions, my understanding deepens. I'm not just cataloguing facts; I'm mapping the landscape of possibility.
This is why science isn't really in the business of collecting facts. Science is in the business of understanding—of explaining why and how things work, and what else might work instead.
The Gravity of Being Wrong (But Still Useful)
Now, before you think all misunderstandings are secretly valuable, let me tell you about Roger Babson. This early-20th-century entrepreneur made a fortune applying Newton's laws to the stock market. But Babson had this bizarre fixation—he hated gravity. Blamed it for his sister's drowning. Thought it was responsible for accidents, diseases, even problems in people over 60.
So naturally, he founded the Gravity Research Foundation in 1949. Colleges that took his money had to erect monuments declaring gravity as "Enemy Number One." The foundation still exists today, 75 years later, funding gravity research.
Did Babson's weird misunderstanding about gravity's malevolence lead to scientific advances? Sure. But here's the thing—researchers didn't have to engage with his bizarre ideas to do good work. They could just ignore him and pursue their research. There was no corrective process transforming his specific misunderstanding into understanding.
That's not a valuable misunderstanding in the sense I'm describing. That's just a confused rich guy accidentally funding useful research. The difference matters.
When Misunderstandings Expire
This brings us to the uncomfortable present. Right now, we're watching science denialism rise—vaccine sceptics in positions of power, climate change deniers appointed to scientific posts, crucial research programs defunded for political reasons.
The vaccine-autism hypothesis offers a perfect case study. When Andrew Wakefield published his fraudulent paper in 1998, it sparked a misunderstanding. But here's what happened next: scientists took it seriously. They tested the hypothesis. They explored how vaccines interact with the immune system. They collected data. They generated alternative explanations.
In other words, the scientific community did exactly what it's supposed to do. They transformed that misunderstanding into understanding. For a time, it was valuable.
But valuable misunderstandings have expiration dates. Once you've wrung all the insight from an error—once you've tested it thoroughly, collected the data, explored the alternatives—continuing to push that same misunderstanding stops being productive. It becomes obstruction.
This is the key sin of science deniers today. They're clinging to misunderstandings that are no longer valuable. The hard work of correction has been done. The alternative hypotheses have been explored. The data has been collected. Yet they demand we start over, as if no learning has occurred.
It's like Kelvin refusing to update his estimate of Earth's age even after radioactive dating proved him wrong. At that point, his dissent stopped being valuable and started being stubborn.
The Consensus Trap
But here's where I'm going to make you uncomfortable again. The problem with science deniers isn't that they disagree with scientific consensus. Because if disagreeing with consensus were the sin, then Galileo, Newton, Einstein—they'd all be in the same category as flat-earthers and anti-vaxxers.
Scientific consensus isn't some sacred end-state that we should worship. Some of the healthiest scientific communities are those brimming with valuable misunderstandings but lacking consensus. They're exploring multiple lines of research, critically engaging each other, building new models and methods.
Consensus can be the result of groupthink. It can be the product of laziness. It can reflect resistance to alternative viewpoints. A community that abhors valuable misunderstandings but maintains consensus is probably doing something wrong.
What matters isn't consensus itself—it's how that consensus was achieved. Was it reached through robust mechanisms for cultivating valuable misunderstandings? Did the community effectively respond to dissent, disagreement, and alternative explanations? If yes, then that consensus is trustworthy. If no, it's just agreement.
The smell of good science isn't conformity. It's the metallic tang of ideas being stress-tested, the friction of theories rubbing against evidence, the texture of old certainties being worn smooth by new discoveries.
The Defunding Danger
This is why defunding science is so much more dangerous than denying science. Science denial is annoying, frustrating, sometimes harmful. But as long as the corrective processes remain intact—as long as scientific institutions have the resources to transform misunderstandings into understanding—denialism can be handled.
When you defund research programs, though, you're not just disagreeing with scientists. You're dismantling the very mechanisms that turn confusion into clarity. You're removing the infrastructure that makes valuable misunderstandings possible.
Take the recent announcement that the US Department of Health and Human Services is cancelling nearly half a billion dollars in mRNA vaccine research. That's not just about one technology or one line of enquiry. It's about gutting the capacity to respond to future misunderstandings, to correct new errors, to transform tomorrow's confusion into understanding.
Without funding, scientists can't build new models. They can't collect new data. They can't develop new methods. The corrective processes grind to a halt. And when those processes stop, so does scientific progress.
The Humility We Need
All of this demands something uncomfortable from everyone involved in science—epistemic humility. And I mean everyone: scientists, politicians, educators, journalists, and yes, the public.
If you're a scientist, you need to hold your understanding lightly. You might be the next Kelvin—brilliant but wrong in ways you can't yet see. Or you might be one of Kelvin's critics, in which case you need to be generous enough to find value in someone else's error.
If you're a politician making decisions about scientific funding, you need to understand that your role isn't just to support research you agree with. It's to maintain the corrective processes that make scientific progress possible—even when they produce results that make you uncomfortable.
If you're an educator or journalist, your job isn't just to report what scientists currently understand or to present "both sides" of every disagreement. It's to help people understand how scientists grapple with misunderstanding, how they transform dissent into discovery.
And if you're part of the public trying to decide whether to trust scientific experts? Look for evidence of corrective processes. Look for communities that take dissent seriously but don't let misunderstandings linger past their value. Look for scientists who update their views when evidence demands it.
What I've Learned From Being Wrong
I started this essay with three provocative ideas that sounded mad. Let me tell you what I've learned about each.
Science deniers aren't the real threat—the dismantling of corrective processes is. Scientific consensus isn't overrated; we've just been measuring it wrong. And yes, some of history's most valuable contributions came from people who were spectacularly wrong.
But the deeper lesson is this: science isn't really about being right. It's about building reliable processes for becoming less wrong over time. It's about communities that can transform misunderstanding into understanding, confusion into clarity, and dissent into discovery.
The next time someone challenges a scientific consensus, your first question shouldn't be "Are they wrong?" Of course they might be wrong. Your first question should be "Can their challenge spark valuable corrections?" And your second question should be "Do we still have the institutional capacity to perform those corrections?"
Because that's where the real fight is happening. Not in the disagreement itself, but in our collective ability to do something productive with it.
From my corner in Bologna, watching the world wrestle with these questions, I keep coming back to Darwin's response to Kelvin. He didn't claim certainty. He didn't dismiss the challenge. He acknowledged the limits of what they knew and kept working.
Maybe that's the model we need right now. Not unshakeable confidence in current understanding, but unshakeable commitment to the processes that improve understanding. Not worship of consensus, but trust in correction.
Being wrong isn't the enemy of science. Staying wrong is.
Written for you by Gerd Dani of Free AstroScience, where we believe the best way to understand complex ideas is to watch smart people work through being wrong.

Post a Comment