Is AI Destroying Our Trust in Everything?


Hello, and welcome. It's Gerd, and I'm glad you're here at FreeAstroScience, where we try to make sense of a complex world together. Recently, I’ve been grappling with a profound sense of unease. It’s that feeling you get when you see a stunning piece of art, a shocking news headline, or a groundbreaking scientific claim online, and your first thought isn’t awe or curiosity, but a quiet, nagging question: Is this even real? This isn't just healthy scepticism; it’s a symptom of a much deeper problem—a new crisis of cultural credibility fuelled by Artificial Intelligence.

Now, you'll hear some pretty bold claims about this new era. Some will tell you that AI is the ultimate democratiser of creativity, finally making everyone an artist. Others will argue that with AI, we can achieve perfectly objective, data-driven news. And you'll definitely hear the corporate line that AI will simply automate tedious tasks, freeing us humans for higher-level thinking. But I think these views are dangerously naive. When a machine can mimic a master's brushstrokes without the years of struggle, does it elevate us, or does it devalue the very idea of craft? When the same technology that could report facts can also generate hyper-realistic deepfakes, are we closer to truth or a more sophisticated form of deception? And when our most revered cultural institutions start censoring human art out of fear of what an algorithm-fuelled mob might do, are we really being freed to think, or are we being quietly corralled into safer, blander pastures?

Let's face the facts about what's happening.


+++


The Authenticity Gap: When Clicks Replace Craft

I recently read a fascinating, and quite sharp, critique of André 3000’s latest piano project . He, a musical icon, released a series of improvisations, openly admitting he can’t name the chords he’s playing but was inspired by titans like Thelonious Monk. The author, Sean Murphy, contrasted this with the obsessive dedication of masters like the sushi chef Jiro, who makes apprentices work on rice for three years before they can even touch fish, or the jazz drummer Max Roach, who spent a lifetime perfecting his rhythm . Their genius wasn't a happy accident; it was earned through thousands of hours of repetition, respect, and a relentless pursuit of perfection .

This, right here, is the heart of the matter. AI is the ultimate shortcut. It allows anyone to generate content that looks the part without the lived experience, the struggle, and the soul that makes great art resonate. It’s the digital equivalent of being a dilettante, and while there's nothing wrong with dabbling, the problem arises when the counterfeit is indistinguishable from the authentic. We're already seeing this rot in other fields. In biomedical research, for instance, AI-driven "paper mills" are churning out hundreds of formulaic, low-quality studies that look legitimate but are often just superficial correlations, flooding science with misleading information.

When the process is devalued—when the "10,000 hours" of dedication can be simulated in ten seconds—we lose more than just a metric for effort. We lose the story, the humanity, and the very soul of the work. And as consumers of this content, we are left adrift in a sea of artifice, unable to tell if we’re experiencing a moment of human genius or a clever algorithmic echo.

The New Gatekeepers: Fear, Algorithms, and Context Collapse

You might think our cultural institutions—our museums, galleries, and presses—would be the bulwark against this tide of inauthenticity. But something strange and frankly chilling is happening. They're becoming afraid. Not of government censors, but of you, me, and the unpredictable power of the algorithmic social sphere .

Researchers call this a "wicked problem" . It works like this: social media platforms, governed by machine-learning algorithms, create what’s known as algorithmic sociality. They feed us content designed to provoke a reaction, creating echo chambers and filter bubbles . This environment is ripe for what’s called **"context collapse"**—where an image or idea is stripped of its original meaning and goes viral for all the wrong reasons . The fear of a "digital wildfire," a sudden and uncontrollable online mob, is now a major factor in cultural decision-making .

Take the case of the Philip Guston art exhibition. In 2020, four major museums, including the Tate Modern, postponed a retrospective of his work. Why? Because some of his most powerful anti-racist paintings feature cartoonish, hooded Ku Klux Klan figures, which Guston used to explore the "banality of evil" and his own complicity in a racist society . The museum directors feared that in the "volatile climate" of the time, these images, taken out of context and spread online, could cause "harm and pain" . They chose pre-emptive censorship over trusting their audience's intelligence. They were afraid of a controversy that hadn't even happened yet.

This isn’t an isolated incident. In Lithuania, a public art project that involved wrapping a controversial Soviet-era statue in moss—a metaphor for time and healing—was forcibly dismantled by city authorities after a single misinformed Facebook post sparked fear of a backlash . The city officials were literally afraid of what might happen if someone saw the art . These institutions are no longer just curating culture; they're managing risk in a digital world they can't control. The result is a quiet, insidious censorship that robs us of the chance to engage with difficult, challenging, and important work.

So, What Do We Do? The Human Antidote to an Algorithmic World

It’s easy to feel hopeless, isn’t it? If our experts are running scared and our creative fields are being flooded with fakes, where do we turn? Some suggest we should fight AI with AI—using algorithms to spot suspicious patterns in scientific papers or to identify deepfakes. That’s part of the solution, for sure. But it’s not the whole story.

The truest antidote to impersonal technology is, and always has been, more authentic humanity. As Sean Murphy so brilliantly put it, "The best way to resist bad stories is telling better ones" . This crisis calls for a renewed commitment from all of us. For creators, it means continuing to do the hard work—to find originality and soul where a machine can only find patterns. For institutions, it means finding the courage to trust the public again, to facilitate difficult conversations rather than running from them.

And for us, the audience, it means becoming more discerning consumers. It means championing authentic voices and demanding transparency. It is a collective responsibility to safeguard the truth and protect the integrity of our shared culture. We must shift our internal value system from rewarding what is merely slick and fast to what is thoughtful, genuine, and deeply human. The truth, after all, will not stand if we do not defend it.

This journey won't be easy. The digital transformation has created a governability gap, a complex problem without a simple solution. But the first step is recognising the challenge for what it is—not a war against machines, but a fight for the credibility of our own culture. It's a fight for the idea that some things, like mastery, authenticity, and soul, are worth the effort.

Thanks for thinking through this with me.

All the best, Gerd Dani FreeAstroScience


Post a Comment

Previous Post Next Post