Who Decides You're Beautiful? The Shocking Truth About AI Beauty Apps
How algorithms are secretly programming our self-worth and what we can do to fight back
I'll be honest with you—I never thought I'd be writing about beauty apps at 2 AM, but here we are. Sometimes the most important conversations start in the most unexpected places, and this one began when I stumbled across a fascinating piece by Italian writer Barbara Frandino that absolutely floored me.
You see, I've always been fascinated by the intersection of technology and human psychology, but what Frandino uncovered about digital beauty standards made me realise we're facing something far more sinister than just vanity apps. We're witnessing the systematic programming of human self-worth, and frankly, it's time we talked about it.
The Beauty Algorithm That Judges Us All
Picture this: you're scrolling through Instagram when an ad pops up promising to rate your attractiveness. Seems harmless enough, right? Just a bit of fun. But what if I told you that single click is feeding into a massive system that's quietly reshaping how entire generations view themselves?
That's exactly what happened to Frandino when she decided to test one of these beauty-rating apps. The algorithm gave her a brutal 50% attractiveness score but generously awarded her points for "accessibility"—a term that felt, in her words, "vaguely disrespectful." The app then predicted her age as 31, essentially telling her she wasn't beautiful but at least appeared youthful.
What happened next is where things get really troubling. Within hours, her social media feeds were flooded with beauty "solutions"—facial exercises, nose-slimming techniques, miracle creams, and yoga routines promising to make her husband "grateful" for her improved physique. The algorithm had identified her insecurity and immediately began exploiting it.
This isn't just about one woman's experience with a beauty app. This is about how artificial intelligence is systematically categorising, scoring, and commodifying human appearance on a global scale.
Meet the Man Behind Your Beauty Standards
Here's something that might surprise you: the person deciding whether you're attractive enough probably looks nothing like the standards they're programming. Frandino takes us on a virtual journey to meet "Ethan," a composite character representing the young male programmers who create these beauty algorithms.
Ethan lives in Austin (though he could just as easily be in San Francisco or Seattle), surrounded by the sterile minimalism that seems to define tech culture—smooth white walls, polished concrete floors, geometric furniture. His apartment reflects his algorithmic approach to beauty: "smooth, polished, shiny, devoid of any negativity, any wound," as philosopher Byung-Chul Han describes our dominant aesthetic.
The irony? Ethan himself wouldn't score well on the beauty standards he's programming. But that doesn't matter to him because he's not the target market—we are.
Working from his MacBook Pro with multiple monitors, Ethan isn't asking philosophical questions about what makes something beautiful. Instead, he's quantifying it, measuring it, automating it. He feeds his AI millions of photos of faces and bodies that received the most likes and views on social platforms, creating a mathematical average of digital popularity that becomes our beauty standard.
The result? A homogenised ideal featuring predominantly white or vaguely Asian features: plump lips, sharp jawlines, high cheekbones, very light skin, large eyes, and straight, thick hair. It's a beauty standard born from the intersection of Hollywood and Seoul, designed to perform well in Zoom calls and LinkedIn profiles.
The Hidden Bias in Beauty Algorithms
Now, here's where things get particularly disturbing. These algorithms aren't neutral, despite what their creators might claim. Every programmer unconsciously embeds their biases—their preferences, aversions, and worldview—into the code they write.
Consider this shocking example from 2009: Joz Wang, a Taiwanese-American woman, bought a new Nikon camera for Mother's Day. When she tried to photograph her family, the camera refused to take the picture, repeatedly displaying the error message "Someone has blinked." The camera's software, programmed to wait for all subjects to have their eyes open, simply couldn't recognise the different physiognomy of non-Caucasian faces.
The programmers didn't intentionally create a racist camera—they simply programmed it based on their own experience and appearance. But the result was technology that literally couldn't see certain faces as valid.
This same bias permeates beauty algorithms today, but it's far more subtle and therefore more dangerous. As technology writer James Bridle warns in his book "New Dark Age," these biases are now so well-hidden that they're nearly impossible to detect. We're unconsciously adopting the aesthetic preferences of young, white, heterosexual American men without even realising it.
The Commercial Machine Behind Our Insecurities
But here's the thing—this isn't just about biased programmers or flawed algorithms. There's a massive commercial apparatus working to make you feel inadequate, and it's more sophisticated than you might imagine.
The process works like this: social media platforms use AI to establish beauty standards, then bombard you with content that makes you feel insufficient compared to those standards. Once you've internalised this inadequacy, the market swoops in with solutions. Suddenly, every part of your body becomes a separate problem to solve—your face needs contouring, your nose needs slimming, your body needs sculpting.
The advertising isn't framed as vanity, though. Oh no, it's about your "right" to self-care, your "right" to look younger, your "right" to be sexually desirable. It's positioned as empowerment while systematically dismantling your self-worth.
Italian philosopher Maura Gancitano, in her book "Mirror Mirror on the Wall," explains how this objectification serves a political purpose: "It keeps us in a situation of mirroring, makes us identical to ourselves, doesn't free us from ourselves, doesn't shake us up, and doesn't give us time to linger." In other words, when we're obsessed with our appearance, we don't have mental energy left for bigger questions—like who's profiting from our insecurities.
When Algorithms Control What We See
Today's definition of beauty isn't based on artistic tradition, cultural heritage, or personal preference. It's defined by what creates "engagement, traffic, likes, shares"—what's capable of generating attention and, ultimately, profit.
Even when we try to reject these narrow standards, the algorithm adapts. Refuse to engage with content about extreme thinness? The system will show you "body positive" influencers who still happen to be conventionally attractive and selling products. The message changes, but the commercial intent remains the same: associate beauty with consumption.
This is what Frandino experienced when she decided to disengage from beauty content. After just two days of avoiding sponsored posts and influencer content, Facebook offered her a dating app that promised to connect her with someone who would "appreciate her for who she is." The algorithm had simply switched tactics, finding a new angle to exploit her desire for acceptance.
The Global Impact We Can't Ignore
The implications of algorithmic beauty standards extend far beyond individual self-esteem. We're witnessing the homogenisation of human aesthetic ideals on a global scale, with tech companies based primarily in Silicon Valley and Seoul determining what billions of people should aspire to look like.
This isn't just about beauty—it's about cultural imperialism through technology. Local beauty traditions, diverse aesthetic values, and unique features that reflect our heritage are being systematically devalued in favour of a manufactured standard designed to maximise engagement and profit.
Young people, in particular, are growing up in a world where their worth is literally being calculated by algorithms they don't understand, created by people they'll never meet, for purposes that have nothing to do with their wellbeing.
Fighting Back Against Algorithmic Control
So what can we do about this? First, we need to understand that awareness is power. Every time you encounter a beauty filter, sponsored post, or "rate my attractiveness" app, remember that you're not seeing neutral technology—you're seeing the monetisation of human insecurity.
We need to actively diversify our inputs. Follow creators who look different from algorithmic beauty standards. Engage with content that celebrates uniqueness rather than conformity. Support platforms and apps that don't rely on appearance-based engagement.
Most importantly, we need to have these conversations. Talk to young people about how these algorithms work. Discuss the commercial interests behind beauty standards. Share stories about real beauty—the kind that comes from character, creativity, kindness, and authenticity.
Reclaiming Our Right to Define Beauty
As someone who spends considerable time thinking about how technology shapes our world, I believe we're at a crucial crossroads. We can either accept algorithmic beauty standards as inevitable, or we can actively resist the commodification of human worth.
The choice isn't just personal—it's political. Every time we reject artificial beauty standards, we're asserting our right to define ourselves rather than be defined by algorithms designed to profit from our insecurities.
Beauty has always been complex, subjective, and deeply personal. It reflects our cultures, our experiences, our humanity. We can't let it be reduced to a percentage score calculated by an app designed by someone who's never met us and doesn't care about our wellbeing.
The technology exists to serve us, not the other way around. It's time we started acting like it.
At Free AstroScience, we believe in making complex technological concepts accessible to everyone. If this article made you think differently about the intersection of technology and human psychology, that's exactly what we're here for. The future belongs to those who understand how these systems work—and more importantly, how to resist their more harmful effects.
Post a Comment