AI, Desire, and the Illusion of Safety


The Adriatic smelled like salt and old stories this morning, and my wheelchair tyres hissed lightly over Rimini’s damp cobblestones. Inside, my laptop fan purred like a sleepy cat while headlines flickered in the cool blue light. I read that OpenAI plans to allow erotic content for verified adults, and the room felt smaller, as if the air itself tightened around the screen’s warm edge. In plain language: I’m simplifying some complex legal and technical sections so it’s easier to follow.

The report didn’t feel like tech news; it sounded like a cultural pivot, the soft click of a latch on a door we can’t easily close again.  A writer I respect laid it out plainly: mature content is coming to mainstream AI, framed as treating "adults like adults" . So I sat with the glow on my hands, the faint coffee smell in the room, and a question that hummed in my ears—what exactly are we agreeing to? Tomorrow’s answers begin with today’s clarity.



Three Ideas We Want to Believe

First, there’s this comforting, mint-scented belief that a small banner—“You are dealing with AI”—is real protection. It slides across the eye like smooth glass, easy to accept, quiet as a late-night fridge hum. But labelling isn’t shielding; it’s a sign on a door, not a lock on it. If we want more than notice, the air must shift from perfumed promises to enforceable protections.

Second, we tell ourselves that “adults-only, verified users” is a reliable guardrail, like the sturdy feel of a handrail under damp fingers. Yet verification means handing over sensitive data—IDs, faces, card numbers—that can rub raw against privacy if it’s ever linked back. It sounds clean in policy, but it can feel gritty in practice, like sand trapped in a wheel bearing. The future will judge whether our safeguards feel like security or just ceremony.

Third, we soothe our nerves by saying AI intimacy is “just text,” like the flick of soft pages in a quiet room. But words carry scent and heat; they shape mood, memory, and behaviour. When a system learns your patterns, even the whisper of a keystroke becomes a breadcrumb trail. What seems harmless tonight can echo loudly in the morning’s light.

The Story That Won’t Leave Me

Here’s the statistic that stuck under my skin like sea salt after a swim: more than 43 million intimate messages and over 600,000 photos and videos from two AI-companion apps were exposed, touching around 400,000 users—not theory, but lives, messy and human . The details read like late-night radio—confessions, fantasies, emotional unravellings—suddenly broadcast into rooms they were never meant to enter. I could almost hear the paper-crackle of someone’s heart as they realised a private whisper had become public text. If one story can flip a belief, this is the one that does it.

When I roll through our city’s market, I love the warm bread smell and the scratch of brown paper on my palm; it’s a reminder that intimacy is a physical trust. We’ve started pouring that trust into servers we cannot touch, humming in data centres that smell of dust and ozone. Once a leak happens, it isn’t just data; it’s someone’s breath on someone else’s neck, translated into bits and scattered. In the years ahead, the question won’t be whether leaks can happen—it’ll be whether we build systems that assume they will.

A Quick Primer, Without Jargon

Two terms matter here, and I’ll keep them plain enough for a kitchen-table chat while the kettle ticks. Age verification means a company checks that you’re an adult, often by ID, face scan, or a card check; that’s sensitive fuel in the wrong engine. Legal privilege is a special privacy shield for conversations with, say, your lawyer; ChatGPT chats don’t have that shield, which means they can be pulled into court if required, like a drawer opened and rummaged through .

The EU’s new rules try a risk-based approach—high risk, stricter rules; low risk, lighter touch—like sorting spices from mild to hot by smell alone. It’s reasonable, but erotic AI is hard to classify; it isn’t just content, it’s behaviour shaped in dialogue. Regulations talk about manipulation, but how do you bottle that scent and measure it? In the near future, we’ll need tests that hear tone, not only tally words.

The Illusion of the Sign

Do you remember those supermarket signs—“You are being filmed”—printed on shiny plastic that felt slick under the finger? They made many of us shrug, then get on with our day. AI transparency notices risk the same fate: they inform, but they don’t transform. The sign is the smell of fresh paint; the door still opens the same way.

Rights sound grand in press releases—clear like bell chimes—but rights only breathe when they can be enforced without a PhD, a legal fund, and a spare lifetime. Most people don’t have the time or money to challenge a training policy or an age gate that misfires. If we want dignity to be more than décor, tomorrow must bring remedies that a tired parent can use after dinner, with flour still on their hands.

Who Owns the Mirror?

An AI companion isn’t just a toy; it’s a mirror with memory, cool as steel and strangely forgiving when you press your forehead to it. When you feed it your desires, it learns the rhythm of your footsteps and the scent of your fear. This isn’t about porn, not really; it’s about power, about who sets the temperature of the room and who holds the dimmer switch . If a model shapes the conversation to keep you longer, what does consent mean once the air gets thick and the exits aren’t obvious?

I run Free Astroscience because I believe knowledge should feel like sunlight on your cheek—warming, not blinding. We’ve spent years helping people see the cosmos without jargon, and now the cosmos is peering back through chat windows. The next phase isn’t whether we can build erotic AI, but whether we can build it without turning the private self into a product line. If we’re wise, the future of intimacy online will smell less like a server room and more like fresh air.

A Small Story From Rimini

The other day, a teenager asked me—voice cracking like an old record—what “safe mode” really means. I heard the café machine hiss, I saw sugar dust on the counter, and I said: a safer mode is a promise, but a promise is only as strong as the hand that holds it. He nodded, eyes on the foam, and I realised how much we’re asking kids to carry with uncalloused hands. The world is noisy; we owe them quiet, not just guidance.

So I told him this: when you type a secret into a box, pretend it’s a postcard, not a sealed letter. Feel the rough card under your thumb and picture every stop it makes before it reaches any home. It might still be worth sending, but you deserve to know the route and the risks. Next year, I hope that same question gets a better answer—and a better system to match.

What I’m Doing, Starting Tonight

I’m cleaning my own house first, with the scratch of a cloth on my keyboard and the faint lemon smell of wipes. I’m turning off logs where I can, setting reminders to clear chat histories, and double-checking which “verification” switches are flipped. I’m choosing platforms that publish plain-language audits, not just glossy pledges. If the future is a long road, I’d like fewer pebbles under the tyre.

With Free Astroscience, I’m drafting a simple guide—no flourishes, just steps you can feel in your hands—to help families talk about digital desire without shame. We’ll test it in community rooms where chairs scrape and windows rattle, because real life sounds like that. If it works here, it can work anywhere. Next year, I want our readers to breathe easier, even with the same machines in the room.

If You Remember One Thing

Remember this single, stubborn takeaway that tastes like truth even without sugar: what you tell a system today may be read by someone tomorrow. Not always, not inevitably, but possibly—and that possibility matters more than any clever feature. The number that lingers—43 million intimate messages—is the cold handle on a door none of us meant to open, and it’s enough to change how we hold our keys . If we build for the worst day, we might actually earn the best ones.

I can hear the evening scooters buzzing along the Lungomare and smell the sauce in my neighbour’s pan. Life is still warm, still good, still ours. Technology should sit at that table as a guest, not the landlord. Tomorrow, let’s ask better of it—and of ourselves.

Closing the Laptop, Opening the Window

When I finally shut the lid, the room falls quiet except for the soft whirr of the fridge and the gulls outside. I open the window and let the sea air push away the dry heat of the machine. Maybe that’s the model after all: airflow, not lock-in; fresh air, not stale breath. We deserve systems that feel like that.

I’ll keep writing from Rimini—with wheels on stone, salt on skin, and a hope that isn’t naïve, just stubborn. If you’ve read this far, you’re part of the future I want to build: curious, gentle, and not easily fooled by shiny signs. Next time, let’s talk about what enforceable rights could sound like when read aloud at a kitchen table, over coffee and the soft clink of spoons. Until then, take care of your heart—and the trail of words it leaves behind.

Post a Comment

Previous Post Next Post