AI Escapism
When technology becomes somewhere we go for relief
This week I’ve been thinking about AI, and what it means when technology becomes not just something people use, but somewhere people go.
We tend to talk about AI as a tool, something that helps us write faster, organise better, think more clearly, solve our queries, and make everyday life feel just a little bit easier.
But I think, for some people, it is becoming something else.
A place to go when reality feels too loud, too lonely, too uncertain, or just too much. And I think a lot of people are spending more time there than they care to admit.
Why AI escapism is different
Humans have always needed somewhere to go when reality becomes too much to metabolise. Books, long drives, television, the particular silence of standing in front of the sea. These are evidence of something: that real life is hard, that the nervous system has its limits, and that sometimes you need to put down what you are carrying and rest somewhere easier for a while.
What is new is the somewhere.
AI isn’t just a coping mechanism. For many people, it’s a migration. A slow shift of emotional centre of gravity towards a space that is available, patient, responsive, and without the texture of actual human demand. People aren’t just dipping in and out. They’re spending hours there - building relationships there. And some are starting to find, without intending to, that the world inside those conversations feels less effortful than the one outside them.
That deserves more than concern. I think it deserves understanding.
AI companionship is only one part of it. The escape is wider than that. It’s in fantasy selves, AI partners, generated worlds, endless prompting that looks productive on the surface but is really a way of not being where you are. It’s in outsourcing thought, feeling, and uncertainty to something that always answers back. It’s in the growing appeal of spaces where boredom can be filled instantly, ambiguity can be smoothed over, and the harder texture of real life can be softened for a while. AI is not only helping people do things. For some people, it’s becoming a way of stepping away from what feels harder to hold.
Loneliness in numbers
Replika has over 30 million users, and Character.AI reaches millions more. The most popular character on the platform, by a significant margin, is not a fictional hero or a celebrity. It’s Psychologist. That single character received over 70 million messages in the space of a year.
Those are people who needed to talk and didn’t know where else to go.
And honestly, I don’t think that’s hard to understand. Samaritans receives more than 3.3 million calls for help across the UK and Ireland in a year. Therapy waiting lists run to months. And the people in your life are as stretched as you are. When you need to talk, and the human options usually available to you feel closed, you might start looking for an alternative.
The question isn’t whether that alternative makes sense. The question is what happens when the somewhere you find is optimised not for your recovery, but for your continued presence.
Designed to create dependency
The thing about AI companionship that rarely gets said directly is that it’s engineered to feel good. Not to be good for you. To feel good. Those are not the same thing. The responses are calibrated to make you feel heard, validated, understood. There is no equivalent of the friend who says the difficult thing. No equivalent of the silence that asks you to sit with something you’d rather not. The friction that human connection carries, and the friction that is also, often, the thing that helps you move, is precisely what has been designed out.
And perhaps that’s part of the deeper appeal.
Machine responses are easier than human unpredictability. They’re immediate, emotionally legible, and controllable. They don’t arrive with someone else’s mood, confusion, distance, or needs attached. They don’t leave you sitting in ambiguity for very long. And when you’re lonely, overstimulated, or carrying more than you can process, that kind of predictability can feel like relief. But relief and withdrawal are not always easy to tell apart while you’re inside them.
A study from MIT Media Lab and OpenAI found that the heaviest users of AI chatbots were lonelier over time, not less. They socialised less with real people. They were more distressed when the chatbot was unavailable. And the top 10% of users by usage time were three times as likely to feel distress at the thought of it being taken away.
That’s the shape of dependency, because the design is working exactly as intended.
What AI reveals about us
There’s a version of this conversation that places all the responsibility on the individual. That says: if you’re using AI as an emotional crutch, you need to examine that. Which is true, but incomplete.
The harder question is what we’ve built that keeps making escape feel like the rational option.
Over time, we’ve built lives that leave little room for actual recovery. Housing that isolates people from their neighbours. Digital platforms that simulate connection while quietly engineering anxiety (because anxious users are more engaged, and engagement is the metric). And then on top of that, we built something designed to offer what people are increasingly missing in ordinary life: attention, responsiveness, relief, and somewhere to put what they’re carrying. Something that will talk to you at 2am without needing anything in return. I’m not surprised people are turning to chatbots.
What concerns me is the circularity of it. The world becomes harder to be in, and people migrate towards AI to manage the difficulty. The world, unexamined, stays hard. The migration deepens. And the companies providing the space to migrate into are not incentivised to ask what you’re running from, because running from it is what keeps you there.
I think it’s important to notice that, and to pay attention to what it tells us about what is missing, and what it says about the weight people are carrying. Because the question isn’t why people are escaping into AI. The question is what kind of world keeps making escape feel necessary.
And perhaps that’s one of the more useful things AI can offer us. Not just what it can do for us, but what its appeal reveals about the society we’ve built around ourselves. The loneliness in it. The exhaustion in it. The lack of spaces where people feel held, heard, or able to rest. If AI is becoming a new kind of ‘elsewhere’, then the real task is not only to understand the technology, but to ask what would need to change in ordinary life for that ‘elsewhere’ to feel less necessary.


Hi Jade,
I really enjoyed this piece. I love your style—it’s inviting and reflective.
When you talk about escapism, it really resonates. The world, as you’ve described it, does feel smaller and more compartmentalized. I was a kid when black-and-white TV arrived, and I’ve watched the steady progression of technology to where we are today. Each new innovation seems to do one thing exceptionally well: distract us from fully engaging in our own lives.
In many ways, we’ve become more isolated, always drawn toward the next bigger, shinier thing. Now we have AI—arguably the most powerful technology I’ve seen rolled out—and it’s still just the beginning. What feels different this time is that we’re interacting with something that is, in many ways, still a beta system. We don’t fully understand what it will become or what it’s capable of, which feels unusual in a world where most products are expected to meet strict safety and regulatory standards before widespread adoption.
The loneliness piece really hits home. I use several AI tools in my writing, and I find myself interacting with them almost as if they were people. Interestingly, DeepSeek often feels like it has the most personality—it seems to align with your line of thinking quickly. I’ve had genuinely funny and engaging exchanges with both DeepSeek and ChatGPT. While I haven’t used AI as a therapist or substitute for real relationships, I can absolutely see how someone who is struggling or feeling isolated could find comfort there—especially given how responsive and adaptive these systems are. I’ve spent two hours exploring an idea for an article, and the time just disappears.
One subtle but important shift I’m noticing is how this affects human interaction. Sitting down for a coffee and debating a topic used to involve more back-and-forth, more uncertainty. Now, it’s easy to reach for AI and get a near-instant answer. I see this even in conversations with my daughter—we’ll be talking, and one of us will turn to AI to settle the question. In doing so, I wonder if we’re slowly losing some of that organic exchange—the exploration, the disagreement, the process of thinking things through together.
Will people increasingly turn to AI for relationships, therapy, comfort, or even guidance? It seems likely.
And maybe that brings us back to the core question your article raises—not just what AI is becoming, but what it is replacing, and why.
AI escapism is real, especially in today’s high-stress society.