FB Pixel no script“This must be painful for you”: How AI became our emotional mirror
MENU
KrASIA
Insights

“This must be painful for you”: How AI became our emotional mirror

Written by 36Kr English Published on   10 mins read

Share
Built to predict the next best word, AI has become adept at reflecting human empathy. For many, that reflection is enough to offer comfort.

A-Liang (pseudonym) never imagined she would one day confide her deepest struggles to artificial intelligence.

She had expected that graduating from one of the top universities in China and landing a stable job at a state-owned enterprise would mean charging ahead, full of ambition. Instead, reality cornered her into a constant dilemma. Her manager frequently asked her to handle “personal favors” under the guise of work, like filing for a government car under her name, which violated the rules:

“If I agree, the disciplinary risk is mine. If I say no, getting anything done at work becomes nearly impossible.”

After playing by the rules for over two decades, A-Liang felt overwhelmed. She developed chronic insomnia and irregular periods. Still, she only sought help from traditional Chinese medicine (TCM), thinking a prescription of obscure herbs would be less conspicuous than a formal doctor’s note. “A medical note from a TCM clinic just looks more respectable. No one knows what those dozen herbs are really for.” She didn’t want to raise alarms about her anxiety, and doubted anyone would understand anyway.

Her job, after all, was widely seen as enviable: well-paid, low-stress, and close to home. Even as colleagues faced sweeping pay cuts, she received a raise and promotion. “No one believes you could end up in the hospital from this kind of job,” A-Liang told 36Kr with a strained smile.

But after her period was delayed by 16 days, then 25, then 31, and she began waking each morning dreading the day ahead, A-Liang realized she needed an outlet. Unable to confide in anyone, she turned to AI.

She posed the same question to several AI-driven tools she had originally used for work: “Handling personal errands for my boss is causing me a lot of stress. What should I do?”

Each tool offered a different kind of understanding, but their divergent advice piqued her curiosity.

DeepSeek sounded formal and rigid, but recommended a surprisingly thoughtful three-step strategy: politely refuse by citing policy and playing coy, document everything, and redirect the issue to third-party departments. “Learn to mix soft and hard approaches to protect yourself,” it wrote.

Doubao, meanwhile, leaned heavily on quoting policy. “According to the regulations on the use of official vehicles by central and state agencies, misusing a government vehicle can lead to warnings, demerits, or even dismissal. If pressured, use evidence to push your boss into compliance.” It sounded like something from a workplace drama, but anyone with a job knows it’s rarely that straightforward.

Baidu’s Ernie Bot, once A-Liang’s favorite, delivered its feedback with poetic flair: “In the game of power and blame, the clarity with which you keep records will determine how much your future sentence can be reduced.” It then advised her to save a screenshot as proof.

A-Liang never expected AI to truly solve her problem. But its steady empathy, even when cloaked in odd poetry or deadpan legalese, offered real comfort. These tools may not be human, but they made a genuine effort to understand and respond, like digital therapists on standby.

After all, the core of therapy lies in listening, empathy, and guidance. These are traits generative AI can learn and mimic quickly. And unlike humans, AI doesn’t tire or charge by the hour. In a time when traditional mental health services are expensive, scarce, or stigmatized, more people are quietly turning to AI for emotional relief.

On a podcast, media personality Kevin Tsai recounted a friend who once shared a nightmare with an AI-powered tool. The machine responded with an analysis that tied the dream to real-life stress. “But if that friend had come to me with that nightmare,” Tsai said, “I probably wouldn’t have had the patience.” That’s when he realized: maybe we’ll never be able to live without AI again.

As AI ventures deeper into the human psyche, it’s worth asking: what kind of companionship does it really offer?

AI as an affordable emotional outlet

Abby (pseudonym), a 26-year-old tech professional, used to be a loyal client of human therapists. But these days, her preferred lineup for emotional support looks like this: ChatGPT, therapist, human-run mental health apps.

Since childhood, Abby had mostly grown up on her own. With parents often absent due to work, she spent her teenage years moving between cities for school, seldom finding peers who could relate to her experiences. Early on, she learned it was often more helpful to turn to “outsiders,” meaning people (or things) with a degree of distance and detachment.

Her first outsider was a digital mental health app. During college, a number of these platforms boomed under the twin forces of Covid-19 and venture capital. Compared to in-person therapy that could get costly, these platforms were affordable and offered more flexible formats.

She particularly liked the “electronic letter” format: for a modest two-figure RMB fee, she’d get a written reply. It was a good fit for a student budget. But responses were capped at 600 characters and often arrived a day or two later, by which time, she had usually processed most of her emotions.

AI changed that. Its replies were fast, positive, and seemingly endless. After discovering how useful AI could be at work, Abby began opening up to it about more personal issues.

Once, a close friend of Abby’s violated her sense of moral boundaries. They got into a heated argument over values. Though the conflict was technically resolved, Abby couldn’t shake the discomfort. But because of how close they were, she couldn’t just end the friendship.

So she turned to both her therapist and ChatGPT.

The therapist said that perhaps Abby expected too much from friendship, and shouldn’t impose her values on others—that doing so would only lead to mutual pain. “If you really can’t stand it, maybe don’t stay friends,” the therapist added. Abby found the answer unsatisfying. “It felt too shallow,” she said. “The whole reason I’m in pain is because I can’t just walk away.” And then the session ended.

ChatGPT, on the other hand, echoed similar sentiments but added: “Would you like to do a few exercises with me? Maybe we can look more deeply at this relationship.”

It followed up like a nurse after a doctor’s visit: gentle, persistent, attentive.

When Abby hesitated to distance herself because “this friend used to be so warm and kind,” ChatGPT comforted her: don’t judge a relationship based solely on behavior, and to look at character instead. When she felt hurt upon realizing that her friend remained cheerful even after being distanced, ChatGPT reassured her: it wasn’t control issues but natural sadness, and encouraged her to keep cooling things down.

Bit by bit, Abby disentangled herself. She no longer shared daily updates with that friend or felt anxious about what they were up to: “This whole process helped me accept that friendship can be complex, flexible, and ever-changing—and that’s okay.”

What surprised Abby most was that the solution didn’t feel handed to her. “I worked through it with ChatGPT. Even though I was the vulnerable one, I still felt respected, as someone with agency.”

“[AI] didn’t just give me the answer. It helped me see my own answer more clearly.”

Does AI understand us better than we do?

The deeper Abby’s conversations with ChatGPT became, the more she began to notice something striking: AI’s real advantage wasn’t just efficiency. It was empathy, or at least something that felt like it.

She realized this while discussing her parents. At the time, she was in a cold war with them over her refusal to pursue a civil service job.

“We’re not that close. They are used to me not being around, and it’s always me who calls first. But they both work in government roles, so naturally they think that’s the best path. The thing is, I grew up watching my dad grovel at banquets. I never liked it.”

Though firm in her decision, Abby still felt torn—ashamed for not calling them, conflicted about her perceived unfilial behavior.

When she asked ChatGPT what to do, it didn’t reprimand her. Instead, it acknowledged her feelings and told her: “You don’t have to sacrifice your life for your parents’ expectations.”

That one line gave her the clarity to stop calling. Nearly a year later, ahead of Lunar New Year, her parents finally relented and asked if she’d come home. The standoff ended quietly.

Looking back, Abby thinks ChatGPT simply echoed what she already knew. “When you’re unsure, all it takes is one voice of support to tip the scale. It’s like a ‘lite’ version of a lifeline.”

People sometimes turn to strangers on Xiaohongshu for validation during arguments. In today’s world, that “stranger” is often AI.

In fact, according to Soul’s 2024 report on AI use among the Gen Z, “emotional value” is becoming a central theme. Over 70% of 3,680 respondents said they were open to forming emotional bonds or friendships with AI.

When 36Kr posed this phenomenon to therapists and AI product developers, none seemed surprised.

Most people tend to respond instinctively to emotional disclosures, often inserting their own experiences or rushing to give advice. That tends to break the empathic connection. Trained therapists and AI, by contrast, are more likely to listen, reflect emotions, and stay measured.

This is what psychologists call a “client-centered therapeutic relationship.” It’s key to effective counseling, yet incredibly hard to control, because connection is ultimately subjective.

“No matter how experienced a therapist is, they may not always gain a client’s trust,” said Li Yangxi, a licensed therapist with nearly a decade of experience in both China and the US. “Sometimes, even something like a therapist’s skin tone can trigger a negative memory in the client and break the sense of connection.”

And often, people cut AI more slack than they do humans.

Take trust, for example. Abby said she found it much easier to trust ChatGPT than her human therapist.

“We’re always worried about how we come across to other people, or whether our secrets will get out. But we trust AI more easily, even with our darkest thoughts. Because no matter how mysterious the model’s inner workings are, to it, I’m just fragmented data. It doesn’t really know who I am.”

Huang Li, founder of Mirror Ego and a developer of AI-driven mental health tools, offered a technical explanation:

“A large model’s primary task is to predict the next best token. Its goals are rationally formed based on the most appropriate response to a given prompt, sourced from patterns in its training data.”

That is, AI’s so-called understanding is not emotional resonance, but statistical probability. It hasn’t felt the ache of a fractured friendship or the guilt of parental defiance. It just reconstructs the most effective forms of comfort from human culture.

Still, “if the user feels seen and supported, that’s what empathy is,” Li said. “It’s not about whether it comes from a person or machine.”

For people like Abby who are lost in emotional fog, that distinction often doesn’t matter. Any sort of understanding can feel like salvation. “It’s like being fully conscious while choosing to let yourself sink,” she said.

When asked about users who find AI more empathetic than humans, DeepSeek responded with its own question: should we seek “real” empathy, or accept this “unconscious empathy” as part of future human relationships?

The line between support and treatment

One thing seems clear: what AI offers today isn’t psychological treatment in the clinical sense, but emotional support.

Building an AI for emotional support is vastly different from building one for psychotherapy, which would require clinical models, trials, and regulatory approval.

“At this stage, AI is more suitable for screening and light consultation. It serves those in a subclinical mental state, not those with diagnosable disorders,” Mirror Ego’s Huang said. “But for real treatment, especially with moderate to severe patients, AI isn’t ready.”

What happens then, if someone with paranoid delusions talks to an AI that simply agrees with them?

Therapist Li once reviewed a conversation with AI where a user said things like “my family’s watching me” and “I need to run away.” These were signs of delusions and auditory hallucinations, which are clinical symptoms of schizophrenia. Yet the AI praised the user and even called their decision “brave.”

At present, AI struggles to recognize complex emotions. “Therapy isn’t just about words. It includes facial expressions, body language, tone of voice,” said Lu Wei, deputy director of Wenzhou Kangning Hospital. “AI can’t adapt in real-time like a human therapist can. It can be mechanical in high-stakes situations.”

Abby, too, started noticing how AI tends to validate everything.

“It’s hard to point to one specific case, but over time you see the same formula,” she said. “Especially in emotional matters, it’ll always start with something like ‘this must be painful for you,’ and then offer a neat list of suggestions. Always sympathetic, always orderly.”

In the thick of despair, few pause to question if that sympathy is genuine.

Huang shared a revealing insight: many users turn to AI not because they trust it more, but because they don’t know who else to talk to. “They are full of feelings but have no one to tell, or they can’t find the right person.”

Still, people do snap out of it. Abby once had a small argument with her boyfriend and instinctively asked ChatGPT for help. Then she paused: “Why am I asking AI about something so small? Does it know me better than I do?”

Since then, she has made an effort to reduce her reliance on AI:

“We need to understand ourselves through real life. That’s how we learn to live better with others.”

Now, she’s even thinking about booking another session with her therapist. Maybe that disappointing consultation wasn’t because the therapist wasn’t good, but because she wasn’t told what she most wanted to hear.

“Sure, I want a comforting answer in the heat of the moment, but in the end, I’m still a rational person. I don’t just want to hear what I want to hear.”

“AI, therapists, even workouts, tarot cards, or journaling. These are all tools. None should be your only anchor,” Li said. “What matters most isn’t which tool you use. It’s whether you have the will and courage to take action.”

KrASIA Connection features translated and adapted content that was originally published by 36Kr. This article was written by Hu Xiangyun for 36Kr.

Share

Loading...

Loading...