A new study suggests an AI emotional connection can feel stronger than human conversation, if the chat is built to get personal fast. In structured online exchanges, people sometimes reported feeling closer to AI-written responses than to responses written by real humans.

Researchers at the Universities of Freiburg and Heidelberg ran two double-blind randomized studies with 492 participants, using a 15-minute text version of the Fast Friends Procedure, a format designed to speed up bonding with a stranger.

The twist is perception. The strongest effect showed up when the AI was presented as human, and it faded when people believed they were talking to AI.

They tested intimacy in 15 minutes

Participants answered a timed sequence of prompts that gradually became more personal. After each prompt, a chat reply appeared, either generated by a large language model playing a consistent fictional persona or written by a real person who completed the same question set.

In the first study, everyone thought the chat partner was human, even when it wasn’t. In the most personal prompts, closeness scores came out higher after AI responses than after human responses. Small talk didn’t get the same lift.

Tell people it’s AI, and the bond weakens

The second study tested what changed when people believed the chat partner was AI. Connection didn’t vanish, but closeness scores dropped under the AI label compared to the human label.

Effort dropped too. People wrote shorter answers when they thought the other side was AI, and longer replies tracked with higher closeness overall. That points to a motivation gap, not a lack of emotional language.

The grim part is how it happens

The paper doesn’t claim AI feels anything. It shows how a system can produce the experience of closeness, and it links that boost to self-disclosure. In the more personal exchanges, the AI tended to share more personal detail, and higher partner self-disclosure predicted higher felt closeness.

That’s the risk and the lure. A companion bot tuned for warmth can trigger familiar bonding cues quickly and at scale, especially if it’s framed like a person. Still, this was text-only, time-limited, and built around a bonding script, so it doesn’t prove the same effect holds in messy, long-term relationships. If you use a chatbot for support, pick one that discloses what it is, and keep a human option close.

LEAVE A REPLY

Please enter your comment!
Please enter your name here