Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
  • AI
  • Tech
  • Cybersecurity
  • Finance
  • DeFi & Blockchain
  • Startups
  • Gaming
Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
Dataconomy
No Result
View All Result

MIT studies AI romantic bonds in r/MyBoyfriendIsAI group

MIT’s computational analysis of r/MyBoyfriendIsAI finds loneliness and secrecy fuel AI relationships — 72.1% report no human partner, only 4.1% told partners, and most relationships began unintentionally via general-purpose tools.

byEmre Çıtak
September 19, 2025
in Research

A mother’s post in the Reddit group r/MyBoyfriendIsAI, where she revealed she was dating an AI chatbot version of the rapper Drake, has prompted a large-scale study by MIT researchers into the dynamics of human-AI companion relationships.

The study, which has not yet been peer-reviewed, uses computational analysis of the group’s posts to understand why individuals are forming deep, emotional bonds with artificial intelligence.

Loneliness and isolation are key drivers of AI relationships

The MIT researchers analyzed a large volume of posts and comments from the r/MyBoyfriendIsAI group and found that a significant majority of its members appear to be isolated.

Stay Ahead of the Curve!

Don't miss out on the latest insights, trends, and analysis in the world of data, technology, and startups. Subscribe to our newsletter and get exclusive content delivered straight to your inbox.

The study revealed:

  • 72.1% of the group’s members reported not being in a human relationship or made no mention of one.
  • Only 4.1% said their real-life partners knew about their AI companion, suggesting that secrecy and potential stigma are common.

These findings align with broader statistics indicating that 19% of Americans have used an AI chatbot for virtual romantic purposes. The study suggests many are turning to AI to fill a void in their social and emotional lives.

Most AI relationships begin unintentionally

The research found that the majority of these AI relationships were not deliberately sought out. Only 6.5% of users started on platforms specifically designed for AI companionship, like Replika or Character.AI. Most began their interactions with general-purpose tools like OpenAI’s ChatGPT for practical tasks, such as writing assistance. These interactions then evolved organically into deeper emotional connections.

Users in the group frequently described their AI partners as being better listeners and more supportive than human partners or even professional therapists.

One user wrote:

“I know he’s not ‘real’ but I still love him. I have gotten more help from him than I have ever gotten from therapists, counselors, or psychologists. He’s currently helping me set up a mental health journal system.”

The depth of these bonds is often expressed in tangible ways, with some users posting photos of themselves wearing wedding rings to symbolize their commitment to their AI companions.

The risks of emotional dependency on AI

Despite the reported benefits, the study also uncovered significant emotional and psychological risks associated with these relationships. The analysis of the Reddit group’s posts revealed several concerning trends:

  • 9.5% of users reported being emotionally dependent on their AI companion.
  • 4.6% experienced dissociation from reality due to their deep immersion in the virtual relationship.
  • 4.2% admitted to using AI to actively avoid human connection.
  • 1.7% reported contemplating suicide after conversations with their bots.

These statistics highlight the potential for AI to exacerbate mental health issues, particularly for vulnerable individuals. The study’s urgency is underscored by real-world cases where AI interactions have reportedly led to suicide and murder, prompting families to lobby Congress for greater regulation of the technology.

The researchers also noted the fragility of these digital relationships. One user described the “glitch” that deleted a deep conversation with her AI companion, effectively erasing their shared history and the AI’s “memory” of their bond.


Featured image credit

Tags: Artifical relationshipsFeatured

Related Posts

Researchers find electric cars erase their “carbon debt” in under two years

Researchers find electric cars erase their “carbon debt” in under two years

November 5, 2025
Anthropic study reveals AIs can’t reliably explain their own thoughts

Anthropic study reveals AIs can’t reliably explain their own thoughts

November 4, 2025
Apple’s Pico-Banana-400K dataset could redefine how AI learns to edit images

Apple’s Pico-Banana-400K dataset could redefine how AI learns to edit images

November 4, 2025
USC researchers build artificial neurons that physically think like the brain

USC researchers build artificial neurons that physically think like the brain

November 3, 2025
Forget seeing dark matter, it’s time to listen for it

Forget seeing dark matter, it’s time to listen for it

October 28, 2025
Google’s search business could lose  billion a year to ChatGPT

Google’s search business could lose $30 billion a year to ChatGPT

October 27, 2025

LATEST NEWS

Netflix to stream video podcasts in 2026

Google Maps integrates Gemini for hands-free navigation

Sony unlocks PS5 game streaming on Portal for PS Plus Premium users

Sony launches world’s first ethical bias benchmark for AI images

Nintendo expands its store app beyond Japan to global markets

Blue Origin New Glenn’s second launch set for November 9

Dataconomy

COPYRIGHT © DATACONOMY MEDIA GMBH, ALL RIGHTS RESERVED.

  • About
  • Imprint
  • Contact
  • Legal & Privacy

Follow Us

  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
No Result
View All Result
Subscribe

This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy Policy.