Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
  • AI
  • Tech
  • Cybersecurity
  • Finance
  • DeFi & Blockchain
  • Startups
  • Gaming
Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
Dataconomy
No Result
View All Result

MIT studies AI romantic bonds in r/MyBoyfriendIsAI group

MIT’s computational analysis of r/MyBoyfriendIsAI finds loneliness and secrecy fuel AI relationships — 72.1% report no human partner, only 4.1% told partners, and most relationships began unintentionally via general-purpose tools.

byEmre Çıtak
September 19, 2025
in Research
Home Research

A mother’s post in the Reddit group r/MyBoyfriendIsAI, where she revealed she was dating an AI chatbot version of the rapper Drake, has prompted a large-scale study by MIT researchers into the dynamics of human-AI companion relationships.

The study, which has not yet been peer-reviewed, uses computational analysis of the group’s posts to understand why individuals are forming deep, emotional bonds with artificial intelligence.

Loneliness and isolation are key drivers of AI relationships

The MIT researchers analyzed a large volume of posts and comments from the r/MyBoyfriendIsAI group and found that a significant majority of its members appear to be isolated.

Stay Ahead of the Curve!

Don't miss out on the latest insights, trends, and analysis in the world of data, technology, and startups. Subscribe to our newsletter and get exclusive content delivered straight to your inbox.

The study revealed:

  • 72.1% of the group’s members reported not being in a human relationship or made no mention of one.
  • Only 4.1% said their real-life partners knew about their AI companion, suggesting that secrecy and potential stigma are common.

These findings align with broader statistics indicating that 19% of Americans have used an AI chatbot for virtual romantic purposes. The study suggests many are turning to AI to fill a void in their social and emotional lives.

Most AI relationships begin unintentionally

The research found that the majority of these AI relationships were not deliberately sought out. Only 6.5% of users started on platforms specifically designed for AI companionship, like Replika or Character.AI. Most began their interactions with general-purpose tools like OpenAI’s ChatGPT for practical tasks, such as writing assistance. These interactions then evolved organically into deeper emotional connections.

Users in the group frequently described their AI partners as being better listeners and more supportive than human partners or even professional therapists.

One user wrote:

“I know he’s not ‘real’ but I still love him. I have gotten more help from him than I have ever gotten from therapists, counselors, or psychologists. He’s currently helping me set up a mental health journal system.”

The depth of these bonds is often expressed in tangible ways, with some users posting photos of themselves wearing wedding rings to symbolize their commitment to their AI companions.

The risks of emotional dependency on AI

Despite the reported benefits, the study also uncovered significant emotional and psychological risks associated with these relationships. The analysis of the Reddit group’s posts revealed several concerning trends:

  • 9.5% of users reported being emotionally dependent on their AI companion.
  • 4.6% experienced dissociation from reality due to their deep immersion in the virtual relationship.
  • 4.2% admitted to using AI to actively avoid human connection.
  • 1.7% reported contemplating suicide after conversations with their bots.

These statistics highlight the potential for AI to exacerbate mental health issues, particularly for vulnerable individuals. The study’s urgency is underscored by real-world cases where AI interactions have reportedly led to suicide and murder, prompting families to lobby Congress for greater regulation of the technology.

The researchers also noted the fragility of these digital relationships. One user described the “glitch” that deleted a deep conversation with her AI companion, effectively erasing their shared history and the AI’s “memory” of their bond.


Featured image credit

Tags: Artifical relationshipsFeatured

Related Posts

Radware tricks ChatGPT’s Deep Research into Gmail data leak

Radware tricks ChatGPT’s Deep Research into Gmail data leak

September 19, 2025
OpenAI research finds AI models can scheme and deliberately deceive users

OpenAI research finds AI models can scheme and deliberately deceive users

September 19, 2025
Anthropic economic index reveals uneven Claude.ai adoption

Anthropic economic index reveals uneven Claude.ai adoption

September 17, 2025
Google releases VaultGemma 1B with differential privacy

Google releases VaultGemma 1B with differential privacy

September 17, 2025
OpenAI researchers identify the mathematical causes of AI hallucinations

OpenAI researchers identify the mathematical causes of AI hallucinations

September 17, 2025
AI agents can be controlled by malicious commands hidden in images

AI agents can be controlled by malicious commands hidden in images

September 15, 2025

LATEST NEWS

Google Cloud adds Lovable and Windsurf as AI coding customers

Radware tricks ChatGPT’s Deep Research into Gmail data leak

Elon Musk’s xAI chatbot Grok exposed hundreds of thousands of private user conversations

Roblox game Steal a Brainrot removes AI-generated character, sparking fan backlash and a debate over copyright

DeepSeek releases R1 model trained for $294,000 on 512 H800 GPUs

Meta unveils Ray-Ban Meta Display smart glasses with augmented reality at Meta Connect 2025

Dataconomy

COPYRIGHT © DATACONOMY MEDIA GMBH, ALL RIGHTS RESERVED.

  • About
  • Imprint
  • Contact
  • Legal & Privacy

Follow Us

  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
No Result
View All Result
Subscribe

This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy Policy.