Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
  • AI
  • Tech
  • Cybersecurity
  • Finance
  • DeFi & Blockchain
  • Startups
  • Gaming
Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
Dataconomy
No Result
View All Result

AI reveals what doctors cannot see in coma patients

A new AI tool sees what doctors can't: tiny facial movements that reveal hidden consciousness in unresponsive coma patients.

byEmre Çıtak
September 1, 2025
in Healthcare, Artificial Intelligence
Home Industry Healthcare
Share on FacebookShare on TwitterShare on LinkedInShare on WhatsAppShare on e-mail

Imagine a patient who appears completely unresponsive after a severe brain injury. To doctors and family, it might seem that all awareness is lost. But what if subtle signs of consciousness are hiding just beneath the surface, invisible to the naked eye? A new breakthrough from researchers at Stony Brook University suggests that artificial intelligence may finally be able to detect those hidden signs.

Researchers have developed SeeMe, a computer vision tool that tracks tiny facial movements in response to voice commands. Unlike standard clinical exams, which rely on visible movement, SeeMe can detect low-amplitude, purposeful gestures—like a slight opening of the eyes or a micro-smile—long before they are visible to clinicians. In their study, SeeMe identified eye-opening movements on average four days earlier than doctors and detected responses in more patients overall.

The study involved 37 comatose patients and 16 healthy volunteers, with AI analyzing thousands of short video clips to identify meaningful facial responses. A deep learning model even confirmed that the movements were specific to the commands given, suggesting that these patients were not just twitching randomly—they were responding deliberately.

Stay Ahead of the Curve!

Don't miss out on the latest insights, trends, and analysis in the world of data, technology, and startups. Subscribe to our newsletter and get exclusive content delivered straight to your inbox.

Why does this matter?

Detecting covert consciousness early could transform patient care, giving doctors critical information about recovery potential and informing decisions about treatment and rehabilitation. Families could gain reassurance that their loved ones might be aware, even if conventional exams suggest otherwise.

SeeMe also opens doors for future therapeutic tools, such as brain-computer interfaces, that could enable communication with patients who were previously thought to be entirely unconscious. By quantifying these subtle movements, AI provides a reliable, objective measure of consciousness that complements, rather than replaces, clinical judgment.

In short, AI is revealing what was once invisible: that some coma patients are quietly aware, capable of interaction, and potentially on the path to recovery. With tools like SeeMe, the future of neurocritical care is not only smarter—it’s profoundly more hopeful.

Featured image

Tags: AIFeaturedhealthcare

Related Posts

Your YouTube Music 2025 Recap is rolling out now with a Gemini twist

Your YouTube Music 2025 Recap is rolling out now with a Gemini twist

November 25, 2025
Altman and Ive target 2027 for screenless AI device launch

Altman and Ive target 2027 for screenless AI device launch

November 25, 2025
OpenAI introduced shopping research in ChatGPT

OpenAI introduced shopping research in ChatGPT

November 25, 2025
Anthropic launches Opus 4.5 to compete with GPT-5.1 and Gemini 3

Anthropic launches Opus 4.5 to compete with GPT-5.1 and Gemini 3

November 25, 2025
New leak shows Google plans to let Gemini read your NotebookLM files

New leak shows Google plans to let Gemini read your NotebookLM files

November 24, 2025
Perplexity brings its AI browser Comet to Android

Perplexity brings its AI browser Comet to Android

November 21, 2025

LATEST NEWS

Your YouTube Music 2025 Recap is rolling out now with a Gemini twist

Altman and Ive target 2027 for screenless AI device launch

Job listing reveals ‘Aluminium’ codename for Android desktop OS

Gmail Android notifications now show image attachment previews

Xbox Crocs arrive November 25 for $80

SEC grants regulatory safety to Solana DePIN project Fuse

Dataconomy

COPYRIGHT © DATACONOMY MEDIA GMBH, ALL RIGHTS RESERVED.

  • About
  • Imprint
  • Contact
  • Legal & Privacy

Follow Us

  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
No Result
View All Result
Subscribe

This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy Policy.