Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
  • AI
  • Tech
  • Cybersecurity
  • Finance
  • DeFi & Blockchain
  • Startups
  • Gaming
Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
Dataconomy
No Result
View All Result

Can AI help us understand what animals feel?

That is the question researchers set out to explore—and their findings might surprise you.

byKerem Gülen
March 11, 2025
in Research
Home Research

Can artificial intelligence help us understand what animals feel? A new study by researchers from the University of Copenhagen’s Department of Biology suggests that it can. Published in iScience, the study demonstrates that a machine-learning model can distinguish between positive and negative emotional states across seven different ungulate species, achieving an 89.49% accuracy rate.

Deciphering animal emotions has long been a challenge in animal welfare, veterinary science, and conservation. While previous research has analyzed vocal cues in single species, this study is the first to develop a machine-learning model capable of detecting emotional valence across multiple species.

Using thousands of recorded vocalizations from cows, pigs, wild boars, and other ungulates, the AI model was trained to identify patterns in vocal signals associated with emotional states. The model focused on key acoustic features, such as energy distribution, frequency, and amplitude modulation, to determine whether an animal was experiencing a positive or negative emotion.

Stay Ahead of the Curve!

Don't miss out on the latest insights, trends, and analysis in the world of data, technology, and startups. Subscribe to our newsletter and get exclusive content delivered straight to your inbox.

How the AI model works

The researchers gathered and labeled vocalizations from seven different ungulate species in both positive and negative emotional contexts. These emotional states were determined based on previous behavioral and physiological studies, which identified situations where animals displayed clear stress or contentment.

The AI model analyzed four primary acoustic features:

  • Duration – The length of the sound, which varies depending on emotional intensity.
  • Energy distribution – The way sound energy is spread across different frequencies.
  • Fundamental frequency – The base pitch of the vocalization, which can shift in response to emotional states.
  • Amplitude modulation – The variation in loudness within a vocalization.

Remarkably, these patterns remained consistent across all species, suggesting that certain vocal expressions of emotions may be evolutionarily conserved.

The study’s results provide insights into how AI can be used to understand animal emotions. The researchers identified three major findings:

  • High accuracy: The AI model classified emotional valence with an 89.49% success rate, proving its strong ability to distinguish between positive and negative states.
  • Cross-species consistency: Emotional expression patterns were found to be similar across all seven ungulate species, suggesting a universal emotional communication system.
  • New perspectives on communication: The study offers insights into the evolutionary origins of emotional vocalization, potentially reshaping how we understand both animal emotions and the development of human language.

This AI model could be integrated into livestock management systems, allowing farmers to monitor stress levels in real time and take action before animals experience significant distress. Similarly, conservationists could use this technology to study emotional responses in wild animal populations.

According to Élodie F. Briefer, Associate Professor at the Department of Biology and last author of the study:

“This breakthrough provides solid evidence that AI can decode emotions across multiple species based on vocal patterns. It has the potential to revolutionize animal welfare, livestock management, and conservation, allowing us to monitor animals’ emotions in real time.”

To accelerate further research, the team has made their dataset of labeled animal vocalizations publicly available. This will enable other scientists to build on their findings and explore additional applications of AI in animal behavior research.


This AI claims it can build ontologies better than you


Briefer adds:

“We want this to be a resource for other scientists. By making the data open access, we hope to accelerate research into how AI can help us better understand animals and improve their welfare.”

Can AI listen to data?

A world where AI doesn’t just process data but listens—that’s what this research edges toward. Not in the sci-fi way, but in the practical, ground-level sense of detecting stress before an animal suffers, catching subtle cues that even trained eyes might miss.

It’s not about translating “moo” into words. If AI can do this for livestock today, what stops it from understanding more species tomorrow? The tech is already proving itself; now it’s up to us to decide what to do with it. Because once we start tuning in, ignoring what we hear won’t be an option.


Featured image credit: Kerem Gülen/Imagen 3

Tags: AIFeatured

Related Posts

Radware tricks ChatGPT’s Deep Research into Gmail data leak

Radware tricks ChatGPT’s Deep Research into Gmail data leak

September 19, 2025
OpenAI research finds AI models can scheme and deliberately deceive users

OpenAI research finds AI models can scheme and deliberately deceive users

September 19, 2025
MIT studies AI romantic bonds in r/MyBoyfriendIsAI group

MIT studies AI romantic bonds in r/MyBoyfriendIsAI group

September 19, 2025
Anthropic economic index reveals uneven Claude.ai adoption

Anthropic economic index reveals uneven Claude.ai adoption

September 17, 2025
Google releases VaultGemma 1B with differential privacy

Google releases VaultGemma 1B with differential privacy

September 17, 2025
OpenAI researchers identify the mathematical causes of AI hallucinations

OpenAI researchers identify the mathematical causes of AI hallucinations

September 17, 2025

LATEST NEWS

Zoom announces AI Companion 3.0 at Zoomtopia

Google Cloud adds Lovable and Windsurf as AI coding customers

Radware tricks ChatGPT’s Deep Research into Gmail data leak

Elon Musk’s xAI chatbot Grok exposed hundreds of thousands of private user conversations

Roblox game Steal a Brainrot removes AI-generated character, sparking fan backlash and a debate over copyright

DeepSeek releases R1 model trained for $294,000 on 512 H800 GPUs

Dataconomy

COPYRIGHT © DATACONOMY MEDIA GMBH, ALL RIGHTS RESERVED.

  • About
  • Imprint
  • Contact
  • Legal & Privacy

Follow Us

  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
No Result
View All Result
Subscribe

This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy Policy.