Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
  • AI
  • Tech
  • Cybersecurity
  • Finance
  • DeFi & Blockchain
  • Startups
  • Gaming
Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
Dataconomy
No Result
View All Result

Can AI help us understand what animals feel?

That is the question researchers set out to explore—and their findings might surprise you.

byKerem Gülen
March 11, 2025
in Research
Home Research
Share on FacebookShare on TwitterShare on LinkedInShare on WhatsAppShare on e-mail

Can artificial intelligence help us understand what animals feel? A new study by researchers from the University of Copenhagen’s Department of Biology suggests that it can. Published in iScience, the study demonstrates that a machine-learning model can distinguish between positive and negative emotional states across seven different ungulate species, achieving an 89.49% accuracy rate.

Deciphering animal emotions has long been a challenge in animal welfare, veterinary science, and conservation. While previous research has analyzed vocal cues in single species, this study is the first to develop a machine-learning model capable of detecting emotional valence across multiple species.

Using thousands of recorded vocalizations from cows, pigs, wild boars, and other ungulates, the AI model was trained to identify patterns in vocal signals associated with emotional states. The model focused on key acoustic features, such as energy distribution, frequency, and amplitude modulation, to determine whether an animal was experiencing a positive or negative emotion.

Stay Ahead of the Curve!

Don't miss out on the latest insights, trends, and analysis in the world of data, technology, and startups. Subscribe to our newsletter and get exclusive content delivered straight to your inbox.

How the AI model works

The researchers gathered and labeled vocalizations from seven different ungulate species in both positive and negative emotional contexts. These emotional states were determined based on previous behavioral and physiological studies, which identified situations where animals displayed clear stress or contentment.

The AI model analyzed four primary acoustic features:

  • Duration – The length of the sound, which varies depending on emotional intensity.
  • Energy distribution – The way sound energy is spread across different frequencies.
  • Fundamental frequency – The base pitch of the vocalization, which can shift in response to emotional states.
  • Amplitude modulation – The variation in loudness within a vocalization.

Remarkably, these patterns remained consistent across all species, suggesting that certain vocal expressions of emotions may be evolutionarily conserved.

The study’s results provide insights into how AI can be used to understand animal emotions. The researchers identified three major findings:

  • High accuracy: The AI model classified emotional valence with an 89.49% success rate, proving its strong ability to distinguish between positive and negative states.
  • Cross-species consistency: Emotional expression patterns were found to be similar across all seven ungulate species, suggesting a universal emotional communication system.
  • New perspectives on communication: The study offers insights into the evolutionary origins of emotional vocalization, potentially reshaping how we understand both animal emotions and the development of human language.

This AI model could be integrated into livestock management systems, allowing farmers to monitor stress levels in real time and take action before animals experience significant distress. Similarly, conservationists could use this technology to study emotional responses in wild animal populations.

According to Élodie F. Briefer, Associate Professor at the Department of Biology and last author of the study:

“This breakthrough provides solid evidence that AI can decode emotions across multiple species based on vocal patterns. It has the potential to revolutionize animal welfare, livestock management, and conservation, allowing us to monitor animals’ emotions in real time.”

To accelerate further research, the team has made their dataset of labeled animal vocalizations publicly available. This will enable other scientists to build on their findings and explore additional applications of AI in animal behavior research.


This AI claims it can build ontologies better than you


Briefer adds:

“We want this to be a resource for other scientists. By making the data open access, we hope to accelerate research into how AI can help us better understand animals and improve their welfare.”

Can AI listen to data?

A world where AI doesn’t just process data but listens—that’s what this research edges toward. Not in the sci-fi way, but in the practical, ground-level sense of detecting stress before an animal suffers, catching subtle cues that even trained eyes might miss.

It’s not about translating “moo” into words. If AI can do this for livestock today, what stops it from understanding more species tomorrow? The tech is already proving itself; now it’s up to us to decide what to do with it. Because once we start tuning in, ignoring what we hear won’t be an option.


Featured image credit: Kerem Gülen/Imagen 3

Tags: AIFeatured

Related Posts

Apple Watch data can predict your health with 92% accuracy

Apple Watch data can predict your health with 92% accuracy

July 11, 2025
How AI platforms rank on data privacy in 2025

How AI platforms rank on data privacy in 2025

July 9, 2025
This AI lab wants to automate scientific discovery

This AI lab wants to automate scientific discovery

July 8, 2025
Counterpoint data shows the global smartwatch market is now shrinking

Counterpoint data shows the global smartwatch market is now shrinking

July 4, 2025
ChatGPT referrals can’t save publishers from AI search

ChatGPT referrals can’t save publishers from AI search

July 3, 2025
AI made executives worse at stock picking

AI made executives worse at stock picking

July 2, 2025

LATEST NEWS

Is Google’s foldable phone already dead on arrival?

Google simplifies Lens to make room for its Gemini AI

Ex-Intel CEO Pat Gelsinger has a new mission for AI

Amazon Kindle finally lets you filter lockscreen ads

Samsung preps foldable screens for Apple’s iPhone Fold

This is what a Windows crash looks like now

Dataconomy

COPYRIGHT © DATACONOMY MEDIA GMBH, ALL RIGHTS RESERVED.

  • About
  • Imprint
  • Contact
  • Legal & Privacy

Follow Us

  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
No Result
View All Result
Subscribe

This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy Policy.