Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
  • AI
  • Tech
  • Cybersecurity
  • Finance
  • DeFi & Blockchain
  • Startups
  • Gaming
Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
Dataconomy
No Result
View All Result

Can AI help us understand what animals feel?

That is the question researchers set out to explore—and their findings might surprise you.

byKerem Gülen
March 11, 2025
in Research

Can artificial intelligence help us understand what animals feel? A new study by researchers from the University of Copenhagen’s Department of Biology suggests that it can. Published in iScience, the study demonstrates that a machine-learning model can distinguish between positive and negative emotional states across seven different ungulate species, achieving an 89.49% accuracy rate.

Deciphering animal emotions has long been a challenge in animal welfare, veterinary science, and conservation. While previous research has analyzed vocal cues in single species, this study is the first to develop a machine-learning model capable of detecting emotional valence across multiple species.

Using thousands of recorded vocalizations from cows, pigs, wild boars, and other ungulates, the AI model was trained to identify patterns in vocal signals associated with emotional states. The model focused on key acoustic features, such as energy distribution, frequency, and amplitude modulation, to determine whether an animal was experiencing a positive or negative emotion.

Stay Ahead of the Curve!

Don't miss out on the latest insights, trends, and analysis in the world of data, technology, and startups. Subscribe to our newsletter and get exclusive content delivered straight to your inbox.

How the AI model works

The researchers gathered and labeled vocalizations from seven different ungulate species in both positive and negative emotional contexts. These emotional states were determined based on previous behavioral and physiological studies, which identified situations where animals displayed clear stress or contentment.

The AI model analyzed four primary acoustic features:

  • Duration – The length of the sound, which varies depending on emotional intensity.
  • Energy distribution – The way sound energy is spread across different frequencies.
  • Fundamental frequency – The base pitch of the vocalization, which can shift in response to emotional states.
  • Amplitude modulation – The variation in loudness within a vocalization.

Remarkably, these patterns remained consistent across all species, suggesting that certain vocal expressions of emotions may be evolutionarily conserved.

The study’s results provide insights into how AI can be used to understand animal emotions. The researchers identified three major findings:

  • High accuracy: The AI model classified emotional valence with an 89.49% success rate, proving its strong ability to distinguish between positive and negative states.
  • Cross-species consistency: Emotional expression patterns were found to be similar across all seven ungulate species, suggesting a universal emotional communication system.
  • New perspectives on communication: The study offers insights into the evolutionary origins of emotional vocalization, potentially reshaping how we understand both animal emotions and the development of human language.

This AI model could be integrated into livestock management systems, allowing farmers to monitor stress levels in real time and take action before animals experience significant distress. Similarly, conservationists could use this technology to study emotional responses in wild animal populations.

According to Élodie F. Briefer, Associate Professor at the Department of Biology and last author of the study:

“This breakthrough provides solid evidence that AI can decode emotions across multiple species based on vocal patterns. It has the potential to revolutionize animal welfare, livestock management, and conservation, allowing us to monitor animals’ emotions in real time.”

To accelerate further research, the team has made their dataset of labeled animal vocalizations publicly available. This will enable other scientists to build on their findings and explore additional applications of AI in animal behavior research.


This AI claims it can build ontologies better than you


Briefer adds:

“We want this to be a resource for other scientists. By making the data open access, we hope to accelerate research into how AI can help us better understand animals and improve their welfare.”

Can AI listen to data?

A world where AI doesn’t just process data but listens—that’s what this research edges toward. Not in the sci-fi way, but in the practical, ground-level sense of detecting stress before an animal suffers, catching subtle cues that even trained eyes might miss.

It’s not about translating “moo” into words. If AI can do this for livestock today, what stops it from understanding more species tomorrow? The tech is already proving itself; now it’s up to us to decide what to do with it. Because once we start tuning in, ignoring what we hear won’t be an option.


Featured image credit: Kerem Gülen/Imagen 3

Tags: AIFeatured

Related Posts

Just 250 bad documents can poison a massive AI model

Just 250 bad documents can poison a massive AI model

October 15, 2025
71% of workers are using rogue AI tools at work, Microsoft warns

71% of workers are using rogue AI tools at work, Microsoft warns

October 14, 2025
Google taught your voice assistant to understand what you mean

Google taught your voice assistant to understand what you mean

October 14, 2025
Apple researchers just made AI text generation 128x faster

Apple researchers just made AI text generation 128x faster

October 13, 2025
Have astronomers finally found the universe’s first dark stars?

Have astronomers finally found the universe’s first dark stars?

October 10, 2025
KPMG: CEOs prioritize AI investment in 2025

KPMG: CEOs prioritize AI investment in 2025

October 9, 2025

LATEST NEWS

Microsoft’s biggest-ever Patch Tuesday fixes 175 bugs

Jensen Huang says every Nvidia engineer now codes with Cursor

Apple unveils new iPad Pro with the M5 chip

Apple Vision Pro gets M5 chip upgrade and PS VR2 controller support

Attackers used AI prompts to silently exfiltrate code from GitHub repositories

Android 16 now shows which apps sneak in your security settings

Dataconomy

COPYRIGHT © DATACONOMY MEDIA GMBH, ALL RIGHTS RESERVED.

  • About
  • Imprint
  • Contact
  • Legal & Privacy

Follow Us

  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
No Result
View All Result
Subscribe

This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy Policy.