Can artificial intelligence help us understand what animals feel? A new study by researchers from the University of Copenhagen’s Department of Biology suggests that it can. Published in iScience, the study demonstrates that a machine-learning model can distinguish between positive and negative emotional states across seven different ungulate species, achieving an 89.49% accuracy rate.
Deciphering animal emotions has long been a challenge in animal welfare, veterinary science, and conservation. While previous research has analyzed vocal cues in single species, this study is the first to develop a machine-learning model capable of detecting emotional valence across multiple species.
Using thousands of recorded vocalizations from cows, pigs, wild boars, and other ungulates, the AI model was trained to identify patterns in vocal signals associated with emotional states. The model focused on key acoustic features, such as energy distribution, frequency, and amplitude modulation, to determine whether an animal was experiencing a positive or negative emotion.
How the AI model works
The researchers gathered and labeled vocalizations from seven different ungulate species in both positive and negative emotional contexts. These emotional states were determined based on previous behavioral and physiological studies, which identified situations where animals displayed clear stress or contentment.
The AI model analyzed four primary acoustic features:
- Duration – The length of the sound, which varies depending on emotional intensity.
- Energy distribution – The way sound energy is spread across different frequencies.
- Fundamental frequency – The base pitch of the vocalization, which can shift in response to emotional states.
- Amplitude modulation – The variation in loudness within a vocalization.
Remarkably, these patterns remained consistent across all species, suggesting that certain vocal expressions of emotions may be evolutionarily conserved.
The study’s results provide insights into how AI can be used to understand animal emotions. The researchers identified three major findings:
- High accuracy: The AI model classified emotional valence with an 89.49% success rate, proving its strong ability to distinguish between positive and negative states.
- Cross-species consistency: Emotional expression patterns were found to be similar across all seven ungulate species, suggesting a universal emotional communication system.
- New perspectives on communication: The study offers insights into the evolutionary origins of emotional vocalization, potentially reshaping how we understand both animal emotions and the development of human language.
This AI model could be integrated into livestock management systems, allowing farmers to monitor stress levels in real time and take action before animals experience significant distress. Similarly, conservationists could use this technology to study emotional responses in wild animal populations.
According to Élodie F. Briefer, Associate Professor at the Department of Biology and last author of the study:
“This breakthrough provides solid evidence that AI can decode emotions across multiple species based on vocal patterns. It has the potential to revolutionize animal welfare, livestock management, and conservation, allowing us to monitor animals’ emotions in real time.”
To accelerate further research, the team has made their dataset of labeled animal vocalizations publicly available. This will enable other scientists to build on their findings and explore additional applications of AI in animal behavior research.
This AI claims it can build ontologies better than you
Briefer adds:
“We want this to be a resource for other scientists. By making the data open access, we hope to accelerate research into how AI can help us better understand animals and improve their welfare.”
Can AI listen to data?
A world where AI doesn’t just process data but listens—that’s what this research edges toward. Not in the sci-fi way, but in the practical, ground-level sense of detecting stress before an animal suffers, catching subtle cues that even trained eyes might miss.
It’s not about translating “moo” into words. If AI can do this for livestock today, what stops it from understanding more species tomorrow? The tech is already proving itself; now it’s up to us to decide what to do with it. Because once we start tuning in, ignoring what we hear won’t be an option.
Featured image credit: Kerem Gülen/Imagen 3