ΑΙhub.org
 

A behaviour monitoring dataset of wild mammals in the Swiss Alps


by
17 July 2025



share this:

Two roe deer foraging, with manual annotations for each individual animal. Credit: A. Mathis (EPFL).

By Nik Papageorgiou

Have you ever wondered how wild animals behave when no one’s watching? Understanding these behaviors is vital for protecting ecosystems—especially as climate change and human expansion alter natural habitats. But collecting this kind of information without interfering has always been tricky.

Traditionally, researchers relied on direct observation or sensors strapped to animals—methods that are either disruptive or limited in scope. Camera traps offer a less invasive alternative, but they generate vast amounts of footage that’s hard to analyze.

AI could help, but there’s a catch: it needs annotated datasets to learn from. Most current video datasets are either scraped from the internet, missing the authenticity of real wild settings, or are small-scale field recordings lacking detail. And few include the kind of rich context—like multiple camera angles or audio—that’s needed to truly understand complex animal behavior.

Introducing MammAlps

To address this challenge, scientists at EPFL have collected and curated MammAlps, the first richly annotated, multi-view, multimodal wildlife behavior dataset in collaboration with the Swiss National Park. MammAlps is designed to train AI models for species and behavior recognition tasks, and ultimately to help researchers understand animal behavior better. This work could make conservation efforts faster, cheaper, and smarter.

MammAlps was developed by Valentin Gabeff, a PhD student at EPFL under the supervision of Professors Alexander Mathis and Devis Tuia, together with their respective research teams.

How MammAlps was developed

The researchers set up nine camera traps that recorded more than 43 hours of raw footage over the course of several weeks. The team then meticulously processed it, using AI tools to detect and track individual animals, resulting in 8.5 hours of material showing wildlife interaction.

They labeled behaviors using a hierarchical approach, categorizing each moment at two levels: high-level activities like foraging or playing, and finer actions like walking, grooming, or sniffing. This structure allows AI models to interpret behaviors more accurately by linking detailed movements to broader behavioral patterns.

To provide AI models with richer context, the team supplemented video with audio recordings and captured “reference scene maps” that documented environmental factors like water sources, bushes, and rocks. This addition al data enables better interpretation of habitat-specific behaviors. They also cross-referenced weather conditions and counts of individuals per event to create more complete scene descriptions.

“By incorporating other modalities alongside video, we’ve shown that AI models can better identify animal behavior,” explains Alexander Mathis. “This multi-modal approach gives us a more complete picture of wildlife behavior.”

A new standard to wildlife monitoring

MammAlps brings a new standard to wildlife monitoring: a full sensory snapshot of animal behavior across multiple angles, sounds, and contexts. It also introduces a “long-term event understanding” benchmark, meaning scientists can now study not just isolated behaviors from short clips, but broader ecological scenes over time—like a wolf stalking a deer across several camera views.

Research is still ongoing. The team is currently processing data collected in 2024 and carries out more fieldwork in 2025. These additional surveys are necessary to expand the set of recordings for rare species such as alpine hares and lynx and are also useful to develop methods for the temporal analysis of wildlife behavior over multiple seasons.

Building more datasets like MammAlps could radically scale up current wildlife monitoring efforts by enabling AI models to identify behaviors of interest from hundreds of hours of video. This would provide wildlife conservationists with timely, actionable insights. Over time, this could make it easier to track how climate change, human encroachment, or disease outbreaks impact wildlife behavior, and help protect vulnerable species.

For more information about MammAlps and access to the dataset, visit the project webpage.

Read the work in full

MammAlps: A multi-view video behavior monitoring dataset of wild mammals in the Swiss Alps, Valentin Gabeff, Haozhe Qi, Brendan Flaherty, Gencer Sumbül, Alexander Mathis, Devis Tuia.



tags: ,


EPFL




            AIhub is supported by:


Related posts :



#ICML2025 social media round-up 1

  16 Jul 2025
Find out what participants have been getting up to during the first couple of days of the conference.

Congratulations to the #ICML2025 award winners!

  16 Jul 2025
Find out which articles have won the outstanding paper, outstanding position paper, and the test-of-time awards.

Tackling the 3D Simulation League: an interview with Klaus Dorer and Stefan Glaser

  15 Jul 2025
With RoboCup2025 starting today, we found out more about the 3D simulation league, and the new simulator they have in the works.

What’s coming up at #RoboCup2025?

  10 Jul 2025
Find out when the different leagues competitions and the symposium are taking place.

Wildlife researchers train AI to better identify animal species in trail camera photos

  09 Jul 2025
Scientists are working on improving AI performance in wildlife monitoring through species and environment-specific training.

What’s on the programme at #ICML2025?

  07 Jul 2025
Find out what the International Conference on Machine Learning has in store.

Introducing the NASA Onboard Artificial Intelligence Research (OnAIR) platform: an interview with Evana Gizzi

  03 Jul 2025
Find out about the OnAIR platform, some of the particular challenges of deploying AI-based solutions in space, and how the tool has been used so far.

An interview with Nicolai Ommer: the RoboCupSoccer Small Size League

  01 Jul 2025
We caught up with Nicolai to find out more about the Small Size League, how the auto referees work, and how teams use AI.



 

AIhub is supported by:






©2025.05 - Association for the Understanding of Artificial Intelligence


 












©2025.05 - Association for the Understanding of Artificial Intelligence