ΑΙhub.org
 

AI-assisted camera system to monitor seabird behaviour


by
06 November 2020



share this:
Seagulls flying | AIhub

Researchers from Osaka University have combined bio-logging cameras with a machine learning algorithm to help them to shed light on hidden aspects of the lives of seabird species, including gulls and shearwaters.

Bio-logging is a technique involving the mounting of small lightweight video cameras and/or other data-gathering devices onto the bodies of wild animals. These systems allow researchers to observe various aspects of animals’ lives, such as behaviours and social interactions, with minimal disturbance.

However, the considerable battery life required for these high-cost bio-logging systems has proved limiting so far. “Since bio-loggers attached to small animals have to be small and lightweight, they have short runtimes and it was therefore difficult to record interesting infrequent behaviours,” explains study corresponding author Takuya Maekawa.

By using AI-assisted bio-loggers, researchers can use low-cost sensors to automatically detect behaviours of interest in real time, allowing them to conditionally activate high-cost (i.e., resource-intensive) sensors to target those behaviours.

The researchers have put together this video to explain how their system works:

The researchers used a random forest classifier algorithm to determine when to switch on the high-cost sensors. Their model uses accelerometer-based features, which can be used to detect the body movements of the animals with only a small (e.g., 1 second) delay between when data collection begins and when behaviours can first be detected. Features from accelerometers were used to train the model to detect whether the birds were flying, stationary or foraging.

You can see three examples of the camera in action below:

Read the research in full

Machine learning enables improved runtime and precision for bio-loggers on seabirds
Joseph Korpela, Hirokazu Suzuki, Sakiko Matsumoto, Yuichi Mizutani, Masaki Samejima, Takuya Maekawa, Junichi Nakai & Ken Yoda




Lucy Smith is Senior Managing Editor for AIhub.
Lucy Smith is Senior Managing Editor for AIhub.




            AIhub is supported by:


Related posts :



2024 AAAI / ACM SIGAI Doctoral Consortium interviews compilation

  20 Dec 2024
We collate our interviews with the 2024 cohort of doctoral consortium participants.

Interview with Andrews Ata Kangah: Localising illegal mining sites using machine learning and geospatial data

  19 Dec 2024
We spoke to Andrews to find out more about his research, and attending the AfriClimate AI workshop at the Deep Learning Indaba.

#NeurIPS social media round-up part 2

  18 Dec 2024
We pick out some highlights from the second half of the conference.

The Good Robot podcast: Machine vision with Jill Walker Rettberg

  17 Dec 2024
Eleanor and Kerry talk to Jill about machine vision's origins in polished volcanic glass, whether or not we'll actually have self-driving cars, and a famous photo-shopped image.

Five ways you might already encounter AI in cities (and not realise it)

  13 Dec 2024
Researchers studied how residents and visitors experience the presence of AI in public spaces in the UK.

#NeurIPS2024 social media round-up part 1

  12 Dec 2024
Find out what participants have been getting up to at the Neural Information Processing Systems conference in Vancouver.

Congratulations to the #NeurIPS2024 award winners

  11 Dec 2024
Find out who has been recognised by the conference awards.

Multi-agent path finding in continuous environments

and   11 Dec 2024
How can a group of agents minimise their journey length whilst avoiding collisions?




AIhub is supported by:






©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association