ΑΙhub.org
 

AI-powered BirdNET app makes citizen science easier


by
13 July 2022



share this:
blue tit

By Pat Leonard

The BirdNET app, a free machine-learning powered tool that can identify more than 3,000 birds by sound alone, generates reliable scientific data and makes it easier for people to contribute citizen-science data on birds by simply recording sounds, according to new Cornell research.

“The most exciting part of this work is how simple it is for people to participate in bird research and conservation,” said Connor Wood, research associate in the K. Lisa Yang Center for Conservation Bioacoustics at the Cornell Lab of Ornithology and lead author of The Machine Learning-Powered BirdNET App Reduces Barriers to Global Bird Research by Enabling Citizen Science Participation, which was published on 28 June in PLOS Biology.

“You don’t need to know anything about birds, you just need a smartphone, and the BirdNET app can then provide both you and the research team with a prediction for what bird you’ve heard,” Wood said. “This has led to tremendous participation worldwide, which translates to an incredible wealth of data. It’s really a testament to an enthusiasm for birds that unites people from all walks of life.”

The study suggests that the BirdNET app lowers the barrier to citizen science because it doesn’t require bird-identification skills. Users simply listen for birds and tap the app to record. BirdNET uses artificial intelligence to automatically identify the species by sound and captures the recording for use in research.

“Our guiding design principles were that we needed an accurate algorithm and a simple user interface,” said study co-author Stefan Kahl in the Yang Center at the Cornell Lab of Ornithology, who led the technical development. “Otherwise, users would not return to the app.”

The results exceeded expectations: Since its launch in 2018, more than 2.2 million people have contributed data.

To test whether the app could generate reliable scientific data, the authors selected four test cases in the United States and Europe in which conventional research had already provided robust answers. Their study shows, for example, that BirdNET app data successfully replicated the known distribution pattern of song-types among white-throated sparrows, and the seasonal and migratory ranges of the brown thrasher.

Validating the reliability of the app data for research purposes was the first step in what the authors hope will be a long-term, global research effort – not just for birds, but ultimately for all wildlife and even entire soundscapes. The app is available for both iOS and Android platforms.

The BirdNET app is part of the Cornell Lab of Ornithology’s suite of tools, including the educational Merlin Bird ID app and citizen-science apps eBird, NestWatch and Project FeederWatch, which together have generated more than 1 billion bird observations, sounds and photos from participants around the world for use in science and conservation.

This project was supported by Jake Holshuh, the Arthur Vining Davis Foundations, European Union, European Social Fund for Germany and German Federal Ministry of Education and Research.


Find out more about the app in this Q&A with BirdNET developer Stefan Kahl.



tags: ,


Cornell University

            AIhub is supported by:



Subscribe to AIhub newsletter on substack



Related posts :

The Good Robot podcast: what makes a drone “good”? with Beryl Pong

  20 Feb 2026
In this episode, Eleanor and Kerry talk to Beryl Pong about what it means to think about drones as “good” or “ethical” technologies.

Relational neurosymbolic Markov models

and   19 Feb 2026
Relational neurosymbolic Markov models make deep sequential models logically consistent, intervenable and generalisable

AI enables a Who’s Who of brown bears in Alaska

  18 Feb 2026
A team of scientists from EPFL and Alaska Pacific University has developed an AI program that can recognize individual bears in the wild, despite the substantial changes that occur in their appearance over the summer season.

Learning to see the physical world: an interview with Jiajun Wu

and   17 Feb 2026
Winner of the 2019 AAAI / ACM SIGAI dissertation award tells us about his current research.

3 Questions: Using AI to help Olympic skaters land a quint

  16 Feb 2026
Researchers are applying AI technologies to help figure skaters improve. They also have thoughts on whether five-rotation jumps are humanly possible.

AAAI presidential panel – AI and sustainability

  13 Feb 2026
Watch the next discussion based on sustainability, one of the topics covered in the AAAI Future of AI Research report.

How can robots acquire skills through interactions with the physical world? An interview with Jiaheng Hu

  12 Feb 2026
Find out more about work published at the Conference on Robot Learning (CoRL).

From Visual Question Answering to multimodal learning: an interview with Aishwarya Agrawal

and   11 Feb 2026
We hear from Aishwarya about research that received a 2019 AAAI / ACM SIGAI Doctoral Dissertation Award honourable mention.



AIhub is supported by:







Subscribe to AIhub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence