AI learns the song of the reef

Summary: A new AI algorithm trained with audio clips of both healthy and deteriorating corals can determine the health of corals 92% of the time.

source: UCL

Coral reefs have a complex acoustic landscape – and even experts have to perform careful analysis to gauge the health of coral based on acoustic recordings.

In the new study published in environmental indicators, The scientists trained a computer algorithm using multiple recordings of healthy and degraded corals, allowing the machine to tell the difference.

The computer then analyzed a set of new recordings, successfully identifying the health of the corals 92% of the time.

The team used this to track the progress of coral reef restoration projects.

Lead author, PhD candidate Ben Williams (UCL Center for Biodiversity and Environmental Research), who started the study while at the University of Exeter, said: “Coral reefs face multiple threats including climate change, so monitoring their health and the success of conservation projects is vital.

A major difficulty is that visual and acoustic surveys of coral reefs usually rely on labor-intensive methods.

“Visible surveys are also limited by the fact that many reef organisms camouflaged themselves, or are active at night, while the complexity of reef sounds made it difficult to determine the health of the reef using individual recordings.

Our approach to this problem was to use machine learning – to see if a computer could learn a reef song.

Our findings show that a computer can pick up patterns that cannot be detected in the human ear. It can tell us faster and more accurately how corals are performing.”

Fish and other creatures that live on coral reefs make a wide range of sounds.

This shows the fish swimming around the reef
Fish and other creatures that live on coral reefs make a wide range of sounds. Credit: Tim Lamont

The meaning of many of these calls is still unknown, but a new AI method can distinguish between the aggregate sounds of healthy and unhealthy corals.

The recordings used in the study were taken by the Mars Coral Reef Restoration Project, which works to restore severely damaged coral reefs in Indonesia.

Co-author Dr Tim Lamont, from Lancaster University, said the AI ​​method creates significant opportunities to improve coral reef monitoring.

“This is a really exciting development. Voice recorders and artificial intelligence can be used around the world to monitor the health of coral reefs, and discover if attempts to protect and restore them are successful.”

“In many cases, it is easier and cheaper to spread an underwater aquarium onto a reef and leave it there than to have expert divers visit the reef over and over to survey it—especially in remote locations.”

Financing: The study was funded by the Natural Environment Research Council and the Swiss National Science Foundation.

About this search for artificial intelligence news

author: Henry Kilworth
source: UCL
Contact: Henry Killworth – UCL
picture: Photo credited to Tim Lamont

original search: open access.
Improving automated analysis of acoustic seascapes using environmental acoustic indicators and machine learningBy Ben Williams et al. environmental indicators


Summary

Improving automated analysis of acoustic seascapes using environmental acoustic indicators and machine learning

see also

This shows a heart shaped brain

Historically, environmental monitoring of marine habitats has mainly relied on labor-intensive, non-automated survey methods. The field of Passive Acoustic Monitoring (PAM) has demonstrated the potential of this practice to automate surveying in marine habitats. This was primarily through the use of “environmental acoustic indicators” to identify features from natural sound spaces.

However, investigations using single indicators have had mixed success.

Using PAM recordings collected in one of the world’s largest coral reef restoration programs, we instead apply a machine learning approach across a range of environmental acoustic indicators to improve the predictive power of ecosystem health. Healthy and degraded reef sites were identified through live coral cover surveys, with 90–95% and 0–20% coverage, respectively.

A library of one-minute recordings was extracted from each. Twelve environmental audio indicators were calculated for each recording, in up to three different frequency bands (low: 0.05–0.8 kHz, medium: 2–7 kHz and wide: 0.05–20 kHz). Twelve of these 33 groups with standard frequency differed significantly between healthy and degraded habitats.

However, the best performing single indicator could only correctly classify 47% of recordings, which would require extensive sampling from each site to be useful.

We therefore trained a machine learning algorithm for systematic discriminant analysis to distinguish between healthy and degraded sites using an optimized mix of environmental acoustic indicators.

This multi-index approach distinguished these two habitat classes with improved accuracy compared to any single indicator in isolation. The pooled classification rate for the 1000 validated iterations of the model was 91.7% 0.8, mean SE) a success rate when individual recordings were correctly classified.

The model was subsequently used to classify recordings from two actively recovered sites, generated >24 months prior to recordings, with coral cover values ​​of 79.1% (±3.9) and 66.5% (±3.8). Among these recordings, 37/38 and 33/39 were classified as healthy, respectively.

The model was also used to classify records from a newly restored site constructed less than 12 months ago with 25.6% (±2.6) coral cover, whereby 27/33 records were classified as degraded.

This investigation highlights the value of combining PAM recordings with machine learning analysis for environmental monitoring and demonstrates the ability of PAM to monitor reef recovery over time, reducing reliance on labor-intensive in-water surveys by experts.

As rapid progress continues in accessing PAM recorders, effective automated analysis will be required to keep pace with these expanded audio data sets.