Healthy coral reefs are often associated with their visual splendor: the brilliant array of colors and shapes that inhabit these magnificent underwater ecosystems.

 However, they can be pretty noisy. Suppose you’ve ever been snorkeling in a coral reef environment. In that case, you’ll be familiar with the clicking and popping sounds created by various marine creatures beneath the water, such as snapping shrimp and feeding fish.

The buzzy din of background noise – almost like the chattering hiss of radio static – is such a distinctive component of the coral reef soundscape that it could aid in monitoring the health of these vulnerable marine environments.

In a new study, researchers used machine learning to train an algorithm to distinguish the subtle acoustic variations between a healthy, active reef and a degraded coral site – an acoustic contrast so subtle that it may be impossible for humans to notice.

The researchers say the new technology could provide significant benefits over current labor-intensive and time-consuming processes for monitoring reef health, such as having divers visit reefs to measure coral cover or manually listening to reef recordings visually. Furthermore, many reef animals hide or are only visible at night, making visual surveys much more difficult.

“Our findings show that a computer can pick up patterns that are undetectable to the human ear,” says marine biologist Ben Williams from the University of Exeter in the UK.

“It can tell us faster and more accurately how the reef is doing.”

Read More:

Williams and his colleagues recorded coral acoustics at seven distinct locations in the Spermonde Archipelago, which is located off the southwest coast of Sulawesi in Indonesia and is home to the Mars Coral Reef Restoration project.

The recordings were made in four different types of reef habitats: healthy, deteriorated, mature, restored, and recently restored – each of which had an additional quantity of coral cover and, as a result, generated a varied form of noise from the aquatic organisms that lived and fed in the area.

“Previously, we relied on manual listening and annotation of these recordings to make reliable comparisons,” Williams explains in a Twitter thread.

“However, this is a very slow process and the size of marine soundscape databases is skyrocketing given the advent of low-cost recorders.”

The scientists used a machine-learning algorithm to train it to distinguish between different types of coral recordings in order to automate the process. Following tests, the AI technology was found to be 92 percent accurate in determining reef health from audio recordings.

“This is a really exciting development,” says co-author and marine biologist Timothy Lamont from Lancaster University in the UK.

“In many cases it’s easier and cheaper to deploy an underwater hydrophone on a reef and leave it there than to have expert divers visiting the reef repeatedly to survey it – especially in remote locations.”

The algorithm’s results are based on a combination of underwater soundscape factors, including the abundance and diversity of fish vocalizations and sounds made by invertebrates. According to the researchers, there are possibly faint noises thought to be produced by algae and contributions from abiotic sources (such as subtle differences in how waves and wind might sound across different kinds of coral habitats).

While the human ear may not be able to distinguish such faint and hidden sounds, machines appear capable of doing so. However, the researchers acknowledge that the method can still be improved, with future sound sampling expected to deliver “a more nuanced approach to classifying ecostate.”

Unfortunately, time is a resource that the world’s corals are rapidly depleting. If we want to save them, we’ll have to act quickly.

Ecological Indicators published the findings.

Source: www.sciencealert.com

Write A Comment