Dolphins and dugongs constitute some of the megafauna presence in Singapore waters. However, our baseline understanding of the local population such as their visiting patterns and foraging behavior, are insufficient to formulate conservation approaches. This is because traditional monitoring efforts rely on expensive visual sightings of animals near the water surface, and these observations are opportunistic, patchy, constrained to daylight hours and subject to weather conditions.
As the animals spend most of their time underwater where visual detection can be ineffective, especially in turbid waters, we deploy both miniature passive acoustic arrays and active systems on autonomous platforms to conduct the monitoring. A hybrid of both methods are used to maximize the monitoring outcome. Machine learning is also being used to detect the animal vocalizations from acoustic data cluttered by anthropogenic noises in busy coastal waters.
We try to understand the animal activities by observing both the vocalizations and their underwater swimming behaviors. Moreover the acoustics data would be a tool to support biodiversity management. This could form a new set of monitoring tools for biodiversity management and marine megafauna conservation in local and regional waters.