ARL

Dolphin crossmodal matching

Dolphins possess echolocation – a sense that humans don’t have – and although we can invent devices like sonar that mimics that ability, our understanding of what information about their environment this sense provides to the dolphin is still very limited. Originally, people thought that dolphins use this sense to find food and to navigate in the ocean – but over the last decades it has become clearer that dolphin echolocation is far more sophisticated than originally thought. Experiments have shown that dolphins can not only detect objects but can also recognize shape through echolocation – comparable to a medical ultrasound that can reconstruct a 3-D image. The interesting part is that dolphins seem to be able to do that at much lower frequencies (about 80-160 kHz) than what is used in technical applications (in the MHz range) – thus lower by almost 2 orders of magnitude – and still achieve enough resolution to determine the shape of an object!

We have been investigating that ability through an experimental setup called crossmodal matching-to-sample in a collaboration with Ocean Park Hong Kong – in which the animal has to use one sense (in this case echolocation) to examine a sample object concealed in an acoustically transparent box underwater. After inspection of the sample the dolphin then has to select the matching alternative among several (up to 4) objects presented to the visual sense (in air – where dolphin echolocation does not function). During these tests the dolphin’s echolocation signals are recorded with several arrays of hydrophones either inside the box that contains the sample or on a biteplate that the dolphins stations on. The acoustic information collected can then be used to investigate what sounds are received by the dolphin and we can then build a model on how the information might be processed.

By changing the sample objects and making the task more difficult (i.e. changing the size of the object while maintaining the overall shape, or by rotating the object) we can then see at what point the performance of the dolphin decreases to a point where the animal is no longer able to perform the task. This will then reveal some of the limitations of echolocation and allow us to get a clearer picture of what the dolphin actually senses.
Additionally, we have also designed and tested a transducer system that tries to implement the findings from these experiments and mimic the dolphin’s performance. If the man-made system also shows decreasing performance under the same experimental circumstances (i.e. size change or rotation) then that is pretty good evidence that we are on the right track in deciphering the dolphins’ amazing ability to “see with sound”.