Google’s AI model helps understand dolphin sounds
Apr. 15, 2025.
2 mins. read.
1 Interactions
The DolphinGemma AI model, developed by Google with with Wild Dolphin Project and Georgia Tech, decodes dolphin vocalizations.
Scientists have studied dolphin sounds for years. Dolphins make clicks, whistles, and burst pulses to talk. Researchers want to understand these sounds better. Google, along with Georgia Tech and the Wild Dolphin Project, has announced DolphinGemma. This AI model learns dolphin vocalizations and creates new dolphin-like sounds.
This work aims to connect humans and dolphins. The Wild Dolphin Project started in 1985. It studies Atlantic spotted dolphins in the Bahamas. Researchers record underwater videos and sounds. They match sounds to dolphin behaviors. This method avoids disturbing the dolphins. They notice signature whistles act like names. Mothers use them to call their calves. Burst pulses often happen during fights. Clicks appear in courtship or when chasing sharks.
DolphinGemma and its role in research
DolphinGemma uses Google’s audio technology called SoundStream. SoundStream turns dolphin sounds into digital data. The model has 400 million parameters, making it small enough for Google Pixel phones. DolphinGemma studies sound patterns and predicts what comes next. This is like how human language models predict words.
The Wild Dolphin Project uses DolphinGemma this season. It finds patterns in dolphin sounds. This helps uncover hidden meanings in their communication. Researchers hope to create a shared vocabulary. They use synthetic sounds to name objects dolphins like, such as seagrass or scarves.
The project also uses a system called CHAT. CHAT stands for Cetacean Hearing Augmentation Telemetry. It’s an underwater computer. CHAT plays artificial whistles for objects. Dolphins might mimic these to ask for items. A Google Pixel 9 powers the latest CHAT system. It listens and identifies dolphin mimics quickly. This lets researchers respond fast with the right object. Using Pixel phones saves space and energy. Google plans to share DolphinGemma with other scientists this summer. It mainly works with Atlantic spotted dolphins. However, it can help study other dolphins, like bottlenose, with adjustments. This model could speed up research on dolphin communication. It brings humans closer to understanding these smart marine animals.
Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter.Â
0 Comments
0 thoughts on “Google’s AI model helps understand dolphin sounds”