Curated by THEOUTPOST
On Sat, 17 May, 12:01 AM UTC
2 Sources
[1]
Can AI help us talk to dolphins? The race is now on
Will artificial intelligence (AI) finally allow us to talk to animals? To spur progress towards that dream, a contest is offering a US$500,000 cash prize for AI use that achieves a "breakthrough" in interspecies communication, as well as an annual award of US$100,000 until the grand prize is claimed. The first annual prize in the contest, the Coller Dolittle Challenge, was awarded yesterday to researchers who are planning to apply AI to some four decades' worth of recordings of bottlenose dolphins (Tursiops truncatus). The team has already identified more than 20 distinct sounds in the dolphin 'vocabulary' and plans to incorporate AI classification methods to expand this lexicon. The contest, which launched a year ago, is organized by the Jeremy Coller Foundation in London -- which funds projects related to animal welfare, among other things -- in partnership with Tel Aviv University in Israel. Annual prizes will be awarded for research that uses AI to improve scientists' understanding of how animals communicate. But "the ultimate goal is indeed what we call two-way, multi-context communication with an animal using the animal's own signals", says Yossi Yovel, a neuroecologist at Tel Aviv University and chair of the award's scientific committee. The grand prize is worth either $500,000 in cash or $10 million in investments; the qualifying criteria will be established in the next year or so, Yovel says. AI has boosted researchers' ability to process large amounts of animal-communication data, such as recordings of bird whistles or wolf howls. The technology has also made it easier for scientists to look for patterns that hint at the meaning of animal sounds, says Yovel. Together with advances in AI for human communication, this has fuelled interest in using AI tools to decode animal communication. But researchers say they have yet to see an AI-based revolution in the field. "You still need a lot of zoological know-how. Thinking that you can just put a camera somewhere, record an animal and automatically detect something -- the chances for this being useful are very low," says Yovel. There's been a lot of hype about applying large language models to animal communication systems, says Arik Kershenbaum, a zoologist at the University of Cambridge, UK, who is not involved in the contest. That approach hasn't yet been proved, but contest organizers are optimistic that AI can improve two-way communication between humans and animals. Laela Sayigh, a biologist at the Woods Hole Oceanographic Institution in Massachusetts, and her team have been studying a community of some 170 bottlenose dolphins in Sarasota Bay, Florida, for decades. Since 1984, they have been recording sounds from individual dolphins by attaching suction-cup microphones to the animals. This has allowed the team, which won this year's prize, to catalogue most of the dolphins' "signature whistles" -- sounds that function like names and help individuals to identify one another. About half of the whistles produced by dolphins are of this type, Sayigh says. The other half, called non-signature whistles, have received little attention because they are harder to study. "Since we know the signature whistles of most dolphins in the Sarasota community, we had a unique research opportunity," she says. "We have now identified more than 20 repeated, shared non-signature whistle types." By playing back some of these shared whistles and observing the dolphins' reactions, the scientists are beginning to uncover their meaning. For instance, the dolphins approached speakers playing some sounds, suggesting these functioned as a way of initiating contact, but swam away when the speakers played others, suggesting these functioned as a type of alarm. The researchers plan to continue to expand the database of dolphin vocabulary and to explore what different sounds might mean. "We really hope that this prize can push the use of AI in order to reveal even more impressive future results using this immense data set," says Yovel. Among the three other finalists was a team running a project that revealed that two species of cuttlefish (Sepia officinalis and Sepia bandensis) use distinct arm-wave gestures to communicate with their peers. AI analysis allowed the researchers to identify four arm-wave signs used by the animals: up, side, roll and crown. When watching videos of peers displaying those signs, cuttlefish waved back at the screen. And when the scientists played the sounds generated by arm gestures, the cuttlefish waved back using the same gestures. Another finalist group used AI to break down the whistles of common nightingales (Luscinia megarhynchos) into syllables to analyse the songs' structure and syntax. The researchers aim to decode the meanings behind the whistles and generate sounds to communicate with the birds in their own 'language'. The fourth team to reach the finals ran a study that revealed that monkeys called common marmosets (Callithrix jacchus) can name each other. The animals make specific sounds to refer to each of their peers, a behaviour that had previously been seen only in humans, dolphins and elephants.
[2]
'About as close to aliens as we'll ever get.' Can AI crack animal language?
Can a robot arm wave hello to a cuttlefish -- and get a hello back? Could a dolphin's whistle actually mean "Where are you?" And are monkeys quietly naming each other while we fail to notice? These are just a few of the questions tackled by the finalists for this year's Dolittle prize, a $100,000 award recognizing early breakthroughs in artificial intelligence (AI)-powered interspecies communication. The winning project -- announced today -- explores how dolphins use shared, learned whistles that may carry specific meanings -- possibly even warning each other about danger, or just expressing confusion. The other contending teams -- working with marmosets, cuttlefish, and nightingales -- are also pushing the boundaries of what human-animal communication might look like. The prize marks an important milestone in the Coller Dolittle Challenge, a 5-year competition offering up to $10 million to the first team that can achieve genuine two-way communication with animals. "Part of how this initiative was born came from my skepticism," says Yossi Yovel, a neuroecologist at Tel Aviv University and one of the prize's organizers. "But we really have much better tools now. So this is the time to revisit a lot of our previous assumptions about two-way communication within the animal's own world." Science caught up with the four finalists to hear how close we really are to cracking the animal code. This interview has been edited for clarity and length. Cuttlefish (Sepia officinalis and S. bandensis) lack ears and voices, but they apparently make up for this with a kind of sign language. When shown videos of comrades waving their arms, they wave back. Q: Why study cuttlefish? Peter Neri, a computational neuroscientist at the Ecole Normale Supérieure and co-lead on the cuttlefish project: Cuttlefish are about as close to aliens as we'll ever get. They're invertebrates with astonishing behavioral complexity, and their [ability to] sense water vibrations works surprisingly like our own hearing. Our cochlear fluid and inner ear hair cells are essentially an adaptation of that same ancient system. In a way, we're still carrying a bit of seawater in our heads. Nightingales (Luscinia megarhynchos) boast an astonishing vocal range, with individual repertoires of up to 200 distinct songs. These small, plain-looking birds reply to scientists' artificial whistles by matching pitch and timing. Q: How is AI helping decode nightingale songs? Jan Clemens, a neurologist at the European Neuroscience Institute and co-lead on the nightingale project: Nightingale songs are built from unique syllables. Our AI system can group those syllables by sound structure, which would take months to do by hand. It's opening up patterns we just couldn't see before. Q: How does this work for species that don't vocalize? Sophie Cohen-Bodénès, a behavioral biologist at Washington University in St. Louis and co-lead on the cuttlefish project: We first trained AI to detect color patterns on [cuttlefish] skin -- spots, stripes, and camouflaging. That gave us the foundation to start decoding their gestures, too. Now, we're using it to analyze wave signs and figure out what they might be saying with their arms. Dolphins (Tursiops truncates) are renowned for their complex communication. They use "baby talk" with their calves and call one another with unique signature whistles -- the equivalent of human names. Q: Can AI help humans eavesdrop on these conversations? Laela Sayigh, a biologist at the Woods Hole Oceanographic Institution and co-lead on the dolphin project: The [dolphins] we study in Florida have been tracked for decades. That gives us labeled recordings from known individuals, which is essential for figuring out whether their shared whistles might work like words. Right now we're using AI to classify whistles, but we hope the algorithms will also help us track dolphins across the bay in near-real time, linking the sounds they make to where they are and who they're with. Marmosets (Callithrix jacchus) -- rat-size New World monkeys -- communicate in loud, high-pitched shrills, about as piercing as a smoke alarm at close range. Like dolphins, they appear to have "names," signified by unique screeches. Q: Are you finding evidence for anything like human communication in these animals? David Omer, a neuroscientist at the Hebrew University of Jerusalem and leader on the marmoset project: We've just started working on a language model for marmoset communication. The idea is to predict what call comes next based on the ones that came before -- like how a machine-learning language model might predict the next word in a sentence. If we find consistent structure, that could hint at something grammarlike. S.C.: We're building a soft robotic arm that can perform the same gestures cuttlefish make. The idea is to let it "talk" to them through movement, and see whether they respond in real time -- like a back-and-forth conversation in their own language. Q: How close are we really to cracking animal language? D.O.: If you mean full-on conversations, we're not there. But if you mean meaningful, structured exchanges in the animal's own world? That could be just a few years off. Daniela Vallentin, a behavioral neuroscientist at the Max Planck Institute for Biological Intelligence and co-lead on the nightingale project: I don't think animals speak in sentences, but I do think they express ideas. "I'm here." "I'm feeling good." Or "I'm not." That's the level I think we're close to understanding. Q: Were there any moments in your research that made you stop and say, "Wait -- did that just happen?" L.S.: Male [dolphins] form pairs and call each other's [signature] whistles if they get separated. But once, we were just testing our equipment and played one of those whistles while the pair was still together. They responded with a totally different whistle -- one we hadn't documented before. We've since heard it in other confusing situations. We call it the "WTF whistle," because it really did seem like that's what they were asking. S.C.: Oh definitely. I was running our experiment with a smaller cuttlefish species, and one escaped into a big tank. I thought she was gone. But when I launched the experiment again -- playing a video of wave signs -- she came back, climbed onto a rock, stared at me, and started waving. It was like she knew what we were doing and wanted to join in.
Share
Share
Copy Link
A new contest offers substantial prizes for AI-driven advancements in interspecies communication, with researchers making progress in decoding dolphin, cuttlefish, nightingale, and marmoset languages.
In a groundbreaking initiative, the Coller Dolittle Challenge is offering substantial prizes to spur progress in AI-powered interspecies communication. The contest, organized by the Jeremy Coller Foundation in partnership with Tel Aviv University, aims to achieve a "breakthrough" in understanding and potentially conversing with animals 1.
The challenge offers a grand prize of US$500,000 in cash or US$10 million in investments for achieving two-way, multi-context communication with animals using their own signals. Additionally, an annual prize of US$100,000 is awarded for research that advances our understanding of animal communication through AI 1.
The inaugural annual prize was awarded to a team led by Laela Sayigh from the Woods Hole Oceanographic Institution. Their research focuses on decoding the complex whistles of bottlenose dolphins in Sarasota Bay, Florida 1.
Key findings include:
Cuttlefish Communication:
Nightingale Song Analysis:
Marmoset Naming Behavior:
AI is revolutionizing the field by:
However, researchers emphasize that zoological expertise remains crucial in interpreting results and designing effective studies 1.
While full-fledged conversations with animals remain a distant goal, researchers are optimistic about achieving meaningful, structured exchanges in the near future. The focus is on understanding simple concepts like presence, emotional states, and basic needs within the animal's own communication framework 2.
As AI continues to advance, it promises to unlock new insights into the complex world of animal communication, potentially revolutionizing our understanding of and interaction with other species.
Reference
Recent AI-powered studies have made significant progress in understanding and translating animal communication, bringing us closer to the possibility of interspecies dialogue. This development has implications for conservation efforts and our understanding of animal cognition.
2 Sources
2 Sources
Google, in collaboration with the Wild Dolphin Project and Georgia Tech, has developed DolphinGemma, an AI model aimed at decoding dolphin vocalizations and potentially enabling interspecies communication.
23 Sources
23 Sources
The Earth Species Project, a nonprofit lab, is using AI to decipher animal vocalizations, with the goal of improving conservation efforts and reconnecting humans with nature.
5 Sources
5 Sources
Chinese tech giant Baidu files a patent for an AI-driven system to translate animal vocalizations into human language, potentially revolutionizing human-animal communication.
9 Sources
9 Sources
Researchers have made significant progress in using AI to interpret animal emotions and pain, with potential applications in animal welfare, livestock management, and conservation.
3 Sources
3 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved