4 Sources
[1]
Can AI help us talk to dolphins? The race is now on
Will artificial intelligence (AI) finally allow us to talk to animals? To spur progress towards that dream, a contest is offering a US$500,000 cash prize for AI use that achieves a "breakthrough" in interspecies communication, as well as an annual award of US$100,000 until the grand prize is claimed. The first annual prize in the contest, the Coller Dolittle Challenge, was awarded yesterday to researchers who are planning to apply AI to some four decades' worth of recordings of bottlenose dolphins (Tursiops truncatus). The team has already identified more than 20 distinct sounds in the dolphin 'vocabulary' and plans to incorporate AI classification methods to expand this lexicon. The contest, which launched a year ago, is organized by the Jeremy Coller Foundation in London -- which funds projects related to animal welfare, among other things -- in partnership with Tel Aviv University in Israel. Annual prizes will be awarded for research that uses AI to improve scientists' understanding of how animals communicate. But "the ultimate goal is indeed what we call two-way, multi-context communication with an animal using the animal's own signals", says Yossi Yovel, a neuroecologist at Tel Aviv University and chair of the award's scientific committee. The grand prize is worth either $500,000 in cash or $10 million in investments; the qualifying criteria will be established in the next year or so, Yovel says. AI has boosted researchers' ability to process large amounts of animal-communication data, such as recordings of bird whistles or wolf howls. The technology has also made it easier for scientists to look for patterns that hint at the meaning of animal sounds, says Yovel. Together with advances in AI for human communication, this has fuelled interest in using AI tools to decode animal communication. But researchers say they have yet to see an AI-based revolution in the field. "You still need a lot of zoological know-how. Thinking that you can just put a camera somewhere, record an animal and automatically detect something -- the chances for this being useful are very low," says Yovel. There's been a lot of hype about applying large language models to animal communication systems, says Arik Kershenbaum, a zoologist at the University of Cambridge, UK, who is not involved in the contest. That approach hasn't yet been proved, but contest organizers are optimistic that AI can improve two-way communication between humans and animals. Laela Sayigh, a biologist at the Woods Hole Oceanographic Institution in Massachusetts, and her team have been studying a community of some 170 bottlenose dolphins in Sarasota Bay, Florida, for decades. Since 1984, they have been recording sounds from individual dolphins by attaching suction-cup microphones to the animals. This has allowed the team, which won this year's prize, to catalogue most of the dolphins' "signature whistles" -- sounds that function like names and help individuals to identify one another. About half of the whistles produced by dolphins are of this type, Sayigh says. The other half, called non-signature whistles, have received little attention because they are harder to study. "Since we know the signature whistles of most dolphins in the Sarasota community, we had a unique research opportunity," she says. "We have now identified more than 20 repeated, shared non-signature whistle types." By playing back some of these shared whistles and observing the dolphins' reactions, the scientists are beginning to uncover their meaning. For instance, the dolphins approached speakers playing some sounds, suggesting these functioned as a way of initiating contact, but swam away when the speakers played others, suggesting these functioned as a type of alarm. The researchers plan to continue to expand the database of dolphin vocabulary and to explore what different sounds might mean. "We really hope that this prize can push the use of AI in order to reveal even more impressive future results using this immense data set," says Yovel. Among the three other finalists was a team running a project that revealed that two species of cuttlefish (Sepia officinalis and Sepia bandensis) use distinct arm-wave gestures to communicate with their peers. AI analysis allowed the researchers to identify four arm-wave signs used by the animals: up, side, roll and crown. When watching videos of peers displaying those signs, cuttlefish waved back at the screen. And when the scientists played the sounds generated by arm gestures, the cuttlefish waved back using the same gestures. Another finalist group used AI to break down the whistles of common nightingales (Luscinia megarhynchos) into syllables to analyse the songs' structure and syntax. The researchers aim to decode the meanings behind the whistles and generate sounds to communicate with the birds in their own 'language'. The fourth team to reach the finals ran a study that revealed that monkeys called common marmosets (Callithrix jacchus) can name each other. The animals make specific sounds to refer to each of their peers, a behaviour that had previously been seen only in humans, dolphins and elephants.
[2]
'About as close to aliens as we'll ever get.' Can AI crack animal language?
Can a robot arm wave hello to a cuttlefish -- and get a hello back? Could a dolphin's whistle actually mean "Where are you?" And are monkeys quietly naming each other while we fail to notice? These are just a few of the questions tackled by the finalists for this year's Dolittle prize, a $100,000 award recognizing early breakthroughs in artificial intelligence (AI)-powered interspecies communication. The winning project -- announced today -- explores how dolphins use shared, learned whistles that may carry specific meanings -- possibly even warning each other about danger, or just expressing confusion. The other contending teams -- working with marmosets, cuttlefish, and nightingales -- are also pushing the boundaries of what human-animal communication might look like. The prize marks an important milestone in the Coller Dolittle Challenge, a 5-year competition offering up to $10 million to the first team that can achieve genuine two-way communication with animals. "Part of how this initiative was born came from my skepticism," says Yossi Yovel, a neuroecologist at Tel Aviv University and one of the prize's organizers. "But we really have much better tools now. So this is the time to revisit a lot of our previous assumptions about two-way communication within the animal's own world." Science caught up with the four finalists to hear how close we really are to cracking the animal code. This interview has been edited for clarity and length. Cuttlefish (Sepia officinalis and S. bandensis) lack ears and voices, but they apparently make up for this with a kind of sign language. When shown videos of comrades waving their arms, they wave back. Q: Why study cuttlefish? Peter Neri, a computational neuroscientist at the Ecole Normale Supérieure and co-lead on the cuttlefish project: Cuttlefish are about as close to aliens as we'll ever get. They're invertebrates with astonishing behavioral complexity, and their [ability to] sense water vibrations works surprisingly like our own hearing. Our cochlear fluid and inner ear hair cells are essentially an adaptation of that same ancient system. In a way, we're still carrying a bit of seawater in our heads. Nightingales (Luscinia megarhynchos) boast an astonishing vocal range, with individual repertoires of up to 200 distinct songs. These small, plain-looking birds reply to scientists' artificial whistles by matching pitch and timing. Q: How is AI helping decode nightingale songs? Jan Clemens, a neurologist at the European Neuroscience Institute and co-lead on the nightingale project: Nightingale songs are built from unique syllables. Our AI system can group those syllables by sound structure, which would take months to do by hand. It's opening up patterns we just couldn't see before. Q: How does this work for species that don't vocalize? Sophie Cohen-Bodénès, a behavioral biologist at Washington University in St. Louis and co-lead on the cuttlefish project: We first trained AI to detect color patterns on [cuttlefish] skin -- spots, stripes, and camouflaging. That gave us the foundation to start decoding their gestures, too. Now, we're using it to analyze wave signs and figure out what they might be saying with their arms. Dolphins (Tursiops truncates) are renowned for their complex communication. They use "baby talk" with their calves and call one another with unique signature whistles -- the equivalent of human names. Q: Can AI help humans eavesdrop on these conversations? Laela Sayigh, a biologist at the Woods Hole Oceanographic Institution and co-lead on the dolphin project: The [dolphins] we study in Florida have been tracked for decades. That gives us labeled recordings from known individuals, which is essential for figuring out whether their shared whistles might work like words. Right now we're using AI to classify whistles, but we hope the algorithms will also help us track dolphins across the bay in near-real time, linking the sounds they make to where they are and who they're with. Marmosets (Callithrix jacchus) -- rat-size New World monkeys -- communicate in loud, high-pitched shrills, about as piercing as a smoke alarm at close range. Like dolphins, they appear to have "names," signified by unique screeches. Q: Are you finding evidence for anything like human communication in these animals? David Omer, a neuroscientist at the Hebrew University of Jerusalem and leader on the marmoset project: We've just started working on a language model for marmoset communication. The idea is to predict what call comes next based on the ones that came before -- like how a machine-learning language model might predict the next word in a sentence. If we find consistent structure, that could hint at something grammarlike. S.C.: We're building a soft robotic arm that can perform the same gestures cuttlefish make. The idea is to let it "talk" to them through movement, and see whether they respond in real time -- like a back-and-forth conversation in their own language. Q: How close are we really to cracking animal language? D.O.: If you mean full-on conversations, we're not there. But if you mean meaningful, structured exchanges in the animal's own world? That could be just a few years off. Daniela Vallentin, a behavioral neuroscientist at the Max Planck Institute for Biological Intelligence and co-lead on the nightingale project: I don't think animals speak in sentences, but I do think they express ideas. "I'm here." "I'm feeling good." Or "I'm not." That's the level I think we're close to understanding. Q: Were there any moments in your research that made you stop and say, "Wait -- did that just happen?" L.S.: Male [dolphins] form pairs and call each other's [signature] whistles if they get separated. But once, we were just testing our equipment and played one of those whistles while the pair was still together. They responded with a totally different whistle -- one we hadn't documented before. We've since heard it in other confusing situations. We call it the "WTF whistle," because it really did seem like that's what they were asking. S.C.: Oh definitely. I was running our experiment with a smaller cuttlefish species, and one escaped into a big tank. I thought she was gone. But when I launched the experiment again -- playing a video of wave signs -- she came back, climbed onto a rock, stared at me, and started waving. It was like she knew what we were doing and wanted to join in.
[3]
Want To Speak to Dophins? Researchers Won $100,000 AI Prize Studying Their Whistling
A team of scientists studying a community of Florida dolphins have been awarded the first $100,000 Coller Dolittle Challenge prize, set up to award research in interspecies communication algorithms. The US-based team led by Laela Sayigh of the Woods Hole Oceanographic Institution found that a type of whistles that dolphins employ is used as an alarm. Another they studied is used by dolphins to respond to unexpected or unfamiliar situations. The team used non-invasive hydrophones to perform the research, which provides evidence that dolphins may be using whistles like words shared with multiple members of their communities. Capturing the sounds is just the beginning: researchers will use AI to continue deciphering the whistle to try to find more patterns. "The main thing stopping us cracking the code of animal communication is a lack of data. Think of the 1 trillion words needed to train a large language model like ChatGPT. We don't have anything like this for other animals," said Jonathan Birch, a professor at the London School of Economics and Politics and one of the judges for the prize. "That's why we need programs like the Sarasota Dolphin Research Program, which has built up an extraordinary library of dolphin whistles over 40 years. The cumulative result of all that work is that Laela Sayigh and her team can now use deep learning to analyse the whistles and perhaps, one day, crack the code," he said. The award was part of a ceremony honoring the work of four teams from across the world. In addition to the dolphin project, researchers studied ways in which nightingales, marmoset monkeys and cuttlefish communicate. The challenge is a collaboration between the Jeremy Coller Foundation and Tel Aviv University. Submissions for next year open up in August.
[4]
Want to Speak to Dolphins? Researchers Won $100,000 AI Prize Studying Their Whistling
A team of scientists studying a community of Florida dolphins has been awarded the first $100,000 Coller Dolittle Challenge prize, set up to award research in interspecies communication algorithms. The US-based team, led by Laela Sayigh of the Woods Hole Oceanographic Institution, found that a type of whistle that dolphins employ is used as an alarm. Another whistle they studied is used by dolphins to respond to unexpected or unfamiliar situations. The team used non-invasive hydrophones to perform the research, which provides evidence that dolphins may be using whistles like words, shared with multiple members of their communities. Capturing the sounds is just the beginning. Researchers will use AI to continue deciphering the whistles to try to find more patterns. "The main thing stopping us cracking the code of animal communication is a lack of data. Think of the 1 trillion words needed to train a large language model like ChatGPT. We don't have anything like this for other animals," said Jonathan Birch, a professor at the London School of Economics and Politics and one of the judges for the prize. "That's why we need programs like the Sarasota Dolphin Research Program, which has built up an extraordinary library of dolphin whistles over 40 years. The cumulative result of all that work is that Laela Sayigh and her team can now use deep learning to analyse the whistles and perhaps, one day, crack the code," he said. The award was part of a ceremony honoring the work of four teams from across the world. In addition to the dolphin project, researchers studied ways in which nightingales, marmoset monkeys and cuttlefish communicate. The challenge is a collaboration between the Jeremy Coller Foundation and Tel Aviv University. Submissions for next year open up in August. Researching animals and trying to learn the secrets of their communication is nothing new; but AI is speeding up the creation of larger and lager datasets. "Breakthroughs are inevitable," says Kate Zacarian, CEO and co-founder of Earth Species Project, a California-based nonprofit that also works in breaking down language barriers with the animal world. "Just as AI has revolutionized the fields of medicine and material science, we see a similar opportunity to bring those advances to the study of animal communication and empower researchers in this space with entirely new capabilities," Zacarian said. Zacarian applauded Sayigh's team and their win and said it will help bring broader recognition to the study of non-human animal communication. It could also bring more attention to ways that AI can change the nature of this type of research. "The AI systems aren't just faster -- they allow for entirely new types of inquiry," she said. "We're moving from decoding isolated signals to exploring communication as a rich, dynamic, and structure phenomenon -- whish is a task that's simply too big for our human brains, but possible for large-scale AI models." Earth Species recently released an open-source large audio language model for analyzing animal sounds called NatureLM-audio. The organization is currently working with biologists and ethologists to study species including carrion crows, orcas, jumping spiders and others and plans to release some of their findings later this year, Zacarian said.
Share
Copy Link
Scientists studying dolphin communication win the first Coller Dolittle Challenge prize, showcasing AI's potential in decoding animal languages and advancing interspecies communication.
In a groundbreaking development for interspecies communication, a team of researchers led by Laela Sayigh from the Woods Hole Oceanographic Institution has been awarded the first $100,000 Coller Dolittle Challenge prize. The award recognizes their innovative use of artificial intelligence (AI) in decoding dolphin communication 1.
The Coller Dolittle Challenge, a collaboration between the Jeremy Coller Foundation and Tel Aviv University, aims to spur progress in interspecies communication. The contest offers an annual prize of $100,000 and a grand prize of either $500,000 in cash or $10 million in investments for achieving a "breakthrough" in this field 2.
Sayigh's team has been studying a community of bottlenose dolphins in Sarasota Bay, Florida, for decades. Their research has revealed that dolphins use distinct whistles for various purposes:
The team has identified more than 20 distinct sounds in the dolphin 'vocabulary' and plans to use AI classification methods to expand this lexicon further 2.
Jonathan Birch, a professor at the London School of Economics and Politics and one of the prize judges, emphasized the importance of large datasets in cracking the code of animal communication. He stated, "The main thing stopping us cracking the code of animal communication is a lack of data. Think of the 1 trillion words needed to train a large language model like ChatGPT. We don't have anything like this for other animals" 1.
The Coller Dolittle Challenge also recognized three other finalist teams:
Cuttlefish communication: Researchers discovered that cuttlefish use distinct arm-wave gestures to communicate, identifying four specific signs: up, side, roll, and crown 2.
Nightingale whistles: A team used AI to break down nightingale whistles into syllables, analyzing the songs' structure and syntax to decode their meanings 2.
Marmoset naming: Scientists found evidence that common marmosets can name each other, a behavior previously observed only in humans, dolphins, and elephants 2.
Kate Zacarian, CEO and co-founder of Earth Species Project, believes that AI is revolutionizing the field of animal communication research. She stated, "The AI systems aren't just faster -- they allow for entirely new types of inquiry. We're moving from decoding isolated signals to exploring communication as a rich, dynamic, and structured phenomenon" 4.
As AI continues to advance, researchers are optimistic about the potential for breakthrough discoveries in interspecies communication. The Coller Dolittle Challenge, with its annual prizes and grand prize incentive, is poised to drive further innovation in this exciting field of study.
Summarized by
Navi
Google has launched its new Pixel 10 series, featuring improved AI capabilities, camera upgrades, and the new Tensor G5 chip. The lineup includes the Pixel 10, Pixel 10 Pro, and Pixel 10 Pro XL, with prices starting at $799.
60 Sources
Technology
11 hrs ago
60 Sources
Technology
11 hrs ago
Google launches its new Pixel 10 smartphone series, showcasing advanced AI capabilities powered by Gemini, aiming to compete with Apple in the premium handset market.
22 Sources
Technology
11 hrs ago
22 Sources
Technology
11 hrs ago
NASA and IBM have developed Surya, an open-source AI model that can predict solar flares and space weather with improved accuracy, potentially helping to protect Earth's infrastructure from solar storm damage.
6 Sources
Technology
19 hrs ago
6 Sources
Technology
19 hrs ago
Google's latest smartwatch, the Pixel Watch 4, introduces significant upgrades including a curved display, AI-powered features, and satellite communication capabilities, positioning it as a strong competitor in the smartwatch market.
18 Sources
Technology
11 hrs ago
18 Sources
Technology
11 hrs ago
FieldAI, a robotics startup, has raised $405 million to develop "foundational embodied AI models" for various robot types. The company's innovative approach integrates physics principles into AI, enabling safer and more adaptable robot operations across diverse environments.
7 Sources
Technology
11 hrs ago
7 Sources
Technology
11 hrs ago