Curated by THEOUTPOST
On Mon, 14 Oct, 8:01 AM UTC
5 Sources
[1]
AI is bringing the dodo back to life (kinda) at a museum in Cambridge
A museum experiment in Cambridge is allowing visitors to have a conversation with a dodo, and other dead animals, with the help of artificial intelligence. The Mauritius dodo, a bird last seen in the 17th century, is once again "speaking" at the Cambridge Museum of Zoology, thanks to AI technology. Despite being extinct by the 1680s, visitors can now engage in real-time conversations with a virtual version of the iconic bird through their smart devices. The AI-driven experience allows the dodo to answer questions about its life, including how it became extinct after sailors arrived on its native island of Mauritius. It can even explore more ethical questions like whether it would like to be cloned back into existence by scientists. Jack Ashby, assistant director of the Museum of Zoology Cambridge, thinks AI provides a new way for visitors to interact with its exhibits. He says: "Museums generally choose what to tell people, but in this way they can ask whatever they like and that's really, really valuable, I think. They can have an actual conversation with an animal, with a specimen, and I think brings it to life in a really different way than a normal museum exhibit might." There are 12 other animal specimens featured in the project, each with its own unique voice. The platypus has a poetic turn of phrase as it describes its life in the water. And the giant skeleton of a megatherion has a female voice. Other animals featured in the AI experience include a narwhal, a butterfly, a fin whale and even a cockroach. Jack Ashby says: "When I started working with the Nature Perspectives platform, I was just asking factual questions. But because the animal's personality comes across really quickly, you end up having an actual conversation where you're asking more about feelings, you're asking anything fun. You could ask how its day was or what it had for breakfast." To avoid anthropomorphising the animals, the AI's creators tried to provide an authentic voice. Of course a dodo isn't going to be a Taylor Swift fan, instead it tries to provide a response consistent with its environment. The AI technology is provided by Nature Perspectives, an international tech-education company founded by Cambridge graduates who studied together on a masters of Conservation Leadership. Speaking from Tel Aviv, co-founder Gal Zadir explains the technology behind the project: "So the simulations are very flexible. We built what we call a digital mine on a simulation of a specific individual that includes personality traits, as much as we can of the known science about its evolutionary adaptations and so on, alongside a memory bank of things that it could possibly have experienced during its lifetime." Ashby also emphasizes the inclusivity of the technology, noting that it adapts its responses based on the user's age and language: "So people can tell the animal what age they are. It speaks back to them at a kind of age appropriate level. It speaks in 20 different languages. So whoever you are, you can come and ask a question. You can type in your questions, or you can use voice. So yeah, it's very inclusive." To ensure appropriate usage, the museum has implemented "guardrails" like profanity filters. The AI experience will be available from 15 October to 15 November 2024, after which the museum will assess its success and the nature of the conversations it inspired.
[2]
AI gives voice to dead animals in Cambridge exhibition
Creatures can converse and share their stories by voice or text through visitors' mobile phones at Museum of Zoology If the pickled bodies, partial skeletons and stuffed carcasses that fill museums seem a little, well, quiet, fear not. In the latest coup for artificial intelligence, dead animals are to receive a new lease of life to share their stories - and even their experiences of the afterlife. More than a dozen exhibits, ranging from an American cockroach and the remnants of a dodo, to a stuffed red panda and a fin whale skeleton, will be granted the gift of conversation on Tuesday for a month-long project at Cambridge University's Museum of Zoology. Equipped with personalities and accents, the dead creatures and models can converse by voice or text through visitors' mobile phones. The technology allows the animals to describe their time on Earth and the challenges they faced, in the hope of reversing apathy towards the biodiversity crisis. "Museums are using AI in a lot of different ways, but we think this is the first application where we're speaking from the object's point of view," said Jack Ashby, the museum's assistant director. "Part of the experiment is to see whether, by giving these animals their own voices, people think differently about them. Can we change the public perception of a cockroach by giving it a voice?" The project was devised by Nature Perspectives, a company that is building AI models to help strengthen the connection between people and the natural world. For each exhibit, the AI is fed specific details on where the specimen lived, its natural environment, and how it arrived in the collection, alongside all the available information on the species it represents. The exhibits change their tone and language to suit the age of the person they are talking to, and can converse in more than 20 languages, including Spanish and Japanese. The platypus has an Australian twang, the red panda is subtly Himalayan, and the mallard sounds like a Brit. Through live conversations with the exhibits, Ashby hopes visitors will learn more than can fit on the labels that accompany the specimens. As part of the project, the conversations that visitors hold with the exhibits will be analysed to get a better picture of the information people want on specimens. While the AI suggests a number of questions, such as asking the fin whale "tell me about life in the open ocean", visitors can ask whatever they like. "When you talk to these animals, they really come across as personalities, it's a very strange experience," Ashby said. "I started by asking things like 'where did you live?' and 'how did you die?', but ended up with far more human questions." Asked what it used to eat, the museum's dodo, one of the most complete specimens in the world, described its Mauritian diet of fruits, seeds and the occasional small invertebrate, explaining how its strong, curved beak was perfect for cracking open the tough fruits of the tambalacoque tree. The AI-enhanced exhibit also shared its views on whether humans should attempt to bring the species back through cloning. "Even with advanced techniques, the dodo's return would require not just our DNA but the delicate ecosystem of Mauritius that supported our kind," it said. "It's a poignant reminder that the true essence of any life goes beyond the genetic code - it's intricately woven into its natural habitat." The fin whale skeleton, which hangs from the museum roof, was granted a similar level of apparent thoughtfulness. Asked about the most famous person it had met, it conceded that while alive it did not have the chance to meet "famous" individuals as humans see them. "However," the AI-powered skeleton continued, "I like to think that anyone who stands below me and feels awe, reverence and love for the natural world is someone of significance."
[3]
Cambridge uses AI to bring extinct animals to life at exhibit
This unique month-long project, starting on Tuesday, allows more than a dozen museum exhibits, including an American cockroach, a red panda, a fin whale skeleton, and even the remnants of a dodo, to "speak" to visitors. The AI technology enables these long-dead creatures to share their stories and experiences, aiming to foster a deeper understanding of their lives and the biodiversity crisis. Jack Ashby, the museum's assistant director, is excited about the potential of the project to change public perception of these animals. "Museums are using AI in a lot of different ways, but we think this is the first application where we're speaking from the object's point of view," Ashby explained. "Part of the experiment is to see whether, by giving these animals their own voices, people think differently about them. Can we change the public perception of a cockroach by giving it a voice?" The project, developed by Nature Perspectives, uses AI models that give each animal exhibit a unique voice, personality, and even an accent. Visitors can engage in conversations with the exhibits through their mobile phones, either by voice or text.
[4]
Dead animals speak thanks to an AI museum exhibition: 'Can we change the public perception of a cockroach by giving it a voice?'
If a dead lion could speak we still wouldn't understand it, etc, etc... I've always been a little freaked out by talking museum exhibits. As someone who spent my childhood wandering around WW2 exhibitions, I have been confronted many a time by a horrifying wax work in a helmet exclaiming his disapproval of the Axis in over-acted Shakespearean tones. "But what about the animals?", said no-one in particular. Well, thanks to the power of AIâ„¢, a new exhibition at Cambridge University's Museum of Zoology is bringing speech to dead animal exhibits (via The Guardian). And these aren't just pre-recorded monologues either -- long-expired creatures and models can "converse by voice or text through visitors' mobile phones." Essentially, an AI is fed details on each exhibit's life (including its journey since, err, death), along with information about the species as a whole. It then generates engaging and informative speech that's capable of changing in tone and language to suit the age and nationality of the person the exhibit is talking to, creating a plethora of what I can only imagine are mildly culturally insensitive stereotypes of various animals by the region in which they lived. The mallard sounds like a Brit, and the koala, an Australian, apparently. I don't know why we Brits got the waterbird, but there's a joke in there somewhere about dead ducks and our current international standing as a nation. Anyways, you talk to the dead animals, and the dead animals talk to you. Sweet! "When you talk to these animals, they really come across as personalities, it's a very strange experience" said Jack Ashby, the museums assistant director. "I started by asking things like 'where did you live?' and 'how did you die?', but ended up with far more human questions." Even less well-loved animals like insects get a look in, and Ashby seems to hope that, by talking to one directly (or at least, an AI trained to "think" like one), it might change some perspectives as to how we perceive these creatures in our day-to-day lives: "Part of the experiment is to see whether, by giving these animals their own voices, people think differently about them. Can we change the public perception of a cockroach by giving it a voice?" I honestly don't know. I'd probably be less inclined to squish one if it spoke to me in a charming Belgian lilt, but in terms of museum exhibits it certainly beats a stuffed one staring at me voiceless from its glass tomb. The project was devised by Nature Perspectives, the company behind the AI models that allow the dead to speak. The exhibition itself will run for a month, presumably as an initial testing period, although if it's a success I'd personally like to see it widened out to a variety of other museums. Why limit it to animals, that's what I reckon. That would have made my childhood really exciting. Just picture this: "I, the mighty Spitfire, once soared over the skies of Great Britain, the glorious thrum of my Rolls-Royce Merlin engine resonating over the green hills below. Innit mate."
[5]
Cambridge museum visitors to have AI chats with dodo
Visitors to a natural history museum will be able to have two-way chats with animals on display using generative artificial intelligence (AI). The University of Cambridge's Museum of Zoology has chosen 13 specimens for the conversations, including the extinct flightless bird the dodo, narwhal and blue fin whale skeletons, a red panda and a preserved cockroach. Assistant director Jack Ashby said its purpose was to get people engaged with the natural world, as well as providing insights into what visitors wanted to know about the displays. Visitors will scan a QR code near the exhibit with their phones to start the conversation with each specimen. The month-long experiment starts on Tuesday.
Share
Share
Copy Link
Cambridge University's Museum of Zoology launches an innovative AI-powered exhibition allowing visitors to converse with extinct and preserved animal specimens, aiming to enhance engagement and education about biodiversity.
In a groundbreaking exhibition, Cambridge University's Museum of Zoology is bringing extinct and preserved animals to life through artificial intelligence. Starting October 15, 2024, visitors will have the unique opportunity to engage in real-time conversations with 13 different animal specimens, including the iconic dodo, which has been extinct since the 17th century 12.
The AI-driven experience, developed by Nature Perspectives, an international tech-education company, allows visitors to interact with the specimens using their smart devices. Each animal has been given a distinct personality and voice, complete with appropriate accents and language adaptations 34.
Gal Zadir, co-founder of Nature Perspectives, explains the technology: "We built what we call a digital mine on a simulation of a specific individual that includes personality traits, as much as we can of the known science about its evolutionary adaptations and so on, alongside a memory bank of things that it could possibly have experienced during its lifetime" 1.
Jack Ashby, assistant director of the Museum of Zoology Cambridge, emphasizes the educational value of this interactive approach: "Museums generally choose what to tell people, but in this way they can ask whatever they like and that's really, really valuable, I think. They can have an actual conversation with an animal, with a specimen, and I think brings it to life in a really different way than a normal museum exhibit might" 1.
The AI-powered exhibits can adapt their responses based on the user's age and language, speaking in 20 different languages. This inclusivity ensures that a wide range of visitors can benefit from the experience 12.
The exhibition features a diverse array of specimens, including:
Each specimen can answer questions about its life, habitat, and even explore ethical questions. For instance, the dodo can discuss the possibility of being cloned back into existence 14.
One of the primary goals of this project is to raise awareness about the biodiversity crisis. By giving voices to these animals, the museum hopes to change public perceptions and foster a stronger connection between people and the natural world 23.
The month-long project will be assessed for its success and the nature of the conversations it inspired. If successful, this innovative approach could potentially be expanded to other museums and even to non-animal exhibits, revolutionizing the way visitors interact with museum displays 45.
As this unique exhibition unfolds, it promises to offer a fresh perspective on natural history and conservation, blending cutting-edge technology with traditional museum experiences to educate and inspire visitors of all ages.
Reference
[1]
[2]
[3]
Recent AI-powered studies have made significant progress in understanding and translating animal communication, bringing us closer to the possibility of interspecies dialogue. This development has implications for conservation efforts and our understanding of animal cognition.
2 Sources
2 Sources
AI-powered voice cloning technology is advancing rapidly, raising concerns about fraud, privacy, and legal implications. Celebrities like David Attenborough and Scarlett Johansson have been targeted, prompting calls for updated regulations.
3 Sources
3 Sources
Google, in collaboration with the Wild Dolphin Project and Georgia Tech, has developed DolphinGemma, an AI model aimed at decoding dolphin vocalizations and potentially enabling interspecies communication.
21 Sources
21 Sources
Renowned broadcaster Sir David Attenborough expresses deep concern over AI-generated clones of his voice, highlighting the ethical implications and potential misuse of this technology in the entertainment industry.
4 Sources
4 Sources
The London Evening Standard's plan to use AI to imitate deceased art critic Brian Sewell has ignited discussions about journalism ethics, AI's role in media, and the value of human expertise in art criticism.
3 Sources
3 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved