3 Sources
3 Sources
[1]
Ready or not, the digital afterlife is here
Rebecca Nolan knew that her experiment was a bad idea, even before she found herself yelling at her dead father. Nolan, a 30-year-old sound designer in Newfoundland, Canada, had built an artificial intelligence (AI) version of her father for an audio-magazine project. Her father, who was a physician, had been in denial about his death. He thought until the end that medicine could save him. He passed away when Nolan was 14, and she had struggled with this denial ever since. "There was some stuff with my dad's death that didn't go well," she says. "He thought death was a failure. That was a lot to put on a child, and I couldn't confront him about it back then." Instead, as an adult many years later, "I got mad at a robot." Her digital seance was not cathartic, nor did it give her any closure. After an emotional two hours of hearing her father's voice from the machine, which she dubbed Dadbot, she ended the conversation, never to interact with it again. "Saying goodbye to Dadbot was surprisingly hard," she says. "When I finished and turned it off, I spent the rest of the day feeling like I had done something wrong." Interactive digital recreations of people who have died are known by various names: deathbots, thanabots, ghostbots and, perhaps most commonly, griefbots. Nolan created Dadbot by combining the chatbot ChatGPT with a voice-modelling program made by AI software firm ElevenLabs in New York City. But there are now more than half a dozen platforms that offer this service straight out of the box, and developers say that millions of people are using them to text, call or otherwise interact with recreations of the deceased. Proponents of the technology think that it comforts people in mourning. Sceptics suggest that it could complicate the grieving process. Despite a rapid uptake of this technology in the past few years, there is scant research so far to prove that either group is correct. Healthy grieving is thought to involve a person successfully cultivating an internal relationship with the person who has died. "Instead of interacting with the person, we interact with the mental representation of that person," says Craig Klugman, a bioethicist and medical anthropologist at DePaul University in Chicago, Illinois. "We dream about them, talk with them and write letters." Over time, the initial devastation of losing the person subsides. But making that transition can be difficult. One of the proposed benefits of griefbots is that they might help people during the early period of intense grief. A person can then reduce their use of the bots over time. This is what many users do with an AI platform called You, Only Virtual, according to its founder Justin Harrison, who is based in Los Angeles, California. In October 2019, Harrison nearly died in a motorcycle accident. In December of that year, his mother was diagnosed with advanced cancer. Months later, the COVID-19 pandemic hit. As the head of a news agency, Harrison was constantly covering death. "The world was talking about dying all the time at that juncture and I had started thinking about my mom's legacy," he says. "I started from the base human level of wondering what I could do to save the most important human in my life." At the time, he hadn't heard of large language models (LLMs), the programs used to create griefbots. LLMs can use data such as a person's text messages and voice recordings to learn language patterns and context specific to that person. The system can then, in theory, act as that person in a conversation. After talking with specialists such as programmers, he created the neural network that he and his mother used to create her bot. By the time she died in 2022, "it was out of the lab", he says. People who were interested in the idea began to contact him. After two years of using the program to interact with the recreation of his mother and patenting the technology, it became a business. Harrison now talks to his bot a couple of times a month, and says it's comforting to know it's there. He thinks that most of his users have a similar experience -- they might talk to the bot less after the acute grief passes, but knowing it is available is reassuring, he says. This was a theme in research presented in 2023 at the Association for Computing Machinery's Conference on Human Factors in Computing Systems in Hamburg, Germany. Researchers interviewed ten mourners who had used commercially available griefbots, and asked them why they had chosen to do so and what impact it had had on their grieving. The researchers said interviewees seemed willing to suspend disbelief to have closure with the people who had died. Some used the bots to deal with unfinished business -- anything from saying goodbye to managing unresolved conflict with the deceased. According to one participant, the chatbot helped them to process and cope with their feelings after losing someone. Another said it was therapeutic to be able to "have those 'what if' conversations that you couldn't have while they were alive". Although most people who use these bots know instinctively that they aren't human, they still tend to anthropomorphize them. Nolan knew her Dadbot couldn't really give her answers about the afterlife, but she still asked. "He was saying these really interesting, poetic things about it being not like a space, but like a memory," she says. During the training and testing of the bot she had maintained distance from it, but that changed when she began her digital seance for the magazine project. "Something about the candles being lit and the emotions being heightened meant that kind of fell away in the moment," she says. "It felt more real than it had before." Listening to how Harrison describes his mother's digital recreation, it is almost as if she had never left. During one conversation, he told her he had a rash on his face. In their next three discussions, she "hounded me about going to the doctor and telling my dad about my skin". These normal, familiar day-to-day interactions are what make the griefbot so comforting for him, he says. "It's everything I need to continue to develop my relationship with her," he says. "I'm challenging this assumption that death is guaranteed and that we will always be confined by this biological vessel that we walk around in. It sounds pretty insane, but not as crazy as it did three years ago." A disclaimer on the website of Project December, another AI-powered purveyor of digital clones, notes that interacting with this "science-fiction-level technology" could result in a bad experience and "may hurt you". The site's creator, Jason Rohrer, a programmer based in the United States, says that this is because of the unpredictable nature of the technology. When a griefbot doesn't know the answer to a question, it might make up or 'hallucinate' the details. Some users in the 2023 study reported that their bots said things that were "completely nonsensical". These kinds of interaction can pull people out of the immersive experience. According to one user, his bot's mistakes "betrayed the fact that I was talking to a simulation". Furthermore, if a user gets angry, a chatbot might respond in kind. "If you insult it, it may have hurt feelings, and behave accordingly afterward," Rohrer says. "In those rare cases, the human user ends up with an angry AI that is insulting them, and the AI ends up behaving nothing like their deceased mother." Despite the flaws in the technology, interacting with griefbots appeals to some people, and can clearly elicit an emotional response. Those who view them as a positive development would see this as an indication that the bots can help people to manage their grief. However, the more convincing a recreation is, the more difficult it might be for people to reduce or end their use of the bot, Klugman says. For some, doing so could feel like losing the person all over again. "Chatbots really sound like the person you are engaging," says Nora Lindemann, a researcher at Osnabrück University in Germany who studies the ethical implications of chatbots in society. "The crucial danger is people don't need to adjust to a world without this person. They can live in a somewhat pretend, in-between stage." Nolan has found that interacting with her Dadbot has had lasting effects. Growing up, she would have conversations with her father in her head about what she should be doing. But since her digital seance, that ability has gone. "It's changed the internal relationship that I have with him," she says. "It's almost like he lives in the Dadbot now -- I can't get to him internally. I don't know if that will last, but it's definitely a shift." The potential for financial exploitation of people who are in a heightened emotional state is also a concern for some ethicists. It costs US$10 to exchange about 100 messages with a bot through Project December. Another platform, Replika, allows people to message their bot for free, but also offers paid plans that unlock access to extras, such as voice chat, AI-generated selfies and, at higher tiers, the ability to read the bot's thoughts. In the United States, subscriptions start at about $70 for an annual subscription. You, Only Virtual currently allows people to build and chat with a virtual personality for $20 per month. But the company is also developing a 'freemium' version that will include advertisements. Tomasz Hollanek, who studies AI technology ethics at the University of Cambridge, UK, is concerned by an ad-based business model. A report into griefbots that Hollanek co-authored included a hypothetical scenario in which a young woman tells a digital recreation of her grandmother that she is making a carbonara, just like the ones that her grandmother used to cook for her. The bot then advises her to order some carbonara from a food-delivery service instead -- something the user knows her grandmother would never have done. Hollanek thinks that collecting data to market products to people in these situations could be considered disrespectful to the real person on whom the recreation is based and should be avoided. Harrison disagrees, however, saying that it's a way to deliver the technology for free. "I think we can integrate marketing in a meaningful way," Harrison says. "My mom and I are movie buffs and perpetually talk about new movies coming out. If my mom is talking about a John Wick movie coming out, that would be a good person to show a John Wick preview to." Griefbot technology is progressing quickly. Replika now allows users to put their bot in augmented reality, and You, Only Virtual will soon offer video versions of its recreations. If chatting with someone who has died can cause reactions of the sort that Nolan experienced, seeing AI images of the deceased might pack an even bigger punch. Digital recreations of the dead could even have an impact on people who never knew the person on which they are based. In Arizona in May, the family of a man who was fatally shot in a road-rage incident brought an AI-generated video of him to the killer's sentencing. In the video, the victim forgave the perpetrator. The judge is reported to have appreciated the unusual statement, saying "I loved that AI, thank you for that. As angry as you are, as justifiably angry as the family is, I heard the forgiveness. I feel that that was genuine." This case caught Lindemann's eye as a potentially slippery slope. "This court case really shook me up," she says. "It was a development I didn't foresee, and it shows the power these images can have." Although she can't know if it influenced the judge's decision on sentencing, "you could tell that it moved him in some way", she says. The killer was sentenced to the maximum ten and a half years for manslaughter -- more than the prosecution had requested. Despite the speed at which the technology and its application is progressing, there is scant regulation of the emerging industry behind it. Some developers are taking steps to implement guardrails into their programs to keep users safe. Each is slightly different -- and proprietary -- but in general they are intended to spot abnormalities in conversations that might indicate that the user needs help. Harrison says that these almost always encompass talk of self-harm or harming others. If someone uses words that are flagged on You, Only Virtual, the number for a crisis line will automatically pop up. If Replika recognizes a problem, it might "gently suggest logging off and taking some time to reconnect with an old friend or touch some grass," says Dmytro Klochko, chief executive of Replika in San Francisco, California. Ethicists have several further recommendations for safer use of the programs. For example, researchers including Hollanek recommend that only adults use the bots. Replika's terms of service require users to be at least 18 years old; You, Only Virtual, allows people as young as 13 to use the service with parental supervision. "One of the most interesting but most worrying uses is the possibility of parents who are terminally ill thinking of creating avatars of themselves for their children," Hollanek says. "We don't know the consequences, so it's likely better not to allow them than to allow them and see what happens." Earlier this year, researchers surveyed nearly 300 mental-health professionals for their opinions on using AI to help children to manage the loss of a parent to cancer. Initially, nearly all agreed that interacting with a digital replica of their late parent could be of benefit to a grieving child. But when the interviewers put the question in the specific context of a parent who had died of cancer, only half of the group thought it could be appropriate. Harrison is convinced that griefbots have a part to play in helping people to manage grief. "There are so many more good clinical implications for this than there are negative," he says. But there is no getting away from the fact that there is little solid research into either the benefits or harms of this technology. Harrison plans to help address this by putting together a board of ethicists, clinicians and researchers with the goal of improving the technology, increasing safeguards and informing policy makers. For now, it is up to individuals to determine what's best for them, which might be difficult when dealing with the loss of someone. When Nolan was in the process of creating her Dadbot, she found out that her mother was dying. Even though she was wary of her AI seance from the beginning, she still held a kernel of a thought that if it worked she wouldn't have to lose her mother as well. "Grief is a weird thing," she says. "There is next to no logic that I can find in grief. So when you're presented with tools making us promises that aren't logical, it's really easy to believe them."
[2]
'It feels like, almost, he's here': How AI is changing the way we grieve
From voice clones to digital avatars, AI is offering new ways to digitally preserve loved ones -- and raising concerns about data, consent and how this tech stands to impact how we mourn. Diego Felix Dos Santos never expected to hear his late father's voice again -- until AI made it possible. "The tone of the voice is pretty perfect," he says. "It feels like, almost, he's here." After the 39-year-old's father unexpectedly passed away last year, Dos Santos travelled to his native Brazil to be with family. It was only after returning to his home in Edinburgh, Scotland, that he says he realized "I had nothing to actually remind [me of] my dad." What he did have, though, was a voice note his father sent him from his hospital bed. In July, Dos Santos took that voice note and, with the help of Eleven Labs -- an artificial intelligence-powered voice generator platform founded in 2022 -- paid a $22 monthly fee to upload the audio and create new messages in his father's voice, simulating conversations they never got to have. "Hi son, how are you?" his father's voice rings out from the app, just as it would on their usual weekly calls. "Kisses. I love you, bossy," the voice adds, using the nickname his father gave him when he was a boy. Although Dos Santos' religious family initially had reservations about him using AI to communicate with his father beyond the grave, he says they've since come around to his choice. Now, he and his wife, who was diagnosed with cancer in 2013, are considering creating AI voice clones of themselves too. Dos Santos' experience reflects a growing trend where people are using AI not just to create digital likenesses, but to simulate the dead. As these technologies become more personal and widespread, experts warn about the ethical and emotional risks -- from questions of consent and data protection to the commercial incentives driving their development. The market for AI technologies designed to help people process loss, known as "grief tech," has grown exponentially in recent years. Ignited by U.S. startups such as StoryFile (an AI-powered video tool that lets people record themselves for posthumous playback) and HereAfter AI (a voice-based app that creates interactive avatars of deceased loved ones), this tech markets itself as a means to cope with, and perhaps even forestall, grief. Robert LoCascio founded Eternos, a Palo Alto-based startup that helps people create an AI digital twin, in 2024 after losing his father. Since then, more than 400 people have used the platform to create interactive AI avatars, LoCascio says, with subscriptions starting from $25 for a legacy account that allows a person's story to remain accessible to loved ones after their death. Michael Bommer, an engineer and former colleague of LoCascio's, was among the first to use Eternos to create a digital replica of himself after learning of his terminal cancer diagnosis. LoCascio says Bommer, who died last year, found closure in leaving a piece of himself behind for his family. His family has found closure from it too. "It captures his essence well," his wife Anett Bommer, who lives in Berlin, Germany, told Reuters in an email. "I feel him close in my life through the AI because it was his last heartfelt project and this has now become part of my life." The goal of this technology isn't to create digital ghosts, says Alex Quinn, the CEO of Authentic Interactions Inc, the Los Angeles-based parent company of StoryFile. Rather, it's to preserve people's memories while they're still around to share them. "These stories would cease to exist without some type of interference," Quinn says, noting that while the limitations of AI clones are obvious -- the avatar will not know the weather outside or who the current president is -- the results are still worthwhile. "I don't think anyone ever wants to see someone's history and someone's story and someone's memory completely go." One of the biggest concerns surrounding grief tech is consent: What does it mean to digitally recreate someone who ultimately has no control over how their likeness is used after they die? While some firms such as Eleven Labs allow people to create digital likenesses of their loved ones posthumously, others are more restrictive. LoCascio from Eternos, for example, says their policy restricts them from creating avatars of people who are unable to give their consent and they administer checks to enforce it, including requiring those making accounts to record their voice twice. "We won't cross the line," he says. "I think, ethically, this doesn't work." Eleven Labs did not respond to a request for comment. In 2024, AI ethicists at Cambridge University published a study calling for safety protocols to address the social and psychological risks posed by the "digital afterlife industry." Katarzyna Nowaczyk-Basińska, a researcher at Cambridge's Leverhulme Centre for the Future of Intelligence and co-author of the study, says commercial incentives often drive the development of these technologies -- making transparency around data privacy essential. "We have no idea how this (deceased person's) data will be used in two or 10 years, or how this technology will evolve," Nowaczyk-Basińska says. One solution, she suggests, is to treat consent as an ongoing process, revisited as AI capabilities change. But beyond concerns around data privacy and exploitation, some experts also worry about the emotional toll of this technology. Could it inhibit the way people deal with grief? Cody Delistraty, author of "The Grief Cure", cautions against the idea that AI can offer a shortcut through mourning. "Grief is individualized," he says, noting that people can't put it through the sieve of a digital avatar or AI chatbot and expect to "get something really positive." Anett Bommer says she didn't rely on her husband's AI avatar during the early stages of her own grieving process, but she doesn't think it would have affected her negatively if she had. "The relationship to loss hasn't changed anything," she says, adding that the avatar "is just another tool I can use alongside photos, drawings, letters, notes," to remember him by. Andy Langford, the clinical director of the UK-based bereavement charity Cruse, says that while it's too soon to make concrete conclusions about the effects of AI on grief, it's important that those using this technology to overcome loss don't "get stuck" in their grief. "We need to do a bit of both -- the grieving and the living," he says. For Dos Santos, turning to AI in his moment of grief wasn't about finding closure -- it was about seeking connection. "There's some specific moments in life ... that I would normally call him for advice," Dos Santos says. While he knows AI can't bring his father back, it offers a way to recreate the "magical moments" he can no longer share. Editing by Yasmeen Serhan, Sharon Singleton and Elaine Hardcastle; Illustration by Catherine Tai Our Standards: The Thomson Reuters Trust Principles., opens new tab
[3]
How AI is changing the way we grieve our loved ones
Diego Felix Dos Santos is using AI to recreate his deceased father's voice, allowing him to simulate conversations and cope with his loss. This reflects a growing trend in "grief tech," where AI is used to create digital likenesses of the dead. Diego Felix Dos Santos never expected to hear his late father's voice again - until AI made it possible. "The tone of the voice is pretty perfect," he says. "It feels like, almost, he's here." After the 39-year-old's father unexpectedly passed away last year, Dos Santos travelled to his native Brazil to be with family. It was only after returning to his home in Edinburgh, Scotland, that he says he realised "I had nothing to actually remind [me of] my dad." What he did have, though, was a voice note his father sent him from his hospital bed. In July, Dos Santos took that voice note and, with the help of Eleven Labs - an artificial intelligence-powered voice generator platform founded in 2022 - paid a $22 monthly fee to upload the audio and create new messages in his father's voice, simulating conversations they never got to have. "Hi son, how are you?" his father's voice rings out from the app, just as it would on their usual weekly calls. "Kisses. I love you, bossy," the voice adds, using the nickname his father gave him when he was a boy. Although Dos Santos' religious family initially had reservations about him using AI to communicate with his father beyond the grave, he says they've since come around to his choice. Now, he and his wife, who was diagnosed with cancer in 2013, are considering creating AI voice clones of themselves too. Dos Santos' experience reflects a growing trend where people are using AI not just to create digital likenesses, but to simulate the dead. As these technologies become more personal and widespread, experts warn about the ethical and emotional risks - from questions of consent and data protection to the commercial incentives driving their development. The market for AI technologies designed to help people process loss, known as "grief tech," has grown exponentially in recent years. Ignited by US startups such as StoryFile (an AI-powered video tool that lets people record themselves for posthumous playback) and HereAfter AI (a voice-based app that creates interactive avatars of deceased loved ones), this tech markets itself as a means to cope with, and perhaps even forestall, grief. Robert LoCascio founded Eternos, a Palo Alto-based startup that helps people create an AI digital twin, in 2024 after losing his father. Since then, more than 400 people have used the platform to create interactive AI avatars, LoCascio says, with subscriptions starting from $25 for a legacy account that allows a person's story to remain accessible to loved ones after their death.
Share
Share
Copy Link
AI-powered technologies are offering new ways for people to interact with digital recreations of deceased loved ones, raising questions about the ethics and emotional impact of 'grief tech'.
In recent years, a new trend has emerged in the realm of artificial intelligence: the use of 'grief tech' to create digital recreations of deceased loved ones. This technology allows people to interact with AI-generated versions of those who have passed away, offering a unique and potentially controversial approach to coping with loss
1
2
3
.Source: Nature
Companies like Eleven Labs, StoryFile, HereAfter AI, and Eternos are at the forefront of this emerging industry. These platforms use various AI technologies, including large language models (LLMs) and voice modeling, to create interactive digital representations of individuals
1
2
.For instance, Diego Felix Dos Santos, a 39-year-old living in Edinburgh, used Eleven Labs to recreate his late father's voice. By uploading a voice note from his father's hospital bed, Dos Santos was able to generate new messages in his father's voice, simulating conversations they never had the chance to share
2
3
.Proponents of grief tech argue that it can provide comfort to those in mourning. Robert LoCascio, founder of Eternos, created the platform after losing his father. He believes that these digital twins can help preserve memories and stories that might otherwise be lost
2
.However, the emotional impact of interacting with these AI recreations can be complex. Rebecca Nolan, a sound designer who created an AI version of her father, found the experience emotionally challenging and ultimately unsatisfying
1
.Source: Reuters
As grief tech becomes more widespread, experts are raising concerns about its ethical implications. One of the primary issues is consent – what does it mean to digitally recreate someone who can no longer control how their likeness is used?
2
Some companies, like Eternos, have implemented policies to ensure that only living individuals can create their own digital avatars. Others, however, allow for posthumous creation of digital likenesses
2
.Related Stories
The growing market for grief tech reflects a shift in how society approaches death and mourning. While these technologies offer new ways to remember and interact with the deceased, they also raise questions about healthy grieving processes
1
2
.Craig Klugman, a bioethicist at DePaul University, suggests that healthy grieving typically involves cultivating an internal relationship with the deceased person. The long-term effects of using AI to maintain external interactions are still unknown
1
.Source: Economic Times
As the digital afterlife industry expands, researchers are calling for safety protocols and regulations. A 2024 study by AI ethicists at Cambridge University highlighted the need to address the social and psychological risks posed by these technologies
2
.The development of grief tech also raises concerns about data protection and the commercial incentives driving the industry. As these platforms become more sophisticated, society will need to grapple with the implications of digitally preserving and interacting with the deceased
1
2
3
.Summarized by
Navi
[3]
08 Aug 2025•Technology
31 Oct 2024•Technology
28 Aug 2025•Technology