Researchers warn that AI can create a “digital afterlife industry”
There is a need to prevent that we get a “digital afterlife industry” using AI to offer fake conversations with lost ones, researchers at Cambridge university say in a report. “AI that allows users to hold text and voice conversations with lost loved ones runs the risk of causing psychological harm and even digitally ‘haunting’ those left behind without design safety standards.”.
‘Deadbots’ or ‘Griefbots’ are AI chatbots that simulate the language and personality traits of the dead using the digital footprints they leave behind.
The researchers note that companies are already offering these services, providing an entirely new type of “postmortem presence”.
AI ethicists from Cambridge’s Leverhulme Centre for the Future of Intelligence outline three design scenarios for platforms that could emerge. The report is published in the journal Philosophy and Technology.
“When the living sign up to be virtually re-created after they die, resulting chatbots could be used by companies to spam surviving family and friends with unsolicited notifications, reminders and updates about the services they provide – akin to being digitally “stalked by the dead”.
Even those who take initial comfort from a ‘deadbot’ may get drained by daily interactions that become an “overwhelming emotional weight”, argue the researchers, yet may also be powerless to have an AI simulation suspended if their now-deceased loved one signed a lengthy contract with a digital afterlife service.
“Rapid advancements in generative AI mean that nearly anyone with Internet access and some basic know-how can revive a deceased loved one,” says Dr Katarzyna Nowaczyk-Basińska, study co-author and researcher.
“This area of AI is an ethical minefield. It’s important to prioritise the dignity of the deceased, and ensure that this isn’t encroached on by financial motives of digital afterlife services, for example.”
“At the same time, a person may leave an AI simulation as a farewell gift for loved ones who are not prepared to process their grief in this manner. The rights of both data donors and those who interact with AI afterlife services should be equally safeguarded.”
Platforms offering to recreate the dead with AI for a small fee already exist, such as ‘Project December’, which started out harnessing GPT models before developing its own systems, and apps including ‘HereAfter’. Similar services have also begun to emerge in China.
“Methods and even rituals for retiring deadbots in a dignified way should be considered. This may mean a form of digital funeral, for example, or other types of ceremony depending on the social context”, says co-author Dr Tomasz Hollanek.
“We recommend design protocols that prevent deadbots being utilised in disrespectful ways, such as for advertising or having an active presence on social media.”
Hollanek and Nowaczyk-Basińska say that designers of re-creation services should actively seek consent from data donors before they pass.
They suggest that design processes should involve a series of prompts for those looking to “resurrect” their loved ones, such as ‘have you ever spoken with X about how they would like to be remembered?’, so the dignity of the departed is foregrounded in deadbot development.
Another scenario featured in the paper, an imagined company called “Paren’t”, highlights the example of a terminally ill woman leaving a deadbot to assist her eight-year-old son with the grieving process.
“While the deadbot initially helps as a therapeutic aid, the AI starts to generate confusing responses as it adapts to the needs of the child, such as depicting an impending in-person encounter.”
The researchers recommend age restrictions for deadbots, and also call for “meaningful transparency” to ensure users are consistently aware that they are interacting with an AI.
“It is vital that digital afterlife services consider the rights and consent not just of those they recreate, but those who will have to interact with the simulations,” says Hollanek.
“These services run the risk of causing huge distress to people if they are subjected to unwanted digital hauntings from alarmingly accurate AI recreations of those they have lost. The potential psychological effect, particularly at an already difficult time, could be devastating.”
The researchers call for design teams to prioritise opt-out protocols that allow potential users to terminate their relationships with deadbots in ways that provide emotional closure.
Moonshot News is an independent European news website for all IT, Media and Advertising professionals, powered by women and with a focus on driving the narrative for diversity, inclusion and gender equality in the industry.
Our mission is to provide top and unbiased information for all professionals and to make sure that women get their fair share of voice in the news and in the spotlight!
We produce original content, news articles, a curated calendar of industry events and a database of women IT, Media and Advertising associations.