close
close

“Never say goodbye”: Can AI bring the dead back to life? | Technology News


“Never say goodbye”: Can AI bring the dead back to life? | Technology News

In a world where artificial intelligence can resurrect the dead, grief takes on a new dimension.

From Canadian singer Drake’s use of AI-generated vocals by Tupac Shakur to Indian politicians addressing crowds years after their deaths, technology is blurring the lines between life and death.

But beyond their uncanny appeal in entertainment and politics, AI “zombies” may soon become a reality for people suffering the loss of a loved one through a series of groundbreaking but potentially controversial initiatives.

So how do AI “resurrections” work and are they as dystopian as we might imagine?

What are AI “resurrections” of humans?

In recent years, AI projects around the world have enabled digital “resurrections” of deceased people, allowing friends and relatives to communicate with them.

Typically, users provide the AI ​​tool with information about the deceased. This can be text messages and emails or simply answers to personality-based questions.

The AI ​​tool then processes this data to speak to the user as if they were the deceased. One of the most popular projects in this area is Replika – a chatbot that can mimic the texting style of humans.

However, other companies now also offer the option of watching a video while talking to the deceased person.

For example, Los Angeles-based StoryFile is using AI to enable people to speak at their own funeral. Before passing, a person can record a video sharing their life story and thoughts. During the funeral, participants can ask questions and AI technology selects relevant answers from the recorded video.

In June, US company Eternos also made headlines for using artificial intelligence to create a digital afterlife of a person. Launched just earlier this year, the project enabled 83-year-old Michael Bommer to leave behind a digital version of himself that his family could continue to interact with.

Do these projects help people?

When a South Korean mother met an AI replica of her deceased daughter in virtual reality, a video of this emotional encounter sparked a heated debate online in 2020 about whether such technology helps or harms its users.

The developers of such projects point to the freedom of action of the users and believe that this addresses a deeper suffering.

Jason Rohrer, founder of Project December, which also uses AI to encourage conversations with the dead, said most users typically experience an “unusual level of trauma and grief” and see the tool as a way to cope.

“Many of these people who want to use Project December in this way are willing to try anything because their grief is so insurmountable and so painful for them.”

The project allows users to chat with AI replicas of well-known public figures and also with people the users may know personally.

People who use the service to have thought-provoking conversations with the dead often discover that it helps them find closure with their fate, Rohrer said. The bots allow them to voice words they didn’t say to their loved ones who died unexpectedly, he added.

Eternos founder Robert LoCasio said he started the company to capture people’s life stories and enable their loved ones to look forward.

Bommer, his former colleague who died in June, wanted to leave a digital legacy exclusively to his family, LoCasio said.

“I spoke to (Bommer) just days before he died and he said, ‘Remember, that was for me.’ I don’t know if they’ll use that in the future, but it was important to me,” LoCasio said.

What are the pitfalls of this technology?

Some experts and observers are more skeptical about the resurgence of artificial intelligence, questioning whether those deeply grieving can really make an informed decision to use it and warning of its negative psychological impact.

“My biggest concern as a clinician is that grief is actually very important. It is an important part of development that we are able to acknowledge the absence of another person,” said Alessandra Lemma, a consultant at the Anna Freud National Centre for Children and Families.

Long-term use can lead to people no longer accepting the other person’s absence and remaining in a kind of “limbo state,” warned Lemma.

In fact, one AI service has marketed a permanent connection with the deceased person as its primary feature.

“Welcome to YOV (You, Only Virtual), the AI ​​startup pioneering advanced digital communications so we never have to say goodbye to the people we love,” the company’s website said before a recent update.

Rohrer said his grief bot has a “built-in” limiting factor: Users pay $10 for a limited conversation.

The fee buys time on a supercomputer, with each response having a different computational cost. This means that $10 doesn’t guarantee a fixed number of responses, but it does allow for one to two hours of talk time. When time runs out, users receive a notification and can say goodbye for good.

There is also a fee for using several other AI-generated conversation services.

Lemma has researched the psychological impact of grief bots and says that while she is concerned about their use outside of a therapeutic context, they can be safely used as a supplement to therapy provided by a trained professional.

Studies around the world are also observing the potential of AI in psychological counseling, particularly through individualized conversation tools.

These services seem to come straight out of a Black Mirror episode.

Proponents of this technology, however, argue that the digital age simply opens up new opportunities for preserving life stories, potentially filling a gap created by the loss of traditional family storytelling practices.

“In the past, when parents knew they were dying, they would leave boxes full of things they wanted to pass on to their children, or a book,” Lemma said. “So this could be the 21st century version that is then passed on, created by the parents in anticipation of their passing.”

LoCasio from Eternos agrees.

“The ability of a person to tell the stories of his life and pass them on to friends and family is actually the most natural thing,” he said.

Are AI resuscitation services safe and private?

Both experts and studies have raised concerns that such services may fail to protect data privacy.

Personal information or data such as text messages shared with these services could potentially be accessed by third parties.

Even if a company promises to keep the data confidential when you first register, data protection cannot be guaranteed due to frequent changes in terms and conditions and possible changes in ownership, warns Renee Richardson Gosline, lecturer at the MIT Sloan School of Management.

Both Rohrer and LoCasio stressed that privacy is at the heart of their projects. Rohrer can only view conversations if users submit a customer support request, while LoCasio’s Eternos limits access to digital heritage to authorized relatives.

However, both agreed that such concerns could arise, especially among technology giants and profit-oriented companies.

A major concern is that companies could use the revival of their AI to customize their marketing to users.

An advertisement with the voice of a loved one, a reference to a product in its text.

“When you do that to vulnerable people, you create a pseudo-advocacy based on a person who never agreed to do something like that. So it’s really a problem of agency and power asymmetry,” Gosline said.

Are there any other concerns about AI chatbots?

The fact that these tools are fundamentally aimed at a market of people struggling with grief makes them risky in itself, says Gosline – especially when large technology companies come into play.

“In a culture of tech companies that is often described as ‘moving fast and breaking things,’ we should be concerned because what typically breaks first is the things of vulnerable people,” Gosline said. “And I can’t think of many people who are more vulnerable than those who are grieving.”

Experts raise ethical concerns about the digital resurrection of the dead, especially in cases where users have not given their consent and are feeding the AI ​​with the data.

The environmental impact of AI-powered tools and chatbots is also a growing concern, particularly in the context of large language models (LLMs) – systems trained to understand and generate human-like text that power applications such as chatbots.

These systems require massive data centers that emit huge amounts of carbon and use vast amounts of water for cooling. In addition, frequent hardware upgrades generate e-waste.

A report from Google in early July showed that the company is far behind its ambitious net-zero goals due to the high demands AI places on its data centers.

Gosline said she understands there is no perfect program and that many users of such AI chatbots would do anything to reconnect with a deceased loved one. But it is up to policymakers and scientists to think more carefully about what kind of world they want to create, she said.

Basically, they have to ask themselves one question: “Do we need this?”

Leave a Reply

Your email address will not be published. Required fields are marked *