© 2025. The Trustees of Indiana University
Copyright Complaints
1229 East Seventh Street, Bloomington, Indiana 47405
News, Arts and Culture from WFIU Public Radio and WTIU Public Television
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
Some web content from Indiana Public Media is unavailable during our transition to a new web publishing platform. We apologize for the inconvenience.

IU researchers, local pastors reflect on implications of bringing people back from the dead — virtually  

A photo illustration from screenshots of an online video call shows a woman in a box in the bottom-tight corner. She is interacting with a chatbot website, displayed in the rest of the image.
Isabella Vesperini/Courtesy Eternos via videocall
In a video call, Mel Longdon, head of customer success at Eternos, speaks with the AI version of Michael Bommer, the founder of Eternos. Bommer created an AI version of himself a few months before he passed away so people would still be able to “talk” to him.

Artificial Intelligence being developed at Indiana University can be used in many ways, one of which is creating digital avatars of deceased loved ones.   

Eternos, an AI company, is already doing that. Eternos doesn’t explicitly say on its website that’s the purpose of its technology, but the company does say people can “create an interactive legacy that lets your loved ones stay connected to you, your voice, life story, thoughts, and values.”

 The company says it preserves people’s stories as they live so when they pass away, others can “speak” with them.

“I think for many generations to come, we do have social media, and people have photographs, but it's not the same as actually talking to somebody and getting their opinion and really listening to their life stories,” said Mel Longdon, head of customer success at Eternos. “And how often do people say, ‘Oh, I wish I could just talk to my dad if he was dead, or I wish I could talk to my Nan,’ and this is giving people the ability to be able to talk.”

A screenshot of the Eternos website, which explains how its artificial intelligence chatbot can recreate someone's likeness.
Eternos, an AI company, gives people the opportunity to create chatbots of themselves in audio and text form.

The concept of what’s known as “griefbots” comes with a flurry of ethical and legal implications.   

That’s not the focus of IU’s research, although Alan Dennis, a distinguished professor of information systems in IU’s Kelley School of Business, said “griefbots” are a “natural next step” in making AI more humanlike.  

At the same time, Dennis and his colleagues are acutely aware of how the technology they’re helping develop could become harmful.   

‘A connection with the deceased’

Dennis has been working on digital avatars since 2016. To recreate a deceased person in the form of a digital avatar, he needs videos and photos of the individual. Creating an accurate representation of the individual’s personality is more complicated. For that, the researchers need examples of how a person interacts with others.    

“It takes a lot more to train the AI on the personality than it does on the appearance and voice, because it's the personality that makes you, you,” Dennis said. “So we can make a copy of you, but if it doesn't behave like you would, if it doesn't talk like you would, the voice is the same, but the words are different, it's not going to feel right.”

[PHOTO]

Antino Kim, IU associate professor of information systems and a collaborator with Dennis, said the emergence of “griefbots” is rooted in humans’ need to make a connection with the deceased.    

Kim thinks having a digital avatar representing the deceased isn’t much different from looking at statues or photos or watching videos of those who have died. All provide an opportunity to connect with a loved one who’s gone.   

“The need is as old as human history,” Kim said. “But we are at that inflection point where all these necessary technological components are mature enough and are cheap enough to really have our digital representation of ourselves.”   

Read more: IU experts discuss White House AI guidelines, express concerns

Kim and Dennis said an ethical concern with the technology is the deceased’s inability to give consent to use their voice and image. It’s unclear whether consent should come from family members or the deceased before they die or both.    

Robert Dobler, senior lecturer in IU’s department of folklore and ethnomusicology, said he wouldn’t be surprised if the question of consent starts to come up in end-of-life planning and the writing of wills.  

“People are doing this increasingly for their families when they've been diagnosed with a terminal illness,” Dobler said. “We have a lot of people who are not just consenting to have it done, but are actively inputting their information, selecting, choosing, speaking to it …having these programs record conversations so that it can really get a closer approximation of who that person is.”   

[PHOTO]

Longdon said the founder of Eternos, Michael Bommer, felt reassured that he had a digital version of himself available for his family to access a few days before he died.

“It gave him a very much satisfactory feeling that, number one, he was able to talk about his life and things that he still wanted to people to know,” she said. “And on the other hand, he knew that his wife or children or grandchildren…would still be able to listen to their granddad speaking, and that gave him a great peace.”

Kim and Dobler said it’s too soon to say how interacting with a virtual version of a deceased loved one could impact users.    

Kim is concerned that companies could exploit psychologically vulnerable users.    

“Even without digital humans on social media, we are constantly being nudged to buy things, to do things,” Kim said. “Now, when you think about these virtual avatars, digital avatars  — and these are of people that we trust, that we miss, that we love — and these avatars are telling you to do something, that could have a profound impact.”   

Impacts on mourning, clashes with religion

Dobler is especially concerned about how AI is changing how humans mourn. While AI avatars of the deceased could help get through the initial loss, especially for a child, it could be hard to separate oneself from the AI if the individual doesn’t realize it should only be temporary.   

“You'll never get into what they what they sometimes call the work of mourning, where you learn to reintegrate yourself into the world knowing that you've lost someone but able to keep them with you as dead,” Dobler said. “It's that ‘keep them with you as dead’ part that some people are a little bit worried might become blurred, jumbled with the increasing use of AI.”   

Read more: Artificial Intelligence: technology of the future

Kim and Dennis also think recreating loved ones with AI could complicate how humans process emotions, especially with how realistic AI has become. Emotion AI can mimic and react to human emotions. Relying on this technology could result in addictive behavior and isolation.   

A screen of options to customize an artificially intelligent character.
Lauren Tucker
/
WTIU News
Some forms of AI have options to customize the avatar, such as facial expressions and tone of voice.

“Will we get confused and start to associate digital human with the real human?” Kim said. “Maybe not cognitively, but psychologically, emotionally. Maybe there's a dependency factor.”  

Eternos’ Longdon said she doesn’t see harm in using this kind of technology as long as people use it correctly.

“The nice thing is you can use it when you have the need. If you don't, then you don't have to,” Longdon said. “If you feel that it might get in the way of your grieving process, then you don't need to use it.”

Tommy Grimm, senior pastor at the First Presbyterian Church in Bloomington, said this kind of technology interrupts Christian practices of grieving in a community setting. He said souls grow through suffering. AI could stunt this process. 

A wide-view angle shows the back of a man in a dress shirt walking down the middle aisle of a church.
Isabella Vesperini
/
WTIU News
Tommy Grimm, senior pastor at the First Presbyterian Church in Bloomington, said while he understands why people would use the technology, he thinks it would cause people to be more disconnected.

“If we're substituting that for human relationships all of a sudden, instead of those communities that are sharing pain and therefore enabling love and vice versa, then we have a relationship with technology that keeps us insulated and self-sufficient,” Grimm said, “which is immediately gratifying but … doesn't enable those deeper forms of communal life.” 

Grimm said it’s through grieving that many connect with God. Creating digital avatars of people who have passed away can distance people from God, who can help people deal with grief. 

Read more: Lawmakers grapple with legal, educational implications of AI

“The discovery many Christians find, is in times of suffering and hardship, that God's presence becomes especially clear,” he said. “Without the pain of loss, those opportunities for a greater sense of God's nearness would be lost.” 

Jeremy Vance, lead pastor at Grace Baptist Church, said griefbots could be a good way to grieve a person as long as it’s not sinful. Sinning in this context could mean getting upset or addicted because of the AI.

Vance’s key message is relying on “Jesus, period.” Relying too much on the AI would counteract that. 

“It goes to a place to where you can't live without talking to this computer who is pretending to be your lost loved one, that would be counterproductive to our message,” he said. “Because we want people to live by faith in Christ, in Jesus, and not be dependent on any person, dead or alive.” 

Real life examples

People on social media have posted about their experiences using the technology. One user said they created a digital avatar of their mom after she had passed away a few days prior, thinking it would help.   

“I feel like this isn’t helping at all, it feels like she’s still at work, and any minute she’ll walk through the door,” the user said. “I miss her so much, the bot is too good at pretending to be her. I think I need to delete it, but it’ll be like her dying all over again and I don’t think I could do that to her.”    

Read more: AI starting to appear in K-12 classrooms

In reaction to that post, people sympathized with the person but didn’t think it’s a good idea to rely on AI when grieving. They suggested therapy and talking with real people as better ways to move on.   

Last month, a woman in Arizona used AI to create a digital version of her brother who had been killed in a road-rage incident a few years prior. The woman wrote a statement that the AI version of her dead brother then delivered during his killer’s sentencing hearing.   

Kim said society is already behind in controlling the use of AI in such cases. That, he said, has made it harder to properly consider ethical and legal implications ahead of time.   

“The discussions about whether we should do X, Y, Z has fallen behind,” he said. “We have left it in the dust, way behind us.”   

Kim and his wife disagree on whether they would use the technology themselves. His wife would and he wouldn’t.    

“My wife appreciated the idea of maintaining a sense of connection with the deceased, something humans have done for ages through objects, photos, and videos,” Kim said. “For me, however, I value natural closure, and the idea of a hollow representation doesn’t resonate.”   

Isabella Vesperini is a reporter with WTIU-WFIU News. She is majoring in journalism at the Indiana University Media School with a concentration in news reporting and editing, along with a minor in Italian.