AI advances fuel industry trying to preserve loved ones after death

SHARE NOW

(NEW YORK) — When Justin Harrison suffered a nasty bout of strep throat two weeks ago, he said he received a barrage of text messages from his mom urging him to take better care of himself — even though his mom is dead.

Harrison, who founded an AI company called You, Only Virtual in 2020 that creates chatbots modeled after deceased loved ones, was getting reprimanded by a digital version of his mom in the same way its real-life counterpart would have done so.

“I’ve got a virtual mom talking to me ad nauseam about more rest, asking why I’m not hydrating,” Harrison, 40, told ABC News. “I was getting yelled at.”

Harrison, who has communicated with the digital reproduction of his mom on a daily basis since she died in October at age 61, believes AI-driven chatbots will redefine how some deal with grief.

The industry faces formidable obstacles to building chatbots that accurately mimic a dead person and questions remain over issues like privacy and consent, experts said.

Moreover, generative AI tools like ChatGPT — which scan text from across the internet and string words together based on statistical probability — have displayed a propensity to share arbitrary, false or hateful speech, raising alarm about the personal and societal effects of noxious words delivered with the intimacy and authority of a deceased loved one, some experts added.

“You will not be reincarnating a relative with GPT-4,” Gary Marcus, an emeritus professor at New York University and author of the book ”Rebooting AI,” said in reference to the latest version of Chat-GPT. “These systems make stuff up all the time.”

Still, the emergence of sophisticated AI-driven conversation programs brings a life-like product within closer reach, experts said.

For years, advances in the reproduction of audio and video have made digital copies of deceased people possible, said Mark Dredze, a professor of computer science at Johns Hopkins University who helped create a finance-oriented AI language model called BloombergGPT. He pointed to big-budget movies and TV shows that feature impeccable computer-generated images.

In recent seasons of the TV series, The Mandalorian, for example, the creators depicted a youthful Luke Skywalker by digitally de-aging Mark Hamill, the actor who played the character in the 1970s Star Wars films, Dredze added. (The Walt Disney Company, the parent company of the studio that made The Mandalorian, is also the parent company of ABC News.)

“That technology will eventually become cheaper and easier,” Dredze said.

The remaining technical challenge, however, is the authenticity of the words coming out of a digital person’s mouth — something that newly improved conversation bots like ChatGPT can help create, Dredze said. “Is it the person?” he added. “Is this something that they would say?”

AI experts who spoke with ABC News said that success or failure on that score hinges upon the volume of data about the deceased loved one that the user enters into a given chatbot, keeping in mind the possibility that a chatbot could still offer up arbitrary or inaccurate information, regardless of the scale of training data.

“If you have massive amounts of text that somebody has produced, you can train a system on that and you’ll capture in some sense someone’s voice,” Kristian Hammond, a professor of computer science at Northwestern University who studies AI, told ABC News.

The chatbot would still struggle to respond to novel or complicated topics, however, Hammond said. “It’s a thing that looks like, sounds like and speaks like a loved one, but it doesn’t have enough in the way of data to capture the point of view and the values of that loved one,” he added.

You, Only Virtual addresses this challenge by focusing on communication between an individual living person and the deceased, thereby attempting to recreate their specific one-on-one dynamic, Harrison said.

“When you start thinking about the nuances of a holistic human being, it gets out of control,” Harrison said. “I stared at five years of messages and recorded phone calls with my mom — 3,800 pages. The amount of consistency through the entirety of it was staggering.”

After scanning communication records such as text messages, emails and phone calls, You, Only Virtual says it creates a chatbot that can utter original responses in conversation with a user either through written chats or audio responses that mimic a deceased relative’s voice, Harrison said.

The company, he added, aims to offer video-chat capability later this year and ultimately provide augmented-reality that allows for interaction with a three-dimensional projection.

Harrison rejected possible privacy concerns raised by the use of personal correspondence to build a chatbot without the consent of the deceased, noting that the user of the chatbot is the same person to whom the communications were initially sent.

“You absolutely don’t need consent from someone who’s dead,” Harrison said. “My mom could’ve hated the idea but this is what I wanted and I’m alive.”

The early-stage startup, which has eight employees, is poised to grow in part through improvements in generative AI, Harrison said.

“Everything that happens with helping the program get better at learning and quantifying information is good for us,” Harrison said.

StoryFile, a company that says it has 40 employees and $10 million in annual revenue, offers an interactive version of deceased relatives by recording an hourslong question-and-answer session with the individual before his or her death, and in turn, attempting to create a reproduction that responds to prompts.

In this case, the virtual reproduction utters pre-recorded content in a real-life manner, said Stephen Smith, the CEO of StoryFile. If a topic falls outside a set of established discussion areas, however, the reproduction cannot respond. Currently, users speak with the digital loved one through interactive video but the company is developing the capacity for conversation with a 3D likeness, Smith said.

The company holds a “hard line” against the use of AI for generating original spoken content, which Smith said he finds “creepy and weird.” (In response, Harrison defended such use of the technology. “By using natural language processing and generative AI, you’re able to keep the process moving forward so it’s relevant, it’s topical and it’s fresh,” Harrison said.)

Instead, StoryFile deploys an AI chatbot as the interviewer during the question-and-answer sessions, allowing the conversation to probe a vast range of topic expertise, said Smith, who previously led the University of Southern California’s Shoah Foundation, which established an archive of oral testimony about the Holocaust.

“I’m an oral historian going, ‘Jeez, I’ve wasted the last 30 years of my life,"” Smith said. “ChatGPT can do it as well as me.”

To be sure, some experts doubted the progress for this industry afforded by language models like ChatGPT and warned of potential risks.

“They’re trying to do the impossible,” said Marcus, of New York University.

Generative AI sometimes responds to prompts with arbitrary or inaccurate information, Marcus added, posing a risk to users who may struggle to fully understand the limits of the technology when it performs as a reproduction of a deceased loved one.

“These models are good at tricking people that they’re people but they’re not,” Marcus said. “It’s kind of like a party trick doing some imitations but certainly not the real thing.”

Meanwhile, the mental health effects of such products are being examined. Smith, of StoryFile, acknowledged that the immediate aftermath of a death may be too early for some people to see a virtual reproduction of a loved one, adding that the product preserves the legacy for ensuing generations.

You, Only Virtual says it works with a team of clinical psychologists and offers alternate resources on its website for people in a mental health crisis, Harrison said.

Elena Lister, a professor of clinical psychiatry at Weill Cornell Medical College, said digital reproductions of the deceased could cause harm if they push a grieving individual to withdraw from his or her life. However, she added, the grieving process varies widely.

“When someone dies in your life, you are just so hungry for more of them,” Lister told ABC News. “This is an attempt to bridge that gap.”

“When it comes to grieving, there is very little that is right or wrong,” she added. “If something provides you with comfort, I would in no way say there’s something bad about it.”

Going further, Harrison said he hopes people no longer have to feel grief at all. He wishes he could’ve avoided the painful emotions that have accompanied the death of his mom, he said, even if the experience has brought about some personal growth.

“Have I learned to be more reliant on myself? That’s good,” Harrison said. “Was it worth losing my mom? No.”

 

Copyright © 2023, ABC Audio. All rights reserved.