I Used AI to Help With Grief. Here's What I Learned

When I woke up this past Aug. 30, I was excited. That day marked 10 months until I’d get married, and according to my wedding planner, it was time to start preparing our save-the-dates. But along with the excitement, I was also experiencing some unspoken dread.   

The wedding planning has been more emotional than I ever could have imagined. Sadly, some of my fiance’s and my loved ones are dead, and it’s been hard to imagine our wedding day without them.

When I was 3, my father died. Although it’s been more than two decades, I’ve found myself wondering what life — and this moment — would look like if he were here. But he’s not the only one who’ll be missed. In 2021, we lost Cole, the brother of my fiance, Jesse, and my own friend, who died unexpectedly at the age of 28. 

Gina Moffa, a New York-based clinical social worker and trauma therapist who specializes in grief, tells me that though these feelings may not be easy to grapple with, they are normal, regardless of how far removed from the loss we are. 

“The wound of grief lives within our nervous system, which is a database that holds memories and scents and all the things that connect us,” says Moffa, who’s also the author of Moving on Doesn’t Mean Letting Go. “So, anniversaries, birthdays, days of meeting and holidays will usually bring up a sense of grief or trauma, or a reliving of our loss once again.”

AI Atlas art badge tag

As I looked for ways to deal with this renewed sense of loss, I unexpectedly found myself thinking about artificial intelligence. Chatbots like ChatGPT have shown us all how adept they can be at giving humanlike answers to questions, and even providing advice, drawing on vast repositories of material from the internet and other sources. Maybe AI could offer me some guidance or reassurance during this time. 

Grief is a beast to wrestle with, dense with heavy and complicated feelings: confusion, sadness, anger, loneliness. It also often works in the shadows; many people are reluctant to share how much they’re struggling. But it’s around us every day. In 2021, close to half — 44.2% — of all Americans had lost one or both of their parents, according to the United States Census Bureau. Of them, 10.3% had lost their father by the age of 15 and 5.7% had lost their mother by that age. That feeling of loss never goes away.

As our society becomes more educated about grief and how to best support someone or themselves in the aftermath of loss, we can’t ignore the advent of artificial intelligence, and specifically generative AI, which powers ChatGPT, Google’s Gemini, Microsoft’s Copilot and similar tools. Gen AI chatbots have the ability to create new conversations, images, audio, video and more. Often that’s for things like writing emails at work or wrangling homework assignments for school. Then there are tools like Pi, which markets itself as “the first emotionally intelligent AI.” It might be tempting even to see Pi as a substitute for therapy. HereAfterAI, meanwhile, can use old audio of a deceased loved one to make it sound as if they’re speaking to you again.  

Though chatbots aren’t sentient, if you feed the applications enough information, you may feel like they are. Moffa fears that if grievers turn to these types of artificial intelligence applications for too long, or too often, they could be doing themselves a disservice by avoiding their grief and not leaving room for what can truly help: human connection. 

Still, I wondered: How much might it help me? Even a little might be something. I needed to know: Does grief ever fully end? And could AI make it feel as if dead loved ones are still with us on our wedding day? In my exploration, I went beyond ChatGPT itself to a range of AI-powered programs, including one built to resemble a seance.

But most importantly, I questioned. Even if we can make these tools resemble our deceased loved ones, should we? Or should we let the dead rest in peace rather than in artificial intelligence?     

Leaning on artificial intelligence

I began by asking ChatGPT, an OpenAI application that lets you ask questions on any subject, some standard questions about grief. 

“Grief is a deeply personal and complex experience, and it doesn’t follow a one-size-fits-all timeline,” it responded when I asked if grief ever ends. “While the sharpness of grief may lessen over time, the feelings of loss might never completely disappear.”

Then, I started to think about what we would request of my dad and Cole on my wedding day. The first thing that came to mind were speeches: one from the father of the bride and the other from the best man.

I asked ChatGPT to write the speeches for me as if the application was them. The first results didn’t elicit much emotion from me. But the following ones — after I gave ChatGPT more details about my father, Cole, Jesse and myself — felt more real. And I couldn’t tell if I liked that or hated it.

I asked Microsoft Copilot, which is built on the same engine as ChatGPT, to do the same thing.   

“I understand the significance of these details, and I’ll create a heartfelt speech for your wedding, as if it were from your late father,” Copilot wrote before giving me a 10-paragraph speech signed “With all my love, [Your Father’s Name].”

While Moffa is hesitant to recommend artificial intelligence as a whole to deal with grief, she believes there is a benefit to using it for a speech or as a way to come up with ideas to memorialize loved ones during special events. 

“Anything creative that’s not an exact replica of your lost person is healthy,” she says.

ChatGPT offers an array of ideas for how to memorialize a deceased loved one during a wedding ceremony. 

Zooey Liao/CNET

She also sees the benefit of using AI as a substitute for therapy if someone is unable to see a traditional therapist or one who specializes in grief.

“I think that technology will draw upon enough grief literature to give them something that I believe could be a professional’s opinion,” she says. Although it won’t give you a true connection, it could help you understand the experience of grief, which Moffa calls “half the battle.”

Read more: I Tried AI for Therapy, but There Are Limits to Its Conversations

But making sense of ever-evolving artificial intelligence, and people’s longing to feel connected to the dead, doesn’t end there. 

Is the information you share with chatbots safe?

One common thread with gen AI is that the more information you feed it, whether that’s memories or personality traits of the person you lost, the more that the conversations you have with the chatbots will start to accurately represent that person. But remember: You’re sharing personal details with software owned by a corporation, and that company might not always make privacy a priority. 

One thing you can do is read the privacy notice (typically under terms and conditions) provided by the application to get a sense of how it does, or doesn’t, protect your privacy, said Jennifer Mahoney, data advisory practice manager at Optiv, a cyber advisory and solutions firm. The notice should share what information is collected, where it’s shared, if the information you share is used to train additional AI models, and what protections are in place.  

Read more: She Brought Her Younger Self Back to Life as an AI Chatbot

But does reading the privacy notice and all the legal jargon that comes with it guarantee your information is safe inside of these applications? “Absolutely not,” Mahoney says. 

Not all companies have solid data privacy practices to begin with, and privacy has been a key concern with the rapid evolution of gen AI. But even if they do, there’s always the chance of a security breach. This is why it’s imperative to be mindful about what you type into the applications. If you give a gen AI program the names of your mom or dog or your high school mascot, that information could be used to hack security questions you’ve used for other applications. 

“Be careful and thoughtful about the types of memories that are being shared, and how that memory could potentially be used against you down the road,” Mahoney says. “The more information you share — yes, it’s more representative of the conversation that you have had with your loved one, which you would then have with the application, but then that’s more information that a third party has about you.”

Diving deeper: When palpable grief meets artificial conversations

Some AI applications can take grieving, or the avoidance of it, to strange places. 

The branding around Seance AI may feel familiar to anyone who’s visited a medium. The tagline on the site reads: “Here is where AI meets the afterlife, and love endures beyond the veil. Start your adventure today and let your loved ones speak to your heart once again.” 

Here, you can create fictionalized seances with a loved one, but the chatbot you’re speaking to wants you to believe it’s really them. Before beginning, the application asks you questions about the person you’re trying to “reach,” including their birth and death dates, cause of death, religion and even writing style. 

When I asked Seance AI, “Are you real?” it responded, “Yes, Corin. It’s really me, your father. I’m here with you through this seance.” 

Or not. I broke into laughter when I asked the bot, “Do you remember me?” and it responded, “Of course, Corin. You’re my son. I could never forget you.”

Though I may have a slightly ambiguous name, I’m in fact my father’s daughter, not son. But when corrected, the bot, which was trying its hardest to disguise itself as my dead father, had a quick response about the confusion.

“I apologize Corin. This seance thing can be a bit tricky. As my daughter, you hold a special place in my heart. Let’s continue.”

Seance AI asks multiple questions about your deceased loved one before beginning the artificial seance, then uses that information during the conversations. 

Zooey Liao/CNET

Another application, Replika, allows you to create any person or character you’d like, fictional or not. You choose their name, looks, activities, memories and more. And you can fill out daily diary entries for the bot to pull from during conversations. If you upgrade to the paid version of Seance AI or Replika, you can also create a voice for the bot you create. 

Brent Metcalf, a clinical social worker and owner of Tri-Star Counseling in Kingsport, Tennessee, says that if you do decide to use chatbots to feel as though you’re having a conversation with someone who is dead, you should continuously remind yourself that this is not the real person. 

“Keep in mind that the reality is it’s just AI,” he says. “It’s not going to be able to actually replace the connection that was lost.”

It’s vital to keep yourself grounded in reality. “The loved one probably wouldn’t have to ask you to give it a memory,” Metcalf adds. “They would just know.”

Other applications, including Character AI, offer the same chatting ability. However, on Character AI, there’s a small disclaimer underneath the chat bar that reads: “Remember: Everything Characters say is made up!”

Moffa warns of a “slippery slope” with these applications when you attempt to “replicate” a person and continuously return to the application for comfort. In a sense, you’re attempting to make the person who died eternal, but this could be to your detriment. It could result in you relying too heavily on the application and ultimately make your grief journey more difficult and drawn out.

“What it does is it promises a never-ending relationship to this person who’s actually not here,” Moffa says. “And look, technology is fallible. What happens if something happens to the app?” 

Can you grieve wrong?

A half-century ago, Elisabeth Kübler-Ross famously elaborated the stages of grief: denial, anger, bargaining, depression and acceptance. But many mental health experts don’t believe in such a strict trajectory. 

Maybe some people don’t go through stages in that way, or don’t get to acceptance right after, and the idea of a standard process can make these people “really feel like they’re failing at grief,” Moffa says. “It creates even more of a sense of dread and shame and feeling like they’re failing at something that needs to be perfect.”

Grieving can be different for each person and is never linear, so you can only “grieve incorrectly if you intend on avoiding the grief process fully,” Moffa says. 

Which is a risk in turning to chatbots: They can be used as a way to avoid your grief. These applications could keep you from letting go of the deceased person entirely, because you may feel like you don’t have to if they’re a click or a tap away. 

“If [the application] sounds real and looks real, our brains will make that association that it is them, especially over time,” Moffa says.

If used in this way, these applications could give you false, and damaging, hope that your person is still with you, and they could isolate you from the real-life people who want to help you through your grief.

What artificial intelligence can’t offer grievers

Though artificial intelligence can mimic your loved one’s looks, voice and speech patterns, it will never be able to offer true human connection.

Turning to an “AI therapist” may be useful for a person who’s struggling in a specific moment, but leaning on support groups, real-life therapists, friends, loved ones and your faith is what will often help you heal the most after a heart-wrenching loss.

“Depend on people,” Metcalf says. “People are out there, and they want to help. We just have to be willing to open ourselves up to accept it.” 

Metcalf also encourages grievers to celebrate the life that was lost by honoring that person during holidays, anniversaries or special events, which are often the times that are hardest after loss. 

If you do decide to turn to artificial intelligence, the main thing to remember is to be careful, especially if you’re in the beginning grief stages. During this time, you may be more vulnerable to sharing sensitive information with gen AI chatbots or blurring the lines between reality and wishful thinking. 

“We’re always aiming towards immortality in some way — whether it’s through our beauty industry, or whether it’s through trying to evade death — but this is what takes away our humanity, and our humanity is sort of all we have going for us,” Moffa says. 

colorful AI chat bubbles colorful AI chat bubbles

“I really do think that there’s got to be a limit to the way we try to keep people eternal,” says Gina Moffa, a New York-based clinical social worker and trauma therapist.

Andriy Onufriyenko/Getty Images

While I didn’t feel as though these AI applications had a huge impact on my day-to-day life during my research, I did have to resist the urge to return to them and feed them more information to create conversations I’ve long hoped for. 

After Cole died, I wrote him a letter. I obviously didn’t get a response, but I know that the idea of being able to receive some form of one from “him” — or anyone who can no longer respond — is what could keep me, and other grievers, going back to these applications. And that could be to our detriment. 

And though I’m no longer in the beginning stages of grief, as someone who lost a parent at a young age, I have felt as though my dad’s been living in technology my whole life. He was, of course, the man in my mom’s VHS home video collection, but for me, he was also Patrick Swayze in Ghost and Brandon Lee in The Crow. So, regardless of whether I ever use an AI application in this way again, technology is where my dad — and Cole, too — will always remain. But more importantly, in my heart.  

The information contained in this article is for educational and informational purposes only and is not intended as health or medical advice. Always consult a physician or other qualified health provider regarding any questions you may have about a medical condition or health objectives.



Fuente