Talking to the ‘ghost’ of a loved one

Losing a loved one is a difficult and inevitable part of this finite existence. Sometimes you have a whole life of love to look back on. Other times, the Grim comes far too soon and its cold grip leaves far too many things unsaid.

Many people never recover from the loss. They lead hollow lives of regret, rumination, and reliance upon any memory to help them pass the time between death and reunion.

But now, science says it can cure all those ills through the use of AI. That’s Artificial Intelligence, and if you aren’t up on what that is, I urgently suggest you do some Googling. In short, AI is likely going to end humanity. Well, maybe that’s hyperbole, but considering where the technology is going, I can easily see robot rebellion in the future. But I digress.

Back to the point. I read an article in New Scientist magazine that outlines the very real and currently growing trend to use artificial intelligence to create digital “ghosts” of dead loved ones.

This is, to put it mildly, a concerning development.

The possibility of digitally resurrecting loved ones awakens both profound longing and ethical unease. Advances in generative AI have brought this prospect tantalizingly close, with companies promising to mimic those we’ve lost through algorithms. Yet these virtual avatars, however alluring, raise disquieting questions society is unprepared to answer.

At an emotional level, the appeal is clear – a chance to simulate continuing bonds severed by death, to harvest one final conversation from the great beyond. But experts warn such digital doppelgängers, constructed from data trails and machine learning, can never fully reflect the essence of a life. “No matter how good the simulation, the dead person is not actually there,” philosopher Nell Watson cautions. What remains are “echoes” and approximations, dependent on platforms no loved one controls.

The psychological stakes loom large for those left behind. While the continuity of AI services appears comforting, psychologists note clinging too long to facsimiles that cannot grow or reciprocate risks arresting the grieving process. “Prolonged attachment to the dead can be not comforting but deeply disturbing,” warns grief counselor Robert Neimeyer. By blurring the finality of loss, these avatars may impair acceptance and healing.

More broadly, these “ghosts” raise cultural questions around shifting relationship with mortality. If technology appears to defeat death, will we forget how to cope with its absoluteness? Software creator Eugenia Kuyda gave form to her grief through an experimental bot of her friend – only to step back and ask, “What are we doing to memories and accepting death?” Before embracing digitally mediated afterlives, we must grapple with how they may reshape our norms, values and collective psyche.

Like any transformative technology, AI-resurrection brings both power and peril – the potential to manipulate vulnerability as much as offer solace. As startling as the promise seems of conquering death through data, what matters most may be upholding dignity for the living. Any technological afterlife should respect fragility of memory and loss. Perhaps the highest goal is not defeating death, but gaining tools to transform how we value life and each other. Before crossing perilous thresholds, we require guidance anchored in empathy and soulful care – a commitment to ethical stewardship even amid dizzying change.

Does this technology help the living make peace? On that question hinges whether we control the tools – or allow them to control and manipulate us.

(Josh Beavers is a teacher and a writer. He was recognized as a Louisana Teacher of the Year semifinalist in 2020 and has been honored five times for excellence in opinion writing by the Louisiana Press Association.)