Re;Memory — A New AI Program Makes Talking To The Dead Come Alive
There are many frontiers being crossed by AI lately, sparking debate and anxiety. But now, we're entering strange, new territory: an algorithm that lets bereaved family members communicate with deceased loved ones in the most realistic of ways. Yet it comes with very real and complicated risks.
TURIN — Generative artificial intelligence is said to be a threat to the jobs in a variety of creative professional fields. Are professional psychics next? Yes, communing with the dead, real or imagined, is an experience that the digital world may now be ready to outflank the human competition.
The technical term for these algorithms is "deadbots," which offer a sort of ephemeral evocation of the spirit of a deceased person. You don't have to look far to find them — even the usual suspect, ChatGPT, can light the path to the dead and establish a mutual, tangible dialogue between you and the dearly departed.
Yet the most realistic of these chatbot models is the consolatory Re;Memory. This ectoplasmic recreation, designed by South Korean company DeepBrain, comes almost as a natural evolution to the spiritual seances to which we're accustomed.
Your ghost, in video chat!
In a promotional video for the chatbot we meet Mr. Lee, who is — or rather, was — an elegant and composed Korean gentleman. It's conceivable that at the time that he agreed to be the spokesperson for the service, he already knew he was nearing the end of his life. Perhaps, being terminally ill, he decided to visit the studios of the company that would ensure his survival, albeit an ephemeral one. He wore a blue suit for the occasion, and maybe someone lovingly adjusted his tie knot.
Highly equipped operators then recorded a monologue in which he addressed his family, as if he had already passed away. During this incredibly emotional session, which would last for seven hours, this human being, with full awareness of his impending death, played his part for the immersive experience that his loved ones would participate in after his passing.
The business of communing with the dead is not a cheap one.
The artificial intelligence recorded not only his facial expressions but also his gestures and voice. Re;memory then created a video clone of Mr. Lee, with whom it would be possible to interact, after his death, as if it were a video call to the afterlife.
In the video, we see him once again seated in an armchair, smiling, surrounded by a dreamlike landscape with trees, sunsets, skies and clouds — recreations of his supposedly blissful afterlife. Both his daughter and widow engage in a dialogue with him, finding solace and reassurance.
He doesn't say anything transcendental; it's the usual kind of phrases that have been spoken during séances since the mid-1800s, where ectoplasmic appearances reassured bereaved family members. "I missed you so much, I love you, I'm here waiting for you." The daughter seems speechless, while the wife bursts into tears and asks questions.
This is exactly like the evocations performed by palmists, mediums, or mentalists when they want to deceive the client into believing they are truly communicating with a deceased person. Still, like all AI chatbots, this algorithm is one that continuously improves upon itself. The producers of this digital ghost assure us that the more video calls you have with your deceased person, the more the deadbot will learn, becoming more accurate and believable over time.
Money and consent
The business of communing with the dead is not a cheap one — the person nearing the end of their life must pay between $12,000 to $24,000 just to create the model. Each video call, once the person has passed away, will cost their relatives an additional $1,200.
We can already imagine the soon-to-be ignited debate over the possible extension of consent, even from those who agree to feed a "deadbot." A year ago, Sara Suarez-Gonzalo, a researcher at the Universitat Oberta de Catalunya, raised the ethical question of whether it is right to create artificial intelligence for generating conversations with a deceased person, which she discussed in an article for The Conversation. The researcher analyzed the legally controversial case of Joshua Barbeau, a 33-year-old man who, through a website called "Project December," created a bot using Chat GPT-3 that allowed him to converse with his girlfriend, Jessica, who had passed away from a rare disease.
The issue of the "freedom of one's thoughts to evolve post-mortem" will create a new dilemma.
The problem is that in just one year, the processing capacity of generative AI has grown enormously. Eventually, we will have to question the legality of informed consent: after all, the person being modeled may be on board with the current version of the algorithm, but there is no post-mortem autonomy if that algorithm later evolves and changes into something new.
Indeed, the question is not easily answered. Consider the scenario where your "ghost" is capable of evolving beyond your will, with the hypothetical possibility of becoming something different from what you were in life — a whole new person, who has grown from the seed of your seven-hour interview/set-up session.
Certainly, the issue of the "freedom of one's thoughts to evolve post-mortem" will create a new dilemma regarding the protection of any future digital manifestation of our consciousness. The ethical and legal implications will raise important questions about privacy, consent, and the boundaries of posthumous representation in the digital age — add to that existential anxieties while we're at it.
- Animals And AI: How Researchers Are Trying To Prevent The Next Pandemic ›
- The "Ruin Of Art" — How Goethe Predicted Our Current AI Nightmare 220 Years Ago ›
- It's Not That AI Will Get Too Smart — It's That It May Make Us Too Stupid ›
- AI As God? How Artificial Intelligence Could Spark Religious Devotion ›