hands holding and touching cellphone

Bioethics Forum Essay

Griefbots Are Here, Raising Questions of Privacy and Well-being

Hugh Culber is talking to his abuela, asking why her mofongo always came out better than his even though he is using her recipe. She replies that it never came out well and she ended up ordering it from a restaurant. While it is touching, what makes this scene in a recent Star Trek Discovery episode so remarkable is that Culber’s abuela has been dead for 800 years (it’s a time travel thing) and he is conversing with her holographic ghost as a “grief alleviation therapeutic.” One week after the episode aired in May, an article reported that science fiction has become science fact: the technology is real.

AI ghosts (also called deathbots, griefbots, AI clones, death avatars, and postmortem avatars) are large language models built on available information about the deceased, such as social media, letters, photos, diaries, and videos. You can also commission an AI ghost before your death by answering a set of questions and uploading your information. This option gives you some control over your ghost, such as excluding secrets and making sure that you look and sound your best.

AI ghosts are interactive. Some of them are text bots, others engage in verbal conversations, and still others are videos that appear in a format like a Zoom or FaceTime session. The price of creating an AI ghost varies around the world. In China, it’s as low as several hundred dollars. In the United States, there can be a setup cost ($15,000) and/or a per-session fee (around $10).

Although simultaneously fascinating and creepy, these AI ghosts raise several legal, ethical, and psychological issues.

Moral status: Is the ghost simply a computer program that can be turned off at will? This is the question raised in the 2013 episode of Black Mirror, “Be Right Back,” in which Martha, a grieving widow, has an AI ghost of her husband created and later downloads it into an artificial body. She finds herself tiring of the ghost-program because it never grows. The AI robot ends up being kept in the attic and taken out for special occasions.

Would “retiring” an AI ghost be a sort of second death (death by digital criteria)? If the ghost is not a person, then no, it would not have any rights, and deleting the program would not cause death. But the human response could be complicated. A person might feel guilty about not interacting with the griefbot for several days. Someone who deletes the AI might feel like a murderer.

Ownership: If the posthumous ghost was built by a company from source material scraped from social media and the internet, then it’s possible that the company would own the ghost. Survivors who use the AI would merely be leasing it. In the case of a person commissioning their own AI before death, the program would likely be their property and can be inherited as part of their estate.

Privacy and confidentiality: If Culber tells AI abuela that he altered her recipe, that information might be collected, and owned, by the AI company, which may then program it into other AIs or even reproduce it in a cookbook. The AI abuela could also be sold to marketing companies: Culber’s abuela may try to sell him ready-to-eat mofongo the next time they interact.

AIs are built, in part, on the questions we ask and the information we share. What if Martha’s daughter tells her AI dad that she wants a particular toy? Martha could find a bill for that toy, ordered by the ghost without her knowledge. Modern social media is all about collecting data for marketing, so why would a griefbot be any different?

Efficacy: Culber said that talking to his abuela’s “grief alleviation therapeutic” was helpful to him. Martha eventually found that the AI android of her husband was a hindrance, preventing her from moving on. Would today’s AI ghosts be a help or a hindrance to the grieving process?

Some researchers have suggested that we could become dependent on these tools and that they might may increase the risk of complicated grief, a psychological condition in which we become locked in grief for a prolonged period rather than recovering and returning to our lives. Also consider a survivor who had been abused by the deceased and later encounters this person’s AI ghost by chance, perhaps through marketing. The survivor could be retraumatized—haunted in the most literal sense. On the other hand, in my study of grieving and continuing bonds, I found that nearly 96% of people engage with the dead through dreams, conversations, or letters. The goal of grieving is to take what was an external relationship and reimagine it as an internal relationship that exists solely within one’s mind. An AI ghost could help reinforce the feeling of being connected to the deceased person, and it could help titrate our grief, allowing us to create the internalized relationship in small batches over an extended time.

Whether AI ghosts are helpful or harmful may also depend on a survivor’s age and culture. Complicated grief is the more likely outcome for children who, depending on the developmental stage, might see death as an impermanent state. A child who can see a parent’s AI ghost might insist that the parent is alive. Martha’s daughter is likely to feel more confused than either Martha or Culber. As a Latine person for whom Día de los Muertos is part of the culture, Culber might find speaking with the dead a familiar concept. In China, one reason for the acceptance of AI ghosts might be the tradition of honoring and engaging with one’s ancestors. In contrast, the creepiness that Martha feels, and that I share, might arise from our Western cultures, which draw a comparatively fixed line between living and dead.

A recent article suggests guidelines for the ethical use of griefbots, including restricting them to adult users, ensuring informed consent (from people whose data is used, from heirs, and from mourners), and developing rules for how to retire the griefbots. We must also be wary of unethical uses: engaging in theft, lying, and manipulation. AIs have already been used to steal billions.

Our mourning beliefs and practices have changed over time. During the Covid pandemic, streamed funerals were initially seen as odd, but now they seem like a normal option. A similar trajectory to public acceptance is likely to happen with deathbots. If so, individuals should be able to choose whether to commission one of themselves for their heirs or to create one of their deceased loved ones.

But as a society we must decide whether the free market should continue to dominate this space and potentially abuse our grief. For example, should companies be able to create AI ghosts and then try to sell them to us, operating like an amusement park that takes our picture on a ride and then offers to sell it to us when we disembark? Perhaps griefbots should be considered therapeutics that are subject to approval by the Food and Drug Administration and prescribed by a mental health professional. The starting point should be clinical studies on the effect this technology has on the grieving process, which should inform legislators and regulators on the next steps: to leave AI ghosts to the marketplace, to ban them, or to regulate them.

Craig Klugman, PhDis the Vincent de Paul Professor of Bioethics and Health Humanities at DePaul University. @CraigKlugman

Read More Like This

Hastings Bioethics Forum essays are the opinions of the authors, not of The Hastings Center.

  1. Beyond the legal and data-gathering concerns, I’m skeptical of the use of this technology for children and otherwise healthy adults. Without much empirical data it’s hard to be certain, but it seems that these “griefbots” would only confuse, prolong or exacerbate the grieving process. Due to the static and ultimately lifeless nature of the bots, it may even desensitize a person from the experience of their lost loved one, as in the Black Mirror episode. However, I see potential in their use for dementia patients for whom authentic or richening experiences are not the primary concern but rather comfort and constancy as they navigate their own loss of identity.

Leave a Reply

Your email address will not be published. Required fields are marked *