Skip to content
Search

Latest Stories

'Spooky, but helpful', says woman who 'spoke' with her dead mother with AI's help

The emotional impact of her mother’s death weighed heavily on her, prompting her to explore unconventional avenues for solace.

'Spooky, but helpful', says woman who 'spoke' with her dead mother with AI's help

It may sound weird as you read further but a woman from Berlin, Germany has been talking to her deceased mother through the help of Artificial Intelligence.

Sirine Malas experienced the profound loss of her mother to kidney failure in 2018. She was separated from her mother in 2015 when she fled to Syria. Sirine harboured an unfulfilled desire for her mother to meet her newborn daughter, Ischtar.


The emotional impact of her mother's death weighed heavily on her, prompting her to explore unconventional avenues for solace.

Introduced to an AI tool called Project December, Sirine embarked on a unique journey to simulate conversations with her late mother.

This AI chatbot, powered by OpenAI's GPT2, required users to fill out a form detailing information about the deceased, including their age, relationship, and a quote. For $10 (£ 7.82) per hour, users could engage in conversations with the AI chatbot, providing a novel way to connect with departed loved ones.

jason-rohrer-project-december Jason Rohrer, founder of Project December (Photo credit: @jasonrohrer)

Project December, founded by Jason Rohrer, boasts over 3,000 users, the majority of whom have sought solace by conversing with lost loved ones through the AI interface.

Rohrer noted that many users experience a final, simulated conversation with the deceased before finding closure and moving forward in their grief journey.

For Sirine, the experience was both 'spooky' and 'strangely realistic.' The AI chatbot addressed her by the nickname she had entered into the online form and conveyed a comforting message that her mother was watching over her.

Despite moments of authenticity, Sirine remained aware of the artificial nature of the interaction, recognising instances where the responses could have come from anyone.

“There were moments that I felt were very real. There were also moments where I thought anyone could have answered that this way,” she mentioned.

Expressing a spiritual perspective, Sirine viewed the chatbot as a "vehicle" to communicate with her mother, allowing her to discern between genuine and simulated moments. While the AI tool contributed to her process of moving on, Sirine cautioned against potential dangers of attachment, describing the app as both useful and revolutionary.

Before tragedy struck, Sirine gave birth to her first child, Ischtar, and wished for her mother to meet the newborn.

She emphasised the importance of exercising caution to avoid addiction, as individuals might become disillusioned or excessively reliant on the AI interface, potentially hindering their healing process.

chat-bot-ai-project-december Before tragedy struck, Sirine gave birth to her first child, Ischtar, and wished for her mother to meet the newborn. (Representative image: iStock)

Rohrer, the app's founder, countered concerns about addiction, stating that he hadn't observed users getting hooked on the app. According to him, very few customers return continuously to engage with the simulated persona, suggesting that most users find closure after their initial experience.

However, British therapist Billie Dunlevy raised valid concerns about the potential implications of the app on the natural grieving process.

“The majority of grief therapy is about learning to come to terms with the absence — learning to recognise the new reality or the new normal … so this could interrupt that,” she said.

The therapist expressed apprehension that the app might disrupt this natural progression, creating a vulnerable state coupled with the potential power to construct a digital version of a lost parent, child, or friend. Dunlevy warned that this could be detrimental to individuals seeking to move through grief and achieve emotional recovery.

The story of Sirine Malas encapsulates the evolving relationship between technology and human emotions, highlighting the delicate balance between innovative solutions for grief support and the potential risks of dependency and detachment from reality.

The intersection of AI and emotional well-being continues to prompt ethical and psychological considerations as individuals navigate novel ways to cope with loss.

More For You

Pinterest

Pinterest will make “AI-modified” content labels more visible

iStock

Pinterest introduces new controls to limit AI-generated images in user feeds

Highlights:

  • Users can now restrict AI-generated visuals across select categories.
  • Pinterest will make “AI-modified” content labels more visible.
  • The update aims to restore trust amid growing user backlash.

Pinterest responds to complaints over AI-generated ‘slop’

Pinterest has rolled out new controls allowing users to reduce the amount of AI-generated content in their feeds, following widespread criticism over an influx of synthetic images across the platform.

The company confirmed on Thursday that users can now personalise their experience by limiting generative imagery within specific categories such as beauty, art, fashion, and home décor. The move comes as many long-time users voiced frustration that their feeds were increasingly dominated by low-quality AI visuals, often referred to online as “AI slop.”

Keep ReadingShow less