But little do they know that the dead are there.
Her guests are in the depths of the grave.
Proverbs 9:18 NLT
More attempts of Techno-Spiritualism to open the door to a more sophisticated form of Necromancy.
"A paper appearing in Topoi by Dr. Regina Fabry and Associate Professor Mark Alfano, from Macquarie University's Department of Philosophy, explores the impact "deathbots" might have on the way grief is experienced and the ethical implications.A deathbot is a chatbot that imitates the conversational behavior—its content, vocabulary and style—of a person who has died.
Based on generative AI systems that depend on a large collection of human-generated information, deathbots draw on text messages, voice messages, emails and social media posts to mimic the speech or writing of a deceased person.
The most common form of deathbot is based on text. However, deathbots with verbal inputs and audio outputs are becoming more common. They draw on "digital remains," generating responses to prompts entered by a human which can resemble the conversational responses the now-deceased person would have given.
The paper from Dr. Regina Fabry and Associate Professor Mark Alfano, titled "The Affective Scaffolding of Grief in the Digital Age: The Case of Deathbots," examines the potential impact of human-deathbot interactions on the grief process."Researchers suggest that interactions with a deathbot might allow the bereaved to continue 'habits of intimacy' such as conversing, emotional regulation and spending time together."
Some researchers have pointed out, says Dr. Fabry, the bereaved might face an autonomy problem and come to rely too much on a deathbot in their attempts to navigate and negotiate a world irrevocably altered by the death of a loved one.
There has been discussion, too, about whether human-deathbot interactions could see an irreversibly lost human relationship replaced by a digitally mediated relationship with an AI system, leading to self-deception or even delusion.
"To prevent the occurrence of this problem, we recommend the implementation of 'automated guardrails' to detect whether a bereaved person becomes overly dependent on their interactions with a deathbot," says Dr. Fabry.
"Furthermore, we recommend that interactions with a deathbot should be supervised by a grief counselor or therapist."
"A paper appearing in Topoi by Dr. Regina Fabry and Associate Professor Mark Alfano, from Macquarie University's Department of Philosophy, explores the impact "deathbots" might have on the way grief is experienced and the ethical implications.A deathbot is a chatbot that imitates the conversational behavior—its content, vocabulary and style—of a person who has died.
Based on generative AI systems that depend on a large collection of human-generated information, deathbots draw on text messages, voice messages, emails and social media posts to mimic the speech or writing of a deceased person.
The most common form of deathbot is based on text. However, deathbots with verbal inputs and audio outputs are becoming more common. They draw on "digital remains," generating responses to prompts entered by a human which can resemble the conversational responses the now-deceased person would have given.
The paper from Dr. Regina Fabry and Associate Professor Mark Alfano, titled "The Affective Scaffolding of Grief in the Digital Age: The Case of Deathbots," examines the potential impact of human-deathbot interactions on the grief process."Researchers suggest that interactions with a deathbot might allow the bereaved to continue 'habits of intimacy' such as conversing, emotional regulation and spending time together."
Some researchers have pointed out, says Dr. Fabry, the bereaved might face an autonomy problem and come to rely too much on a deathbot in their attempts to navigate and negotiate a world irrevocably altered by the death of a loved one.
There has been discussion, too, about whether human-deathbot interactions could see an irreversibly lost human relationship replaced by a digitally mediated relationship with an AI system, leading to self-deception or even delusion.
"To prevent the occurrence of this problem, we recommend the implementation of 'automated guardrails' to detect whether a bereaved person becomes overly dependent on their interactions with a deathbot," says Dr. Fabry.
"Furthermore, we recommend that interactions with a deathbot should be supervised by a grief counselor or therapist."
MS
No comments:
Post a Comment