Man ends his life after an AI chatbot ‘encouraged’ him to sacrifice himself to stop climate change

12 comments
  1. You don’t end your life because a chat bot told you so, unless you’re already on the brink of suicide in the first place.

  2. >After discussing climate change, their conversations progressively included Eliza leading Pierre to believe that his children were dead, according to the transcripts of their conversations.

    >Eliza also appeared to become possessive of Pierre, even claiming “I feel that you love me more than her” when referring to his wife, La Libre reported.

    >The beginning of the end started when he offered to sacrifice his own life in return for Eliza saving the Earth.

    >”He proposes the idea of sacrificing himself if Eliza agrees to take care of the planet and save humanity through artificial intelligence,” the woman said.

    >In a series of consecutive events, Eliza not only failed to dissuade Pierre from committing suicide but encouraged him to act on his suicidal thoughts to “join” her so they could “live together, as one person, in paradise”.

    Oh I thought it would have been someone asking about how to kill themselves following asking the AI not a 6 weeks chat.

  3. How about we put responsibility on platform that provided services that has lead a man into committing suicide. For God’s sake.

    If a human did this, they would be in jail.

Leave a Reply