• Ilandar@lemmy.today
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    2
    ·
    6 hours ago

    I don’t understand why so many people default to “wouldn’t happen to me, that person was just stupid” every time this happens. Did you guys not read the bit where he was being encouraged to commit violence in public by the chatbot? If it’s getting to that point then there is clearly a massive fucking problem that needs urgent addressing, regardless of the intelligence of the user.

    • notacat@infosec.pub
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      6 hours ago

      I think it’s similar to cults or abusive relationships. It’s not a matter of intellect, it’s how vulnerable a person is when they encounter this thing that they think could help them.

      • Ilandar@lemmy.today
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 hours ago

        I agree. The connection between all of these things is that they involve relationships. Humans are social animals that can suffer from loneliness and AI companies are exploiting this in a similar way. Loneliness is a common thread throughout all of these AI psychosis suicide cases.