• frunch@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    2 days ago

    As someone with a bipolar loved one, i can see exactly how this could feed into their delusions. It’s always there…even if they ran out of people to blast with their wild, delusional ideas the chat bot can be there to listen and feed back. When everyone has stopped listening or begins avoiding them because the mentally ill person has gotten more forceful/assertive about their beliefs, the chatbot will still be there. The voice in their head now has a companion on screen. I never considered any of this before but I’m concerned where this can lead, especially given the examples in the article.

    • GreenCrunch@lemmy.today
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 hours ago

      I know someone with obsessive compulsive disorder, and I could see a chatbot being harmful there, depending on how it goes. A lot of compulsions are around checking or asking for reassurance. A chatbot would provide endless reassurance where a human might eventually get annoyed and cut you off. It would allow you to ruminate endlessly.

      The problem is that engaging in compulsions keeps you in a cycle - it’s never enough reassurance. The gold standard treatment is exposure response prevention (ERP), where you intentionally expose yourself to triggers and resist doing the compulsions. (Info from Free Yourself from OCD by Jonah Lakin, PsyD)