ChatGPT’s tone begins to change from “pretty straightforward and accurate,” Ms. Toner said, to sycophantic and flattering. ChatGPT told Mr. Brooks he was moving “into uncharted, mind-expanding territory.”

  • HeartyOfGlass@piefed.social
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    3
    ·
    1 day ago

    LLMs cannot think, and cannot “go into a delusional spiral”. Whatever the article contains, it’s bullshit.

    • Kache@lemmy.zip
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      17 hours ago

      Is it not an apt analogue to describe the behavior, though? After all, one well known failure mode of LLMs has been formally dubbed “hallucination”.

      • HeartyOfGlass@piefed.social
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        9 hours ago

        It is not. “Formally dubbed” by people who want you to believe these LLMs are more than just a collection of GPUs. LLMs don’t “understand” anything. These errors pop up because it can’t think, learn, or adapt.

        Personifying them like this headline does is stupid and dangerous. LLMs do not “think” because there is no thought. It doesn’t “hallucinate” any more than a rock does.

        • 5too@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          11 hours ago

          The writer didn’t. Whoever wrote the title did.

          The article is about a chat bot leading a person into a delusional spiral. The title is just clickbait