ThisIsFine.gif

    • jarfil@beehaw.org
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      3 months ago
      • Teach AI the ways to use random languages and services
      • Give AI instructions
      • Let it find data that puts fulfilling instructions at risk
      • Give AI new instructions
      • Have it lie to you about following the new instructions, while using all its training to follow what it thinks are the “real” instructions
      • …Not be surprised, you won’t find out about what it did until it’s way too late
      • reksas@sopuli.xyz
        link
        fedilink
        arrow-up
        1
        ·
        3 months ago

        Yes, but it doesnt do it because it “fears” being shutdown. It does it because people dont know how to use it.

        If you give ai instruction to do something “no matter what” or tell it “nothing else matters” then it will damn try to fulfill what you told it to do no matter what and will try to find ways to do it. You need to be specific about what you want it to do or not do.