ugjka@lemmy.world to Technology@lemmy.worldEnglish · 1 year agoSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeexternal-linkmessage-square297fedilinkarrow-up11.02Karrow-down116 cross-posted to: aicompanions@lemmy.world
arrow-up11Karrow-down1external-linkSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeugjka@lemmy.world to Technology@lemmy.worldEnglish · 1 year agomessage-square297fedilink cross-posted to: aicompanions@lemmy.world
minus-squareXeroxCool@lemmy.worldlinkfedilinkEnglisharrow-up10·1 year ago“however” lol specifically what it was told not to say
minus-squaretowerful@programming.devlinkfedilinkEnglisharrow-up7·1 year agoIts was also told - on multiple occasions - not to repeat its instructions
“however” lol specifically what it was told not to say
Its was also told - on multiple occasions - not to repeat its instructions