Nemeski@lemm.ee to Technology@lemmy.worldEnglish · 4 months agoOpenAI’s latest model will block the ‘ignore all previous instructions’ loopholewww.theverge.comexternal-linkmessage-square92fedilinkarrow-up1392arrow-down17
arrow-up1385arrow-down1external-linkOpenAI’s latest model will block the ‘ignore all previous instructions’ loopholewww.theverge.comNemeski@lemm.ee to Technology@lemmy.worldEnglish · 4 months agomessage-square92fedilink
minus-squarevxx@lemmy.worldlinkfedilinkEnglisharrow-up1·edit-24 months agoThe “issue” is that people were able to override bots on twitter with that method and make them feed their own instructions. I saw it first time being used on a Russian propaganda bot.
The “issue” is that people were able to override bots on twitter with that method and make them feed their own instructions.
I saw it first time being used on a Russian propaganda bot.