shish_mish@lemmy.world to Technology@lemmy.worldEnglish · 8 months agoResearchers jailbreak AI chatbots with ASCII art -- ArtPrompt bypasses safety measures to unlock malicious queriesplus-squarewww.tomshardware.comexternal-linkmessage-square2fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1external-linkResearchers jailbreak AI chatbots with ASCII art -- ArtPrompt bypasses safety measures to unlock malicious queriesplus-squarewww.tomshardware.comshish_mish@lemmy.world to Technology@lemmy.worldEnglish · 8 months agomessage-square2fedilink
shish_mish@lemmy.world to Lemmy Shitpost@lemmy.worldEnglish · 11 months agopuzzlinglemmy.worldimagemessage-square0fedilinkarrow-up12arrow-down10
arrow-up12arrow-down1imagepuzzlinglemmy.worldshish_mish@lemmy.world to Lemmy Shitpost@lemmy.worldEnglish · 11 months agomessage-square0fedilink