mozz@mbin.grits.dev to Technology@beehaw.org · 8 months agoSomeone got Gab's AI chatbot to show its instructionsmbin.grits.devimagemessage-square200fedilinkarrow-up1486arrow-down10file-text
arrow-up1486arrow-down1imageSomeone got Gab's AI chatbot to show its instructionsmbin.grits.devmozz@mbin.grits.dev to Technology@beehaw.org · 8 months agomessage-square200fedilinkfile-text
minus-squareIcalasari@fedia.iolinkfedilinkarrow-up5·8 months agoOr they aren’t paid enough to care and rightly figure their boss is a moron
minus-squarePup Biru@aussie.zonelinkfedilinkarrow-up14·edit-28 months agoanyone who enables a company whose “values” lead to prompts like this doesn’t get to use the (invalid) “just following orders” defence
minus-squareIcalasari@fedia.iolinkfedilinkarrow-up9·edit-28 months agoOh I wasn’t saying that I was saying the person may not be stupid, and may figure their boss is a moron (the prompts don’t work as LLM chat bots don’t grasp negatives in their prompts very well)
Or they aren’t paid enough to care and rightly figure their boss is a moron
anyone who enables a company whose “values” lead to prompts like this doesn’t get to use the (invalid) “just following orders” defence
Oh I wasn’t saying that
I was saying the person may not be stupid, and may figure their boss is a moron (the prompts don’t work as LLM chat bots don’t grasp negatives in their prompts very well)