Nemeski@lemm.ee to Technology@lemmy.worldEnglish · 4 months agoOpenAI’s latest model will block the ‘ignore all previous instructions’ loopholewww.theverge.comexternal-linkmessage-square101fedilinkarrow-up1446arrow-down17
arrow-up1439arrow-down1external-linkOpenAI’s latest model will block the ‘ignore all previous instructions’ loopholewww.theverge.comNemeski@lemm.ee to Technology@lemmy.worldEnglish · 4 months agomessage-square101fedilink
minus-squarefelixwhynot@lemmy.worldlinkfedilinkEnglisharrow-up2·4 months agoDid they really? Do you mean specifically that phrase or are you saying it’s not currently possible to jailbreak chatGPT?
minus-squareGrimy@lemmy.worldlinkfedilinkEnglisharrow-up3·edit-24 months agoThey usually take care of a jailbreak the week its made public. This one is more than a year old at this point.
Did they really? Do you mean specifically that phrase or are you saying it’s not currently possible to jailbreak chatGPT?
They usually take care of a jailbreak the week its made public. This one is more than a year old at this point.