‘Skeleton Key’ attack unlocks the worst of AI, says Microsoft
‘Skeleton Key’ attack unlocks the worst of AI, says Microsoft 2024-06-28 at 09:47 By Thomas Claburn Simple jailbreak prompt can bypass safety guardrails on major models Microsoft on Thursday published details about Skeleton Key – a technique that bypasses the guardrails used by makers of AI models to prevent their generative chatbots from creating harmful […]
React to this headline:
‘Skeleton Key’ attack unlocks the worst of AI, says Microsoft Read More »