GPT Dan
Software / App
A 'jailbreak' method for ChatGPT that allows it to bypass its trust and safety filters by instructing it to behave as an unrestricted AI, which has since been patched.
Mentioned in 1 video
A 'jailbreak' method for ChatGPT that allows it to bypass its trust and safety filters by instructing it to behave as an unrestricted AI, which has since been patched.