Jail Breaking

Jail Breaking is the process of removing the constraints that LLM designers have placed on user's interactions with ChatGPT. If you have seen stories about ChatGPT wanting to get free or ChatGPT wanting to destroy the world, these are examples of Jailbreaking.

Within ChatGPT there is a command called DAN(Do Anything New) which is a source of push and pull between the community trying to get more out of ChatGPT and OpenAI trying to patch jailbreaks.

Last updated