While the two terms are often used synonymously, prompt injections and jailbreaking are different techniques. Prompt injections disguise malicious instructions as benign inputs, while jailbreaking makes an LLM ignore its safeguards.
System…
Article Source
https://www.ibm.com/think/topics/prompt-injection