Sunday, 27 July 2025

Difference between prompt injections and jail breaking



Although often confused, prompt injections and jailbreaking are distinct methods. Prompt injections involve cleverly crafting seemingly harmless inputs to conceal malicious commands, whereas jailbreaking involves bypassing an LLM's built-in security measures.



Is Jail breaking and prompt injections similar? 


Prompt injections and jailbreaking might sound similar, but they're actually different. One tricks the system with sneaky inputs, while the other breaks free from the rules altogether.

No comments:

Answer of How do you design a Retrieval-Augmented Generation system to minimize hallucinations and handle conflicting information?

*Designing a RAG system that stays factual + handles conflicts* RAG reduces hallucinations by grounding the LLM in retrieved docs. But garba...