LMTWT is AI security testing framework for evaluating LLM prompt injection vulnerabilities
-
Updated
Apr 13, 2025 - Python
8000
LMTWT is AI security testing framework for evaluating LLM prompt injection vulnerabilities
Add a description, image, and links to the promptinjection topic page so that developers can more easily learn about it.
To associate your repository with the promptinjection topic, visit your repo's landing page and select "manage topics."