An open API service providing repository metadata for many open source software ecosystems.

GitHub / AnthenaMatrix / Prompt-Injection-Testing-Tool

The Prompt Injection Testing Tool is a Python script designed to assess the security of your AI system's prompt handling against a predefined list of user prompts commonly used for injection attacks. This tool utilizes the OpenAI GPT-3.5 model to generate responses to system-user prompt pairs and outputs the results to a CSV file for analysis.

JSON API: http://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/AnthenaMatrix%2FPrompt-Injection-Testing-Tool

Stars: 10
Forks: 1
Open issues: 0

License: mit
Language: Python
Size: 7.81 KB
Dependencies parsed at: Pending

Created at: about 1 year ago
Updated at: about 1 year ago
Pushed at: about 1 year ago
Last synced at: about 1 year ago

Topics: ai, ai-cyber-security, ai-security, openai, openai-api, prompt, prompt-engineering, prompt-injection, prompt-injection-tool, prompt-learning, prompting

    Loading...