GitHub / null1024-ws / Poisoning-Attack-on-Code-Completion-Models
Paper "An LLM-Assisted Easy-to-Trigger Poisoning Attack on Code Completion Models: Injecting Disguised Vulnerabilities against Strong Detection"
Stars: 2
Forks: 0
Open issues: 0
License: None
Language: Python
Size: 58 MB
Dependencies parsed at: Pending
Created at: over 1 year ago
Updated at: 11 months ago
Pushed at: about 1 year ago
Last synced at: 11 months ago
Topics: large-language-models, poisoning-attack, program-analysis, usenix-security-2024
Readme
Loading...