LLM Security Guard for Code with LLMSecGuard

LLMs such as ChatGPT speed up software development, but recent studies have raised serious concerns about vulnerabilities in LLM-generated code.

What if the hints from security analysis tools guide LLMs in writing secure code? The Secure Software Engineering (SSE) Group has led the development of LLMSecGuard, an open-source framework that facilitates this goal. They'll present it at EASE 2024, June 18-21, Italy. Link to preprint: arxiv.org/abs/2405.01103