Developer security platform Snyk has acquired AI security research firm and early pioneer in developing safeguards against emerging AI threats, Invariant Labs.
Invariant Labs has built Guardrails, a transparent security layer at the LLM and agent level that allows agent builders and software engineers to augment existing AI systems with strong security safeguards. The company’s unique methods take into account contextual information, static scans of agent tools and implementations, runtime information, human annotations, and incident databases.
With Invariant Labs, developers can inspect and observe agent behaviour, enforce contextual security rules on agent systems, and scan MCP servers for vulnerabilities.
“This acquisition is an important integration into Snyk’s recently launched AI Trust Platform that adds the ability to secure applications from emergent threats,” said Peter McKay, CEO of Snyk.
“Snyk can now offer customers a single platform to address both current application and agentic AI vulnerabilities.”
The Invariant Labs acquisition will also provide a major advancement for Snyk Labs, the company’s new research arm focused on advancing AI security delivered through its AI Trust Platform.
According to Snyk, unauthorised data exfiltration to AI agents executing unintended actions, and threats like MCP vulnerabilities are already appearing in production. Invariant Labs is at the forefront of research on these evolving risks, discovering and naming attack terminology such as “tool poisoning” and “MCP rug pulls.”
“We’ve spent years researching and building the frameworks necessary to secure the AI-native future,” said Marc Fischer, PhD, CEO and co-founder of Invariant Labs.
“We must understand that agent-based AI systems are a powerful new class of software, especially autonomous ones, and demand greater oversight and stronger security guarantees than traditional approaches.
We’re excited to join the Snyk team, as this mindset is deeply aligned with their mission.”
Lead image: Freepik.
Would you like to write the first comment?
Login to post comments