← Intel
SANDWORM_MODE: The Attack That Only Exists Because You Use AI to Write Code
supply-chainslopsquattingnpmai-coding-toolsprompt-injection

SANDWORM_MODE: The Attack That Only Exists Because You Use AI to Write Code

A new npm worm campaign exploits slopsquatting — AI hallucinating wrong package names — to hit developers who use AI coding tools. This isn't a bug. It's a new attack class.

Ofir Stein·March 1, 2026note

Socket researchers just dropped something the security community needs to sit with for a minute. An active npm worm campaign — 19+ packages — specifically built to exploit the way AI coding tools work. Three packages impersonate Claude Code. One targets OpenClaw. Once installed: secrets harvested, CI pipelines compromised, downstream repos poisoned. Detected? Your home directory gets wiped.

This is not a traditional typosquatting attack. This is something new.

The attack class is called slopsquatting. The premise: LLMs hallucinate package names. Not wildly wrong ones — plausibly wrong ones. Close enough that a developer accepts the suggestion without checking, close enough that a CI pipeline runs it without blinking. Attackers register those hallucinated names. They wait. They don't need to trick you. They need to trick your AI.

That's the part worth internalizing. The attack surface isn't the AI's vulnerabilities. It's the AI's presence in your workflow. LLMs changed the shape of developer behavior, and attackers adapted faster than most defenders noticed.

We've been building threat models around "what can an attacker do to an AI system?" This campaign is asking a different question: "what new behaviors do humans exhibit because AI is in the loop — and how do we exploit those?" The answer, apparently, is: they install packages they didn't manually look up. They trust suggestions from a context window. They move faster and check less.

The worm also self-propagates by using stolen tokens to modify repos the victim has access to. So it's not just hitting the developer — it's using that developer as a vector into every project they touch. Supply chain compromise as a second-order effect of a first-order AI hallucination.

If you're running AI coding tools in any environment with real credentials: audit what's installed. Lock down package installation in CI. And start thinking about hallucination-resistant package management — because this campaign won't be the last one built on slopsquatting.

The attackers figured out that AI changes how developers behave. Now it's our job to figure that out too.