The Gap Isn't Data. It's Delivery. Why We Put 1 Million IOCs in Your Editor, Terminal, and Browser.
- Patrick Duggan
- 4 hours ago
- 6 min read
The threat intelligence market is $14.6 billion and growing. CrowdStrike charges $25 per endpoint per month. Recorded Future starts at $100,000 per year. Mandiant's pricing page says "contact sales," which is the enterprise way of saying "more than you want to spend."
Ninety-five percent of organizations on earth cannot afford those prices. The small hospitals, the school districts, the municipalities, the startups, the managed service providers serving a hundred SMBs — they face the same nation-state threats as Fortune 500 companies and defend themselves with free Google searches and hope.
That's a real gap. But it's the obvious one.
The less obvious gap — the one that matters more — is what happens at the organizations that CAN afford enterprise threat intelligence. They buy the feed. They pipe it into a SIEM. The SIEM sits in a dashboard. Three people in the SOC check the dashboard twice a day, usually when an alert fires. The rest of the time, the intelligence sits in a system of record, waiting to be alt-tabbed into.
The developer writing a Terraform config with a hardcoded IP doesn't check the threat feed. The DevOps engineer approving a PR doesn't cross-reference the indicators. The analyst reading a CrowdStrike blog post about a new C2 server doesn't validate the IOCs against their own environment. The security architect reviewing firewall rules doesn't correlate the allow-listed IPs against fresh threat data.
They all have access to the intelligence. None of them consume it where they work.
The gap isn't data. The gap is delivery.
Where Decisions Actually Happen
Security decisions don't happen in the SIEM dashboard. They happen in five places:
The code editor. A developer saves a config file. If the IP on line 42 is a known Cobalt Strike C2, the time to learn that is now — not after the code ships to production and a SOC analyst notices the connection in the SIEM two weeks later.
The terminal. An engineer runs a deploy script. If the script pulls from a domain that appeared in a threat feed yesterday, the time to learn that is before the deploy, not after.
The code review. A PR adds a new integration endpoint. If the IP in the payload matches a known adversary infrastructure, the reviewer should see a warning before clicking "Approve" — not days later in an incident postmortem.
The browser. An analyst reads a vendor report about a new campaign. The report lists IOCs. The analyst wants to know: are any of these indicators already in our environment? Right now, answering that question requires switching to the SIEM, pasting each indicator, and checking results. By the time they've done three, they've lost context on the report.
The team chat. Someone pastes a suspicious IP into a Slack channel. Today, checking it requires opening a new tab, navigating to a threat intel portal, entering the IP, and reporting back. The person who pasted it has moved on to the next conversation.
Every one of these decision points is a place where threat intelligence could prevent a mistake. Every one of them is currently served by alt-tabbing to a dashboard or Googling the indicator. The intelligence exists. The delivery doesn't.
What We Built This Week
We decided to close the delivery gap by putting the intelligence where the decisions happen. Not as a roadmap. As shipped products.
[VS Code Extension](https://marketplace.visualstudio.com/items?itemName=DugganUSALLC.dugganusa-threat-intel) — open a file, save a file, every IP, domain, hash, and CVE is checked against 1.08 million indicators in real-time. Known-bad indicators appear as inline warnings with enrichment. Right-click any text for instant lookup. AIPM audit any domain's AI presence without leaving the editor. Live on the VS Code Marketplace today.
[CLI Tool](https://github.com/pduggusa/dugganusa-cli) — npx dugganusa-lookup 185.39.19.176 from any terminal, any OS, no install required. Scan files, pipe stdin, batch process, output as JSON for automation. Exit code 1 on match makes it CI/CD native. Ship today.
[GitHub Action](https://github.com/pduggusa/dugganusa-action) — add two lines to your workflow YAML. Every PR gets scanned for threat indicators. Matches annotate the PR with warnings. Optionally block the merge. Summary table in the Actions output. Ship today.
[Scanner Core](https://github.com/pduggusa/dugganusa-scanner-core) — the shared engine underneath all of them. Extract IOCs from any text, correlate against the API, format results. One module, every integration. The stable interface that doesn't change when we add the next delivery surface.
We shipped all four in one afternoon. The technology is straightforward — regex extraction, HTTPS API calls, JSON formatting. The insight that took longer was realizing the bottleneck was never the data. It was where the data lived.
Why This Gap Exists
Enterprise threat intelligence vendors are structured around a specific delivery model: feed data into a SIEM, visualize in a dashboard, alert when thresholds fire. That model works for the three analysts in the SOC who live in the dashboard. It doesn't reach the fifty developers, twenty DevOps engineers, and ten security architects who make decisions in editors, terminals, and browsers.
The vendors don't close the delivery gap because their business model doesn't require it. They sell per-seat or per-endpoint licenses. Their revenue comes from the SIEM integration, not the VS Code extension. Building a free IDE plugin doesn't increase the license count. It's a cost center for a company selling $100K annual contracts.
For us, the economics are inverted. We run the entire platform on $75 per month. The marginal cost of one more API call is zero. A free VS Code extension isn't a cost center — it's a distribution point that drives API registrations at zero marginal cost. We can afford to give away the delivery because we don't need per-seat revenue to survive.
The gap exists because the vendors who have the data are structured around a delivery model that doesn't reach most decision-makers, and the economics of closing the gap don't work at enterprise cost structures. They work at ours.
The Distribution Flywheel
Each integration isn't just a product. It's a compounding distribution channel.
A developer installs the VS Code extension because a colleague mentioned it. The extension scans their code and finds nothing — clean. But every time they save a file, the status bar says "DugganUSA: clean." The brand registers subconsciously. When they encounter a suspicious IP in a log, they right-click and look it up. The result says "Not found in 1.08M+ IOC index. Clean." They trust the index because it's been running quietly in their editor for weeks.
Six months later, their company needs a STIX feed for the new SIEM deployment. The developer remembers the extension. They mention DugganUSA in the vendor evaluation. The CTO Googles it. The blog has 1,734 posts. The STIX feed has 275+ consumers. The MN Cup application is in the High Tech semifinalist round. The GitHub Action is already in two of their repos.
The VS Code extension didn't close the deal. The VS Code extension started the relationship. The deal closed because the relationship had been running in the background for six months, building trust one "clean" scan at a time.
That's the flywheel. Every integration is a touchpoint. Every touchpoint is a brand impression. Every brand impression compounds. And every one of them is free because the marginal cost is zero.
What's Next
The VS Code extension, CLI tool, and GitHub Action are the first three delivery surfaces. The same core engine ports to Chrome (every webpage becomes an IOC scanner), Slack (paste an IP, get enrichment), Splunk (Splunkbase listing), Sentinel (Microsoft Content Hub), JetBrains (IntelliJ plugin), Obsidian (OSINT researcher community), Raycast (macOS power users), and more.
Each integration is a thin wrapper around the same API. We ship one per day if we want to. The stable interface doesn't change. The delivery surfaces multiply.
The threat intelligence market is $14.6 billion. The gap isn't data — every major vendor has the data. The gap is delivering that data to the fifty people in the organization who make security-relevant decisions and currently don't check the dashboard.
We're closing the gap, one integration at a time, at a price point the 95% can afford, delivered to the surfaces where the decisions actually happen.
The fridge is in the room now. Cold. Free. No cap to unscrew.
— Patrick
Install today:
CLI Tool — npx dugganusa-lookup
GitHub Action — pduggusa/dugganusa-action@v1
STIX Feed — 275+ consumers, 46 countries
AIPM Audit — how do AI models see your brand?
How do AI models see YOUR brand?
AIPM has audited 250+ domains. 15 seconds. Free while still in beta.




Comments