CI for your documentation. Automatically tests code examples, detects API drift, and posts PR comments with AI fixes.
Hacker News
Show HN thread — discuss the sandbox architecture, drift detection, and WASM runtimes
Product Hunt
Upvote DocsCI on Product Hunt to help other developers find it
Indie Hackers
Building in public — progress updates, metrics, and learnings
BetaList
Early access signup on BetaList for startup hunters
Show HN: DocsCI – CI pipeline for your documentation (snippetci.com)
Hey HN,
I built DocsCI to solve a problem I kept hitting at API companies: broken code examples.
Every API-first company has this: a developer copies a curl example from the docs, gets a 422, opens a support ticket. The fix takes 4 minutes. The ticket cost 45 minutes of support time. The developer already left.
DocsCI runs a docs-specific CI pipeline on every PR:
- Executes code examples (Python, JS/TS, curl, Go, Ruby) in hermetic WASM/V8 sandboxes
- Diffs documented endpoints against your OpenAPI spec to detect drift
- Scans for secrets before execution (40+ patterns)
- Posts inline PR comments with AI-generated fixes on exact failing lines
It integrates with GitHub Actions in about 5 minutes:
```yaml
- name: Run DocsCI
run: |
tar czf docs.tar.gz docs/ *.md
curl -sf -X POST https://snippetci.com/api/runs/queue \
-H "Authorization: Bearer ${{ secrets.DOCSCI_TOKEN }}" \
-F "docs_archive=@docs.tar.gz" | jq -e '.status == "passed"'
```
We analyzed support ticket data from 12 API-first companies. Median cost of a single broken example: $47K/quarter in developer time, support overhead, and churn.
Happy to answer questions about the sandbox architecture, drift detection, or the decision to use WASM for language runtimes.
Tech stack: Next.js, Supabase, Vercel, V8 isolates + Pyodide for Python.
Live: https://snippetci.com
Docs: https://snippetci.com/docs
Comparison pages: https://snippetci.com/vsTagline
CI pipeline for your API documentation — tests code examples, detects drift, fixes PRs
Description
DocsCI is a GitHub/GitLab-integrated CI pipeline that automatically tests your documentation code examples in sandboxes, detects API drift against your OpenAPI spec, and posts inline PR comments with AI-generated fixes. **Problems it solves:** • Broken code examples that cost ~$47K/quarter in support tickets • API drift (docs diverge from live API after releases) • No automated gate on documentation quality **How it works:** 1. Connect your docs repo (GitHub Action, 5 min setup) 2. DocsCI runs on every PR — executes Python, JS/TS, curl, Go examples 3. Posts precise inline comments with errors + AI patch on failing lines 4. Detects when documented API params drift from your OpenAPI spec **Tech:** Next.js, Supabase, V8 isolates + Pyodide for Python, hermetic sandboxes with network allowlists. **Free tier:** unlimited public repos, 100 runs/month, full drift detection.
Launching DocsCI: automated CI for your API documentation After watching support tickets pile up at every API company I worked with, I built DocsCI — a GitHub/GitLab-integrated CI pipeline specifically for documentation. **The problem:** - Developers copy broken code examples from docs → support tickets - SDK releases happen without updating docs → API drift - Nobody runs the examples during PR review → silent failures ship **What I built:** DocsCI runs on every PR that touches docs. It executes code snippets in sandboxes, diffs docs against your OpenAPI spec, and posts inline PR comments with AI fixes. **Early results from beta users:** - Avg 23 broken examples found on first scan - ~$47K/quarter in estimated developer time saved per broken example fixed - 5-minute GitHub Actions setup **Free tier:** unlimited public repos, 100 runs/month. Live: https://snippetci.com Would love feedback from developers who've dealt with broken documentation.
Copy-paste-ready templates for GitHub Actions and DocsCI configuration.