Codebase to LLM context — tools.voiddo/ctxstuff vs repomix

Both pack a codebase into a single file for LLM consumption. repomix is more popular and battle-tested. ctxstuff adds multi-model cost estimation, context window splitting, and a watch mode for iterative development.

tools.voiddo/ctxstuff

  • Pack repo → token count → cost estimate across 60+ models
  • Context splitting: auto-chunk for any model's context window
  • Watch mode: rebuild on file change
  • 2026 pricing snapshot: GPT-5, Claude Opus 4.7, Gemini 2.5, Grok 4, Llama 4, DeepSeek…
  • Optimize output: remove comments, minify, strip whitespace
  • CLI-first, pipe-friendly, zero deps
  • MIT licensed, Node 14+

repomix

  • 50,000+ GitHub stars — the most popular codebase packer
  • XML-tagged output with file boundaries and tree structure
  • Security check: flags secrets and sensitive content before packing
  • Remote GitHub URL support — pack any public repo
  • VS Code extension available
  • No cost estimation; no context splitting; no watch mode
  • More output format options (XML, Markdown, plain)
use tools.voiddo/ctxstuff →

Feature comparison

Feature tools.voiddo/ctxstuff repomix
Pack directory to single context file
Token count
Cost estimation (60+ models)✓ with 2026 pricing
Context window splitting✓ --split flag
Watch mode (rebuild on change)
Respect .gitignore
Custom ignore patterns
Markdown output✓ (default)
XML-tagged output✓ (default)
Plain text output
Remove comments / minify✓ --optimizepartial
Security / secret scan before pack
Pack from GitHub URL (remote)
VS Code extension
Multi-model cost comparison
No API key required
GitHub community / starssmaller community50,000+ stars
Open source✓ MIT✓ MIT

Comparison based on publicly observable behavior as of 2026-05. For remote GitHub packing, secret scanning, and VS Code integration, repomix is the better choice. For cost estimation and iterative development with watch mode, ctxstuff is the stronger fit.

FAQ

What does ctxstuff add over repomix?
ctxstuff adds three features repomix lacks: (1) cost estimation — after packing, it shows how much the context would cost to send to GPT-5, Claude Opus 4.7, Gemini 2.5 Pro, Grok 4, or 55+ other models at 2026 pricing; (2) context splitting — if your packed context exceeds a model's window, --split automatically divides it into properly-sized chunks; (3) watch mode — rebuilds the packed context whenever source files change, useful during active development.
Is repomix more trusted than ctxstuff?
Yes, by community size. repomix reached 50,000+ GitHub stars, has a VS Code extension, and is widely referenced in AI developer communities. If you want the most-used tool with the most community support, repomix is the safer choice. ctxstuff is a newer, smaller tool focused on cost visibility and developer-loop features.
How do I estimate the cost of sending my codebase to Claude?
Run ctxstuff . --model claude-opus-4-7. ctxstuff packs your directory, counts tokens, and outputs the estimated input cost at 2026 pricing. You can compare multiple models at once: ctxstuff . --compare outputs a table of token counts and costs for all 60+ supported models.
My repo is too big for one context window. Can ctxstuff handle that?
Yes. Use ctxstuff . --split --model gpt-4o. ctxstuff measures the total token count, divides the files into groups that each fit within gpt-4o's 128K window, and writes one output file per chunk. repomix does not split; you would need to manually select directories.
Can ctxstuff pack a GitHub repo without cloning it?
No — ctxstuff requires local files. repomix supports remote GitHub URLs directly (repomix --remote https://github.com/owner/repo), which is useful for analyzing public repos without cloning. Clone the repo first if you want to use ctxstuff on a remote codebase.
Does ctxstuff scan for secrets before packing?
ctxstuff respects .gitignore and custom ignore patterns, which typically excludes .env files and secret files. However, it does not actively scan for secrets the way repomix does. If you are working with sensitive codebases, repomix's built-in secret detection is an advantage.

Try tools.voiddo/ctxstuff

Pack your codebase into LLM-ready context. See token counts and cost estimates for 60+ models. Watch mode, context splitting, and optimization included. Zero deps.

open ctxstuff → npm install @v0idd0/ctxstuff

Competitor names and trademarks belong to their respective owners. This comparison reflects publicly observable tool behavior.