Overview
Features
File collection and context assembly
Collects files from specified absolute paths (with wildcards in filenames) and assembles them into a single context for the model.
OpenRouter-backed model support
Supports 500+ OpenRouter models with dynamic context window sizing to fit the assembled context.
Performance modes
Offers fast, mid, and think modes to control reasoning depth and speed.
Output to file
Optional output_file parameter saves results to disk and returns a concise notification.
Dynamic size limits
Context window size-based dynamic file size limits per model; e.g., Gemini 3 Pro up to 1M tokens.
CLI and API integration
Provides CLI usage and a Python API for programmatic access, with JSON-based parameters.
Ignore rules
Automatically excludes common files/folders such as __pycache__, .env, secrets.py, .DS_Store, .git, node_modules.
Mnemonics for model/mode combos
Includes mnemonics like gemt, gptt, grot, gemf, ULTRA to reference model + mode configurations.
Who Is This For?
- AI agents:Offload large-context file analysis to OpenRouter-backed models.
- Developers:Analyze large codebases and document repositories beyond agent context limits.
- Research teams:Summarize and analyze large document collections with high-context models.




