Human-Readable First
Always exports something you can skim in 2 minutes.
Use it when switching tools, agents, or worktrees.
Export the current state for a deeper reasoning pass with another model.
Keep sources, decisions, and constraints bundled for seamless context transfer.
Assign tasks across models and track who is doing what.
Use Ollama or LMStudio when you need privacy or offline access.
Simple three-step process: import, extract, export.
Bring in a conversation from any AI tool or IDE.
Pull out decisions, code blocks, and next steps automatically.
Generate a clean markdown handoff for the next tool or model.
Install via pip or pipx and start orchestrating.
Install globally with pip for system-wide access.
Recommended: isolated environment with pipx.
Initialize a new project with default templates.
No SaaS lock-in. No mandatory cloud. Just a CLI that helps you move faster.
Always exports something you can skim in 2 minutes.
CLI-first design with no mandatory cloud dependency.
MIT licensed. Fork, customize, and contribute.
Works with any LLM provider or local model.