AI/ML

    Crush CLI - Your new coding bestie, now available in your favourite terminal.


    Introduction

    Developers don’t need another bloated editor. They need acceleration. They need clarity. They need control inside the terminal they already love.

    Crush isn’t a cute add‑on; it’s a glamorously capable, agentic AI that plugs into real workflows, speaks to real tools and respects real projects right in the shell.

    It keeps sessions per project, expands context with LSPs and connects to the LLMs preferred by professional teams, from OpenAI and Anthropic to Gemini, Groq, OpenRouter, Bedrock and even local engines via OpenAI‑compatible APIs. It’s multi‑model, model‑switching and terminal‑native, so builders ship faster and debug with drive. From messy manual hunts to AI-powered flow, Crush turns terminal time into momentum.

    Traditional Terminal AI vs Crush CLI

    Model choice

    • Traditional terminal AI helpers: Single provider, rigid
    • Crush CLI: Multi-model: OpenAI, Anthropic, Gemini, Groq, OpenRouter, Bedrock; add custom OpenAI/Anthropic-compatible; local (Ollama/LM Studio) via OpenAI-compatible APIs

    Context

    • Traditional terminal AI helpers: Stateless chat windows
    • Crush CLI: Session-based, per-project memory and context

    Code awareness

    • Traditional terminal AI helpers: Minimal
    • Crush CLI: LSP-enhanced semantics across languages for richer assistance

    Extensibility

    • Traditional terminal AI helpers: Closed or ad-hoc
    • Crush CLI: First-class MCP (http/stdio/sse) for capabilities and tools

    Platform coverage

    • Traditional terminal AI helpers: macOS/Linux only
    • Crush CLI: macOS, Linux, Windows (PowerShell/WSL)

    Safety & control

    • Traditional terminal AI helpers: Opaque tool use
    • Crush CLI: Permissioned tool calls, logs and explicit "yolo" escape hatch

    Configuration

    • Traditional terminal AI helpers: Proprietary formats
    • Crush CLI: Human-readable JSON, project and global precedence

    Observability

    • Traditional terminal AI helpers: Sparse
    • Crush CLI: Built-in logs and debug flags; tail and follow inside the CLI

    Performance and productivity wins

    • Switch models mid‑session without losing context: cut context resets and churn.
    • Enrich reasoning with LSP context: reduce back‑and‑forth and fix cycles.
    • Wire in tools via MCP: delegate structured tasks safely and visibly.

    Metrics & Social Proof

    • Community momentum: 8,900+ GitHub stars; 400+ forks; 26 contributors reflected in repository stats at the time of writing.
    • Open‑source licensing: FSL‑1.1‑MIT (MIT‑compatible), optimised for sustainable, trusted integrations.
    • Public releases: Active cadence with v0.4.0 latest as of Aug 8, 2025, plus 11 prior releases.
    • Real‑world chatter: Developers compare Crush favorably against Gemini CLI for terminal coding flows and model flexibility.
    • Team transparency: Building in public across GitHub, issues and social; the project encourages community contributions via the model index (Catwalk) and provider updates.

    Selected praise and signals

    “High-performance, agentic coding tool built with Charm libraries” - public positioning from the maintainers.

    Technical Deep Dive Spec table

    Languages

    • Go (CLI implementation)

    Platforms

    • macOS, Linux, Windows (PowerShell, WSL)

    Model providers

    • OpenAI, Anthropic, Google Gemini, Groq, OpenRouter, AWS Bedrock (Claude), Azure OpenAI, Vertex AI; custom OpenAI/Anthropic‑compatible

    Local models

    • Via OpenAI‑compatible endpoints (e.g., Ollama: localhost:11434/v1; LM Studio: localhost:1234/v1)

    Context & code intelligence

    • LSP integration; user‑configurable per language

    Extensibility

    • MCP transports: stdio, http, sse; environment expansion supported

    Permissions

    • Allowed tools list: global --yolo to suppress prompts (use carefully)

    Logging

    • ./.crush/logs/crush.log; logs, tail, follow; --debug, debug_lsp options

    License

    • FSL‑1.1‑MIT

    Sample I/O table

    Prompt (in Crush): Scan this Go project, identify flaky tests and propose a parallelisation strategy. Use LSP context.

    • AI Response: Enumerates test files, highlights data races and shared state issues, proposes subtest structure, table‑driven patterns and run filters; suggests go test -race and per‑pkg concurrency with reasoning grounded in LSP symbols.

    Prompt (in Crush): Switch to Anthropic for reasoning; summarise diffs across src/* and propose a refactor plan.

    • AI Response: Switches provider mid‑session without losing context; outputs component‑grouped refactor plan, risk list and stepwise commits, referencing file symbols gleaned from LSP.

    Prompt (in Crush): Add an MCP filesystem server and generate a migration for the models package; ask permission before edits.

    • AI Response: Registers MCP server (stdio/http/sse), requests edit permissions, writes migration file and logs actions for review; schema adheres to crush.json MCP format.

    Community & Adoption

    Ecosystem and contributions

    • Model listing via the Catwalk community‑supported repository for Crush‑compatible models.
    • Active issues shaping roadmap, including progress visibility and Windows ergonomics.
    • Tutorials and third‑party content amplifying adoption and comparisons.(https://www.youtube.com/watch?v=IdNr-PngQCI)

    Installation & Getting Started

    System requirements

    • A supported OS: macOS, Linux, Windows (PowerShell/WSL)
    • An API key for at least one provider (OpenAI, Anthropic, Gemini, Groq, OpenRouter, Bedrock, Azure OpenAI, Vertex AI) or a local OpenAI‑compatible endpoint.

    Install options

    • Homebrew (macOS/Linux): brew install charmbracelet/tap/crush
    • npm (Node.js): npm install -g @charmland/crush
    • Arch: yay -S crush-bin
    • Nix: nix run github:numtide/nix-ai-tools#crush
    • Windows: winget install charmbracelet.crush or Scoop (add bucket then scoop install crush)

    First run

    • Launch crush; enter provider API key when prompted.
    • Optional: configure LSPs in crush.json for Go/TS/Nix or others.

    Roadmap & Version History

    Version: v0.4.0

    • Date: Aug 8, 2025
    • Notable items: Latest release, continued improvements across agent flow and UX per release cadence.

    Version: Prior 11 releases

    • Date: Various
    • Notable items: Incremental features across providers, logging, permissions and platform polish, visible in GitHub releases list.

    Upcoming capabilities and focus areas

    • Better progress status and background activity clarity based on community feedback.
    • Windows terminal UX refinements (selection/copy behaviours surfaced by users).
    • Ongoing provider/model updates surfaced via Catwalk and configuration improvements for local engines.

    FAQ

    Q1. Is Crush open source?

    Yes, licensed FSL‑1.1‑MIT, an MIT‑compatible license.

    Q2. Which LLMs does Crush support?

    OpenAI, Anthropic, Google Gemini, Groq, OpenRouter, AWS Bedrock (Claude), Azure OpenAI, Vertex AI and any OpenAI/Anthropic‑compatible provider; local engines via OpenAI‑compatible APIs (e.g., Ollama, LM Studio).

    Q3. Does it run on Windows?

    Yes, Windows PowerShell and WSL are first‑class, alongside macOS, Linux and multiple BSDs.

    Q4. How does Crush understand my codebase?

    It integrates Language Server Protocols (LSPs) for semantic awareness and richer context.

    Q5. Can I extend Crush with tools?

    Yes, via MCP servers over stdio, http and sse; configure in crush.json with environment expansion.

    Call to Action

    Try Crush CLI today. Install in minutes. Ship in hours. Level up from junior to genius, inside your favourite terminal.

    Join the community on Discord.

    Need help with AI transformation? Partner with OneClick to unlock your AI potential. Get in touch today!

    Contact Us

    Comment

    Share

    facebook
    LinkedIn
    Twitter
    Mail
    AI/ML

    Related Center Of Excellence