Skip to content

llm

Agent-readyLocal AIAI

Prompting, local plugins, and structured llm outputs from the terminal.

Prompting, local plugins, and structured llm outputs from the terminal. Built by Simon Willison. Supports structured output — good for scripts and agents.

Task fit

prompting, local plugins, and structured llm outputs from the terminal.

Lane

Set up coding agents, local models, and AI-first terminal workflows.

Operator brief

Use llm for prompting, local plugins, and structured llm outputs from the terminal.

Run `llm 'Explain this command: rg TODO src'` and see what comes back.

Repository family

Simon Willison

First trust check

The llm CLI responds and is ready for prompts.

Safe first loop

Install, verify, then run one real command.

Agent capability loop

Install command

$ pipx install llm

Operator pack

Copy or export the working notes for this CLI before handing it to an agent.

Verify

$ llm --help

The llm CLI responds and is ready for prompts.

First real command

$ llm 'Explain this command: rg TODO src'

First steps

  1. 01Install llm.
  2. 02Run `llm --help` first.
  3. 03Start with `llm 'Explain this command: rg TODO src'`.
  4. 04Authenticate llm before asking the agent to do real work.

When to use / hold off when

Best for

prompting, local plugins, and structured llm outputs from the terminal.

Use this when

You want AI models and inference you can script with structured output.

Hold off when

You need something that works offline or without an account.

Trust and constraints

automation-ready100/100
Install readyAutomation-ready
JSON outputYes
Non-interactiveYes
CI-friendlyYes

Why operators pick it

  • llm fits local ai well, especially for prompting, local plugins, and structured llm outputs from the terminal.
  • 1,418 homebrew installs (30d).
  • Good for scripts and agents.

Constraints

  • Sign in before real work.
  • Needs network access.

Facts and links

Install withpipx
Homebrew installs (30d)1.4K
GitHub stars11.4K
LicenseApache-2.0
UpdatedMar 17, 2026