hey is a terminal-native AI assistant that translates plain English into executable shell commands using a locally-hosted LLM via Ollama. Your data never leaves your machine.
- 100% Local & Private — All reasoning happens on your machine via Ollama. No API keys, no cloud, no telemetry.
- Cross-Platform Intelligence — Detects your OS (macOS/Linux/Windows) and generates the correct flags. Won't suggest
xargs -don BSD orapton Arch. - Agentic Context Gathering — Ask "is Docker running?" and
heysilently runs diagnostics, reads the output, and answers in plain English. - Security Governance — Dangerous commands (
rm -rf,mkfs,DROP TABLE) are intercepted and require explicit confirmation before execution. - Pipe-Friendly — Pipe error logs directly:
npm run build 2>&1 | hey what is causing this error? - Conversational Memory — Remembers your recent interactions for follow-up questions.
hey requires Ollama running locally. Install it first, then choose your platform:
brew tap sinsniwal/hey-cli
brew install hey-cliTip
Apple Silicon (M1/M2/M3) Note: If you see an error about Rosetta 2 while installing via Homebrew, ensure your terminal is running natively (not under Rosetta emulation). You can force a native installation by running:
arch -arm64 brew install hey-cli
curl -sL https://raw.githubusercontent.com/sinsniwal/hey-cli/main/install.sh | bashscoop install https://raw.githubusercontent.com/sinsniwal/hey-cli/main/scoop/hey-cli.jsonInvoke-WebRequest -Uri "https://raw.githubusercontent.com/sinsniwal/hey-cli/main/install.ps1" -OutFile "$env:TEMP\hey_install.ps1"; & "$env:TEMP\hey_install.ps1"Download hey.exe from the latest release. No Python required.
uv tool install hey-cli-pythonNote: After installation, authenticate with
ollama loginand pull the default model:ollama pull gpt-oss:20b-cloud
macOS & Linux:
curl -sL https://raw.githubusercontent.com/sinsniwal/hey-cli/main/uninstall.sh | bashWindows (PowerShell):
Invoke-WebRequest -Uri "https://raw.githubusercontent.com/sinsniwal/hey-cli/main/uninstall.ps1" -OutFile "$env:TEMP\hey_uninstall.ps1"; & "$env:TEMP\hey_uninstall.ps1"hey <your objective in plain English>| Command | What happens |
|---|---|
hey list all running docker containers |
Generates and runs docker ps |
hey is port 8080 in use? |
Silently runs lsof -i :8080, reads output, answers in English |
hey forcefully delete all .pyc files |
Generates find . -name "*.pyc" -delete, pauses for confirmation |
hey compress this folder into a tar.gz |
Generates the correct tar command for your OS |
npm run build 2>&1 | hey what broke? |
Reads piped stderr and explains the error |
hey --clear |
Wipes conversational memory |
By default, CLI tools cannot change your terminal's directory because they run in a subshell. To enable hey to change your directory (e.g., hey go to desktop), add the following to your shell configuration (~/.zshrc or ~/.bashrc):
eval "$(hey --shell-init)"For Windows (PowerShell):
hey --shell-init | Out-String | iex| Level | Flag | Behavior |
|---|---|---|
| 0 | --level 0 |
Dry-run — shows the command but never executes |
| 1 | (default) | Supervised — safe commands auto-run, risky ones ask for confirmation |
| 2 | --level 2 |
Unrestricted — executes everything without confirmation |
| 3 | --level 3 |
Troubleshooter — iteratively debugs until the objective is resolved |
hey works locally by default, but it also supports authenticated Ollama instances and custom hosts:
- Standard Login: Most users should run
ollama loginonce to authenticate their terminal with the local or cloud instance. - Auth Key: If you are in a CI/CD or server environment, you can set the
OLLAMA_API_KEYenvironment variable. - Custom Host: If Ollama is running on a different port or machine, set
OLLAMA_HOST(e.g.,export OLLAMA_HOST="http://192.168.1.10:11434"). - Custom Model: You can provide a custom model via
--model:hey "summarize this file" --model llama3
Safety is enforced at runtime via a local governance engine (~/.hey-rules.json):
- Blocked —
rm -rf /,mkfs,:(){ :|:& };:are permanently rejected. - Explicit Confirm — High-risk operations (
rm,truncate,DROP) require typing a keyword to authorize. - Y/N Confirm — Moderate-risk operations require a quick
y. - Auto-Run — Safe diagnostics (
ls,cat,grep,git status) execute immediately.
Initialize or customize your rules:
hey --inithey ships with built-in knowledge for macOS, Ubuntu/Debian, Arch Linux, Fedora/RHEL, Windows PowerShell, FreeBSD, and ChromeOS.
Want to improve it for your OS? Add a Markdown file to hey_cli/skills/ with plain-English rules (e.g., "On Alpine, use apk add instead of apt install"). The engine loads them dynamically at runtime.
Pull requests for new OS skills are welcome!
hey "your question"
│
▼
┌──────────────┐ ┌──────────────────┐
│ CLI Parser │────▶│ Governance Check │
└──────────────┘ └────────┬─────────┘
│
┌─────────▼──────────┐
│ Ollama (local LLM)│
│ localhost:11434 │
└─────────┬──────────┘
│
┌─────────▼──────────┐
│ Command Runner │
│ (execute / confirm)│
└────────────────────┘
- Zero external API calls — communicates with Ollama via
localhost:11434using Python's built-inurllib. - Zero compiled dependencies — the core logic is pure Python (only
richis used for formatting). - Pure Python 3.9–3.14 — works with
uv,pipx, or standardpip. No build tools required. - Instant Speed — Installation with
uvorbrewbinaries takes < 2 seconds.
- Fork the repository
- Create a feature branch:
git checkout -b feature/my-feature - Commit changes:
git commit -m "feat: add my feature" - Push and open a Pull Request
See RELEASING.md for maintainer release instructions.
MIT — Mohit Singh Sinsniwal
