ez-term
Natural language in. Shell commands out.
100% local · ollama · open source
What it does
Type what you want. Get the command. Review, then run.
AI-Powered
Local language models translate plain English into accurate shell commands.
Runs Locally
100% local execution via Ollama. No cloud API calls. No data leaving your machine.
Open Source
Read the code. Fork it. Audit it. No black boxes.
Context-Aware
Knows your shell, current directory, and environment. Suggestions match where you are.
Questions
The short answers.
Does ez-term send my data to the cloud? ▼
No. ez-term uses Ollama to run AI models locally on your machine. Nothing leaves your device.
What AI models does it support? ▼
Any model supported by Ollama — Llama, Mistral, CodeLlama, and many others.
Do I need a powerful computer? ▼
A modern computer with 8GB+ RAM runs smaller models fine. Larger models benefit from more RAM and a GPU.
Is it safe to run generated commands? ▼
ez-term shows you the command before execution. Always review commands before you run them.