Installing Ollama is very simple and supports multiple operating systems:
macOS Installation:
bashbrew install ollama
Linux Installation:
bashcurl -fsSL https://ollama.com/install.sh | sh
Windows Installation: Download the installer from the official website and run it.
Common Commands:
- Run a model:
bashollama run llama2 ollama run mistral ollama run codellama
- List installed models:
bashollama list
- Download a model:
bashollama pull llama2
- Remove a model:
bashollama rm llama2
- View model information:
bashollama show llama2
- Start API service:
bashollama serve
- View logs:
bashollama logs
API Call Example:
bashcurl http://localhost:11434/api/generate -d '{ "model": "llama2", "prompt": "Why is the sky blue?" }'