Skip to main content

Configuration


Ollama Configuration & Environment Variables

OLLAMA_API_KEY

What: Environment variable for your Ollama API key.
Why: Authenticate requests securely.
How: Set it in your shell or environment before using Ollama CLI or SDK.

Example:

export OLLAMA_API_KEY="your_api_key_here"
OLLAMA_HOST

What: Sets custom Ollama host endpoint.
Why: Useful if running Ollama on a different server.
How: Set environment variable accordingly.

Example:

export OLLAMA_HOST="http://localhost:11434"
OLLAMA_MODEL_PATH

What: Overrides default path to local models.
Why: Useful if you store models in a custom location.
How: Set environment variable.

Example:

export OLLAMA_MODEL_PATH="/custom/models/path"
OLLAMA_LOG_LEVEL

What: Sets the logging verbosity.
Why: Debug or reduce log output.
How: Set one of: DEBUG, INFO, WARN, ERROR.

Example:

export OLLAMA_LOG_LEVEL="DEBUG"
config.yaml

What: Ollama config file to set default CLI options.
Why: Avoid specifying flags every time.
How: Create config.yaml in your home or project directory.

Example:

default_model: llama2
temperature: 0.7
max_tokens: 200