Skip to main content

Chat


Ollama Chat & Interaction Commands

ollama chat <model>

What: Start an interactive chat session with the model.
Why: Allows continuous conversational interaction.
How: Run this command with model name.

Example:

ollama chat llama2
--history <file>

What: Loads chat history from a file.
Why: Maintain context over multiple sessions.
How: Pass history file when starting chat.

Example:

ollama chat llama2 --history chat_history.json
--no-history

What: Runs chat without saving or loading history.
Why: Use for one-off chats without persistent context.
How: Add this flag to chat command.

Example:

ollama chat llama2 --no-history
--system <string>

What: Overrides the system prompt for the chat session.
Why: Change assistant’s behavior or personality.
How: Provide system prompt string.

Example:

ollama chat llama2 --system "You are a helpful assistant."
--max-tokens <number>

What: Limits maximum tokens per response.
Why: Control response length.
How: Add this flag with desired number.

Example:

ollama chat llama2 --max-tokens 150