ollama
Client for the Ollama endpoints.
Classes
OllamaClient
OllamaClient(client: OWUIClientBase)
Bases: ResourceBase
Client for the Ollama endpoints.
This client handles interaction with the Ollama service managed by Open WebUI, including model management (pull, push, create, delete), text generation (chat, completion), and configuration.
- Code Reference client Classes OpenWebUI Attributes ollama
Source code in src/owui_client/client_base.py
Functions
get_status
Check the status of the Ollama service.
Matches GET /ollama/
Returns:
| Type | Description |
|---|---|
Dict[str, bool]
|
Dict containing status (e.g. {"status": True}) |
Source code in src/owui_client/routers/ollama.py
head_status
Check the status of the Ollama service using HEAD request.
Returns:
| Type | Description |
|---|---|
bool
|
True if service is reachable. |
Source code in src/owui_client/routers/ollama.py
verify_connection
verify_connection(form: ConnectionVerificationForm) -> Any
Verify connection to an external Ollama instance.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
form
|
ConnectionVerificationForm
|
Configuration containing URL and optional key. |
required |
Returns:
| Type | Description |
|---|---|
Any
|
Response data from the verification endpoint (usually version info). |
Source code in src/owui_client/routers/ollama.py
get_config
Get the current global Ollama configuration.
Returns:
| Type | Description |
|---|---|
Dict[str, Any]
|
Dict containing ENABLE_OLLAMA_API, OLLAMA_BASE_URLS, OLLAMA_API_CONFIGS. |
Source code in src/owui_client/routers/ollama.py
update_config
update_config(form: OllamaConfigForm) -> Dict[str, Any]
Update the global Ollama configuration.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
form
|
OllamaConfigForm
|
New configuration settings. |
required |
Returns:
| Type | Description |
|---|---|
Dict[str, Any]
|
Updated configuration dictionary. |
Source code in src/owui_client/routers/ollama.py
get_models
List available models.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
url_idx
|
int
|
Optional index of the Ollama server to query. If None, queries all. |
None
|
Returns:
| Type | Description |
|---|---|
Dict[str, Any]
|
Dict containing list of models. |
Source code in src/owui_client/routers/ollama.py
get_loaded_models
List models currently loaded in memory (ps).
Returns:
| Type | Description |
|---|---|
Dict[str, Any]
|
Dict containing list of loaded models and their details. |
Source code in src/owui_client/routers/ollama.py
get_version
Get the Ollama version.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
url_idx
|
int
|
Optional index of the Ollama server to query. |
None
|
Returns:
| Type | Description |
|---|---|
Dict[str, Any]
|
Dict containing version string. |
Source code in src/owui_client/routers/ollama.py
unload_model
unload_model(form: ModelNameForm) -> Any
Unload a model from memory.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
form
|
ModelNameForm
|
Form containing the model name. |
required |
Returns:
| Type | Description |
|---|---|
Any
|
Status of operation. |
Source code in src/owui_client/routers/ollama.py
pull_model
pull_model(form: ModelNameForm, url_idx: int = 0) -> str
Pull a model from the registry.
Note: This endpoint streams responses (NDJSON) by default. The client will wait for the operation to complete and return the full NDJSON response as a string.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
form
|
ModelNameForm
|
Form containing the model name. |
required |
url_idx
|
int
|
Index of the Ollama server to use (default 0). |
0
|
Returns:
| Type | Description |
|---|---|
str
|
The full NDJSON response string. |
Source code in src/owui_client/routers/ollama.py
push_model
push_model(form: PushModelForm, url_idx: int = None) -> str
Push a model to the registry.
Note: This endpoint streams responses (NDJSON) by default. The client will wait for the operation to complete and return the full NDJSON response as a string.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
form
|
PushModelForm
|
Form containing model name and options. |
required |
url_idx
|
int
|
Optional index of the Ollama server. |
None
|
Returns:
| Type | Description |
|---|---|
str
|
The full NDJSON response string. |
Source code in src/owui_client/routers/ollama.py
create_model
create_model(
form: CreateModelForm, url_idx: int = 0
) -> str
Create a model from a Modelfile.
Note: This endpoint streams responses (NDJSON) by default. The client will wait for the operation to complete and return the full NDJSON response as a string.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
form
|
CreateModelForm
|
Form containing model name and creation options. |
required |
url_idx
|
int
|
Index of the Ollama server (default 0). |
0
|
Returns:
| Type | Description |
|---|---|
str
|
The full NDJSON response string. |
Source code in src/owui_client/routers/ollama.py
copy_model
copy_model(form: CopyModelForm, url_idx: int = None) -> Any
Copy a model.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
form
|
CopyModelForm
|
Form containing source and destination names. |
required |
url_idx
|
int
|
Optional index of the Ollama server. |
None
|
Returns:
| Type | Description |
|---|---|
Any
|
True if successful. |
Source code in src/owui_client/routers/ollama.py
delete_model
delete_model(
form: ModelNameForm, url_idx: int = None
) -> Any
Delete a model.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
form
|
ModelNameForm
|
Form containing the model name. |
required |
url_idx
|
int
|
Optional index of the Ollama server. |
None
|
Returns:
| Type | Description |
|---|---|
Any
|
True if successful. |
Source code in src/owui_client/routers/ollama.py
show_model
show_model(form: ModelNameForm) -> Dict[str, Any]
Show information about a model.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
form
|
ModelNameForm
|
Form containing the model name. |
required |
Returns:
| Type | Description |
|---|---|
Dict[str, Any]
|
Dict containing model details (modelfile, parameters, etc.). |
Source code in src/owui_client/routers/ollama.py
embed
embed(
form: GenerateEmbedForm, url_idx: int = None
) -> Dict[str, Any]
Generate embeddings for the given input (new endpoint).
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
form
|
GenerateEmbedForm
|
Form containing model and input. |
required |
url_idx
|
int
|
Optional index of the Ollama server. |
None
|
Returns:
| Type | Description |
|---|---|
Dict[str, Any]
|
Response containing embeddings. |
Source code in src/owui_client/routers/ollama.py
embeddings
embeddings(
form: GenerateEmbeddingsForm, url_idx: int = None
) -> Dict[str, Any]
Generate embeddings for the given prompt (legacy endpoint).
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
form
|
GenerateEmbeddingsForm
|
Form containing model and prompt. |
required |
url_idx
|
int
|
Optional index of the Ollama server. |
None
|
Returns:
| Type | Description |
|---|---|
Dict[str, Any]
|
Response containing embeddings. |
Source code in src/owui_client/routers/ollama.py
generate
generate(
form: GenerateCompletionForm, url_idx: int = None
) -> Union[Dict[str, Any], str]
Generate a completion for the given prompt.
If stream=True (default), the client waits for the full NDJSON response and returns it as a string.
If stream=False, returns a Dict containing the completion.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
form
|
GenerateCompletionForm
|
Form containing model, prompt, and options. |
required |
url_idx
|
int
|
Optional index of the Ollama server. |
None
|
Returns:
| Type | Description |
|---|---|
Union[Dict[str, Any], str]
|
Dict or NDJSON string depending on |
Source code in src/owui_client/routers/ollama.py
chat
chat(
form: GenerateChatCompletionForm, url_idx: int = None
) -> Union[Dict[str, Any], str]
Generate a chat completion.
If stream=True (default), the client waits for the full NDJSON response and returns it as a string.
If stream=False, returns a Dict containing the completion.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
form
|
GenerateChatCompletionForm
|
Form containing model, messages, and options. |
required |
url_idx
|
int
|
Optional index of the Ollama server. |
None
|
Returns:
| Type | Description |
|---|---|
Union[Dict[str, Any], str]
|
Dict or NDJSON string depending on |
Source code in src/owui_client/routers/ollama.py
generate_openai_completion
Generate completion using OpenAI-compatible endpoint.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
payload
|
Dict
|
OpenAI completion request payload. |
required |
url_idx
|
int
|
Optional index of the Ollama server. |
None
|
Returns:
| Type | Description |
|---|---|
Dict[str, Any]
|
OpenAI-compatible completion response. |
Source code in src/owui_client/routers/ollama.py
generate_openai_chat_completion
Generate chat completion using OpenAI-compatible endpoint.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
payload
|
Dict
|
OpenAI chat completion request payload. |
required |
url_idx
|
int
|
Optional index of the Ollama server. |
None
|
Returns:
| Type | Description |
|---|---|
Dict[str, Any]
|
OpenAI-compatible chat completion response. |
Source code in src/owui_client/routers/ollama.py
get_openai_models
List models using OpenAI-compatible endpoint.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
url_idx
|
int
|
Optional index of the Ollama server. |
None
|
Returns:
| Type | Description |
|---|---|
Dict[str, Any]
|
OpenAI-compatible model list. |
Source code in src/owui_client/routers/ollama.py
download_model
download_model(form: UrlForm, url_idx: int = None) -> str
Download a model from a URL (e.g. Hugging Face).
Note: The backend returns a Server-Sent Events (SSE) stream of progress updates. The client waits for completion and returns the full SSE text.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
form
|
UrlForm
|
Form containing the URL. |
required |
url_idx
|
int
|
Optional index of the Ollama server. |
None
|
Returns:
| Type | Description |
|---|---|
str
|
The full SSE response string. |
Source code in src/owui_client/routers/ollama.py
upload_model
Upload a model file to the server.
Note: The backend returns a Server-Sent Events (SSE) stream of progress updates. The client waits for completion and returns the full SSE text.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
file_path
|
Union[str, Path]
|
Local path to the file to upload. |
required |
url_idx
|
int
|
Optional index of the Ollama server. |
None
|
Returns:
| Type | Description |
|---|---|
str
|
The full SSE response string. |
Raises:
| Type | Description |
|---|---|
FileNotFoundError
|
If the file does not exist. |