Skip to content

ollama

Pydantic models for Ollama endpoints.

Classes

OllamaConfigForm

Bases: BaseModel

Configuration for Ollama API settings.

This form is used to update the global Ollama configuration, including enabling/disabling the API, setting base URLs, and configuring specific API settings like keys.

Attributes

ENABLE_OLLAMA_API
ENABLE_OLLAMA_API: Optional[bool] = None

Whether to enable the Ollama API integration.

OLLAMA_BASE_URLS
OLLAMA_BASE_URLS: List[str]

A list of base URLs for Ollama instances (e.g., http://localhost:11434).

OLLAMA_API_CONFIGS
OLLAMA_API_CONFIGS: Dict[str, Any]

A dictionary mapping URL indices (as strings) or URLs to configuration objects.

Dict Fields
  • enable (bool, optional): Whether this specific URL is enabled.
  • key (str, optional): API key for authentication (if required).
  • prefix_id (str, optional): A prefix to prepend to model names from this source.
  • tags (List[str], optional): Tags to apply to models from this source.
  • model_ids (List[str], optional): Allowlist of model IDs to show.
  • connection_type (str, optional): Type of connection (e.g., "local").
access_control
access_control: Optional[dict] = None

Access control configuration for Ollama resources.

Dict Fields
  • read (dict, optional): Read access control configuration
    • group_ids (List[str], optional): List of group IDs that have read access
    • user_ids (List[str], optional): List of user IDs that have read access
  • write (dict, optional): Write access control configuration
    • group_ids (List[str], optional): List of group IDs that have write access
    • user_ids (List[str], optional): List of user IDs that have write access

When access_control is None, resources are publicly accessible. When access_control is provided, only specified users and groups have access.

ModelNameForm

Bases: BaseModel

Form for specifying a model name.

Used in various operations like unloading, deleting, or showing model information.

Attributes

model
model: Optional[str] = None

The model identifier (e.g., llama2:latest).

PushModelForm

Bases: BaseModel

Form for pushing a model to a registry.

Attributes

model
model: str

The name of the model to push.

insecure
insecure: Optional[bool] = None

Allow insecure connections to the registry.

stream
stream: Optional[bool] = None

Whether to stream the progress of the push operation.

CreateModelForm

Bases: BaseModel

Form for creating a new model.

Attributes

model
model: Optional[str] = None

The name of the model to create.

stream
stream: Optional[bool] = None

Whether to stream the progress of the creation.

path
path: Optional[str] = None

Path to the model file (deprecated in newer Ollama versions).

CopyModelForm

Bases: BaseModel

Form for copying a model.

Attributes

source
source: str

The name of the source model.

destination
destination: str

The name of the destination model.

GenerateEmbedForm

Bases: BaseModel

Form for generating embeddings.

Attributes

model
model: str

The model to use for embeddings.

input
input: Union[List[str], str]

The input text or list of texts to embed.

truncate
truncate: Optional[bool] = None

Whether to truncate the input to the model's context length.

options
options: Optional[Dict[str, Any]] = None

Model options (e.g., temperature, context size).

Dict Fields

See Ollama Modelfile documentation for valid parameters.

keep_alive
keep_alive: Optional[Union[int, str]] = None

How long to keep the model loaded (e.g., "5m").

GenerateEmbeddingsForm

Bases: BaseModel

Form for generating embeddings (legacy endpoint).

Attributes

model
model: str

The model to use.

prompt
prompt: str

The prompt text to embed.

options
options: Optional[Dict[str, Any]] = None

Model options.

Dict Fields

See Ollama Modelfile documentation for valid parameters.

keep_alive
keep_alive: Optional[Union[int, str]] = None

How long to keep the model loaded.

GenerateCompletionForm

Bases: BaseModel

Form for generating a completion (single prompt).

Attributes

model
model: str

The model to use.

prompt
prompt: str

The prompt text.

suffix
suffix: Optional[str] = None

A suffix to append to the generated text (for infilling).

images
images: Optional[List[str]] = None

A list of base64-encoded images.

format
format: Optional[Union[Dict[str, Any], str]] = None

The format of the response (e.g., "json").

Dict Fields

If provided as a dictionary, it should be a JSON Schema to enforce a specific output structure.

options
options: Optional[Dict[str, Any]] = None

Model parameters like temperature, top_k, etc.

Dict Fields

See Ollama Modelfile documentation for valid parameters.

system
system: Optional[str] = None

System prompt to override the model's default.

template
template: Optional[str] = None

Prompt template to override the model's default.

context
context: Optional[List[int]] = None

Context parameter returned from a previous request (legacy).

stream
stream: Optional[bool] = True

Whether to stream the response.

raw
raw: Optional[bool] = None

If True, no formatting is applied to the prompt.

keep_alive
keep_alive: Optional[Union[int, str]] = None

How long to keep the model loaded.

ChatMessage

Bases: BaseModel

A message in a chat conversation.

Attributes

role
role: str

The role of the message sender (e.g., "user", "assistant", "system").

content
content: Optional[str] = None

The content of the message.

tool_calls
tool_calls: Optional[List[Dict[str, Any]]] = None

List of tool calls generated by the model.

Dict Fields
  • index (int, optional): The index of the tool call in the sequence
  • id (str, optional): Unique identifier for the tool call
  • function (dict, required): Function call details
    • name (str, required): Name of the function/tool to call
    • arguments (dict, required): JSON-serializable arguments for the function
images
images: Optional[List[str]] = None

List of base64-encoded images included in the message.

GenerateChatCompletionForm

Bases: BaseModel

Form for generating a chat completion.

Attributes

model
model: str

The model to use.

messages
messages: List[ChatMessage]

The conversation history.

format
format: Optional[Union[Dict[str, Any], str]] = None

Response format (e.g., "json").

Dict Fields

If provided as a dictionary, it should be a JSON Schema to enforce a specific output structure.

options
options: Optional[Dict[str, Any]] = None

Model parameters.

Dict Fields

See Ollama Modelfile documentation for valid parameters.

template
template: Optional[str] = None

Prompt template to use.

stream
stream: Optional[bool] = True

Whether to stream the response.

keep_alive
keep_alive: Optional[Union[int, str]] = None

How long to keep the model loaded.

tools
tools: Optional[List[Dict[str, Any]]] = None

List of tools available to the model.

Dict Fields
  • type (str, required): The type of tool, e.g. "function".
  • function (dict, required): The function definition.
    • name (str, required): The name of the function.
    • description (str, optional): A description of the function.
    • parameters (dict, required): A JSON schema defining the function parameters.

ConnectionVerificationForm

Bases: BaseModel

Form for verifying an Ollama connection.

Attributes

url
url: str

The URL of the Ollama instance.

key
key: Optional[str] = None

The API key for authentication.

UrlForm

Bases: BaseModel

Form containing a URL, used for downloading models.

Attributes

url
url: str

The URL to process (e.g., HuggingFace model URL).

UploadBlobForm

Bases: BaseModel

Form for uploading a blob (model file).

Attributes

filename
filename: str

The name of the file being uploaded.