Skip to main content

Documentation Index

Fetch the complete documentation index at: https://anaconda.com/docs/llms.txt

Use this file to discover all available pages before exploring further.

The PydanticAI integration provides Chat and Embedding classes that automatically manage downloading models and starting servers.

Chat model usage

Here is a minimal setup example for using PydanticAI chat models with Anaconda AI:
from pydantic import BaseModel
from anaconda_ai.integrations.pydantic_ai import AnacondaChatModel

class UserInfo(BaseModel):
    name: str
    age: int

model = AnacondaChatModel(
    "OpenHermes-2.5-Mistral-7B_Q4_K_M.gguf",
    extra_options={
        'ctx_size': 4096,
        'n_gpu_layers': 20,
        'temp': 0.7
    }
)
To use an already running server, pass server/<server-name> as the model name:
model = AnacondaChatModel("server/my-server")

Embedding model usage

from anaconda_ai.integrations.pydantic_ai import AnacondaEmbeddingModel

embed = AnacondaEmbeddingModel(
    "sentence-transformers/bge-small-en-v1.5/q4_k_m"
)

result = await embed.embed("cat", input_type="document")
To use an already running server, pass server/<server-name> as the model name:
embed = AnacondaEmbeddingModel("server/my-server")
For more information on using PydanticAI, see the official documentation.