Integrations
Supported Models
Ollama

Ollama

First, make sure you have completed these pre-requisites:

Connecting

To send a message to Ollama, you can use the OpenAI Client. Arcade will use the OpenAI interface internally to send messages to Ollama.

Ollama is currently not supported on Arcade Cloud, but can be used with a self hosted engine.
import os
 
from openai import OpenAI
 
client = OpenAI(
  base_url="http://localhost:9099/v1", #  Where your self-hosted Arcade engine is running
  api_key=os.environ.get("ARCADE_API_KEY")
)
 
 
response = client.chat.completions.create(
    model="ollama/llama3.2",
    user="[email protected]",
    messages=[{"role": "user", "content": "hello"}],
    stream=False,
)

Configuration

This is a simple example for enabling Ollama in the Arcade Engine

llm:
  models:
    - id: ollama
      openai: # Use the OpenAI interface for Ollama
        base_url: http://localhost:11434
        chat_endpoint: /v1/chat/completions
        model: llama3.2
        api_key: ollama # Required, but ignored

For more advanced configuration, see the model docs