Skip to content

Chat

The Chat endpoint allows you to send structured messages to a model and receive a conversational response. It's useful for creating chatbots, interactive assistants, or multi-turn conversations.

public async void Start()
{
    var api = new OllamaClient("http://localhost:11434");

    var request = new ChatRequest
    {
        Model = "gpt-oss:latest",
        Messages =
        {
            new ChatMessage { Role = "user", Content = "Hello!" }
        }
    };

    ChatResponse response = await api.ChatAsync(request);
    Debug.Log(response.Message.Content);
}

Unity Extension Methods (Non-Async)

For Unity projects, you can use the provided extension methods to avoid async/await entirely. These methods run on a background thread and invoke callbacks on the Unity main thread.

public class OllamaChatExtensionExample : MonoBehaviour
{
    private void Start()
    {
        var api = new OllamaClient("http://localhost:11434");

        var request = new ChatRequest
        {
            Model = "gpt-oss:latest",
            Messages =
            {
                new ChatMessage { Role = "user", Content = "Hello!" }
            }
        };

        api.Chat(
            request,
            OnChatSuccess,
            OnChatError
        );
    }

    void OnChatSuccess(ChatResponse response)
    {
        Debug.Log(response.Message.Content);
    }

    void OnChatError(System.Exception ex)
    {
        Debug.LogError(ex);
    }
}