MockOllama

open class MockOllama(port: Int = 0, verbose: Boolean = true) : AbstractMockLlm(source)

Mock implementation of an Ollama-compatible service for testing purposes.

This class provides an HTTP mock server to simulate Ollama APIs, specifically for generate completions, chat completions, and other Ollama endpoints. It is designed to mimic the behavior of the Ollama APIs locally and facilitate integration testing and development.

Extends AbstractMockLlm to provide Ollama-specific functionality.

Author

Konstantin Pavlov

Parameters

port

The port on which the mock server will run. Defaults to 0, which allows the server to select an available port.

verbose

Controls whether the mock server's operations are logged in detail. Defaults to true.

Constructors

Link copied to clipboard
constructor(port: Int = 0, verbose: Boolean = true)

Functions

Link copied to clipboard
open override fun baseUrl(): String

Returns the base URL of the mock Ollama server.

Link copied to clipboard

Sets up a mock handler for the Ollama chat completion (/api/chat) endpoint.

Link copied to clipboard

Provides a Java-compatible overload for configuring a mock Ollama chat request using a Consumer.

Link copied to clipboard

Sets up a mock handler for the Ollama /api/embed endpoint, allowing configuration of request matching for embedding requests.

Link copied to clipboard

Provides a Java-compatible overload for configuring a mock /api/embed endpoint using a Consumer.

Link copied to clipboard

Sets up a mock handler for the Ollama /api/generate completion endpoint.

Link copied to clipboard

Configures a mock /api/generate endpoint using a Java Consumer to specify the request criteria.

Link copied to clipboard
fun port(): Int
Link copied to clipboard
Link copied to clipboard
open fun shutdown(gracePeriodMillis: Long, timeoutMillis: Long)
Link copied to clipboard
Link copied to clipboard