ChatRequest

data class ChatRequest(val model: String, val messages: List<Message>, val tools: List<Tool>? = null, val think: Boolean? = null, val format: Format? = null, val options: ModelOptions? = null, val stream: Boolean = true, val keepAlive: String? = null)(source)

Represents a request to generate the next message in a chat with a provided model.

See Ollama API for details.

Constructors

Link copied to clipboard
constructor(model: String, messages: List<Message>, tools: List<Tool>? = null, think: Boolean? = null, format: Format? = null, options: ModelOptions? = null, stream: Boolean = true, keepAlive: String? = null)

Properties

Link copied to clipboard

The format to return a response in. Format can be "json" or a JSON schema

Link copied to clipboard
@SerialName(value = "keep_alive")
val keepAlive: String?

Controls how long the model will stay loaded into memory following the request (default: "5m")

Link copied to clipboard

The messages of the chat, used to keep chat memory

Link copied to clipboard

The model name to use for generation (required)

Link copied to clipboard

Additional model parameters such as temperature. See Valid Parameters and Values

Link copied to clipboard

If false the response will be returned as a single response object, rather than a stream of objects

Link copied to clipboard

Should the model think before responding? (for thinking models)

Link copied to clipboard
val tools: List<Tool>?

List of tools in JSON for the model to use if supported