ChatRequest
data class ChatRequest(val model: String, val messages: List<Message>, val tools: List<Tool>? = null, val think: Boolean? = null, val format: Format? = null, val options: ModelOptions? = null, val stream: Boolean = true, val keepAlive: String? = null)(source)
Represents a request to generate the next message in a chat with a provided model.
See Ollama API for details.
Constructors
Properties
Link copied to clipboard
Controls how long the model will stay loaded into memory following the request (default: "5m")
Link copied to clipboard
Additional model parameters such as temperature. See Valid Parameters and Values
Link copied to clipboard
If false the response will be returned as a single response object, rather than a stream of objects