Interface OllamaStreamingChatModelConfig
- All Superinterfaces:
Prototype.Api
- All Known Implementing Classes:
OllamaStreamingChatModelConfig.BuilderBase.OllamaStreamingChatModelConfigImpl
Interface generated from definition. Please add javadoc to the definition interface.
- See Also:
-
Nested Class Summary
Nested ClassesModifier and TypeInterfaceDescriptionstatic classFluent API builder forOllamaStreamingChatModelConfig.static classOllamaStreamingChatModelConfig.BuilderBase<BUILDER extends OllamaStreamingChatModelConfig.BuilderBase<BUILDER,PROTOTYPE>, PROTOTYPE extends OllamaStreamingChatModelConfig> Fluent API builder base forOllamaStreamingChatModelConfig. -
Field Summary
FieldsModifier and TypeFieldDescriptionstatic final StringThe root configuration key for this builder. -
Method Summary
Modifier and TypeMethodDescriptionbaseUrl()Generated fromOllamaBaseChatModel.Builder.baseUrl(java.lang.String)builder()Create a new fluent API builder to customize configuration.builder(OllamaStreamingChatModelConfig instance) Create a new fluent API builder from an existing instance.default dev.langchain4j.model.ollama.OllamaStreamingChatModel.OllamaStreamingChatModelBuilderSkipped: build - doesn't have exactly one parameter supportedCapabilities - property already exist ->B extends dev.langchain4j.model.ollama.OllamaBaseChatModel.Builder<C extends dev.langchain4j.model.ollama.OllamaBaseChatModel> supportedCapabilities(java.util.Set<dev.langchain4j.model.chat.Capability>)build - doesn't have exactly one parametercreate()Create a new instance with default values.Deprecated.Create a new instance from configuration.Generated fromOllamaBaseChatModel.Builder.customHeaders(java.util.Map)Optional<dev.langchain4j.model.chat.request.ChatRequestParameters> Generated fromOllamaBaseChatModel.Builder.defaultRequestParameters(dev.langchain4j.model.chat.request.ChatRequestParameters)booleanenabled()If set tofalse(default), OllamaStreamingChatModel will not be available even if configured.Optional<dev.langchain4j.http.client.HttpClientBuilder> Generated fromOllamaBaseChatModel.Builder.httpClientBuilder(dev.langchain4j.http.client.HttpClientBuilder)List<dev.langchain4j.model.chat.listener.ChatModelListener> Generated fromOllamaBaseChatModel.Builder.listeners(java.util.List)Optional<org.slf4j.Logger> logger()Generated fromOllamaBaseChatModel.Builder.logger(org.slf4j.Logger)Generated fromOllamaBaseChatModel.Builder.logRequests(java.lang.Boolean)Generated fromOllamaBaseChatModel.Builder.logResponses(java.lang.Boolean)minP()Generated fromOllamaBaseChatModel.Builder.minP(java.lang.Double)mirostat()Generated fromOllamaBaseChatModel.Builder.mirostat(java.lang.Integer)Generated fromOllamaBaseChatModel.Builder.mirostatEta(java.lang.Double)Generated fromOllamaBaseChatModel.Builder.mirostatTau(java.lang.Double)Generated fromOllamaBaseChatModel.Builder.modelName(java.lang.String)numCtx()Generated fromOllamaBaseChatModel.Builder.numCtx(java.lang.Integer)Generated fromOllamaBaseChatModel.Builder.numPredict(java.lang.Integer)Generated fromOllamaBaseChatModel.Builder.repeatLastN(java.lang.Integer)Generated fromOllamaBaseChatModel.Builder.repeatPenalty(java.lang.Double)Optional<dev.langchain4j.model.chat.request.ResponseFormat> Generated fromOllamaBaseChatModel.Builder.responseFormat(dev.langchain4j.model.chat.request.ResponseFormat)Generated fromOllamaBaseChatModel.Builder.returnThinking(java.lang.Boolean)seed()Generated fromOllamaBaseChatModel.Builder.seed(java.lang.Integer)stop()Generated fromOllamaBaseChatModel.Builder.stop(java.util.List)Set<dev.langchain4j.model.chat.Capability> Generated fromOllamaBaseChatModel.Builder.supportedCapabilities(java.util.Set)Generated fromOllamaBaseChatModel.Builder.temperature(java.lang.Double)think()Generated fromOllamaBaseChatModel.Builder.think(java.lang.Boolean)timeout()Generated fromOllamaBaseChatModel.Builder.timeout(java.time.Duration)topK()Generated fromOllamaBaseChatModel.Builder.topK(java.lang.Integer)topP()Generated fromOllamaBaseChatModel.Builder.topP(java.lang.Double)
-
Field Details
-
CONFIG_ROOT
The root configuration key for this builder.- See Also:
-
-
Method Details
-
builder
Create a new fluent API builder to customize configuration.- Returns:
- a new builder
-
builder
Create a new fluent API builder from an existing instance.- Parameters:
instance- an existing instance used as a base for the builder- Returns:
- a builder based on an instance
-
create
Create a new instance from configuration.- Parameters:
config- used to configure the new instance- Returns:
- a new instance configured from configuration
-
create
Deprecated.Create a new instance from configuration.- Parameters:
config- used to configure the new instance- Returns:
- a new instance configured from configuration
-
create
Create a new instance with default values.- Returns:
- a new instance
-
enabled
boolean enabled()If set tofalse(default), OllamaStreamingChatModel will not be available even if configured.- Returns:
- whether OllamaStreamingChatModel is enabled, defaults to
false
-
mirostat
Generated fromOllamaBaseChatModel.Builder.mirostat(java.lang.Integer)- Returns:
- Integer property
-
seed
Generated fromOllamaBaseChatModel.Builder.seed(java.lang.Integer)- Returns:
- Integer property
-
logger
Optional<org.slf4j.Logger> logger()Generated fromOllamaBaseChatModel.Builder.logger(org.slf4j.Logger)- Returns:
- Logger property
-
responseFormat
Optional<dev.langchain4j.model.chat.request.ResponseFormat> responseFormat()Generated fromOllamaBaseChatModel.Builder.responseFormat(dev.langchain4j.model.chat.request.ResponseFormat)- Returns:
- ResponseFormat property
-
defaultRequestParameters
Optional<dev.langchain4j.model.chat.request.ChatRequestParameters> defaultRequestParameters()Generated fromOllamaBaseChatModel.Builder.defaultRequestParameters(dev.langchain4j.model.chat.request.ChatRequestParameters)- Returns:
- ChatRequestParameters property
-
minP
Generated fromOllamaBaseChatModel.Builder.minP(java.lang.Double)- Returns:
- Double property
-
timeout
Generated fromOllamaBaseChatModel.Builder.timeout(java.time.Duration)- Returns:
- Duration property
-
httpClientBuilder
Optional<dev.langchain4j.http.client.HttpClientBuilder> httpClientBuilder()Generated fromOllamaBaseChatModel.Builder.httpClientBuilder(dev.langchain4j.http.client.HttpClientBuilder)- Returns:
- HttpClientBuilder property
-
temperature
Generated fromOllamaBaseChatModel.Builder.temperature(java.lang.Double)- Returns:
- Double property
-
numCtx
Generated fromOllamaBaseChatModel.Builder.numCtx(java.lang.Integer)- Returns:
- Integer property
-
repeatPenalty
Generated fromOllamaBaseChatModel.Builder.repeatPenalty(java.lang.Double)- Returns:
- Double property
-
numPredict
Generated fromOllamaBaseChatModel.Builder.numPredict(java.lang.Integer)- Returns:
- Integer property
-
customHeaders
Generated fromOllamaBaseChatModel.Builder.customHeaders(java.util.Map)- Returns:
- Map property
-
topK
Generated fromOllamaBaseChatModel.Builder.topK(java.lang.Integer)- Returns:
- Integer property
-
think
Generated fromOllamaBaseChatModel.Builder.think(java.lang.Boolean)- Returns:
- Boolean property
-
mirostatEta
Generated fromOllamaBaseChatModel.Builder.mirostatEta(java.lang.Double)- Returns:
- Double property
-
listeners
List<dev.langchain4j.model.chat.listener.ChatModelListener> listeners()Generated fromOllamaBaseChatModel.Builder.listeners(java.util.List)- Returns:
- List property
-
supportedCapabilities
Set<dev.langchain4j.model.chat.Capability> supportedCapabilities()Generated fromOllamaBaseChatModel.Builder.supportedCapabilities(java.util.Set)- Returns:
- Set property
-
logResponses
Generated fromOllamaBaseChatModel.Builder.logResponses(java.lang.Boolean)- Returns:
- Boolean property
-
topP
Generated fromOllamaBaseChatModel.Builder.topP(java.lang.Double)- Returns:
- Double property
-
logRequests
Generated fromOllamaBaseChatModel.Builder.logRequests(java.lang.Boolean)- Returns:
- Boolean property
-
returnThinking
Generated fromOllamaBaseChatModel.Builder.returnThinking(java.lang.Boolean)- Returns:
- Boolean property
-
modelName
Generated fromOllamaBaseChatModel.Builder.modelName(java.lang.String)- Returns:
- String property
-
baseUrl
Generated fromOllamaBaseChatModel.Builder.baseUrl(java.lang.String)- Returns:
- String property
-
stop
Generated fromOllamaBaseChatModel.Builder.stop(java.util.List)- Returns:
- List property
-
mirostatTau
Generated fromOllamaBaseChatModel.Builder.mirostatTau(java.lang.Double)- Returns:
- Double property
-
repeatLastN
Generated fromOllamaBaseChatModel.Builder.repeatLastN(java.lang.Integer)- Returns:
- Integer property
-
configuredBuilder
default dev.langchain4j.model.ollama.OllamaStreamingChatModel.OllamaStreamingChatModelBuilder configuredBuilder()Skipped:- build - doesn't have exactly one parameter
- supportedCapabilities - property already exist ->
B extends dev.langchain4j.model.ollama.OllamaBaseChatModel.Builder<C extends dev.langchain4j.model.ollama.OllamaBaseChatModel> supportedCapabilities(java.util.Set<dev.langchain4j.model.chat.Capability>) - build - doesn't have exactly one parameter
- Returns:
- Actual Lc4j model builder configured with this blueprint.
-
create(io.helidon.config.Config)