Class OllamaStreamingChatModelConfig.BuilderBase.OllamaStreamingChatModelConfigImpl
java.lang.Object
io.helidon.integrations.langchain4j.providers.ollama.OllamaStreamingChatModelConfig.BuilderBase.OllamaStreamingChatModelConfigImpl
- All Implemented Interfaces:
Prototype.Api,OllamaStreamingChatModelConfig
- Enclosing class:
OllamaStreamingChatModelConfig.BuilderBase<BUILDER extends OllamaStreamingChatModelConfig.BuilderBase<BUILDER,PROTOTYPE>, PROTOTYPE extends OllamaStreamingChatModelConfig>
protected static class OllamaStreamingChatModelConfig.BuilderBase.OllamaStreamingChatModelConfigImpl
extends Object
implements OllamaStreamingChatModelConfig
Generated implementation of the prototype, can be extended by descendant prototype implementations.
-
Nested Class Summary
Nested classes/interfaces inherited from interface io.helidon.integrations.langchain4j.providers.ollama.OllamaStreamingChatModelConfig
OllamaStreamingChatModelConfig.Builder, OllamaStreamingChatModelConfig.BuilderBase<BUILDER extends OllamaStreamingChatModelConfig.BuilderBase<BUILDER,PROTOTYPE>, PROTOTYPE extends OllamaStreamingChatModelConfig> -
Field Summary
FieldsModifier and TypeFieldDescriptionstatic final StringThe root configuration key for this builder. -
Constructor Summary
ConstructorsModifierConstructorDescriptionprotectedCreate an instance providing a builder. -
Method Summary
Modifier and TypeMethodDescriptionbaseUrl()Generated fromOllamaBaseChatModel.Builder.baseUrl(java.lang.String)default dev.langchain4j.model.ollama.OllamaStreamingChatModel.OllamaStreamingChatModelBuilderSkipped: build - doesn't have exactly one parameter supportedCapabilities - property already exist ->B extends dev.langchain4j.model.ollama.OllamaBaseChatModel.Builder<C extends dev.langchain4j.model.ollama.OllamaBaseChatModel> supportedCapabilities(java.util.Set<dev.langchain4j.model.chat.Capability>)build - doesn't have exactly one parameterGenerated fromOllamaBaseChatModel.Builder.customHeaders(java.util.Map)Optional<dev.langchain4j.model.chat.request.ChatRequestParameters> Generated fromOllamaBaseChatModel.Builder.defaultRequestParameters(dev.langchain4j.model.chat.request.ChatRequestParameters)booleanenabled()If set tofalse(default), OllamaStreamingChatModel will not be available even if configured.booleaninthashCode()Optional<dev.langchain4j.http.client.HttpClientBuilder> Generated fromOllamaBaseChatModel.Builder.httpClientBuilder(dev.langchain4j.http.client.HttpClientBuilder)List<dev.langchain4j.model.chat.listener.ChatModelListener> Generated fromOllamaBaseChatModel.Builder.listeners(java.util.List)Optional<org.slf4j.Logger> logger()Generated fromOllamaBaseChatModel.Builder.logger(org.slf4j.Logger)Generated fromOllamaBaseChatModel.Builder.logRequests(java.lang.Boolean)Generated fromOllamaBaseChatModel.Builder.logResponses(java.lang.Boolean)minP()Generated fromOllamaBaseChatModel.Builder.minP(java.lang.Double)mirostat()Generated fromOllamaBaseChatModel.Builder.mirostat(java.lang.Integer)Generated fromOllamaBaseChatModel.Builder.mirostatEta(java.lang.Double)Generated fromOllamaBaseChatModel.Builder.mirostatTau(java.lang.Double)Generated fromOllamaBaseChatModel.Builder.modelName(java.lang.String)numCtx()Generated fromOllamaBaseChatModel.Builder.numCtx(java.lang.Integer)Generated fromOllamaBaseChatModel.Builder.numPredict(java.lang.Integer)Generated fromOllamaBaseChatModel.Builder.repeatLastN(java.lang.Integer)Generated fromOllamaBaseChatModel.Builder.repeatPenalty(java.lang.Double)Optional<dev.langchain4j.model.chat.request.ResponseFormat> Generated fromOllamaBaseChatModel.Builder.responseFormat(dev.langchain4j.model.chat.request.ResponseFormat)Generated fromOllamaBaseChatModel.Builder.returnThinking(java.lang.Boolean)seed()Generated fromOllamaBaseChatModel.Builder.seed(java.lang.Integer)stop()Generated fromOllamaBaseChatModel.Builder.stop(java.util.List)Set<dev.langchain4j.model.chat.Capability> Generated fromOllamaBaseChatModel.Builder.supportedCapabilities(java.util.Set)Generated fromOllamaBaseChatModel.Builder.temperature(java.lang.Double)think()Generated fromOllamaBaseChatModel.Builder.think(java.lang.Boolean)timeout()Generated fromOllamaBaseChatModel.Builder.timeout(java.time.Duration)topK()Generated fromOllamaBaseChatModel.Builder.topK(java.lang.Integer)topP()Generated fromOllamaBaseChatModel.Builder.topP(java.lang.Double)toString()
-
Field Details
-
CONFIG_ROOT
The root configuration key for this builder.- See Also:
-
-
Constructor Details
-
OllamaStreamingChatModelConfigImpl
protected OllamaStreamingChatModelConfigImpl(OllamaStreamingChatModelConfig.BuilderBase<?, ?> builder) Create an instance providing a builder.- Parameters:
builder- extending builder base of this prototype
-
-
Method Details
-
enabled
public boolean enabled()Description copied from interface:OllamaStreamingChatModelConfigIf set tofalse(default), OllamaStreamingChatModel will not be available even if configured.- Specified by:
enabledin interfaceOllamaStreamingChatModelConfig- Returns:
- whether OllamaStreamingChatModel is enabled, defaults to
false
-
mirostat
Description copied from interface:OllamaStreamingChatModelConfigGenerated fromOllamaBaseChatModel.Builder.mirostat(java.lang.Integer)- Specified by:
mirostatin interfaceOllamaStreamingChatModelConfig- Returns:
- Integer property
-
seed
Description copied from interface:OllamaStreamingChatModelConfigGenerated fromOllamaBaseChatModel.Builder.seed(java.lang.Integer)- Specified by:
seedin interfaceOllamaStreamingChatModelConfig- Returns:
- Integer property
-
logger
Description copied from interface:OllamaStreamingChatModelConfigGenerated fromOllamaBaseChatModel.Builder.logger(org.slf4j.Logger)- Specified by:
loggerin interfaceOllamaStreamingChatModelConfig- Returns:
- Logger property
-
responseFormat
Description copied from interface:OllamaStreamingChatModelConfigGenerated fromOllamaBaseChatModel.Builder.responseFormat(dev.langchain4j.model.chat.request.ResponseFormat)- Specified by:
responseFormatin interfaceOllamaStreamingChatModelConfig- Returns:
- ResponseFormat property
-
defaultRequestParameters
public Optional<dev.langchain4j.model.chat.request.ChatRequestParameters> defaultRequestParameters()Description copied from interface:OllamaStreamingChatModelConfigGenerated fromOllamaBaseChatModel.Builder.defaultRequestParameters(dev.langchain4j.model.chat.request.ChatRequestParameters)- Specified by:
defaultRequestParametersin interfaceOllamaStreamingChatModelConfig- Returns:
- ChatRequestParameters property
-
minP
Description copied from interface:OllamaStreamingChatModelConfigGenerated fromOllamaBaseChatModel.Builder.minP(java.lang.Double)- Specified by:
minPin interfaceOllamaStreamingChatModelConfig- Returns:
- Double property
-
timeout
Description copied from interface:OllamaStreamingChatModelConfigGenerated fromOllamaBaseChatModel.Builder.timeout(java.time.Duration)- Specified by:
timeoutin interfaceOllamaStreamingChatModelConfig- Returns:
- Duration property
-
httpClientBuilder
Description copied from interface:OllamaStreamingChatModelConfigGenerated fromOllamaBaseChatModel.Builder.httpClientBuilder(dev.langchain4j.http.client.HttpClientBuilder)- Specified by:
httpClientBuilderin interfaceOllamaStreamingChatModelConfig- Returns:
- HttpClientBuilder property
-
temperature
Description copied from interface:OllamaStreamingChatModelConfigGenerated fromOllamaBaseChatModel.Builder.temperature(java.lang.Double)- Specified by:
temperaturein interfaceOllamaStreamingChatModelConfig- Returns:
- Double property
-
numCtx
Description copied from interface:OllamaStreamingChatModelConfigGenerated fromOllamaBaseChatModel.Builder.numCtx(java.lang.Integer)- Specified by:
numCtxin interfaceOllamaStreamingChatModelConfig- Returns:
- Integer property
-
repeatPenalty
Description copied from interface:OllamaStreamingChatModelConfigGenerated fromOllamaBaseChatModel.Builder.repeatPenalty(java.lang.Double)- Specified by:
repeatPenaltyin interfaceOllamaStreamingChatModelConfig- Returns:
- Double property
-
numPredict
Description copied from interface:OllamaStreamingChatModelConfigGenerated fromOllamaBaseChatModel.Builder.numPredict(java.lang.Integer)- Specified by:
numPredictin interfaceOllamaStreamingChatModelConfig- Returns:
- Integer property
-
customHeaders
Description copied from interface:OllamaStreamingChatModelConfigGenerated fromOllamaBaseChatModel.Builder.customHeaders(java.util.Map)- Specified by:
customHeadersin interfaceOllamaStreamingChatModelConfig- Returns:
- Map property
-
topK
Description copied from interface:OllamaStreamingChatModelConfigGenerated fromOllamaBaseChatModel.Builder.topK(java.lang.Integer)- Specified by:
topKin interfaceOllamaStreamingChatModelConfig- Returns:
- Integer property
-
think
Description copied from interface:OllamaStreamingChatModelConfigGenerated fromOllamaBaseChatModel.Builder.think(java.lang.Boolean)- Specified by:
thinkin interfaceOllamaStreamingChatModelConfig- Returns:
- Boolean property
-
mirostatEta
Description copied from interface:OllamaStreamingChatModelConfigGenerated fromOllamaBaseChatModel.Builder.mirostatEta(java.lang.Double)- Specified by:
mirostatEtain interfaceOllamaStreamingChatModelConfig- Returns:
- Double property
-
listeners
Description copied from interface:OllamaStreamingChatModelConfigGenerated fromOllamaBaseChatModel.Builder.listeners(java.util.List)- Specified by:
listenersin interfaceOllamaStreamingChatModelConfig- Returns:
- List property
-
supportedCapabilities
Description copied from interface:OllamaStreamingChatModelConfigGenerated fromOllamaBaseChatModel.Builder.supportedCapabilities(java.util.Set)- Specified by:
supportedCapabilitiesin interfaceOllamaStreamingChatModelConfig- Returns:
- Set property
-
logResponses
Description copied from interface:OllamaStreamingChatModelConfigGenerated fromOllamaBaseChatModel.Builder.logResponses(java.lang.Boolean)- Specified by:
logResponsesin interfaceOllamaStreamingChatModelConfig- Returns:
- Boolean property
-
topP
Description copied from interface:OllamaStreamingChatModelConfigGenerated fromOllamaBaseChatModel.Builder.topP(java.lang.Double)- Specified by:
topPin interfaceOllamaStreamingChatModelConfig- Returns:
- Double property
-
logRequests
Description copied from interface:OllamaStreamingChatModelConfigGenerated fromOllamaBaseChatModel.Builder.logRequests(java.lang.Boolean)- Specified by:
logRequestsin interfaceOllamaStreamingChatModelConfig- Returns:
- Boolean property
-
returnThinking
Description copied from interface:OllamaStreamingChatModelConfigGenerated fromOllamaBaseChatModel.Builder.returnThinking(java.lang.Boolean)- Specified by:
returnThinkingin interfaceOllamaStreamingChatModelConfig- Returns:
- Boolean property
-
modelName
Description copied from interface:OllamaStreamingChatModelConfigGenerated fromOllamaBaseChatModel.Builder.modelName(java.lang.String)- Specified by:
modelNamein interfaceOllamaStreamingChatModelConfig- Returns:
- String property
-
baseUrl
Description copied from interface:OllamaStreamingChatModelConfigGenerated fromOllamaBaseChatModel.Builder.baseUrl(java.lang.String)- Specified by:
baseUrlin interfaceOllamaStreamingChatModelConfig- Returns:
- String property
-
stop
Description copied from interface:OllamaStreamingChatModelConfigGenerated fromOllamaBaseChatModel.Builder.stop(java.util.List)- Specified by:
stopin interfaceOllamaStreamingChatModelConfig- Returns:
- List property
-
mirostatTau
Description copied from interface:OllamaStreamingChatModelConfigGenerated fromOllamaBaseChatModel.Builder.mirostatTau(java.lang.Double)- Specified by:
mirostatTauin interfaceOllamaStreamingChatModelConfig- Returns:
- Double property
-
repeatLastN
Description copied from interface:OllamaStreamingChatModelConfigGenerated fromOllamaBaseChatModel.Builder.repeatLastN(java.lang.Integer)- Specified by:
repeatLastNin interfaceOllamaStreamingChatModelConfig- Returns:
- Integer property
-
toString
-
equals
-
hashCode
public int hashCode() -
configuredBuilder
default dev.langchain4j.model.ollama.OllamaStreamingChatModel.OllamaStreamingChatModelBuilder configuredBuilder()Skipped:- build - doesn't have exactly one parameter
- supportedCapabilities - property already exist ->
B extends dev.langchain4j.model.ollama.OllamaBaseChatModel.Builder<C extends dev.langchain4j.model.ollama.OllamaBaseChatModel> supportedCapabilities(java.util.Set<dev.langchain4j.model.chat.Capability>) - build - doesn't have exactly one parameter
- Returns:
- Actual Lc4j model builder configured with this blueprint.
-