Class JlamaStreamingChatModelConfig.BuilderBase.JlamaStreamingChatModelConfigImpl
java.lang.Object
io.helidon.integrations.langchain4j.providers.jlama.JlamaStreamingChatModelConfig.BuilderBase.JlamaStreamingChatModelConfigImpl
- All Implemented Interfaces:
Prototype.Api,JlamaStreamingChatModelConfig
- Enclosing class:
JlamaStreamingChatModelConfig.BuilderBase<BUILDER extends JlamaStreamingChatModelConfig.BuilderBase<BUILDER,PROTOTYPE>, PROTOTYPE extends JlamaStreamingChatModelConfig>
protected static class JlamaStreamingChatModelConfig.BuilderBase.JlamaStreamingChatModelConfigImpl
extends Object
implements JlamaStreamingChatModelConfig
Generated implementation of the prototype, can be extended by descendant prototype implementations.
-
Nested Class Summary
Nested classes/interfaces inherited from interface io.helidon.integrations.langchain4j.providers.jlama.JlamaStreamingChatModelConfig
JlamaStreamingChatModelConfig.Builder, JlamaStreamingChatModelConfig.BuilderBase<BUILDER extends JlamaStreamingChatModelConfig.BuilderBase<BUILDER,PROTOTYPE>, PROTOTYPE extends JlamaStreamingChatModelConfig> -
Field Summary
Fields -
Constructor Summary
ConstructorsModifierConstructorDescriptionprotectedCreate an instance providing a builder. -
Method Summary
Modifier and TypeMethodDescriptionGenerated fromJlamaStreamingChatModel.JlamaStreamingChatModelBuilder.authToken(java.lang.String)booleanenabled()If set tofalse, JlamaStreamingChatModel will not be available even if configured.booleaninthashCode()Generated fromJlamaStreamingChatModel.JlamaStreamingChatModelBuilder.maxTokens(java.lang.Integer)Generated fromJlamaStreamingChatModel.JlamaStreamingChatModelBuilder.modelCachePath(java.nio.file.Path)Configure the model name.Generated fromJlamaStreamingChatModel.JlamaStreamingChatModelBuilder.quantizeModelAtRuntime(java.lang.Boolean)Generated fromJlamaStreamingChatModel.JlamaStreamingChatModelBuilder.temperature(java.lang.Float)Generated fromJlamaStreamingChatModel.JlamaStreamingChatModelBuilder.threadCount(java.lang.Integer)toString()Generated fromJlamaStreamingChatModel.JlamaStreamingChatModelBuilder.workingDirectory(java.nio.file.Path)Optional<com.github.tjake.jlama.safetensors.DType> Generated fromJlamaStreamingChatModel.JlamaStreamingChatModelBuilder.workingQuantizedType(com.github.tjake.jlama.safetensors.DType)Methods inherited from class java.lang.Object
clone, finalize, getClass, notify, notifyAll, wait, wait, waitMethods inherited from interface io.helidon.integrations.langchain4j.providers.jlama.JlamaStreamingChatModelConfig
configuredBuilder
-
Field Details
-
PROVIDER_KEY
AI provider config key.- See Also:
-
CONFIG_ROOT
The root configuration key for this builder.- See Also:
-
-
Constructor Details
-
JlamaStreamingChatModelConfigImpl
Create an instance providing a builder.- Parameters:
builder- extending builder base of this prototype
-
-
Method Details
-
enabled
public boolean enabled()Description copied from interface:JlamaStreamingChatModelConfigIf set tofalse, JlamaStreamingChatModel will not be available even if configured.- Specified by:
enabledin interfaceJlamaStreamingChatModelConfig- Returns:
- whether JlamaStreamingChatModel is enabled, defaults to
true
-
workingQuantizedType
Description copied from interface:JlamaStreamingChatModelConfigGenerated fromJlamaStreamingChatModel.JlamaStreamingChatModelBuilder.workingQuantizedType(com.github.tjake.jlama.safetensors.DType)- Specified by:
workingQuantizedTypein interfaceJlamaStreamingChatModelConfig- Returns:
- DType property
-
modelCachePath
Description copied from interface:JlamaStreamingChatModelConfigGenerated fromJlamaStreamingChatModel.JlamaStreamingChatModelBuilder.modelCachePath(java.nio.file.Path)- Specified by:
modelCachePathin interfaceJlamaStreamingChatModelConfig- Returns:
- Path property
-
workingDirectory
Description copied from interface:JlamaStreamingChatModelConfigGenerated fromJlamaStreamingChatModel.JlamaStreamingChatModelBuilder.workingDirectory(java.nio.file.Path)- Specified by:
workingDirectoryin interfaceJlamaStreamingChatModelConfig- Returns:
- Path property
-
authToken
Description copied from interface:JlamaStreamingChatModelConfigGenerated fromJlamaStreamingChatModel.JlamaStreamingChatModelBuilder.authToken(java.lang.String)- Specified by:
authTokenin interfaceJlamaStreamingChatModelConfig- Returns:
- String property
-
temperature
Description copied from interface:JlamaStreamingChatModelConfigGenerated fromJlamaStreamingChatModel.JlamaStreamingChatModelBuilder.temperature(java.lang.Float)- Specified by:
temperaturein interfaceJlamaStreamingChatModelConfig- Returns:
- Float property
-
maxTokens
Description copied from interface:JlamaStreamingChatModelConfigGenerated fromJlamaStreamingChatModel.JlamaStreamingChatModelBuilder.maxTokens(java.lang.Integer)- Specified by:
maxTokensin interfaceJlamaStreamingChatModelConfig- Returns:
- Integer property
-
threadCount
Description copied from interface:JlamaStreamingChatModelConfigGenerated fromJlamaStreamingChatModel.JlamaStreamingChatModelBuilder.threadCount(java.lang.Integer)- Specified by:
threadCountin interfaceJlamaStreamingChatModelConfig- Returns:
- Integer property
-
quantizeModelAtRuntime
Description copied from interface:JlamaStreamingChatModelConfigGenerated fromJlamaStreamingChatModel.JlamaStreamingChatModelBuilder.quantizeModelAtRuntime(java.lang.Boolean)- Specified by:
quantizeModelAtRuntimein interfaceJlamaStreamingChatModelConfig- Returns:
- Boolean property
-
modelName
Description copied from interface:JlamaStreamingChatModelConfigConfigure the model name.- Specified by:
modelNamein interfaceJlamaStreamingChatModelConfig- Returns:
- model name
-
toString
-
equals
-
hashCode
public int hashCode()
-