Interface JlamaStreamingChatModelConfig
- All Superinterfaces:
Prototype.Api
- All Known Implementing Classes:
JlamaStreamingChatModelConfig.BuilderBase.JlamaStreamingChatModelConfigImpl
Interface generated from definition. Please add javadoc to the definition interface.
- See Also:
-
Nested Class Summary
Nested ClassesModifier and TypeInterfaceDescriptionstatic class
Fluent API builder forJlamaStreamingChatModelConfig
.static class
JlamaStreamingChatModelConfig.BuilderBase<BUILDER extends JlamaStreamingChatModelConfig.BuilderBase<BUILDER,
PROTOTYPE>, PROTOTYPE extends JlamaStreamingChatModelConfig> Fluent API builder base forJlamaStreamingChatModelConfig
. -
Field Summary
FieldsModifier and TypeFieldDescriptionstatic final String
The root configuration key for this builder. -
Method Summary
Modifier and TypeMethodDescriptionGenerated fromJlamaStreamingChatModel.JlamaStreamingChatModelBuilder.authToken(java.lang.String)
builder()
Create a new fluent API builder to customize configuration.builder
(JlamaStreamingChatModelConfig instance) Create a new fluent API builder from an existing instance.default dev.langchain4j.model.jlama.JlamaStreamingChatModel.JlamaStreamingChatModelBuilder
Skipped: build - doesn't have exactly one parameter toString - doesn't have exactly one parametercreate()
Create a new instance with default values.Deprecated.Create a new instance from configuration.boolean
enabled()
If set tofalse
(default), JlamaStreamingChatModel will not be available even if configured.Generated fromJlamaStreamingChatModel.JlamaStreamingChatModelBuilder.maxTokens(java.lang.Integer)
Generated fromJlamaStreamingChatModel.JlamaStreamingChatModelBuilder.modelCachePath(java.nio.file.Path)
Configure the model name.Generated fromJlamaStreamingChatModel.JlamaStreamingChatModelBuilder.quantizeModelAtRuntime(java.lang.Boolean)
Generated fromJlamaStreamingChatModel.JlamaStreamingChatModelBuilder.temperature(java.lang.Float)
Generated fromJlamaStreamingChatModel.JlamaStreamingChatModelBuilder.threadCount(java.lang.Integer)
Generated fromJlamaStreamingChatModel.JlamaStreamingChatModelBuilder.workingDirectory(java.nio.file.Path)
Optional
<com.github.tjake.jlama.safetensors.DType> Generated fromJlamaStreamingChatModel.JlamaStreamingChatModelBuilder.workingQuantizedType(com.github.tjake.jlama.safetensors.DType)
-
Field Details
-
CONFIG_ROOT
The root configuration key for this builder.- See Also:
-
-
Method Details
-
builder
Create a new fluent API builder to customize configuration.- Returns:
- a new builder
-
builder
Create a new fluent API builder from an existing instance.- Parameters:
instance
- an existing instance used as a base for the builder- Returns:
- a builder based on an instance
-
create
Create a new instance from configuration.- Parameters:
config
- used to configure the new instance- Returns:
- a new instance configured from configuration
-
create
Deprecated.Create a new instance from configuration.- Parameters:
config
- used to configure the new instance- Returns:
- a new instance configured from configuration
-
create
Create a new instance with default values.- Returns:
- a new instance
-
enabled
boolean enabled()If set tofalse
(default), JlamaStreamingChatModel will not be available even if configured.- Returns:
- whether JlamaStreamingChatModel is enabled, defaults to
false
-
workingQuantizedType
Optional<com.github.tjake.jlama.safetensors.DType> workingQuantizedType()Generated fromJlamaStreamingChatModel.JlamaStreamingChatModelBuilder.workingQuantizedType(com.github.tjake.jlama.safetensors.DType)
- Returns:
- DType property
-
modelCachePath
Generated fromJlamaStreamingChatModel.JlamaStreamingChatModelBuilder.modelCachePath(java.nio.file.Path)
- Returns:
- Path property
-
workingDirectory
Generated fromJlamaStreamingChatModel.JlamaStreamingChatModelBuilder.workingDirectory(java.nio.file.Path)
- Returns:
- Path property
-
authToken
Generated fromJlamaStreamingChatModel.JlamaStreamingChatModelBuilder.authToken(java.lang.String)
- Returns:
- String property
-
temperature
Generated fromJlamaStreamingChatModel.JlamaStreamingChatModelBuilder.temperature(java.lang.Float)
- Returns:
- Float property
-
maxTokens
Generated fromJlamaStreamingChatModel.JlamaStreamingChatModelBuilder.maxTokens(java.lang.Integer)
- Returns:
- Integer property
-
threadCount
Generated fromJlamaStreamingChatModel.JlamaStreamingChatModelBuilder.threadCount(java.lang.Integer)
- Returns:
- Integer property
-
quantizeModelAtRuntime
Generated fromJlamaStreamingChatModel.JlamaStreamingChatModelBuilder.quantizeModelAtRuntime(java.lang.Boolean)
- Returns:
- Boolean property
-
modelName
String modelName()Configure the model name.- Returns:
- model name
-
configuredBuilder
default dev.langchain4j.model.jlama.JlamaStreamingChatModel.JlamaStreamingChatModelBuilder configuredBuilder()Skipped:- build - doesn't have exactly one parameter
- toString - doesn't have exactly one parameter
Overridden:
JlamaLc4jProvider.modelName()
JlamaLc4jProvider.modelName()
- Returns:
- Actual Lc4j model builder configured with this blueprint.
-
create(io.helidon.config.Config)