Interface OpenAiStreamingChatModelConfig
- All Superinterfaces:
Prototype.Api
- All Known Implementing Classes:
OpenAiStreamingChatModelConfig.BuilderBase.OpenAiStreamingChatModelConfigImpl
Interface generated from definition. Please add javadoc to the definition interface.
- See Also:
-
Nested Class Summary
Nested ClassesModifier and TypeInterfaceDescriptionstatic class
Fluent API builder forOpenAiStreamingChatModelConfig
.static class
OpenAiStreamingChatModelConfig.BuilderBase<BUILDER extends OpenAiStreamingChatModelConfig.BuilderBase<BUILDER,
PROTOTYPE>, PROTOTYPE extends OpenAiStreamingChatModelConfig> Fluent API builder base forOpenAiStreamingChatModelConfig
. -
Field Summary
FieldsModifier and TypeFieldDescriptionstatic final String
The root configuration key for this builder. -
Method Summary
Modifier and TypeMethodDescriptionapiKey()
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.apiKey(java.lang.String)
baseUrl()
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.baseUrl(java.lang.String)
builder()
Create a new fluent API builder to customize configuration.builder
(OpenAiStreamingChatModelConfig instance) Create a new fluent API builder from an existing instance.default dev.langchain4j.model.openai.OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder
Skipped: modelName - property already exist ->dev.langchain4j.model.openai.OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder modelName(java.lang.String)
responseFormat - property already exist ->dev.langchain4j.model.openai.OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder responseFormat(java.lang.String)
build - doesn't have exactly one parametercreate()
Create a new instance with default values.Deprecated.Create a new instance from configuration.Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.customHeaders(java.util.Map)
Optional
<dev.langchain4j.model.chat.request.ChatRequestParameters> Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.defaultRequestParameters(dev.langchain4j.model.chat.request.ChatRequestParameters)
boolean
enabled()
If set tofalse
(default), OpenAiStreamingChatModel will not be available even if configured.Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.frequencyPenalty(java.lang.Double)
Optional
<dev.langchain4j.http.client.HttpClientBuilder> Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.httpClientBuilder(dev.langchain4j.http.client.HttpClientBuilder)
List
<dev.langchain4j.model.chat.listener.ChatModelListener> Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.listeners(java.util.List)
Optional
<org.slf4j.Logger> logger()
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.logger(org.slf4j.Logger)
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.logitBias(java.util.Map)
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.logRequests(java.lang.Boolean)
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.logResponses(java.lang.Boolean)
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.maxCompletionTokens(java.lang.Integer)
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.maxTokens(java.lang.Integer)
metadata()
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.metadata(java.util.Map)
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.modelName(java.lang.String)
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.organizationId(java.lang.String)
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.parallelToolCalls(java.lang.Boolean)
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.presencePenalty(java.lang.Double)
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.projectId(java.lang.String)
Enable a "JSON mode" in the model configuration.Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.returnThinking(java.lang.Boolean)
seed()
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.seed(java.lang.Integer)
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.serviceTier(java.lang.String)
stop()
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.stop(java.util.List)
store()
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.store(java.lang.Boolean)
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.strictJsonSchema(java.lang.Boolean)
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.strictTools(java.lang.Boolean)
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.temperature(java.lang.Double)
timeout()
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.timeout(java.time.Duration)
topP()
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.topP(java.lang.Double)
user()
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.user(java.lang.String)
-
Field Details
-
CONFIG_ROOT
The root configuration key for this builder.- See Also:
-
-
Method Details
-
builder
Create a new fluent API builder to customize configuration.- Returns:
- a new builder
-
builder
Create a new fluent API builder from an existing instance.- Parameters:
instance
- an existing instance used as a base for the builder- Returns:
- a builder based on an instance
-
create
Create a new instance from configuration.- Parameters:
config
- used to configure the new instance- Returns:
- a new instance configured from configuration
-
create
Deprecated.Create a new instance from configuration.- Parameters:
config
- used to configure the new instance- Returns:
- a new instance configured from configuration
-
create
Create a new instance with default values.- Returns:
- a new instance
-
enabled
boolean enabled()If set tofalse
(default), OpenAiStreamingChatModel will not be available even if configured.- Returns:
- whether OpenAiStreamingChatModel is enabled, defaults to
false
-
parallelToolCalls
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.parallelToolCalls(java.lang.Boolean)
- Returns:
- Boolean property
-
metadata
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.metadata(java.util.Map)
- Returns:
- Map property
-
apiKey
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.apiKey(java.lang.String)
- Returns:
- String property
-
strictJsonSchema
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.strictJsonSchema(java.lang.Boolean)
- Returns:
- Boolean property
-
seed
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.seed(java.lang.Integer)
- Returns:
- Integer property
-
logger
Optional<org.slf4j.Logger> logger()Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.logger(org.slf4j.Logger)
- Returns:
- Logger property
-
defaultRequestParameters
Optional<dev.langchain4j.model.chat.request.ChatRequestParameters> defaultRequestParameters()Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.defaultRequestParameters(dev.langchain4j.model.chat.request.ChatRequestParameters)
- Returns:
- ChatRequestParameters property
-
timeout
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.timeout(java.time.Duration)
- Returns:
- Duration property
-
organizationId
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.organizationId(java.lang.String)
- Returns:
- String property
-
presencePenalty
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.presencePenalty(java.lang.Double)
- Returns:
- Double property
-
httpClientBuilder
Optional<dev.langchain4j.http.client.HttpClientBuilder> httpClientBuilder()Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.httpClientBuilder(dev.langchain4j.http.client.HttpClientBuilder)
- Returns:
- HttpClientBuilder property
-
temperature
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.temperature(java.lang.Double)
- Returns:
- Double property
-
maxTokens
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.maxTokens(java.lang.Integer)
- Returns:
- Integer property
-
logitBias
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.logitBias(java.util.Map)
- Returns:
- Map property
-
frequencyPenalty
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.frequencyPenalty(java.lang.Double)
- Returns:
- Double property
-
customHeaders
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.customHeaders(java.util.Map)
- Returns:
- Map property
-
maxCompletionTokens
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.maxCompletionTokens(java.lang.Integer)
- Returns:
- Integer property
-
listeners
List<dev.langchain4j.model.chat.listener.ChatModelListener> listeners()Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.listeners(java.util.List)
- Returns:
- List property
-
store
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.store(java.lang.Boolean)
- Returns:
- Boolean property
-
logResponses
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.logResponses(java.lang.Boolean)
- Returns:
- Boolean property
-
topP
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.topP(java.lang.Double)
- Returns:
- Double property
-
logRequests
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.logRequests(java.lang.Boolean)
- Returns:
- Boolean property
-
returnThinking
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.returnThinking(java.lang.Boolean)
- Returns:
- Boolean property
-
modelName
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.modelName(java.lang.String)
- Returns:
- String property
-
baseUrl
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.baseUrl(java.lang.String)
- Returns:
- String property
-
stop
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.stop(java.util.List)
- Returns:
- List property
-
strictTools
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.strictTools(java.lang.Boolean)
- Returns:
- Boolean property
-
serviceTier
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.serviceTier(java.lang.String)
- Returns:
- String property
-
projectId
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.projectId(java.lang.String)
- Returns:
- String property
-
user
Generated fromOpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.user(java.lang.String)
- Returns:
- String property
-
responseFormat
Enable a "JSON mode" in the model configuration. This way, the LLM will be forced to respond with a valid JSON. For newer models that support Structured Outputs use supported-capabilities.- Returns:
- "json_object" to enable JSON mode on older models like gpt-3.5-turbo or gpt-4
-
configuredBuilder
default dev.langchain4j.model.openai.OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder configuredBuilder()Skipped:- modelName - property already exist ->
dev.langchain4j.model.openai.OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder modelName(java.lang.String)
- responseFormat - property already exist ->
dev.langchain4j.model.openai.OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder responseFormat(java.lang.String)
- build - doesn't have exactly one parameter
Overridden:
OpenAiLc4jProvider.responseFormat()
OpenAiLc4jProvider.responseFormat()
- Returns:
- Actual Lc4j model builder configured with this blueprint.
- modelName - property already exist ->
-
create(io.helidon.config.Config)