All Superinterfaces:
Prototype.Api
All Known Implementing Classes:
OpenAiStreamingChatModelConfig.BuilderBase.OpenAiStreamingChatModelConfigImpl

public interface OpenAiStreamingChatModelConfig extends Prototype.Api
Interface generated from definition. Please add javadoc to the definition interface.
See Also:
  • Field Details

  • Method Details

    • builder

      Create a new fluent API builder to customize configuration.
      Returns:
      a new builder
    • builder

      Create a new fluent API builder from an existing instance.
      Parameters:
      instance - an existing instance used as a base for the builder
      Returns:
      a builder based on an instance
    • create

      static OpenAiStreamingChatModelConfig create(Config config)
      Create a new instance from configuration.
      Parameters:
      config - used to configure the new instance
      Returns:
      a new instance configured from configuration
    • create

      Create a new instance from configuration.
      Parameters:
      config - used to configure the new instance
      Returns:
      a new instance configured from configuration
    • create

      Create a new instance with default values.
      Returns:
      a new instance
    • enabled

      boolean enabled()
      If set to false (default), OpenAiStreamingChatModel will not be available even if configured.
      Returns:
      whether OpenAiStreamingChatModel is enabled, defaults to false
    • parallelToolCalls

      Optional<Boolean> parallelToolCalls()
      Generated from OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.parallelToolCalls(java.lang.Boolean)
      Returns:
      Boolean property
    • metadata

      Map<String,String> metadata()
      Generated from OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.metadata(java.util.Map)
      Returns:
      Map property
    • apiKey

      Optional<String> apiKey()
      Generated from OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.apiKey(java.lang.String)
      Returns:
      String property
    • strictJsonSchema

      Optional<Boolean> strictJsonSchema()
      Generated from OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.strictJsonSchema(java.lang.Boolean)
      Returns:
      Boolean property
    • seed

      Optional<Integer> seed()
      Generated from OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.seed(java.lang.Integer)
      Returns:
      Integer property
    • logger

      Optional<org.slf4j.Logger> logger()
      Generated from OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.logger(org.slf4j.Logger)
      Returns:
      Logger property
    • defaultRequestParameters

      Optional<dev.langchain4j.model.chat.request.ChatRequestParameters> defaultRequestParameters()
      Generated from OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.defaultRequestParameters(dev.langchain4j.model.chat.request.ChatRequestParameters)
      Returns:
      ChatRequestParameters property
    • timeout

      Optional<Duration> timeout()
      Generated from OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.timeout(java.time.Duration)
      Returns:
      Duration property
    • organizationId

      Optional<String> organizationId()
      Generated from OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.organizationId(java.lang.String)
      Returns:
      String property
    • presencePenalty

      Optional<Double> presencePenalty()
      Generated from OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.presencePenalty(java.lang.Double)
      Returns:
      Double property
    • httpClientBuilder

      Optional<dev.langchain4j.http.client.HttpClientBuilder> httpClientBuilder()
      Generated from OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.httpClientBuilder(dev.langchain4j.http.client.HttpClientBuilder)
      Returns:
      HttpClientBuilder property
    • temperature

      Optional<Double> temperature()
      Generated from OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.temperature(java.lang.Double)
      Returns:
      Double property
    • maxTokens

      Optional<Integer> maxTokens()
      Generated from OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.maxTokens(java.lang.Integer)
      Returns:
      Integer property
    • logitBias

      Map<String,Integer> logitBias()
      Generated from OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.logitBias(java.util.Map)
      Returns:
      Map property
    • frequencyPenalty

      Optional<Double> frequencyPenalty()
      Generated from OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.frequencyPenalty(java.lang.Double)
      Returns:
      Double property
    • customHeaders

      Map<String,String> customHeaders()
      Generated from OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.customHeaders(java.util.Map)
      Returns:
      Map property
    • maxCompletionTokens

      Optional<Integer> maxCompletionTokens()
      Generated from OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.maxCompletionTokens(java.lang.Integer)
      Returns:
      Integer property
    • listeners

      List<dev.langchain4j.model.chat.listener.ChatModelListener> listeners()
      Generated from OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.listeners(java.util.List)
      Returns:
      List property
    • store

      Optional<Boolean> store()
      Generated from OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.store(java.lang.Boolean)
      Returns:
      Boolean property
    • logResponses

      Optional<Boolean> logResponses()
      Generated from OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.logResponses(java.lang.Boolean)
      Returns:
      Boolean property
    • topP

      Optional<Double> topP()
      Generated from OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.topP(java.lang.Double)
      Returns:
      Double property
    • logRequests

      Optional<Boolean> logRequests()
      Generated from OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.logRequests(java.lang.Boolean)
      Returns:
      Boolean property
    • returnThinking

      Optional<Boolean> returnThinking()
      Generated from OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.returnThinking(java.lang.Boolean)
      Returns:
      Boolean property
    • modelName

      Optional<String> modelName()
      Generated from OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.modelName(java.lang.String)
      Returns:
      String property
    • baseUrl

      Optional<String> baseUrl()
      Generated from OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.baseUrl(java.lang.String)
      Returns:
      String property
    • stop

      List<String> stop()
      Generated from OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.stop(java.util.List)
      Returns:
      List property
    • strictTools

      Optional<Boolean> strictTools()
      Generated from OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.strictTools(java.lang.Boolean)
      Returns:
      Boolean property
    • serviceTier

      Optional<String> serviceTier()
      Generated from OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.serviceTier(java.lang.String)
      Returns:
      String property
    • projectId

      Optional<String> projectId()
      Generated from OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.projectId(java.lang.String)
      Returns:
      String property
    • user

      Optional<String> user()
      Generated from OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder.user(java.lang.String)
      Returns:
      String property
    • responseFormat

      Optional<String> responseFormat()
      Enable a "JSON mode" in the model configuration. This way, the LLM will be forced to respond with a valid JSON. For newer models that support Structured Outputs use supported-capabilities.
      Returns:
      "json_object" to enable JSON mode on older models like gpt-3.5-turbo or gpt-4
    • configuredBuilder

      default dev.langchain4j.model.openai.OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder configuredBuilder()
      Skipped:
      • modelName - property already exist -> dev.langchain4j.model.openai.OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder modelName(java.lang.String)
      • responseFormat - property already exist -> dev.langchain4j.model.openai.OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder responseFormat(java.lang.String)
      • build - doesn't have exactly one parameter

      Overridden:

      • OpenAiLc4jProvider.responseFormat()
      • OpenAiLc4jProvider.responseFormat()
      Returns:
      Actual Lc4j model builder configured with this blueprint.