All Superinterfaces:
Prototype.Api
All Known Implementing Classes:
OpenAiChatModelConfig.BuilderBase.OpenAiChatModelConfigImpl

public interface OpenAiChatModelConfig extends Prototype.Api
Interface generated from definition. Please add javadoc to the definition interface.
See Also:
  • Field Details

  • Method Details

    • builder

      Create a new fluent API builder to customize configuration.
      Returns:
      a new builder
    • builder

      Create a new fluent API builder from an existing instance.
      Parameters:
      instance - an existing instance used as a base for the builder
      Returns:
      a builder based on an instance
    • create

      static OpenAiChatModelConfig create(Config config)
      Create a new instance from configuration.
      Parameters:
      config - used to configure the new instance
      Returns:
      a new instance configured from configuration
    • create

      @Deprecated static OpenAiChatModelConfig create(Config config)
      Create a new instance from configuration.
      Parameters:
      config - used to configure the new instance
      Returns:
      a new instance configured from configuration
    • create

      static OpenAiChatModelConfig create()
      Create a new instance with default values.
      Returns:
      a new instance
    • enabled

      boolean enabled()
      If set to false (default), OpenAiChatModel will not be available even if configured.
      Returns:
      whether OpenAiChatModel is enabled, defaults to false
    • parallelToolCalls

      Optional<Boolean> parallelToolCalls()
      Generated from OpenAiChatModel.OpenAiChatModelBuilder.parallelToolCalls(java.lang.Boolean)
      Returns:
      Boolean property
    • metadata

      Map<String,String> metadata()
      Generated from OpenAiChatModel.OpenAiChatModelBuilder.metadata(java.util.Map)
      Returns:
      Map property
    • apiKey

      Optional<String> apiKey()
      Generated from OpenAiChatModel.OpenAiChatModelBuilder.apiKey(java.lang.String)
      Returns:
      String property
    • strictJsonSchema

      Optional<Boolean> strictJsonSchema()
      Generated from OpenAiChatModel.OpenAiChatModelBuilder.strictJsonSchema(java.lang.Boolean)
      Returns:
      Boolean property
    • seed

      Optional<Integer> seed()
      Generated from OpenAiChatModel.OpenAiChatModelBuilder.seed(java.lang.Integer)
      Returns:
      Integer property
    • logger

      Optional<org.slf4j.Logger> logger()
      Generated from OpenAiChatModel.OpenAiChatModelBuilder.logger(org.slf4j.Logger)
      Returns:
      Logger property
    • defaultRequestParameters

      Optional<dev.langchain4j.model.chat.request.ChatRequestParameters> defaultRequestParameters()
      Generated from OpenAiChatModel.OpenAiChatModelBuilder.defaultRequestParameters(dev.langchain4j.model.chat.request.ChatRequestParameters)
      Returns:
      ChatRequestParameters property
    • timeout

      Optional<Duration> timeout()
      Generated from OpenAiChatModel.OpenAiChatModelBuilder.timeout(java.time.Duration)
      Returns:
      Duration property
    • organizationId

      Optional<String> organizationId()
      Generated from OpenAiChatModel.OpenAiChatModelBuilder.organizationId(java.lang.String)
      Returns:
      String property
    • presencePenalty

      Optional<Double> presencePenalty()
      Generated from OpenAiChatModel.OpenAiChatModelBuilder.presencePenalty(java.lang.Double)
      Returns:
      Double property
    • httpClientBuilder

      Optional<dev.langchain4j.http.client.HttpClientBuilder> httpClientBuilder()
      Generated from OpenAiChatModel.OpenAiChatModelBuilder.httpClientBuilder(dev.langchain4j.http.client.HttpClientBuilder)
      Returns:
      HttpClientBuilder property
    • temperature

      Optional<Double> temperature()
      Generated from OpenAiChatModel.OpenAiChatModelBuilder.temperature(java.lang.Double)
      Returns:
      Double property
    • maxTokens

      Optional<Integer> maxTokens()
      Generated from OpenAiChatModel.OpenAiChatModelBuilder.maxTokens(java.lang.Integer)
      Returns:
      Integer property
    • logitBias

      Map<String,Integer> logitBias()
      Generated from OpenAiChatModel.OpenAiChatModelBuilder.logitBias(java.util.Map)
      Returns:
      Map property
    • frequencyPenalty

      Optional<Double> frequencyPenalty()
      Generated from OpenAiChatModel.OpenAiChatModelBuilder.frequencyPenalty(java.lang.Double)
      Returns:
      Double property
    • customHeaders

      Map<String,String> customHeaders()
      Generated from OpenAiChatModel.OpenAiChatModelBuilder.customHeaders(java.util.Map)
      Returns:
      Map property
    • maxCompletionTokens

      Optional<Integer> maxCompletionTokens()
      Generated from OpenAiChatModel.OpenAiChatModelBuilder.maxCompletionTokens(java.lang.Integer)
      Returns:
      Integer property
    • listeners

      List<dev.langchain4j.model.chat.listener.ChatModelListener> listeners()
      Generated from OpenAiChatModel.OpenAiChatModelBuilder.listeners(java.util.List)
      Returns:
      List property
    • supportedCapabilities

      Set<dev.langchain4j.model.chat.Capability> supportedCapabilities()
      Generated from OpenAiChatModel.OpenAiChatModelBuilder.supportedCapabilities(java.util.Set)
      Returns:
      Set property
    • store

      Optional<Boolean> store()
      Generated from OpenAiChatModel.OpenAiChatModelBuilder.store(java.lang.Boolean)
      Returns:
      Boolean property
    • logResponses

      Optional<Boolean> logResponses()
      Generated from OpenAiChatModel.OpenAiChatModelBuilder.logResponses(java.lang.Boolean)
      Returns:
      Boolean property
    • topP

      Optional<Double> topP()
      Generated from OpenAiChatModel.OpenAiChatModelBuilder.topP(java.lang.Double)
      Returns:
      Double property
    • logRequests

      Optional<Boolean> logRequests()
      Generated from OpenAiChatModel.OpenAiChatModelBuilder.logRequests(java.lang.Boolean)
      Returns:
      Boolean property
    • returnThinking

      Optional<Boolean> returnThinking()
      Generated from OpenAiChatModel.OpenAiChatModelBuilder.returnThinking(java.lang.Boolean)
      Returns:
      Boolean property
    • modelName

      Optional<String> modelName()
      Generated from OpenAiChatModel.OpenAiChatModelBuilder.modelName(java.lang.String)
      Returns:
      String property
    • baseUrl

      Optional<String> baseUrl()
      Generated from OpenAiChatModel.OpenAiChatModelBuilder.baseUrl(java.lang.String)
      Returns:
      String property
    • maxRetries

      Optional<Integer> maxRetries()
      Generated from OpenAiChatModel.OpenAiChatModelBuilder.maxRetries(java.lang.Integer)
      Returns:
      Integer property
    • stop

      List<String> stop()
      Generated from OpenAiChatModel.OpenAiChatModelBuilder.stop(java.util.List)
      Returns:
      List property
    • strictTools

      Optional<Boolean> strictTools()
      Generated from OpenAiChatModel.OpenAiChatModelBuilder.strictTools(java.lang.Boolean)
      Returns:
      Boolean property
    • serviceTier

      Optional<String> serviceTier()
      Generated from OpenAiChatModel.OpenAiChatModelBuilder.serviceTier(java.lang.String)
      Returns:
      String property
    • projectId

      Optional<String> projectId()
      Generated from OpenAiChatModel.OpenAiChatModelBuilder.projectId(java.lang.String)
      Returns:
      String property
    • user

      Optional<String> user()
      Generated from OpenAiChatModel.OpenAiChatModelBuilder.user(java.lang.String)
      Returns:
      String property
    • responseFormat

      Optional<String> responseFormat()
      Enable a "JSON mode" in the model configuration. This way, the LLM will be forced to respond with a valid JSON. For newer models that support Structured Outputs use supported-capabilities.
      Returns:
      "json_object" to enable JSON mode on older models like gpt-3.5-turbo or gpt-4
    • configuredBuilder

      default dev.langchain4j.model.openai.OpenAiChatModel.OpenAiChatModelBuilder configuredBuilder()
      Skipped:
      • modelName - property already exist -> dev.langchain4j.model.openai.OpenAiChatModel.OpenAiChatModelBuilder modelName(java.lang.String)
      • responseFormat - property already exist -> dev.langchain4j.model.openai.OpenAiChatModel.OpenAiChatModelBuilder responseFormat(java.lang.String)
      • supportedCapabilities - property already exist -> dev.langchain4j.model.openai.OpenAiChatModel.OpenAiChatModelBuilder supportedCapabilities(java.util.Set<dev.langchain4j.model.chat.Capability>)
      • build - doesn't have exactly one parameter

      Overridden:

      • OpenAiLc4jProvider.responseFormat()
      • OpenAiLc4jProvider.responseFormat()
      Returns:
      Actual Lc4j model builder configured with this blueprint.