All Superinterfaces:
Prototype.Api
All Known Implementing Classes:
OllamaStreamingChatModelConfig.BuilderBase.OllamaStreamingChatModelConfigImpl

public interface OllamaStreamingChatModelConfig extends Prototype.Api
Interface generated from definition. Please add javadoc to the definition interface.
See Also:
  • Field Details

  • Method Details

    • builder

      Create a new fluent API builder to customize configuration.
      Returns:
      a new builder
    • builder

      Create a new fluent API builder from an existing instance.
      Parameters:
      instance - an existing instance used as a base for the builder
      Returns:
      a builder based on an instance
    • create

      static OllamaStreamingChatModelConfig create(Config config)
      Create a new instance from configuration.
      Parameters:
      config - used to configure the new instance
      Returns:
      a new instance configured from configuration
    • create

      Create a new instance from configuration.
      Parameters:
      config - used to configure the new instance
      Returns:
      a new instance configured from configuration
    • create

      Create a new instance with default values.
      Returns:
      a new instance
    • enabled

      boolean enabled()
      If set to false (default), OllamaStreamingChatModel will not be available even if configured.
      Returns:
      whether OllamaStreamingChatModel is enabled, defaults to false
    • mirostat

      Optional<Integer> mirostat()
      Generated from OllamaBaseChatModel.Builder.mirostat(java.lang.Integer)
      Returns:
      Integer property
    • seed

      Optional<Integer> seed()
      Generated from OllamaBaseChatModel.Builder.seed(java.lang.Integer)
      Returns:
      Integer property
    • logger

      Optional<org.slf4j.Logger> logger()
      Generated from OllamaBaseChatModel.Builder.logger(org.slf4j.Logger)
      Returns:
      Logger property
    • responseFormat

      Optional<dev.langchain4j.model.chat.request.ResponseFormat> responseFormat()
      Generated from OllamaBaseChatModel.Builder.responseFormat(dev.langchain4j.model.chat.request.ResponseFormat)
      Returns:
      ResponseFormat property
    • defaultRequestParameters

      Optional<dev.langchain4j.model.chat.request.ChatRequestParameters> defaultRequestParameters()
      Generated from OllamaBaseChatModel.Builder.defaultRequestParameters(dev.langchain4j.model.chat.request.ChatRequestParameters)
      Returns:
      ChatRequestParameters property
    • minP

      Optional<Double> minP()
      Generated from OllamaBaseChatModel.Builder.minP(java.lang.Double)
      Returns:
      Double property
    • timeout

      Optional<Duration> timeout()
      Generated from OllamaBaseChatModel.Builder.timeout(java.time.Duration)
      Returns:
      Duration property
    • httpClientBuilder

      Optional<dev.langchain4j.http.client.HttpClientBuilder> httpClientBuilder()
      Generated from OllamaBaseChatModel.Builder.httpClientBuilder(dev.langchain4j.http.client.HttpClientBuilder)
      Returns:
      HttpClientBuilder property
    • temperature

      Optional<Double> temperature()
      Generated from OllamaBaseChatModel.Builder.temperature(java.lang.Double)
      Returns:
      Double property
    • numCtx

      Optional<Integer> numCtx()
      Generated from OllamaBaseChatModel.Builder.numCtx(java.lang.Integer)
      Returns:
      Integer property
    • repeatPenalty

      Optional<Double> repeatPenalty()
      Generated from OllamaBaseChatModel.Builder.repeatPenalty(java.lang.Double)
      Returns:
      Double property
    • numPredict

      Optional<Integer> numPredict()
      Generated from OllamaBaseChatModel.Builder.numPredict(java.lang.Integer)
      Returns:
      Integer property
    • customHeaders

      Map<String,String> customHeaders()
      Generated from OllamaBaseChatModel.Builder.customHeaders(java.util.Map)
      Returns:
      Map property
    • topK

      Optional<Integer> topK()
      Generated from OllamaBaseChatModel.Builder.topK(java.lang.Integer)
      Returns:
      Integer property
    • think

      Optional<Boolean> think()
      Generated from OllamaBaseChatModel.Builder.think(java.lang.Boolean)
      Returns:
      Boolean property
    • mirostatEta

      Optional<Double> mirostatEta()
      Generated from OllamaBaseChatModel.Builder.mirostatEta(java.lang.Double)
      Returns:
      Double property
    • listeners

      List<dev.langchain4j.model.chat.listener.ChatModelListener> listeners()
      Generated from OllamaBaseChatModel.Builder.listeners(java.util.List)
      Returns:
      List property
    • supportedCapabilities

      Set<dev.langchain4j.model.chat.Capability> supportedCapabilities()
      Generated from OllamaBaseChatModel.Builder.supportedCapabilities(java.util.Set)
      Returns:
      Set property
    • logResponses

      Optional<Boolean> logResponses()
      Generated from OllamaBaseChatModel.Builder.logResponses(java.lang.Boolean)
      Returns:
      Boolean property
    • topP

      Optional<Double> topP()
      Generated from OllamaBaseChatModel.Builder.topP(java.lang.Double)
      Returns:
      Double property
    • logRequests

      Optional<Boolean> logRequests()
      Generated from OllamaBaseChatModel.Builder.logRequests(java.lang.Boolean)
      Returns:
      Boolean property
    • returnThinking

      Optional<Boolean> returnThinking()
      Generated from OllamaBaseChatModel.Builder.returnThinking(java.lang.Boolean)
      Returns:
      Boolean property
    • modelName

      Optional<String> modelName()
      Generated from OllamaBaseChatModel.Builder.modelName(java.lang.String)
      Returns:
      String property
    • baseUrl

      Optional<String> baseUrl()
      Generated from OllamaBaseChatModel.Builder.baseUrl(java.lang.String)
      Returns:
      String property
    • stop

      List<String> stop()
      Generated from OllamaBaseChatModel.Builder.stop(java.util.List)
      Returns:
      List property
    • mirostatTau

      Optional<Double> mirostatTau()
      Generated from OllamaBaseChatModel.Builder.mirostatTau(java.lang.Double)
      Returns:
      Double property
    • repeatLastN

      Optional<Integer> repeatLastN()
      Generated from OllamaBaseChatModel.Builder.repeatLastN(java.lang.Integer)
      Returns:
      Integer property
    • configuredBuilder

      default dev.langchain4j.model.ollama.OllamaStreamingChatModel.OllamaStreamingChatModelBuilder configuredBuilder()
      Skipped:
      • build - doesn't have exactly one parameter
      • supportedCapabilities - property already exist -> B extends dev.langchain4j.model.ollama.OllamaBaseChatModel.Builder<C extends dev.langchain4j.model.ollama.OllamaBaseChatModel> supportedCapabilities(java.util.Set<dev.langchain4j.model.chat.Capability>)
      • build - doesn't have exactly one parameter
      Returns:
      Actual Lc4j model builder configured with this blueprint.