Interface OpenAIHttpExecutor<I extends Streamable,​O extends Mergeable<O>>

    • Method Detail

      • execute

        O execute​(I request)
        Executes HTTP request synchronously
        Parameters:
        request - The request (Input) model
        Returns:
        deserialized HTTP Response in the Output model
      • executeAsync

        void executeAsync​(I request,
                          Consumer<String> callBack,
                          Consumer<O> finalizer)
        Executes HTTP request asynchronously. Since response can be streamed, it can be potentially beneficial for the developer, to subscribe to each line, hence the callback parameter. It makes a little bit more sense to subscribe, to the whole response, using the finalizer parameter.
        Parameters:
        request - The request (Input) model
        callBack - A callback of type stringLine -> consume(stringLine)
        finalizer - A callback of type outputModel -> consume(outputModel)
      • executeReactive

        OpenAIHttpExecutor.ReactiveExecution<O> executeReactive​(I request)
        Executes HTTP request in reactive fashion. We strongly recommend to use this only if a real reactive runtime is present, such as Reactor netty.
        Parameters:
        request - The request (Input) model
        Returns:
        OpenAIHttpExecutor.ReactiveExecution object holding a single observable (Mono) to the whole response, and a multi-emit observable (Flux) to each response line.
      • canStream

        boolean canStream​(I input)
        Denotes whether the response can be streamed, mostly taking into account the value of Streamable.stream() which usually is implemented like inputModel -> inputModel.stream() && specificApiLogic.
        Parameters:
        input - The request (Input) model
        Returns:
        boolean saying if you can stream the response or not. Usually beneficial for internal calls like execute(Streamable)