Expected Behavior now i want to get original response from llm provider, but i can't find a solution Current Behavior the chatmodel(maybe audio,embedding model)will return ChatResponse which was transformed from ChatCompletion(origin api response), how can i derectly get ChatCompletion?
waiting for help ing.
Comment From: fantasy-lotus
Maybe can use API class like OpenAiApi derectly by call chatCompletionEntity()?it seems work, but a little complex, and also discards Model and Client classes.