Expected Behavior now i want to get original response from llm provider, but i can't find a solution Current Behavior the chatmodel(maybe audio,embedding model)will return ChatResponse which was transformed from ChatCompletion(origin api response), how can i derectly get ChatCompletion?
waiting for help ing.
Comment From: fantasy-lotus
Maybe can use API class like OpenAiApi derectly by call chatCompletionEntity()?it seems work, but a little complex, and also discards Model and Client classes.
Comment From: sunyuhan1998
So far, all the methods in Spring AI for obtaining model response results seem to be based on the ChatCompletionChunk
object that Spring AI uses to wrap the raw output from the model. There have indeed been some previous discussions regarding this limitation, and a few workarounds have emerged, such as using a custom WebClientCustomizer
to intercept the raw responses from the underlying webClient
. However, up to this point, I haven't yet seen an elegant way to directly obtain the original JSON returned by the model.
Comment From: fantasy-lotus
So far, all the methods in Spring AI for obtaining model response results seem to be based on the
ChatCompletionChunk
object that Spring AI uses to wrap the raw output from the model. There have indeed been some previous discussions regarding this limitation, and a few workarounds have emerged, such as using a customWebClientCustomizer
to intercept the raw responses from the underlyingwebClient
. However, up to this point, I haven't yet seen an elegant way to directly obtain the original JSON returned by the model.
Thanks, I got it. I've thought about using interceptor or aop to accomplish it, but may cause some complexities. Generally, If the community plans to feature this or has a relevant roadmap in the future, plz notify me, i‘m pleasure to do some contributes.