Bug description
When I use the ChatOptions.builder()...
method to create a set of chat options and set these options on my Prompt
instance, no tool calls are available to the LLM.
Environment OpenJDK 21.0.7 with Spring AI 1.1.0-SNAPSHOT
Steps to reproduce
ChatOptions chatOptions = new ChatOptions.builder.model("qwen3:14b").build();
Prompt prompt = ChatClient.prompt()
.tools(new DateTimeTools())
.user("What is the current date and time?")
.options(chatOptions)
.call().chatResponse();
Expected behavior
My expectation was that the tools defined in the DateTimeTools
class would be available to the model but they were not. 😲
If you visit the DefaultChatClientUtils
file and place a breakpoint on line 96, you will see what actually happens. We check to see if the provided ChatOptions
instance is of the type ToolingChatOptions
, when we see that it's not we skip the rest of the tool processing. In this case we miss the opportunity to add the tooling callbacks from the prompt itself on line 103.
https://github.com/spring-projects/spring-ai/blob/main/spring-ai-client-chat/src/main/java/org/springframework/ai/chat/client/DefaultChatClientUtils.java#L96-L114
Is this the expected behavior? If so, I think this is worthy of a callout in the documentation. It took me a while to figure this out, much time would have been saved if I had been creating a TooCallingChatOptions
instance in the first place.
I would be happy to either open a PR to ensure we add prompt tools every time or to update the documentation, whichever makes the most sense.
Minimal Complete Reproducible example The code above provides most of the code needed for a working example, please let me know if more information is needed.
Thank you! This project is already a big time save for me, I really appreciate it! 💜
Comment From: cmiles74
I found a similar issue with the Bedrock Converse API. I haven't dug in yet, but if I create a ToolCallingChatOptions
instance and set it on the prompt, no tools are available. It does work if I create a BedrockChatOptions
instance.
Comment From: sunyuhan1998
Hi @cmiles74 , this issue should be currently being followed up by PR #3417, can you help confirm if the PR is addressing the issue you mentioned?
Comment From: cmiles74
Hi @sunyuhan1998, this does resolve one half of my issue. 😎 When I set options with the ChatOptions.Builder
and invoke a model with ollama
, the tool calls are present.
I can see that your change is to set all of the options on processesChatOptions
one-by-one in the toChatClientRequest
method of the DefaultChatClientUtils
. This works but it might be a little nicer to add another option to the ChatOptions.Builder
that accepts another ChatOptions
instance. Then you are left with a nice one-liner:
if (processedChatOptions instanceof DefaultChatOptions defaultChatOptions) {
if (!inputRequest.getToolNames().isEmpty() || !inputRequest.getToolCallbacks().isEmpty()
|| !CollectionUtils.isEmpty(inputRequest.getToolContext())) {
processedChatOptions = DefaultToolCallingChatOptions.builder().chatOptions(defaultChatOptions).build();
}
}
I have these changes in this branch if you would like to have them. I tried to open a PR against your branch but I couldn't get it done. * https://github.com/cmiles74/spring-ai/commit/6d31b5ae3c4360b05517a15d4393a60da93e287e
However when I invoke a bedrock-converse
model, however, the tool options are missing. I will need to look into this one further. 😢
Thank you for your help with this, I really appreciate it! 💜
Comment From: sunyuhan1998
@cmiles74 Thank you for your suggestion! This inspired me, and I noticed that we can directly use org.springframework.ai.model.ModelOptionsUtils#copyToTarget
to avoid lengthy code. I have already added this commit in that PR.
If you encounter any issues while using bedrock-converse
, we can also discuss them together!