Bug Report: com.fasterxml.jackson.core.JsonParseException: Unexpected close marker ']': expected '}' (for Object starting at [Source: REDACTED (StreamReadFeature.INCLUDE_SOURCE_IN_LOCATION disabled); line: 1, column: 235]) at [Source: REDACTED (StreamReadFeature.INCLUDE_SOURCE_IN_LOCATION disabled); line: 1, column: 289] Spring AI version: 1.0.1 Spring Boot version: 3.5.5 Java version: 17

Description: the json content is {"jobParam": {"clusterType": "xx", "noAddClusterAgain": false, "noQuitCluster": false, "receiveAccount": "xx", "noReturnMachine": true}, "borrowNodeCondition": {"nodeFromLabel": "buffer", "notBorrow": false, "weCubeKeys": [{"field": "idc", "relation": "eq", "value": "x"]}, "creator": "XXX", "rate": 6, "sourceCondition": {"weCubeKeys": [{"field": "idc", "relation": "in", "values": ["s", "a"]}, {"field": "cluster_alias", "relation": "eq", "value": "abc"}, {"field": "node_ip", "relation": "eq", "value": "127.0.0.1"]}, "taskName": ""}, "conversionId": "1"} that is not an normal json style Image


what the code when occur in chatClient.prompt().toolCallbacks(ToolCallbacks.from(XXX)).user(xxx).advisors(spec -> spec.param(ChatMemory.CONVERSATION_ID, XXX)).stream().content() the tool code is @Tool(description = "发起k8s节点迁移任务") public String startMgrTask( @ToolParam(description = "迁移任务参数") AiMgrJobParam jobParam, @ToolParam(description = "当前会话id") String conversionId) { } I guess this is a bug when using tool call in ai stream ,beacus no exception orrur when use the sync way

chatClient.prompt().system(SYSTEM_PROMPT).toolCallbacks(ToolCallbacks.from()).user().advisors(spec -> spec.param(ChatMemory.CONVERSATION_ID, conversionKey)).call().content()

Comment From: kmw10693

Hi, thanks for reporting this issue.

I’d like to take this one and work on a fix.

Comment From: injae-kim

Fix PR: https://github.com/spring-projects/spring-ai/pull/4274 👍

Comment From: wykeli

Thank you. Your feedback is extremely prompt; you are the best

Comment From: wykeli

Fix PR: #4274 👍

may i aks an question, what is the maven config use in you code case, and what is the dependency version ? because I can not find the OpenAiChatOptions.builder().toolExecutionEligibilityPredicate(),thank you for supporting!!

Comment From: dev-jonghoonpark

@wykeli The PR has been created, but it hasn’t been merged yet.

Once it gets merged and included in the official release, you’ll be able to use it via Maven. If you need to use it right away, you can also build it yourself and use it directly.

Comment From: dev-jonghoonpark

@wykeli Looking at the PR, it seems like it would work if you just add the implementation code from the PR into your project. Would you like to try it?

Comment From: wykeli

@wykeli Looking at the PR, it seems like it would work if you just add the implementation code from the PR into your project. Would you like to try it?

@dev-jonghoonpark ok,thanks Additionally, I’ve found that when Spring AI performs streaming calls to certain LLMs with “thinking” capabilities, every additional tool call during the process causes one more think tag to appear in the model output. This doesn’t occur when tool calls aren’t used or when the calls are synchronous.

Image

Comment From: wykeli

Fix PR: #4274 👍

I think this PR is not working,ToolExecutionEligibilityPredicate is interface to judge whether to use tool call , and when tool call finish will set finishReason with TOOL_CALLSS or STOP, however, I guest tool call will not execute in ToolOnFinishPredicate ,because the condition will not be ture for hasToolCalls && ("tool_calls".equalsIgnoreCase(finishReason) || "stop".equalsIgnoreCase(finishReason)) I think this is impossible to be ture because no any one tool will be execute for this