Bug description The ChatClient/LLM does not make use of the provided tool/s when using Gemini

Environment - Spring AI 1.0.0-M8 - Spring Boot 3.4.5 - Gemini 2.0 Flash

Steps to reproduce - Define a tool with name and description (@Tool) - Use VertexAiGeminiChatModel for ChatClient - Call prompt with tools() and toolContext() - User prompt with Image

Expected behavior When running the same ChatClient via OpenAI it makes use of the tool as expected. Gemini seems to ignore the provided tool. I'm not sure if something is wrong on my side. ("vertex.ai.gemini.chat.options.tool-names" config is provided).

Comment From: thomasflad

Seems to be related to this issue mentioned here. It's indeed working when using the ANY mode but this leads to a lot of usage of the tool. Anyone who figured this out?

It's working properly with OpenAI

Comment From: nlinhvu

From what you specified, I understand that Gemini always return a completed result instead of a function call and its arguments in the response for your prompt. If that was a case, of course, ChatClient can only return the final result instead of calling MethodToolCallback that makes use of tools.

Comment From: wisnuwardoyo

add this as system prompt "You are an API assistant. You must only output the function or tool to call, using strict JSON. Do not output commentary or internal thoughts."

ChatClient chatClient = ChatClient.builder(chatModel) .defaultAdvisors(MessageChatMemoryAdvisor.builder(chatMemory).build()) .defaultSystem(