Bug description

'name': '', 'refusal': '', 'role': 'USER', 'tool_call_id': '', 'tool_calls': None}], 'metadata': None, 'modalities': None, 'model': 'Qwen/Qwen3-30B-A3B', 'n': None, 'parallel_tool_calls': None, 'presence_penalty': None, 'reasoning_effort': '', 'response_format': None, 'seed': None, 'service_tier': '', 'stop': None, 'store': None, 'stream': False, 'stream_options': None, 'temperature': 0.1, 'tool_choice': None, 'tools': None, 'top_logprobs': None, 'top_p': None, 'user': ''}, 'ctx': {'error': ValueError('When using tool_choice, tools must be set.')}}]","type":"BadRequestError","param":null,"code":400}

Environment java 21 spring ai version:M7

Steps to reproduce

Image

Comment From: markpollack

Not sure what is going on in this issue, I suspect that it has to do with something that is incompatible with using the the new Qwen3 model. I've only run Qwen using Ollama but I can't run a 30B parameter model.

I was not able to reproduce using qwen2.5:3b in OllamaFunctionCallbackIT. see here

I am able to run qwen3 using ollama in the terminal, but when I switch to that model in the test, it isn't able to resolve the tag 404 - {"error":"model \"qwen3:4b\" not found, try pulling it first"}

I'll change the title to investigating qwen3 support.

If you can provide a full stack trace and/or reproducible example, that would help greatly.

Comment From: sunyuhan1998

Not sure what is going on in this issue, I suspect that it has to do with something that is incompatible with using the the new Qwen3 model. I've only run Qwen using Ollama but I can't run a 30B parameter model.

I was not able to reproduce using qwen2.5:3b in OllamaFunctionCallbackIT. see here

I am able to run qwen3 using ollama in the terminal, but when I switch to that model in the test, it isn't able to resolve the tag 404 - {"error":"model \"qwen3:4b\" not found, try pulling it first"}

I'll change the title to investigating qwen3 support.

If you can provide a full stack trace and/or reproducible example, that would help greatly.

I used the latest Spring AI code to connect to the qwen3-30b-a3b model provided by Alibaba Cloud DashScope in a way compatible with OpenAI, attempting to reproduce this issue, but I was unable to replicate it.

Comment From: yunfong

I encountered a similar error while using Spring AI 1.0.0 with an OpenAI-compatible interface, which suggested that non-stream requests should have the parameter enabled_thinking set to false, but there was no place to configure this parameter. Could Spring-AI provide an extended OpenAI-compatible approach that allows passing additional parameters to the model?