Local AI inference engines, like Ollama or Docker Model Runner have OpenAI compatible endpoints. For Ollama: http://localhost:11434 For Docker Model Runner: http://localhost:12434/engines In this way they are accessible from my application using the dependency spring-ai-starter-model-openai. They don't require an api-key.

However, when i don't add the setting spring.ai.openai.api-key to my application, I get this exception when my application starts: java.lang.IllegalArgumentException: OpenAI API key must be set. Use the connection property: spring.ai.openai.api-key or spring.ai.openai.chat.api-key property.

For the moment i solve this by adding following setting: spring.ai.openai.api-key=dummy

Comment From: ilayaperumalg

@hansdesmet While this would be an use case to support, not having the API key by default would make the application that connect to API Key based LLMs fail as well. As a trade off, we went ahead with restricting valid API key at the first place.