Today, when using

Flux<String> flux = chatClient.prompt()
        .user(userText)
        .stream().content();
return flux;

an error occurred! Duration evalDuration = response.getEvalDuration(); threw a NullPointerException.

Image

java.lang.NullPointerException: Cannot invoke "java.time.Duration.plus(java.time.Duration)" because "evalDuration" is null at org.springframework.ai.ollama.OllamaChatModel.from(OllamaChatModel.java:174) ~[spring-ai-ollama-1.0.0.jar:1.0.0] Suppressed: reactor.core.publisher.FluxOnAssembly$OnAssemblyException: Assembly trace from producer [reactor.core.publisher.FluxMapFuseable] : reactor.core.publisher.Flux.map(Flux.java:6631) org.springframework.ai.ollama.OllamaChatModel.lambda$internalStream$10(OllamaChatModel.java:309) Error has been observed at the following site(s): __Flux.map ⇢ at org.springframework.ai.ollama.OllamaChatModel.lambda$internalStream$10(OllamaChatModel.java:309) | Flux.flatMap ⇢ at org.springframework.ai.ollama.OllamaChatModel.lambda$internalStream$10(OllamaChatModel.java:336) | Flux.doOnError ⇢ at org.springframework.ai.ollama.OllamaChatModel.lambda$internalStream$10(OllamaChatModel.java:359) | Flux.doFinally ⇢ at org.springframework.ai.ollama.OllamaChatModel.lambda$internalStream$10(OllamaChatModel.java:360) | Flux.contextWrite ⇢ at org.springframework.ai.ollama.OllamaChatModel.lambda$internalStream$10(OllamaChatModel.java:363) | Flux.doOnSubscribe ⇢ at org.springframework.ai.chat.model.MessageAggregator.aggregate(MessageAggregator.java:73) | Flux.doOnNext ⇢ at org.springframework.ai.chat.model.MessageAggregator.aggregate(MessageAggregator.java:84) | Flux.doOnComplete ⇢ at org.springframework.ai.chat.model.MessageAggregator.aggregate(MessageAggregator.java:123) | Flux.doOnError ⇢ at org.springframework.ai.chat.model.MessageAggregator.aggregate(MessageAggregator.java:150) __Flux.deferContextual ⇢ at org.springframework.ai.ollama.OllamaChatModel.internalStream(OllamaChatModel.java:293) ______Flux.defer ⇢ at org.springframework.ai.ollama.OllamaChatModel.lambda$internalStream$7(OllamaChatModel.java:340) |_ Flux.subscribeOn ⇢ at org.springframework.ai.ollama.OllamaChatModel.lambda$internalStream$7(OllamaChatModel.java:353) ____Flux.flatMap ⇢ at org.springframework.ai.ollama.OllamaChatModel.lambda$internalStream$10(OllamaChatModel.java:336) | Flux.doOnError ⇢ at org.springframework.ai.ollama.OllamaChatModel.lambda$internalStream$10(OllamaChatModel.java:359) | Flux.doFinally ⇢ at org.springframework.ai.ollama.OllamaChatModel.lambda$internalStream$10(OllamaChatModel.java:360) | Flux.contextWrite ⇢ at org.springframework.ai.ollama.OllamaChatModel.lambda$internalStream$10(OllamaChatModel.java:363) | Flux.doOnSubscribe ⇢ at org.springframework.ai.chat.model.MessageAggregator.aggregate(MessageAggregator.java:73) | Flux.doOnNext ⇢ at org.springframework.ai.chat.model.MessageAggregator.aggregate(MessageAggregator.java:84) | Flux.doOnComplete ⇢ at org.springframework.ai.chat.model.MessageAggregator.aggregate(MessageAggregator.java:123)

Comment From: 192902649

@Bean
public ChatClient ollamaChatClient(OllamaChatModel chatModel) {
    ChatClient.Builder builder = ChatClient.builder(chatModel);


    builder.defaultTools(new DateTimeTools());
    return builder.build();
}

Comment From: sunyuhan1998

I'm guessing you're not using the latest version of the code? You can try using the 1.1.0-SNAPSHOT version, as this issue was previously fixed in #3372

Comment From: lihuagang03

java.lang.NullPointerException: Cannot invoke "java.time.Duration.plus(java.time.Duration)" because "evalDuration" is null at org.springframework.ai.ollama.OllamaChatModel.from(OllamaChatModel.java:174) ~[spring-ai-ollama-1.0.0.jar:1.0.0]

I'm guessing you're not using the latest version of the code? You can try using the 1.1.0-SNAPSHOT version, as this issue was previously fixed in #3372

Upgrade version Spring AI from 1.0.0 to 1.1.0-SNAPSHOT. @192902649 CC @sunyuhan1998

<!-- from -->
<spring-ai.version>1.0.0</spring-ai.version>

<!-- to -->
<spring-ai.version>1.1.0-SNAPSHOT</spring-ai.version>

Image

Comment From: sunyuhan1998

@lihuagang03 In Spring AI 1.1.0-SNAPSHOT, the dependency coordinates for Ollama have been changed to:

<dependency>
    <groupId>org.springframework.ai</groupId>
    <artifactId>spring-ai-starter-model-ollama</artifactId>
</dependency>

Comment From: lihuagang03

@lihuagang03 In Spring AI 1.1.0-SNAPSHOT, the dependency coordinates for Ollama have been changed to:

<dependency> <groupId>org.springframework.ai</groupId> <artifactId>spring-ai-starter-model-ollama</artifactId> </dependency>

Yes, the method call stack is print the original jar of source code, but the Maven dependency is boot starter.