Bug description After migrating from M6 to stable and adjusting the parameters/dependencies accordingly, Observability exposes input and output as null.
Environment Java21, spring-ai.1.0.0
Steps to reproduce
M6:
spring.ai.chat.observations.include-prompt=true spring.ai.chat.observations.include-completion=true
↓
stable:
spring.ai.chat.observations.log-prompt=true spring.ai.chat.observations.log-completion=true
management.tracing.sampling.probability=1.0 management.observations.annotations.enabled=true
spring.ai.chat.client.observations.log-prompt=true
Minimal Complete Reproducible example
stable:
M6:
Comment From: dev-jonghoonpark
This issue seems to be related: https://github.com/spring-projects/spring-ai/issues/3257
Comment From: zinit
This issue seems to be related: #3257
Somewhat, yes... reading that PR, I can see that the issue lies exactly here –
Mx:
public void onStop(ChatModelObservationContext context) {
TracingObservationHandler.TracingContext tracingContext = context
.get(TracingObservationHandler.TracingContext.class);
Span otelSpan = TracingHelper.extractOtelSpan(tracingContext);
if (otelSpan != null) {
otelSpan.addEvent(AiObservationEventNames.CONTENT_PROMPT.value(),
Attributes.of(AttributeKey.stringArrayKey(AiObservationAttributes.PROMPT.value()),
ChatModelObservationContentProcessor.prompt(context)));
}
}
stable:
public void onStop(ChatModelObservationContext context) {
logger.info("Chat Model Prompt Content:\n{}", ObservabilityHelper.concatenateStrings(prompt(context)));
}
Anyway, for now, I'm reverting to previous implementation as CustomChatPrompt/CompletionObservationHandlers... and I'm eagerly looking forward to continued compatibility with external observability tools on the Spring AI side.
Comment From: DannySortino
Just to add the same issue is on both..
ChatModelPromptContentObservationHandler - for the prompt (input) side. ChatModelCompletionObservationHandler - for the completion (output) side.
Just reverting the implementation it locally and loading into context as bean was enough for me to at least see it on my spans for output context for observability platforms.