I wanted to obtain the number of tokens consumed for calling the large model, so I customized the Advisor, but it didn't take effect

Here is my definition of Advisor: public class CustomizeLoggerAdvisor implements CallAdvisor, StreamAdvisor {

private Integer order;

@Override
public String getName() {
    return this.getClass().getSimpleName();
}

@Override
public int getOrder() {
    return null != order ? order : 1;
}

private ChatClientRequest before(ChatClientRequest request) {
    log.info("AI Request: {}", request.prompt().getContents());
    return request;
}

private void observeAfter(ChatClientResponse advisedResponse) {
    ChatResponse response = advisedResponse.chatResponse();
    if (null == response) {
        log.info("AI Response is null");
        return;
    }
    ChatResponseMetadata responseMetadata = response.getMetadata();
    Usage usage = responseMetadata.getUsage();
    log.info("total tokens:{}", usage.getTotalTokens());
    log.info("input tokens:{}", usage.getPromptTokens());
    log.info("output tokens:{}", usage.getCompletionTokens());
}

@Override
public ChatClientResponse adviseCall(ChatClientRequest advisedRequest, CallAdvisorChain chain) {
    advisedRequest = this.before(advisedRequest);
    ChatClientResponse advisedResponse = chain.nextCall(advisedRequest);
    this.observeAfter(advisedResponse);
    return advisedResponse;
}

@Override
public Flux<ChatClientResponse> adviseStream(ChatClientRequest advisedRequest,
                                             StreamAdvisorChain chain) {
    advisedRequest = this.before(advisedRequest);
    Flux<ChatClientResponse> advisedResponses = chain.nextStream(advisedRequest);
    return new ChatClientMessageAggregator().
            aggregateChatClientResponse(advisedResponses, this::observeAfter);
}

I added my defined Advisor to ChatClient: ChatClient .builder(openedAiChatModel).defaultAdvisors(new CustomizeLoggerAdvisor()) .build()

After calling the big model, the tokens I obtained were all 0