Expected Behavior

Logprobs for Vertex AI are in General Availability since June 09, 2025

Google Cloud Generative AI Reference on how to request logprobs.

Logprobs, allow users to gauge a model's confidence in its outputs. It's desirable to have that information available, if the user wants.

Below an example of how that could be used:

VertexAiGeminiChatOptions.builder()
    .responseLogprobs(true)
    .logProbs(1) // can range from 1 to 20
    .temperature(0.5)
    .model("gemini-2.0-flash-lite")
    .build();

var logprobs = (VertexAiGeminiApi.LogProbs) this.chatModel
    .call(new Prompt("Explain Bulgaria? Answer in 10 paragraphs.", chatOptions))
    .getResult()
    .getOutput()
    .getMetadata()
    .get("logprobs");

VertexAiGeminiApi.LogProbs logprobs = Optional.of(response)
    .map(ChatResponse::getResult)
    .map(Generation::getMetadata)
    .map(metadata -> metadata.get("logprobs"))
    .map(VertexAiGeminiApi.LogProbs.class::cast)
    .orElse(null);

System.out.println(logprobs.avgLogprobs());

logprobs.chosenCandidates().stream()
    .map(VertexAiGeminiApi.LogProbs.Content::logprob)
    .forEach(System.out::println);

logprobs.topCandidates().stream()
    .map(VertexAiGeminiApi.LogProbs.TopContent::candidates)
    .flatMap(List::stream)
    .map(VertexAiGeminiApi.LogProbs.Content::logprob)
    .forEach(System.out::println);

Current Behavior

The current version doesn't deal with newly added, logprob related request and response fields.

Context

This implementation will allow us to obtain logprobs from Google Vertex AI, being able

From https://docs.together.ai/docs/logprobs:

Logprobs, short for log probabilities, are logarithms of probabilities that indicate the likelihood of each token occurring based on the previous tokens in the context. They allow users to gauge a model's confidence in its outputs and explore alternative responses considered by the model and are beneficial for various applications such as classification tasks, retrieval evaluations, and autocomplete suggestions. One big use case of using logprobs is to assess how confident a model is in its answer.

This will help us to improve the quality of the answers that are provided to customers.

This feature is already available for OpenAI, but we need it available for Vertex AI.