I am working on an agentic workflow project using SpringAI, now I have a question/problem with ChatMemory.

Question How can I implement the memories in a workflow?

Example I have an agentic workflow that includes the following steps: 1. Perform a GoogleSearch based on user input. 2. Send the Google result to a LLM(e.g., OpenAI) to summarize them into a short article. 3. Return the summarized result.

Workflow: Start→GoogleSearch→OpenAI LLM→End

Problem When I running this workflow and enter "NBA" for the first time, and it returns an article about "NBA". However, when I input "please reduce the content of previous answer to 20 words" for the second time, the workflow does not retain the memory. Instead, It transmits the "please reduce the content of previous answer to 20 words" to Google and generates a new article.

Is there any way to solve this problem? I would greatly appreciate any responses.

Comment From: alexcheng1982

Spring AI provides some built-in advisors to support chat memory. In your case, you can use either MessageChatMemoryAdvisor or PromptChatMemoryAdvisor. The configuration may look like below (untested code):

@Configuration
public class AppConfiguration {
  @Bean
  ChatMemory chatMemory() {
    return new InMemoryChatMemory(); // Use in-memory chat memory
  }

  @Bean
  MessageChatMemoryAdvisor messageChatMemoryAdvisor(ChatMemory chatMemory) {
    return new MessageChatMemoryAdvisor(chatMemory); // create the advisor
  }

  @Bean
  ChatClient chatClient(ChatClient.Builder builder, MessageChatMemoryAdvisor advisor) {
    return builder.defaultAdvisors(advisor).build(); // create a ChatClient
  }
}

Comment From: FredLiu0813

Thanks for your answers.

I am considering a solution: before invoking the Flow, I send the Flow's description along with the user's question to the LLM. Let the LLM determine if the question is suitable for the Flow.

If it is not suitable, I will use the LLM's response directly. If it is suitable, I will proceed with the Flow.

In this process, each round of Chat uses ChatMemory and MessageChatMemoryAdvisor.

I am not sure if this approach is correct.

Comment From: alexcheng1982

From your description, only the first time the user interacts with your service, you need to determine if the flow should be used. You can use another advisor to call the LLM and determine if the flow should be used. The result should be stored somewhere. Based on the result, you can choose to bypass ChatMemory in the advisor implementation.

On Mon, Sep 30, 2024 at 8:57 PM Fred @.***> wrote:

Thanks for your answers.

I am considering a solution: before invoking the Flow, I send the Flow's description along with the user's question to the LLM. Let the LLM determine if the question is suitable for the Flow.

If it is not suitable, I will use the LLM's response directly. If it is suitable, I will proceed with the Flow.

In this process, each round of Chat uses ChatMemory and MessageChatMemoryAdvisor.

I am not sure if this approach is correct.

— Reply to this email directly, view it on GitHub https://github.com/spring-projects/spring-ai/issues/1408#issuecomment-2382378647, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAI6AEGONDOYWZBD4UHHFMDZZEAAHAVCNFSM6AAAAABOZPKEPKVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDGOBSGM3TQNRUG4 . You are receiving this because you commented.Message ID: @.***>

-- Regards Alex Cheng

Comment From: FredLiu0813

How to use "another advisor to call the LLM and determine if the flow should be used"? can you help me and give some sample codes?

Comment From: matosoma123

@alexcheng1982 , is it possible to set MessageChatMemoryAdvisor for BedrockAnthropic3ChatModel ?

Comment From: ThomasVitale

@FredLiu0813 the new documentation includes examples of how ChatMemory can be used with a ChatModel directly: https://docs.spring.io/spring-ai/reference/api/chat-memory.html#_memory_in_chat_model, instead of having to use ChatClient+Advisors.

There's also an example combining chat memory and tool calling: https://docs.spring.io/spring-ai/reference/api/tools.html#_user_controlled_tool_execution

Comment From: ilayaperumalg

@matosoma123

is it possible to set MessageChatMemoryAdvisor for BedrockAnthropic3ChatModel ?

Sorry about the delayed response. The BedrockAnthropic3ChatModel was removed in the place of BedrockConverse chat model and you can use MessageChatMemoryAdvisor with BedrockConverse chat model.

Comment From: ilayaperumalg

@ThomasVitale @alexcheng1982 Thanks for your inputs. Closing this for now. Please feel free to re-open if you still need clarification.