Chat Memory
In xFlow, Chat Memory is a crucial feature that allows AI agents to retain context across multiple interactions. This capability is essential for creating coherent and contextually relevant conversations. Chat Memory in xFlow supports different types of memory configurations that can be customized to fit specific requirements. This documentation explains the details of the types of memory supported and how to configure them.
Chat Memory Properties
Memory ID
Name:
memoryIdType:
stringDescription: The unique identifier used to store and retrieve the agent's memory. This ID ensures that the conversation history is correctly associated with the specific AI agent.
Example:
memoryId: "session123"
Memory Type
Name:
memoryTypeType:
stringDescription: Defines the type of memory to use. The supported types are
message_windowandtoken_window. The default type ismessage_window.Enum Values:
message_window: Retains a fixed number of messages.
token_window: Retains a fixed number of tokens.
Default:
message_windowExample:
memoryType: "token_window"
Max Messages
Name:
maxMessagesType:
integerDescription: The maximum number of messages to retain in memory. If the memory exceeds this limit, the oldest messages are evicted. This property is used when the memory type is
message_window.Example:
Max Tokens
Name:
maxTokensType:
integerDescription: The maximum number of tokens to retain in memory. The memory will retain as many of the most recent messages as can fit within the
maxTokenslimit. If an old message does not fit within the limit, it is completely evicted. This property is used when the memory type istoken_window.Example:
Types of Chat Memory
Message Window Memory
message_window memory retains a fixed number of recent messages. This type of memory is useful when the exact number of past interactions needs to be remembered, regardless of the length of the individual messages.
Configuration Example:
Token Window Memory
token_window memory retains a fixed number of tokens. This type of memory is beneficial when controlling the total size of the memory is more important than the number of messages. It ensures that the memory does not exceed a certain number of tokens, making it easier to manage the cost and performance of the AI agent.
Configuration Example:
Usage Example
Here is an example of how to configure Chat Memory in an AI Agent State within a xFlow workflow:
For more detailed information and advanced configurations, refer to the AI Agent State spec.
Last updated