-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory Feature #11
Comments
Based on testing we might need to create a special deployment for this feature and send it only when the conversation finish, for processing. |
The system reviews the entire conversation, represented by the variable chat_history. Required Variables: memories (Array): Holds the list of previously saved memory entries. Any duplicate information found in this array should be discarded.
inputs.is_talk (Boolean): Activates the system only when set to False. When True, memory processing is skipped. Outputs:
|
Implement a memory feature for the LLM that allows it to store, update, and retrieve relevant information shared by users across conversations. The memory should persist important details, such as user preferences, key events, personal information, and recurring topics, to enhance contextual understanding in future interactions.
The text was updated successfully, but these errors were encountered: