Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory Feature #11

Open
hectoritr opened this issue Oct 9, 2024 · 2 comments
Open

Memory Feature #11

hectoritr opened this issue Oct 9, 2024 · 2 comments
Assignees

Comments

@hectoritr
Copy link
Contributor

hectoritr commented Oct 9, 2024

Implement a memory feature for the LLM that allows it to store, update, and retrieve relevant information shared by users across conversations. The memory should persist important details, such as user preferences, key events, personal information, and recurring topics, to enhance contextual understanding in future interactions.

  • Identify and Store Relevant Data: Capture important user inputs, such as preferences, goals, biographical details, and frequently discussed topics. Avoid storing unnecessary or sensitive information unless explicitly requested by the user.
  • Update and Manage Stored Information: Allow the LLM to update stored information as conversations evolve (e.g., change in user preferences, new events).
  • Retrieve Memory Context: Retrieve relevant stored data when necessary to maintain context in future conversations and enhance responses.
  • Token Efficiency: Ensure that the memory retrieval and updates are handled efficiently to minimize token usage and processing costs.
@hectoritr hectoritr self-assigned this Oct 9, 2024
@hectoritr
Copy link
Contributor Author

Based on testing we might need to create a special deployment for this feature and send it only when the conversation finish, for processing.

@hectoritr hectoritr added this to the Fluid Mind Alpha milestone Oct 18, 2024
@hectoritr
Copy link
Contributor Author

The system reviews the entire conversation, represented by the variable chat_history.
For each interaction in chat_history, both the user’s input and the assistant’s response are considered to determine if relevant memory is generated.
If a relevant piece of information is identified but already exists in the memories array, it will be discarded to avoid duplication.

Required Variables:
Inputs:
chat_history (Array): Contains the history of the conversation, with user input and assistant responses.

memories (Array): Holds the list of previously saved memory entries. Any duplicate information found in this array should be discarded.

[
  { "id": 1, "memory": "User loves hiking." }
]

inputs.is_talk (Boolean): Activates the system only when set to False. When True, memory processing is skipped.

Outputs:
memories (Array): An array of objects, each containing a memory with the following structure:
id (Number): A unique identifier for each memory.
memory (String): The relevant information extracted from the conversation.
If no memory is generated, the system should return "NaM" in place of the memory content.

{
  "replies": [
    { "id": 1, "memory": "User loves hiking." },
    { "id": 2, "memory": "User prefers concise lists." }
  ]
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant