Feature: Saved information
under review
A
André Mikalsen
I came across a feature I liked very much on a platform to save important information, and saw the use case for better productivity with agents/assistants.
A centralized place to store information, that is not bound to variables inside an agent.
I see this being a good feature when chatting with an agent (just typing # to pull up saved information).
We have noticed interest from client based businesses, like lawyers, accountants to have saved information about their clients, so the questions to the LLM's become more relevant, without having to type or copy/paste each time.
Take a look at how copy.ai solved this by their feature (infobase: https://www.copy.ai/blog/infobase-by-copyai)
Chenhe Gu
under review
Pascal M
Great idea. In short: Allow dify agents to read and write user specific knowledge.
Currently knowledge is stored for the whole app, but one could allow the agent assistant to access knowledge as tool with read and write permissions.
The tool would require the id of the current user which chats with the agent to ensure that the agent only reads and writes data for the current user. A "User Knowledge" is then automatically synced to this app and is only active in sessions for this user.
For retrieving data, the dify agent would use a tool to query for data, so that the stored data is not fetched every time with the default RAG pipeline but only fetched when the LLM thinks the user wants to fetch data. Since the Tool would then get the users prompt, we can then trigger the RAG pipeline and provide the result to the Agent.
Kowledge could get a lifetime so that the knowledge get's deleted after time.
This feature could be extended by a frontend component which allows the user to manually query, delete and update user knowledge.
An alternative way would be to extend integrations like notion to allow users to sync their notion space. This way the "User Knowledge" is not managed inside dify, but by the user in an existing tool like notion, dropbox, you name it. Again the Agent would decide to query data from the User Knowledge, in this case Notion, but the user needs to place the data there.
I think I like the first idea more because it feels more like "ai magic" and allows to say "Store the following information for me: ...." and "Search for information about ..." which would allow the LLM to bring this data from the User Knowledge into the chat history where it can be used as context for the conversation.
André Mikalsen I did not check what copy.ai is doing but I hope this concept matches with your idea.
A
André Mikalsen
Pascal M Thanks pascal for your additional feedback, in my opinion, this would be best to be a small feature in the frontend, with saving to the DB for the tenant/workplace. Then we would not rely on the LLM's tool retrieval and the risk of failure of that logic.
Example:
I'm an accountant that want to regularly use Assistants/Agents to talk about client A,B and C finances.
If I could save information in this new feature like:
#Client A
"Client A is a big tech company with 400 employees, they did 10 million in revenue last year, etc."
then go to chat with a Assistant, I could write "Given #Client A, how should they reduce tax this year when they have a 80% increase in revenue".
The Prompt to the LLM would then be:
Given Client A is a big tech company with 400 employees, they did 10 million in revenue last year, etc., how should they reduce tax this year when they have a 80% increase in revenue.
This could have a very broad functionality for users, like Brand guidelines, Templates for Social media posts