You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
For my use case I require to create a new graph each time the request comes in. For generating and storing the graph I am using temporary directory.
When another request comes in, it throws me an error: Directory not found for cache. I think it is still mapping to old one though I am generating the configuration again
The prompt templates are exported as text files in the project root, which is useful for editing/tuning, but those are not the per-call prompts.
As @IT-Bill mentions, we do store the entire context in the cache, so the prompts are available, but not really intended for direct consumption. It sounds like your use case may require dynamically updating the config to point to the correct place each time. That's not really something we'll be able to put into the library directly, but with some scripting it seems like you should be able to put the cache entries to use.
I delete the old cache directory (temp directory) once the graph creation is done. But when I see the source code of the graphrag, it internally creates a cache factory and stores the file locations...
So even if I were to direct it to the correct place, it still also looks for the old deleted cache files which is why I always have to disable the cache when using it
Do you need to file an issue?
Is your feature request related to a problem? Please describe.
Can we also store the prompts called to LLM?
Describe the solution you'd like
The prompts and its respective completions must be stored in a proper formatted way.
Additional context
No response
The text was updated successfully, but these errors were encountered: