Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request]: Access the GraphRAG prompts #1727

Open
3 tasks
pg018 opened this issue Feb 20, 2025 · 4 comments
Open
3 tasks

[Feature Request]: Access the GraphRAG prompts #1727

pg018 opened this issue Feb 20, 2025 · 4 comments
Labels
enhancement New feature or request

Comments

@pg018
Copy link

pg018 commented Feb 20, 2025

Do you need to file an issue?

  • I have searched the existing issues and this feature is not already filed.
  • My model is hosted on OpenAI or Azure. If not, please look at the "model providers" issue and don't file a new one here.
  • I believe this is a legitimate feature request, not just a question. If this is a question, please use the Discussions area.

Is your feature request related to a problem? Please describe.

Can we also store the prompts called to LLM?

Describe the solution you'd like

The prompts and its respective completions must be stored in a proper formatted way.

Additional context

No response

@pg018 pg018 added the enhancement New feature or request label Feb 20, 2025
@IT-Bill
Copy link

IT-Bill commented Feb 20, 2025

I think the files in cache folder are all you need.

@pg018
Copy link
Author

pg018 commented Feb 20, 2025

For my use case I require to create a new graph each time the request comes in. For generating and storing the graph I am using temporary directory.

When another request comes in, it throws me an error: Directory not found for cache. I think it is still mapping to old one though I am generating the configuration again

@natoverse
Copy link
Collaborator

The prompt templates are exported as text files in the project root, which is useful for editing/tuning, but those are not the per-call prompts.

As @IT-Bill mentions, we do store the entire context in the cache, so the prompts are available, but not really intended for direct consumption. It sounds like your use case may require dynamically updating the config to point to the correct place each time. That's not really something we'll be able to put into the library directly, but with some scripting it seems like you should be able to put the cache entries to use.

@pg018
Copy link
Author

pg018 commented Feb 21, 2025

I delete the old cache directory (temp directory) once the graph creation is done. But when I see the source code of the graphrag, it internally creates a cache factory and stores the file locations...
So even if I were to direct it to the correct place, it still also looks for the old deleted cache files which is why I always have to disable the cache when using it

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants