[Bug]: <Something wrong with fnllm、tenacity> #1666
Labels
awaiting_response
Maintainers or community have suggested solutions or requested info, awaiting filer response
bug
Something isn't working
triage
Default label assignment, indicates new issue needs reviewed by a maintainer
Do you need to file an issue?
Describe the bug
Title: Error in tenacity KeyError: 'idle_for' Something wrong with fnllm、tenacity
Description:
Hello,I hope this message finds you well.
I am build index ,when step into extract_graph flows ,the entities and edges are extracted correctly, but an error occurs during the summarization step in the summarize_descriptions_with_llm function.
Here is the relevant code snippet:
I have not modified this code.
However, I noticed a similar usage in graphrag/index/operations/extract_entities/graph_extractor.py:
This part works correctly, so I tried removing model_parameters={"max_tokens": self._max_summary_length}, but the error still persists.
I am using the deepseek-chat model, but it doesn't seem to be the cause. I suspect there might be something wrong with fnllm、tenacity
Could you please help me identify and resolve this problem?
Thank you !
Steps to reproduce
just init normal
Expected Behavior
summarization success
GraphRAG Config Used
Logs and screenshots
error:
Additional Information
The text was updated successfully, but these errors were encountered: