This repository has been archived by the owner on Nov 13, 2024. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 122
context engine docstring #130
Merged
Merged
Changes from 2 commits
Commits
Show all changes
3 commits
Select commit
Hold shift + click to select a range
File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -24,6 +24,25 @@ async def aquery(self, queries: List[Query], max_context_tokens: int, ) -> Conte | |
|
||
|
||
class ContextEngine(BaseContextEngine): | ||
""" | ||
ContextEngine is responsible for providing context to the LLM, given a set of search queries. | ||
|
||
Once called with a set of queries, the ContextEngine will go through the following steps: | ||
1. Query the knowledge base for relevant documents | ||
2. Build a context from the documents retrieved that can be injected into the LLM prompt | ||
|
||
The context engine considers token budgeting when building the context, and tries to maximize the amount of relevant information that can be provided to the LLM within the token budget. | ||
|
||
To create a context engine, you must provide a knowledge base and optionally a context builder. | ||
|
||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Maybe we should also mention that the returned There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. IMO it's too deep for putting in this layer of docstrings, it's something to look at in the context builder I think because it relevant only once you want to costumes it |
||
Example: | ||
>>> from canopy.context_engine import ContextEngine | ||
>>> from canopy.models.data_models import Query | ||
>>> context_engine = ContextEngine(knowledge_base=knowledge_base) | ||
>>> context_engine.query(Query(text="What is the capital of France?"), max_context_tokens=1000) | ||
|
||
To create a knowledge base, see the documentation for the knowledge base module (canopy.knowledge_base.knowledge_base). | ||
""" # noqa: E501 | ||
|
||
_DEFAULT_COMPONENTS = { | ||
'knowledge_base': KnowledgeBase, | ||
|
@@ -36,6 +55,14 @@ def __init__(self, | |
context_builder: Optional[ContextBuilder] = None, | ||
global_metadata_filter: Optional[dict] = None | ||
): | ||
""" | ||
Initialize a new ContextEngine. | ||
|
||
Args: | ||
knowledge_base: The knowledge base to query for retrieving documents | ||
context_builder: The context builder to use for building the context from the retrieved documents | ||
acatav marked this conversation as resolved.
Show resolved
Hide resolved
|
||
global_metadata_filter: A metadata filter to apply to all queries. See: https://docs.pinecone.io/docs/metadata-filtering | ||
""" # noqa: E501 | ||
|
||
if not isinstance(knowledge_base, BaseKnowledgeBase): | ||
raise TypeError("knowledge_base must be an instance of BaseKnowledgeBase, " | ||
|
@@ -55,6 +82,22 @@ def __init__(self, | |
self.global_metadata_filter = global_metadata_filter | ||
|
||
def query(self, queries: List[Query], max_context_tokens: int, ) -> Context: | ||
""" | ||
Query the knowledge base for relevant documents and build a context from the retrieved documents that can be injected into the LLM prompt. | ||
|
||
Args: | ||
queries: A list of queries to use for retrieving documents from the knowledge base | ||
max_context_tokens: The maximum number of tokens to use for the context | ||
|
||
Returns: | ||
A Context object containing the retrieved documents and metadata | ||
|
||
Example: | ||
>>> from canopy.context_engine import ContextEngine | ||
>>> from canopy.models.data_models import Query | ||
>>> context_engine = ContextEngine(knowledge_base=knowledge_base) | ||
>>> context_engine.query(Query(text="What is the capital of France?"), max_context_tokens=1000) | ||
""" # noqa: E501 | ||
query_results = self.knowledge_base.query( | ||
queries, | ||
global_metadata_filter=self.global_metadata_filter) | ||
|
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Love it! 😍