-
Notifications
You must be signed in to change notification settings - Fork 142
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
adding pinecone test #6
base: main
Are you sure you want to change the base?
Conversation
I had this exact idea - store the vectors of entities/attributes in a vector database, tokenize the user's input and use the embeddings to augment the prompt to only include the relevant data. Glad I found this before I started that project. |
Glad to hear that! I tried to hook vector db with this component, but it makes component hard to understand the border between simply using OpenAI and using OpenAI combined with vector db. So I stopped there, and probably will try different approach. |
Can you add a little more detail about where you parked this @jekalmin and what other options you see? |
Of course. Work doneIn this PR, I tried to use Pinecone to store entities and retrieve only relevent entities using similarity search to be used in a system prompt. I added an optional field of "Pinecone API Key" when setup, and added a few options about Pinecone. Problem
Next stepWhat I want to try next is to make a separate Pinecone integration, and hook it via function call like below. System Prompt
Functions - spec:
name: get_entities
description: Get all entities of HA
parameters:
type: object
properties:
query:
type: string
description: User requested query
required:
- query
function:
type: script
sequence:
- service: pinecone.ask_database
data:
prompt: "{{ query }}"
top_k: 5
score_threshold: 0.7
response_variable: _function_result By creating Pinecone integration apart from extended_openai_integration, Advantages
Disadvantages
|
deleted |
add pinecone to perform similarity search before calling chat completion