You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We will download a pre-embedding dataset from pinecone-datasets. Allowing us to skip the embedding and preprocessing steps, if you'd rather work through those steps you can find the [full notebook here](https://colab.research.google.com/github/pinecone-io/examples/blob/master/docs/langchain-retrieval-augmentation.ipynb).
The full notebook here link does not point to a reference which explains how pinecone-datasets was embedded and preprocessed.
Personally, I would like to learn how to embed webpages of documentation (e.g. LangChain Docs) for retrieval augmentation use case. Thank you for your attention!
Expected Behavior
The full notebook here link correctly point to a reference which explains how pinecone-datasets was embedded and preprocessed.
Is this a new bug?
Current Behavior
Verbatim from langchain-retrieval-augmentation.ipynb:
We will download a pre-embedding dataset from pinecone-datasets. Allowing us to skip the embedding and preprocessing steps, if you'd rather work through those steps you can find the [full notebook here](https://colab.research.google.com/github/pinecone-io/examples/blob/master/docs/langchain-retrieval-augmentation.ipynb).
The
full notebook here
link does not point to a reference which explains howpinecone-datasets
was embedded and preprocessed.Personally, I would like to learn how to embed webpages of documentation (e.g. LangChain Docs) for retrieval augmentation use case. Thank you for your attention!
Expected Behavior
The
full notebook here
link correctly point to a reference which explains howpinecone-datasets
was embedded and preprocessed.Steps To Reproduce
Building the Knowledge Base
sectionfull notebook here
linkpinecone-datasets
was embedded and preprocessedRelevant log output
No response
Environment
Additional Context
No response
The text was updated successfully, but these errors were encountered: