-
Notifications
You must be signed in to change notification settings - Fork 70
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Portkey Observability Documentation #174
base: main
Are you sure you want to change the base?
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -142,6 +142,68 @@ Here is a step-by-step [example](https://github.com/mistralai/cookbook/blob/main | |
<img src="/img/guides/obs_langfuse2.png" alt="drawing" width="700"/> | ||
|
||
|
||
|
||
Here's the PortkeyAI observability section in the same style as the Mistral docs: | ||
|
||
|
||
### Integration with PortkeyAI | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Hey @siddharthsambharia-portkey thank you for this PR 🙏 May I ask you to add the block on PortkeyAI after the other observability tools already listed (that is, after Thank you so very much! |
||
|
||
PortkeyAI is an open-source AI gateway that provides unified observability across all your Mistral AI requests. It offers real-time analytics, detailed logs, tracing, and metadata tracking through a single API layer. | ||
|
||
<img src="https://raw.githubusercontent.com/siddharthsambharia-portkey/Portkey-Product-Images/refs/heads/main/Portkey-Dashboard.png" alt="Portkey Analytics Dashboard" width="700"/> | ||
|
||
**Pros:** | ||
|
||
* **Open-source AI Gateway** - Self-host or use cloud-managed version | ||
* **Unified API** - OpenAI compliant API that works with Mistral models | ||
* **Real-time Analytics** - Track cost, latency, token usage across all models | ||
* **Metadata Filtering** - Add custom metadata and filter logs by any parameter | ||
* **Request Tracing** - Visualize complete request lifecycle and LLM call chains | ||
* **Feedback API** - Programmatically collect user feedback on generations | ||
|
||
**Key Observability Features:** | ||
|
||
1. **Cost Tracking** - Real-time spend tracking across models/providers | ||
2. **Latency Monitoring** - P95/P99 latency metrics with error budgets | ||
3. **Token Analytics** - Input/output token tracking with cost estimates | ||
4. **Custom Metadata** - Add business context (user IDs, session IDs, etc) | ||
5. **Semantic Search** - Search across all LLM request/responses | ||
6. **Tracing** - Visualize complex workflows and multi-LLM call chains | ||
|
||
**Mistral Integration Example:** | ||
```bash | ||
!pip install portkey-ai | ||
``` | ||
```python | ||
from portkey_ai import Portkey | ||
|
||
portkey = Portkey( | ||
api_key="PORTKEY_API_KEY", | ||
virtual_key="MISTRAL_VIRTUAL_KEY", | ||
metadata={"environment": "production", "user_id": "123"} | ||
) | ||
|
||
# Track metadata and trace ID for observability | ||
response = portkey.with_options( | ||
metadata = { | ||
"_user": "USER_ID", | ||
"environment": "production", | ||
"prompt": "test_prompt", | ||
"session_id": "1729" | ||
}).chat.completions.create( | ||
messages = [{ "role": 'user', "content": 'What is 1729' }], | ||
model = 'your-mistral-model' | ||
) | ||
|
||
print(response.choices[0].message) | ||
``` | ||
|
||
**Getting Started:** | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. would you add a link to the cookbook? Let's agree on which PR you rather have reviewed and then add it here! I am sure it would make using PortkeyAI muuuch easier :)) |
||
- [Mistral Integration Guide on Portkey Docs](https://portkey.sh/mistral) | ||
- [Open-source Gateway GitHub](https://github.com/Portkey-AI/gateway) | ||
|
||
|
||
|
||
### Integration with Arize Phoenix | ||
|
||
Phoenix is an open-source observability library designed for experimentation, evaluation, and troubleshooting. It is designed to support agents, RAG pipelines, and other LLM applications. | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thank you for the PR @siddharthsambharia-portkey
I am pretty sure you did not intend to add this ;) Can you remove it please?