title | description |
---|---|
Introduction |
Welcome to the Langtrace AI documentation |
Langtrace is an open-source observability tool that collects and analyze traces in order to help you improve your LLM apps. Langtrace has two components:
- SDK: The SDK is a lightweight library that can be installed and imported into your project in order to collect traces. The traces are open telemetry based and can be exported to Langtrace or any other observability stack (Grafana, Datadog, Honeycomb etc) without having to use a Langtrace API key.
GitHub:Python SDK
Typescript SDK
trace-attributes
- Langtrace Dashboard: The dashboard is a web-based interface where you can view and analyze your traces.
GitHub:Langtrace
Langtrace optimizes for the following 3 pillars of observability for your LLM apps:
- Usage - Tokens and Cost
- Accuracy
- Performance - Latency and Success Rate
Need help with the SDK? Contact us
The first step to using Langtrace is to import into your Typescript or Python project.
-
Signup to Langtrace here
-
Go to the Langtrace dashboard
-
Create a new project, be sure to provide a name and description
-
Click on the Project name
-
Copy the generated API key. You will need this key to initialize the SDK in your project.
- Python: Install the Langtrace SDK using pip
pip install langtrace-python-sdk
- Typescript: Install the Langtrace SDK using npm
npm i @langtrase/typescript-sdk
// Must precede any llm module imports
import * as Langtrace from "@langtrase/typescript-sdk";
Langtrace.init({ api_key: "<LANGTRACE_API_KEY>" });
from langtrace_python_sdk import langtrace
langtrace.init(api_key = '<LANGTRACE_API_KEY>')
You can now view your traces on the Langtrace dashboard.
Dive into the SDK features. Here are the supported languages
Integrate with your Python project Integrate with your Typescript projectWe are working hard to add support for more languages.
Langtrace offers flexible pricing options to suit various needs and usage levels. Here's an overview of our pricing structure:
Ideal for individuals or hackers just beginning their LLM observability journey.
- Free forever
- 1 user/month
- Includes up to 10K spans/month, prompt management, and evaluations
Designed for growing teams and projects with increasing observability needs.
- $39/user/month
- Usage based pricing: $0.005 / additional span ingested / month (above 50K spans)
- Includes everything in Free Forever
- Evaluations in the cloud
- Team collaboration features
- Security guardrails
Tailored solutions for large-scale enterprises with complex requirements.
- Custom pricing
- Custom retention policy
- Custom SLAs
- Security guardrails
- Team collaboration features
- SOC 2 Type II Compliance
Perfect for developers who prioritize data control and customization.
- Free to get started
- Includes tracing, prompt management, and evaluations that can be accessed by hosting locally
A span represents a single unit of work or operation in your application. For LLM apps, this typically corresponds to an API call or a specific step in your language model pipeline.
If you're just getting started or want to self-host, our Self-Hosted option is perfect. For small teams or projects using our managed service, the Starter plan is ideal. As your usage grows, the Growth plan offers more flexibility and features. Large enterprises with specific needs should consider our Scale plan.
Yes, you can change your plan at any time to match your current needs.
For the Growth plan, you'll be charged for additional spans above the 50K limit. For other plans, please contact our sales team [email protected] to discuss options.
Please contact our sales team to discuss trial options for our paid features.
To discuss specific requirements or to learn more about our Enterprise offerings, please contact us at [email protected].
- What is Langtrace? Langtrace is an open-source observability tool that helps you collect and analyze traces to improve your LLM apps.
- How do I get started with Langtrace? To get started with Langtrace, you need to signup and generate an API key. Then, install and import the SDK into your project to start shipping traces to Langtrace.
- What are the benefits of using Langtrace?
Langtrace helps you with the following:
- Monitor your LLM usage - token and costs, latency and success rate.
- Evaluate the responses of the LLM to measure the accuracy of your LLM apps.
- Create and manage datasets and prompt sets for your LLM apps.
- What languages does Langtrace support? Langtrace currently supports Python and Typescript. We are adding support for more languages.