Skip to content

Commit

Permalink
Added changes to pattern match existing docs.
Browse files Browse the repository at this point in the history
  • Loading branch information
mgregerson committed Apr 10, 2024
1 parent 6ee6c14 commit 3fa3fff
Show file tree
Hide file tree
Showing 25 changed files with 114 additions and 81 deletions.
2 changes: 2 additions & 0 deletions fern/docs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -61,6 +61,8 @@ navigation:
path: ./pages/openllmetry/tracing/entities-traces.mdx
- page: Tracking User Feedback
path: ./pages/openllmetry/tracing/tracking-feedback.mdx
- page: Manually reporting calls to LLMs and Vector DBs
path: ./pages/openllmetry/tracing/manually-reporting-calls.mdx
- page: Manual Implementations (Typescript / Javascript)
path: ./pages/openllmetry/tracing/manual-implementations.mdx
- page: Usage with Threads (Python)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,8 @@ To disable polling all together, set the `TRACELOOP_SYNC_ENABLED` environment va

Make sure you’ve configured the SDK with the right environment and API Key. See the [SDK documentation](/docs/openllmetry/integrations/traceloop) for more information.

<Callout intent="info">
The SDK uses smart caching mechanisms to proide zero latency for fetching prompts.
<Callout intent="note">
The SDK uses smart caching mechanisms to provide zero latency for fetching prompts.
</Callout>

## Get Prompt API
Expand Down Expand Up @@ -67,6 +67,6 @@ Then, you can retrieve it with in your code using `get_prompt`:
</CodeBlock>
</CodeBlocks>

<Callout intent="info">
<Callout intent="note">
The returned variable `prompt_args` is compatible with the API used by the foundation models SDKs (OpenAI, Anthropic, etc.) which means you should directly plug in the response to the appropriate API call.
</Callout>
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ The prompt configuration is composed of two parts:
- The prompt template (system and/or user prompts)
- The model configuration (`temperature`, `top_p`, etc.)

<Callout intent="info">
<Callout intent="tip">
Your prompt template can include variables. Variables are defined according to the syntax of the parser specified. For example, if using `jinjia2` the syntax will be `{{ variable_name }}`. You can then pass variable values to the SDK when calling `get_prompt`. See the example on the [SDK Usage](/fetching-prompts) section.
</Callout>

Expand Down Expand Up @@ -49,7 +49,7 @@ Choose the `Deploy` Tab to navigate to the deployments page for your prompt.

Here, you can see all recent prompt versions, and which environments they are deployed to. Simply click on the `Deploy` button to deploy a prompt version to an environment. Similarly, click `Rollback` to revert to a previous prompt version for a specific environment.

<Callout intent="info">
<Callout intent="note">
As a safeguard, you cannot deploy a prompt to the `Staging` environment before first deploying it to `Development`. Similarly, you cannot deploy to `Production` without first deploying to `Staging`.
</Callout>

Expand All @@ -59,6 +59,6 @@ To fetch prompts from a specific environment, you must supply that environment

If you want to make changes to your prompt after deployment, simply create a new version by clicking on the `New Version` button. New versions will be created in `Draft Mode`.

<Callout intent="warn">
<Callout intent="warning">
If you change the names of variables or add/remove existing variables, you will be required to create a new prompt.
</Callout>
6 changes: 3 additions & 3 deletions fern/pages/openllmetry/contribute/gen-ai.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ This is a work in progress, and we welcome your feedback and contributions!

## Definitions

## LLM Foundation Models
### LLM Foundation Models

| Field | Description |
|------------------------------|--------------------------------------------------------------------------------------------------|
Expand All @@ -34,14 +34,14 @@ This is a work in progress, and we welcome your feedback and contributions!
| `llm.user` | The user ID sent with the request |
| `llm.headers` | The headers used for the request |

## Vector DBs
### Vector DBs

| Field | Description |
|--------------------------|----------------------------------------------------------------|
| `vector_db.vendor` | The vendor of the Vector DB (e.g. Chroma, Pinecone, etc.) |
| `vector_db.query.top_k` | The top k used for the query |

## LLM Frameworks
### LLM Frameworks

| Field | Description |
|---------------------------------|---------------------------------------------------------------------------------------------------|
Expand Down
4 changes: 3 additions & 1 deletion fern/pages/openllmetry/contribute/overview.mdx
Original file line number Diff line number Diff line change
@@ -1,4 +1,6 @@
<h1>We welcome any contributions to OpenLLMetry, big or small.</h1>
---
excerpt: We welcome any contributions to OpenLLMetry, big or small
---

## Community

Expand Down
8 changes: 4 additions & 4 deletions fern/pages/openllmetry/integrations/azure-insights.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -10,15 +10,15 @@ Review how to setup [OpenTelemetry with Python in Azure Application Insights](ht
![integrations-azure](https://fern-image-hosting.s3.amazonaws.com/traceloop/integrations-azure.png)
</Frame>

## Provision an Application Insights instance in the [Azure portal](https://portal.azure.com/).
1. Provision an Application Insights instance in the [Azure portal](https://portal.azure.com/).

## Get your Connection String from the instance - [details here](https://learn.microsoft.com/en-us/azure/azure-monitor/app/sdk-connection-string?tabs=python).
2. Get your Connection String from the instance - [details here](https://learn.microsoft.com/en-us/azure/azure-monitor/app/sdk-connection-string?tabs=python).

## Install required packages
3. Install required packages

`pip install azure-monitor-opentelemetry-exporter traceloop-sdk openai`

## Example implementation
4. Example implementation

```python
import os
Expand Down
12 changes: 1 addition & 11 deletions fern/pages/openllmetry/integrations/dynatrace.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -6,19 +6,12 @@ excerpt: LLM Observability with Dynatrace and OpenLLMetry
![integrations-dynatrace](https://fern-image-hosting.s3.amazonaws.com/traceloop/integrations-dynatrace.png)
</Frame>


## Initialize Dynatrace OpenTelemetry Integration

Analyze all collected LLM traces within Dynatrace by using the native OpenTelemetry ingest endpoint of your Dynatrace environment.

## Generate Access Token for Dynatrace Environment

Go to your Dynatrace environment and create a new access token under **Manage Access Tokens**.

The access token needs the following permission scopes that allow the ingest of OpenTelemetry spans, metrics and logs (`openTelemetryTrace.ingest`, `metrics.ingest`, `logs.ingest`).

## Set Environment Variables for Dynatrace Integration

Set `TRACELOOP_BASE_URL` environment variable to the URL of your Dynatrace OpenTelemetry ingest endpoint.

```
Expand All @@ -31,7 +24,4 @@ Set the `TRACELOOP_HEADERS` environment variable to include your previously crea
TRACELOOP_HEADERS=Authorization=Api-Token%20<YOUR_ACCESS_TOKEN>
```


You're all set!

All the exported spans along with their span attributes will show up within the Dynatrace trace view.
You're all set! All the exported spans along with their span attributes will show up within the Dynatrace trace view.
15 changes: 3 additions & 12 deletions fern/pages/openllmetry/integrations/grafana.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -21,9 +21,8 @@ Note also the **Stack ID** value.
You can find it in the URL `https://grafana.com/orgs/<Your Org Name>/stacks/<Stack ID>`.

## With Grafana Agent
Make sure you have an agent installed and running in your cluster. The host to target your traces is constructed is the hostname of the `URL` noted above, without the `https://` and the trailing `/tempo`.

### Update Agent Configuration
Make sure you have an agent installed and running in your cluster. The host to target your traces is constructed is the hostname of the `URL` noted above, without the `https://` and the trailing `/tempo`.

Add this to the configuration of your agent:

Expand All @@ -42,12 +41,10 @@ traces:
grpc:
```
<Callout intent="warn">
<Callout intent="warning">
Note the endpoint. The URL you need to use is without `https` and the trailing `/`. So `https://tempo-us-central1.grafana.net/tempo` should be used as `tempo-us-central1.grafana.net:443`.
</Callout>

### Set Environmental Variable

Set this as an environment variable in your app:

```
Expand All @@ -56,12 +53,10 @@ TRACELOOP_BASE_URL=http://<grafana-agent-hostname>:4318

## Without Grana Agent

<Callout intent="info">
<Callout intent="note">
Grafana cloud currently only supports sending traces to some of its regions. Before you begin, [check out this list](https://grafana.com/docs/grafana-cloud/monitor-infrastructure/otlp/send-data-otlp/) and make sure your region is supported.
</Callout>

### Get your encoded User ID and API Key

In a terminal, type:

```
Expand All @@ -70,12 +65,8 @@ echo -n "<your stack id>:<your api key>" | base64

The result which is a base64 encoding of your user id and api key.

### Get your URL

The URL you’ll use as the destination for the traces depends on your region/zone. For example, for AWS US Central this will be `prod-us-central-0`. See [here](https://grafana.com/docs/grafana-cloud/monitor-infrastructure/otlp/send-data-otlp/#before-you-begin) for the names of the zones you should use below.

### Set your environmental variables

Finally, you can set the following environment variables when running your app with Traceloop SDK installed:

```
Expand Down
1 change: 0 additions & 1 deletion fern/pages/openllmetry/integrations/instana.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,6 @@ excerpt: LLM Observability with Instana and OpenLLMetry

With Instana, you can export directly to an Instana Agent in your cluster. The Instana Agent will report back the tracing and metrics to the Instana Backend and display them on the Instana UI.


## Edit the agent config file

After an Instana OS agent is installed, edit the agent config file `configuration.yaml` under the `/opt/instana/agent/etc/instana folder`.
Expand Down
9 changes: 4 additions & 5 deletions fern/pages/openllmetry/integrations/traceloop.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -12,20 +12,19 @@ excerpt: LLM Observability with Traceloop

On Traceloop, API keys can be generated from the [Traceloop Dashboard](https://app.traceloop.com/settings/api-keys), for each of the three supported environments (Development, Staging, Production).


### Go to [Traceloop Environments Management](https://app.traceloop.com/settings/api-keys)
Go to [Traceloop Environments Management](https://app.traceloop.com/settings/api-keys)

You can also reach here by clicking on **Environments** on the left-hand navigation bar.

### Click on **Generate API Key**
Click on **Generate API Key**

### Click **Copy Key** to copy the API key
Click **Copy Key** to copy the API key

<Callout intent="info">
API Keys are only displayed once, at the time of their creation and are not stored anywhere. If you lose your API key, you will need to revoke the old one and generate a new one.
</Callout>

### Set the API key as an environment variable named `TRACELOOP_API_KEY`.
Set the API key as an environment variable named `TRACELOOP_API_KEY`.

Done! You’ll get instant visibility into everything that’s happening with your LLM. If you’re calling a vector DB, or any other external service or database, you’ll also see it in the Traceloop dashboard.

Expand Down
2 changes: 1 addition & 1 deletion fern/pages/openllmetry/intro/what-is-llmetry.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@

OpenLLMetry is an open source project that allows you to easily start monitoring and debugging the execution of your LLM app. Tracing is done in a non-intrusive way, built on top of OpenTelemetry. You can choose to export the traces to Traceloop, or to your existing observability stack.

<Callout intent="info">
<Callout intent="tip">
You can use OpenLLMetry whether you use a framework like LangChain, or directly interact with a foundation model API.
</Callout>

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,6 @@ You can decide to selectively enable prompt logging for specific workflows, task

You can decide to selectively enable or disable prompt logging for specific users or workflows.


### Using the Traceloop Platform

We have an API to enable content tracing for specific users, as defined by [association entities](/docs/openllmetry/tracing/associating-entities-with-traces). See the [Traceloop API documentation](/dashboard-api/endpoints) for more information.
Expand Down
2 changes: 1 addition & 1 deletion fern/pages/openllmetry/quickstart/go.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -105,7 +105,7 @@ For Traceloop, read on. For other options, see [Exporting](/docs/openllmetry/int

Go to [Traceloop](https://app.traceloop.com/), and create a new account. Then, click on **Environments** on the left-hand navigation bar. Or go to directly to https://app.traceloop.com/settings/api-keys. Click **Generate API Key** to generate an API key for the development environment and click **Copy API Key** to copy it over.

<Callout intent="warn">
<Callout intent="warning">
Make sure to copy it as it won’t be shown again.
</Callout>

Expand Down
19 changes: 11 additions & 8 deletions fern/pages/openllmetry/quickstart/next.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@ Run the following command in your terminal:
});
```

<Callout intent="warn">
<Callout intent="warning">
Make sure to explictly pass any LLM modules you want to instrument as otherwise auto-instrumentation won’t work on Next.js. Also make sure to set `disableBatch` to `true`.
</Callout>

Expand All @@ -75,7 +75,7 @@ Run the following command in your terminal:
module.exports = nextConfig;
```

<Callout intent="info">
<Callout intent="tip">
See official Next.js [OpenTelemetry docs](https://nextjs.org/docs/pages/building-your-application/optimizing/open-telemetry) for more information.
</Callout>

Expand Down Expand Up @@ -141,7 +141,7 @@ traceloop.initialize({
});
```

<Callout intent="info">
<Callout intent="tip">
See official Next.js [OpenTelemetry docs](https://nextjs.org/docs/pages/building-your-application/optimizing/open-telemetry) for more information.
</Callout>

Expand All @@ -157,7 +157,7 @@ We have a set of [methods and decorators](/docs/openllmetry/tracing/workflows-ta

We also have compatible Typescript decorators for class methods which are more convenient.

<Callout intent="info">
<Callout intent="note">
If you’re using an LLM framework like Haystack, Langchain or LlamaIndex - we’ll do that for you. No need to add any annotations to your code.
</Callout>

Expand Down Expand Up @@ -195,12 +195,15 @@ For Traceloop, read on. For other options, see [Exporting](/docs/openllmetry/int

Go to [Traceloop](https://app.traceloop.com/), and create a new account. Then, click on **Environments** on the left-hand navigation bar. Or go to directly to https://app.traceloop.com/settings/api-keys. Click **Generate API Key** to generate an API key for the development environment and click **Copy API Key** to copy it over.

<Callout intent="warn">
<Callout intent="warning">
Make sure to copy it as it won’t be shown again.
</Callout>

Set the copied Traceloop’s API key as an environment variable in your app named `TRACELOOP_API_KEY`.

<Frame>
![openllmetry-next-2](https://fern-image-hosting.s3.amazonaws.com/traceloop/openllmetry-next-2.png)
</Frame>
</Frame>

Set the copied Traceloop’s API key as an environment variable in your app named `TRACELOOP_API_KEY`.

You're all set! You’ll get instant visibility into everything that’s happening with your LLM. If you’re calling a vector DB, or any other external service or database, you’ll also see it in the Traceloop dashboard.

8 changes: 4 additions & 4 deletions fern/pages/openllmetry/quickstart/node.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
excerpt: Install OpenLLMetry for Node.js by following these 3 easy steps and get instant monitoring.
---

<Callout intent="info">
<Callout intent="note">
If you’re on Next.js, follow the [Next.js guide](/next).
</Callout>

Expand Down Expand Up @@ -35,7 +35,7 @@ import * as traceloop from "@traceloop/node-server-sdk";
traceloop.initialize();
```

<Callout intent="info">
<Callout intent="warning">
Because of the way Javascript works, you must import the Traceloop SDK before importing any LLM module like OpenAI.
</Callout>

Expand All @@ -57,7 +57,7 @@ We have a set of [methods and decorators](/docs/openllmetry/tracing/workflows-ta

We also have compatible Typescript decorators for class methods which are more convenient.

<Callout intent="info">
<Callout intent="launch">
If you’re using an LLM framework like Haystack, Langchain or LlamaIndex - we’ll do that for you. No need to add any annotations to your code.
</Callout>

Expand Down Expand Up @@ -95,7 +95,7 @@ For Traceloop, read on. For other options, see [Exporting](/docs/openllmetry/int

Go to [Traceloop](https://app.traceloop.com/), and create a new account. Then, click on **Environments** on the left-hand navigation bar. Or go to directly to https://app.traceloop.com/settings/api-keys. Click **Generate API Key** to generate an API key for the development environment and click **Copy API Key** to copy it over.

<Callout intent="warn">
<Callout intent="warning">
Make sure to copy it as it won’t be shown again.
</Callout>

Expand Down
4 changes: 2 additions & 2 deletions fern/pages/openllmetry/quickstart/python.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ If you have complex workflows or chains, you can annotate them to get a better u

We have a set of [decorators](/docs/openllmetry/tracing/workflows-tasks-agents-and-tools) to make this easier. Assume you have a function that renders a prompt and calls an LLM, simply add `@workflow` (or for asynchronous methods - `@aworkflow`).

<Callout intent="info">
<Callout intent="note">
If you’re using an LLM framework like Haystack, Langchain or LlamaIndex - we’ll do that for you. No need to add any annotations to your code.
</Callout>

Expand All @@ -71,7 +71,7 @@ For Traceloop, read on. For other options, see [Exporting](/docs/openllmetry/int

Go to [Traceloop](https://app.traceloop.com/), and create a new account. Then, click on **Environments** on the left-hand navigation bar. Or go to directly to https://app.traceloop.com/settings/api-keys. Click **Generate API Key** to generate an API key for the developement environment and click **Copy API Key** to copy it over.

<Callout intent="warn">
<Callout intent="warning">
Make sure to copy it as it won’t be shown again.
</Callout>

Expand Down
6 changes: 3 additions & 3 deletions fern/pages/openllmetry/quickstart/ruby.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
excerpt: Install OpenLLMetry for Ruby by following these 3 easy steps and get instant monitoring.
---

<Callout intent="info">
<Callout intent="note">
This is still in beta. Give us feedback at [[email protected]](mailto:[email protected])
</Callout>

Expand Down Expand Up @@ -31,7 +31,7 @@ require "traceloop/sdk"
traceloop = Traceloop::SDK::Traceloop.new
```

<Callout intent="info">
<Callout intent="tip">
If you’re using Rails, this needs to be in `config/initializers/traceloop.rb`
</Callout>

Expand Down Expand Up @@ -77,7 +77,7 @@ For Traceloop, read on. For other options, see [Exporting](/docs/openllmetry/int

Go to [Traceloop](https://app.traceloop.com/), and create a new account. Then, click on **Environments** on the left-hand navigation bar. Or go to directly to https://app.traceloop.com/settings/api-keys. Click **Generate API Key** to generate an API key for the development environment and click **Copy API Key** to copy it over.

<Callout intent="warn">
<Callout intent="warning">
Make sure to copy it as it won’t be shown again.
</Callout>

Expand Down
Loading

0 comments on commit 3fa3fff

Please sign in to comment.