Skip to content

Commit

Permalink
Updated to match current design on Mintlify. Ensured all links are po…
Browse files Browse the repository at this point in the history
…inting to the correct page and not breaking.
  • Loading branch information
mgregerson committed Apr 10, 2024
1 parent 3fa3fff commit d3439dc
Show file tree
Hide file tree
Showing 28 changed files with 241 additions and 257 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ To disable polling all together, set the `TRACELOOP_SYNC_ENABLED` environment va

Make sure you’ve configured the SDK with the right environment and API Key. See the [SDK documentation](/docs/openllmetry/integrations/traceloop) for more information.

<Callout intent="note">
<Callout intent="tip">
The SDK uses smart caching mechanisms to provide zero latency for fetching prompts.
</Callout>

Expand Down Expand Up @@ -67,6 +67,6 @@ Then, you can retrieve it with in your code using `get_prompt`:
</CodeBlock>
</CodeBlocks>

<Callout intent="note">
<Callout intent="tip">
The returned variable `prompt_args` is compatible with the API used by the foundation models SDKs (OpenAI, Anthropic, etc.) which means you should directly plug in the response to the appropriate API call.
</Callout>
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ The prompt configuration is composed of two parts:
- The model configuration (`temperature`, `top_p`, etc.)

<Callout intent="tip">
Your prompt template can include variables. Variables are defined according to the syntax of the parser specified. For example, if using `jinjia2` the syntax will be `{{ variable_name }}`. You can then pass variable values to the SDK when calling `get_prompt`. See the example on the [SDK Usage](/fetching-prompts) section.
Your prompt template can include variables. Variables are defined according to the syntax of the parser specified. For example, if using `jinjia2` the syntax will be `{{ variable_name }}`. You can then pass variable values to the SDK when calling `get_prompt`. See the example on the [SDK Usage](/docs/documentation/prompt-management/fetching-prompts) section.
</Callout>

Initially, prompts are created in `Draft Mode`. In this mode, you can make changes to the prompt and configuration. You can also test your prompt in the playground (see below).
Expand Down Expand Up @@ -53,7 +53,7 @@ Here, you can see all recent prompt versions, and which environments they are de
As a safeguard, you cannot deploy a prompt to the `Staging` environment before first deploying it to `Development`. Similarly, you cannot deploy to `Production` without first deploying to `Staging`.
</Callout>

To fetch prompts from a specific environment, you must supply that environment’s API key to the Traceloop SDK. See the [SDK Configuration](/docs/openllmetry/integrations/traceloop) for details
To fetch prompts from a specific environment, you must supply that environment’s API key to the Traceloop SDK. See the [SDK Configuration](/openllmetry/integrations/traceloop) for details

## Prompt Versions

Expand Down
10 changes: 5 additions & 5 deletions fern/pages/documentation/prompt-management/quickstart.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@

You can use Traceloop to manage your prompts and model configurations. That way you can easily experiment with different prompts, and rollout changes gradually and safely.

<Callout intent="info">
<Callout intent="note">
Make sure you’ve created an API key and set it as an environment variable `TRACELOOP_API_KEY` before you start. Check out the SDK’s [getting started guide](/docs/openllmetry/quick-start/python) for more information.
</Callout>

Expand All @@ -17,9 +17,9 @@ Click **New Prompt** to create a new prompt. Give it a name, which will be used

Set the system and/or user prompt. You can use variables in your prompt by following the [Jinja format](https://jinja.palletsprojects.com/en/3.1.x/templates/) of `{{ variable_name }}`. The values of these variables will be passed in when you retrieve the prompt in your code.

For more information see the [Registry Documentation](/prompt-registry).
For more information see the [Registry Documentation](/docs/documentation/prompt-management/prompt-registry).

<Callout intent="info" icon="fa-light fa-lightbulb">
<Callout intent="tip">
This screen is also a prompt playground. Give the prompt a try by clicking **Test** at the bottom.
</Callout>

Expand Down Expand Up @@ -90,9 +90,9 @@ Retrieve your prompt by using the `get_prompt` function. For example, if you’v
</CodeBlock>
</CodeBlocks>

<Callout intent="info">
<Callout intent="note">
The returned variable `prompt_args` is compatible with the API used by the foundation models SDKs (OpenAI, Anthropic, etc.) which means you can directly plug in the response to the appropriate API call.
</Callout>

For more information see the [SDK Usage Documentation](/fetching-prompts).
For more information see the [SDK Usage Documentation](/docs/documentation/prompt-management/fetching-prompts).
</Steps>
2 changes: 1 addition & 1 deletion fern/pages/openllmetry/integrations/axiom.mdx
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
excerpt: LLM Observability with Axiom and OpenLLMetry
title: LLM Observability with Axiom and OpenLLMetry
---

<Frame>
Expand Down
2 changes: 1 addition & 1 deletion fern/pages/openllmetry/integrations/azure-insights.mdx
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
excerpt: Azure Application Insights
title: Azure Application Insights
---

Traceloop supports sending traces to Azure Application Insights via standard OpenTelemetry integrations.
Expand Down
2 changes: 1 addition & 1 deletion fern/pages/openllmetry/integrations/datadog.mdx
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
excerpt: LLM Observability with Datadog and OpenLLMetry
title: LLM Observability with Datadog and OpenLLMetry
---

With datadog, there are 2 options - you can either export directly to a Datadog Agent in your cluster, or through an OpenTelemetry Collector (which requires that you deploy one in your cluster).
Expand Down
14 changes: 7 additions & 7 deletions fern/pages/openllmetry/integrations/dynatrace.mdx
Original file line number Diff line number Diff line change
@@ -1,27 +1,27 @@
---
excerpt: LLM Observability with Dynatrace and OpenLLMetry
title: LLM Observability with Dynatrace and OpenLLMetry
---

<Frame>
![integrations-dynatrace](https://fern-image-hosting.s3.amazonaws.com/traceloop/integrations-dynatrace.png)
</Frame>

Analyze all collected LLM traces within Dynatrace by using the native OpenTelemetry ingest endpoint of your Dynatrace environment.
- Analyze all collected LLM traces within Dynatrace by using the native OpenTelemetry ingest endpoint of your Dynatrace environment.

Go to your Dynatrace environment and create a new access token under **Manage Access Tokens**.
- Go to your Dynatrace environment and create a new access token under **Manage Access Tokens**.

The access token needs the following permission scopes that allow the ingest of OpenTelemetry spans, metrics and logs (`openTelemetryTrace.ingest`, `metrics.ingest`, `logs.ingest`).
- The access token needs the following permission scopes that allow the ingest of OpenTelemetry spans, metrics and logs (`openTelemetryTrace.ingest`, `metrics.ingest`, `logs.ingest`).

Set `TRACELOOP_BASE_URL` environment variable to the URL of your Dynatrace OpenTelemetry ingest endpoint.
- Set `TRACELOOP_BASE_URL` environment variable to the URL of your Dynatrace OpenTelemetry ingest endpoint.

```
TRACELOOP_BASE_URL=https://<YOUR_ENV>.live.dynatrace.com\api\v2\otlp
```

Set the `TRACELOOP_HEADERS` environment variable to include your previously created access token
- Set the `TRACELOOP_HEADERS` environment variable to include your previously created access token

```
TRACELOOP_HEADERS=Authorization=Api-Token%20<YOUR_ACCESS_TOKEN>
```

You're all set! All the exported spans along with their span attributes will show up within the Dynatrace trace view.
- You're all set! All the exported spans along with their span attributes will show up within the Dynatrace trace view.
22 changes: 3 additions & 19 deletions fern/pages/openllmetry/integrations/grafana.mdx
Original file line number Diff line number Diff line change
@@ -1,24 +1,8 @@
---
excerpt: LLM Observability with Grafana and OpenLLMetry
title: LLM Observability with Grafana and OpenLLMetry
---

## Access Grafana Cloud Account for Tempo Integration

Go to the Grafana Cloud account page under `https://grafana.com/orgs/<your org name>`, and click on **Send Traces** under Tempo

### Retrieve URL from Grafana Data Source Settings

In **Grafana Data Source settings**, note the **URL** value

### Generate API Key for Tempo Integration

Click **Generate now** to generate an API key and copy it

### Record Stack ID for Integration Configuration

Note also the **Stack ID** value.

You can find it in the URL `https://grafana.com/orgs/<Your Org Name>/stacks/<Stack ID>`.
First, go to the Grafana Cloud account page under `https://grafana.com/orgs/<your org name>`, and click on **Send Traces** under Tempo. In **Grafana Data Source settings**, note the **URL** value. Click **Generate now** to generate an API key and copy it. Note also the **Stack ID** value. You can find it in the URL `https://grafana.com/orgs/<Your Org Name>/stacks/<Stack ID>`.

## With Grafana Agent

Expand All @@ -41,7 +25,7 @@ traces:
grpc:
```
<Callout intent="warning">
<Callout intent="info">
Note the endpoint. The URL you need to use is without `https` and the trailing `/`. So `https://tempo-us-central1.grafana.net/tempo` should be used as `tempo-us-central1.grafana.net:443`.
</Callout>

Expand Down
2 changes: 1 addition & 1 deletion fern/pages/openllmetry/integrations/honeycomb.mdx
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
excerpt: LLM Observability with Honeycomb and OpenLLMetry
title: LLM Observability with Honeycomb and OpenLLMetry
---

<Frame>
Expand Down
2 changes: 1 addition & 1 deletion fern/pages/openllmetry/integrations/hyperdx.mdx
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
excerpt: LLM Observability with HyperDX and OpenLLMetry
title: LLM Observability with HyperDX and OpenLLMetry
---

<Frame>
Expand Down
10 changes: 3 additions & 7 deletions fern/pages/openllmetry/integrations/instana.mdx
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
excerpt: LLM Observability with Instana and OpenLLMetry
title: LLM Observability with Instana and OpenLLMetry
---

<Frame>
Expand All @@ -8,16 +8,14 @@ excerpt: LLM Observability with Instana and OpenLLMetry

With Instana, you can export directly to an Instana Agent in your cluster. The Instana Agent will report back the tracing and metrics to the Instana Backend and display them on the Instana UI.

## Edit the agent config file

After an Instana OS agent is installed, edit the agent config file `configuration.yaml` under the `/opt/instana/agent/etc/instana folder`.

```bash
cd /opt/instana/agent/etc/instana
vi configuration.yaml
```

## Add the following to the file
Add the following to the file:

```yaml
com.instana.plugin.opentelemetry:
Expand All @@ -26,16 +24,14 @@ com.instana.plugin.opentelemetry:
enabled: true
```
## Restart the Instana agent
Restart the Instana agent:
```
systemctl restart instana-agent.service
```

The Instana agent should be ready for OpenTelemetry data at `port 4317`.

## Set your TRACELOOP_BASE_URL variable

Finally, set this env var, and you’re done!

```
Expand Down
2 changes: 1 addition & 1 deletion fern/pages/openllmetry/integrations/new-relic.mdx
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
excerpt: LLM observability with New Relic and OpenLLMetry
title: LLM observability with New Relic and OpenLLMetry
---

<Frame>
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
excerpt: LLM observability with OpenTelemetry Collector
title: LLM observability with OpenTelemetry Collector
---

Since Traceloop is emitting standard OTLP HTTP (standard OpenTelemetry protocol), you can use any OpenTelemetry Collector, which gives you the flexibility to then connect to any backend you want. First, [deploy an OpenTelemetry Collector](https://opentelemetry.io/docs/kubernetes/operator/automatic/#create-an-opentelemetry-collector-optional) in your cluster. Then, point the output of the Traceloop SDK to the collector by setting:
Expand Down
2 changes: 1 addition & 1 deletion fern/pages/openllmetry/integrations/overview.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ Since Traceloop SDK is using OpenTelemetry under the hood, you can see everythin
<Card title="HyperDX" href="/docs/openllmetry/integrations/hyper-dx"/>
<Card title="Instana" href="/docs/openllmetry/integrations/instana"/>
<Card title="New Relic" href="/docs/openllmetry/integrations/new-relic"/>
<Card title="OpenTelemetry Collector" href="/docs/openllmetry/integrations/open-telemetry-collector"/>
<Card title="OpenTelemetry Collector" href="/docs/openllmetry/integrations/docs/open-telemetry-collector"/>
<Card title="Service Now Cloud Observability" href="/docs/openllmetry/integrations/service-now-cloud-observability"/>
<Card title="Sentry" href="/docs/openllmetry/integrations/sentry"/>
<Card title="SigNoz" href="/docs/openllmetry/integrations/sig-noz"/>
Expand Down
2 changes: 1 addition & 1 deletion fern/pages/openllmetry/integrations/sentry.mdx
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
excerpt: LLM observability with Sentry
title: LLM Observability with Sentry and OpenLLMetry
---

## Install Sentry SDK with OpenTelemetry support
Expand Down
2 changes: 1 addition & 1 deletion fern/pages/openllmetry/integrations/service-now.mdx
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
excerpt: LLM Observability with Service Now Cloud Observability and OpenLLMetry
title: LLM Observability with Service Now Cloud Observability and OpenLLMetry
---

<Frame>
Expand Down
2 changes: 1 addition & 1 deletion fern/pages/openllmetry/integrations/signoz.mdx
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
excerpt: LLM Observability with SigNoz and OpenLLMetry
title: LLM Observability with SigNoz and OpenLLMetry
---

<Frame>
Expand Down
16 changes: 4 additions & 12 deletions fern/pages/openllmetry/integrations/splunk.mdx
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
excerpt: LLM Observability with Splunk and OpenLLMetr
excerpt: LLM Observability with Splunk and OpenLLMetry
---

<Frame>
Expand All @@ -8,8 +8,6 @@ excerpt: LLM Observability with Splunk and OpenLLMetr

Collecting and analyzing LLM traces in [Splunk Observability Cloud](https://www.splunk.com/en_us/products/observability.html) can be achieved by configuring the `TRACELOOP_BASE_URL` environment variable to point to the [Splunk OpenTelemetry Collector](https://github.com/signalfx/splunk-otel-collector/releases) OTLP endpoint.

## Configure Collector for OTLP Reception

Have the Collector run in agent or gateway mode and ensure the OTLP receiver is configured, see [Get data into Splunk Observability Cloud](https://docs.splunk.com/observability/en/gdi/get-data-in/get-data-in.html).

```yaml
Expand All @@ -22,9 +20,7 @@ receivers:
endpoint: "0.0.0.0:4318"
```
## Set Up OTLP Exporter for Splunk Cloud
Ensure the OTLP exporter is configured to send to Splunk Observability Cloud:
Secondly, ensure the OTLP exporter is configured to send to Splunk Observability Cloud:
```yaml
exporters:
Expand All @@ -36,9 +32,7 @@ exporters:
num_consumers: 32
```
## Integrate OTLP in Traces Pipeline
Make sure otlp is defined in the traces pipeline:
Thirdly, make sure otlp is defined in the traces pipeline:
```yaml
pipelines:
Expand All @@ -51,9 +45,7 @@ Make sure otlp is defined in the traces pipeline:
exporters: [sapm]
```
## Define Endpoint Environment Variable
Define the `TRACELOOP_BASE_URL` environment variable to point to the Splunk OpenTelemetry Collector OTLP endpoint:
Finally, define the `TRACELOOP_BASE_URL` environment variable to point to the Splunk OpenTelemetry Collector OTLP endpoint:

```
TRACELOOP_BASE_URL=http://<splunk-otel-collector>:4318
Expand Down
2 changes: 1 addition & 1 deletion fern/pages/openllmetry/integrations/traceloop.mdx
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
excerpt: LLM Observability with Traceloop
title: LLM Observability with Traceloop
---

<Frame>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ You can decide to selectively enable or disable prompt logging for specific user

### Using the Traceloop Platform

We have an API to enable content tracing for specific users, as defined by [association entities](/docs/openllmetry/tracing/associating-entities-with-traces). See the [Traceloop API documentation](/dashboard-api/endpoints) for more information.
We have an API to enable content tracing for specific users, as defined by [association entities](/docs/openllmetry/tracing/associating-entities-with-traces). See the [Traceloop API documentation](/docs/dashboard-api/endpoints) for more information.

### Without the Traceloop Platform

Expand Down
2 changes: 1 addition & 1 deletion fern/pages/openllmetry/privacy/telemetry.mdx
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
OpenLLMetry contains a telemetry feature that collects anonymous usage information.

<Callout intent="info">
<Callout intent="success">
Not to be confused with OpenTelemetry. Telemetry refers to anonymous product usage statistics we collect. It is a completely different stream of data, and is not related to OpenTelemetry, traces, or instrumentations.
</Callout>

Expand Down
15 changes: 9 additions & 6 deletions fern/pages/openllmetry/quickstart/go.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,9 @@
excerpt: Install OpenLLMetry for Go by following these 3 easy steps and get instant monitoring.
---

<Steps>

## Install the SDK
### Install the SDK

Run the following command in your terminal:

Expand All @@ -27,7 +28,7 @@ func main() {
}
```

## Log your prompts
### Log your prompts

<Frame>
![openllmetry-go](https://fern-image-hosting.s3.amazonaws.com/traceloop/openllmetry-go.png)
Expand Down Expand Up @@ -95,13 +96,13 @@ func call_llm() {

```

## Configure Trace Exporting
### Configure Trace Exporting

Lastly, you’ll need to configure where to export your traces. The 2 environment variables controlling this are `TRACELOOP_API_KEY` and `TRACELOOP_BASE_URL`.

For Traceloop, read on. For other options, see [Exporting](/docs/openllmetry/integrations/overview).
For Traceloop, read on. For other options, see [Exporting](/openllmetry/integrations/overview).

## Using Traceloop Cloud
<h3>Using Traceloop Cloud</h3>

Go to [Traceloop](https://app.traceloop.com/), and create a new account. Then, click on **Environments** on the left-hand navigation bar. Or go to directly to https://app.traceloop.com/settings/api-keys. Click **Generate API Key** to generate an API key for the development environment and click **Copy API Key** to copy it over.

Expand All @@ -115,4 +116,6 @@ Make sure to copy it as it won’t be shown again.

Set the copied Traceloop’s API key as an environment variable in your app named `TRACELOOP_API_KEY`.

You're all set! You’ll get instant visibility into everything that’s happening with your LLM. If you’re calling a vector DB, or any other external service or database, you’ll also see it in the Traceloop dashboard.
You're all set! You’ll get instant visibility into everything that’s happening with your LLM. If you’re calling a vector DB, or any other external service or database, you’ll also see it in the Traceloop dashboard.

</Steps>
Loading

0 comments on commit d3439dc

Please sign in to comment.