Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

(chore): prefix links to point to /docs #29

Merged
merged 2 commits into from
Apr 9, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion fern/definition/gdpr-privacy.yml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ service:
docs: |
By default, all prompts and responses are logged.

If you’ve disabled this behavior by following [this guide](/open-ll-metry/privacy/prompts-completions-and-embeddings), and then selectively enabled it for some of your users, then you can use this API to view which users you’ve enabled.
If you’ve disabled this behavior by following [this guide](/docs/openllmetry/privacy/prompts-completions-and-embeddings), and then selectively enabled it for some of your users, then you can use this API to view which users you’ve enabled.
path: "/data-deletion"
request:
name: GetDeletionStatusRequest
Expand Down
6 changes: 3 additions & 3 deletions fern/definition/tracing.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ service:
docs: |
By default, all prompts and responses are logged.

If you want to disable this behavior by following [this guide](/open-ll-metry/privacy/prompts-completions-and-embeddings), you can selectively enable it for some of your users with this API.
If you want to disable this behavior by following [this guide](/docs/openllmetry/privacy/prompts-completions-and-embeddings), you can selectively enable it for some of your users with this API.
path: "/tracing-allow-list"
method: POST
request: EnableLogging
Expand All @@ -20,7 +20,7 @@ service:
docs: |
By default, all prompts and responses are logged.

If you’ve disabled this behavior by following [this guide](/open-ll-metry/privacy/prompts-completions-and-embeddings), and then selectively enabled it for some of your users, then you can use this API to view which users you’ve enabled.
If you’ve disabled this behavior by following [this guide](/docs/openllmetry/privacy/prompts-completions-and-embeddings), and then selectively enabled it for some of your users, then you can use this API to view which users you’ve enabled.
path: "/tracing-allow-list"
method: GET
response: GetIdentifiersResponse
Expand All @@ -32,7 +32,7 @@ service:
docs: |
By default, all prompts and responses are logged.

If you’ve disabled this behavior by following [this guide](/open-ll-metry/privacy/prompts-completions-and-embeddings), and then selectively enabled it for some of your users, then you can use this API to disable it for previously enabled ones.
If you’ve disabled this behavior by following [this guide](/docs/openllmetry/privacy/prompts-completions-and-embeddings), and then selectively enabled it for some of your users, then you can use this API to disable it for previously enabled ones.
path: "/tracing-allow-list"
method: DELETE
request: DisableLogging
Expand Down
4 changes: 2 additions & 2 deletions fern/docs.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
instances:
- url: traceloop.docs.buildwithfern.com/docs
custom-domain: traceloop.com/docs
custom-domain: fern.traceloop.com/docs

title: Traceloop | Docs

Expand All @@ -15,7 +15,7 @@ tabs:
api:
display-name: Dashboard API
icon: "fa-duotone fa-webhook"

slug: dashboard-api
navigation:
- tab: docs
layout:
Expand Down
8 changes: 4 additions & 4 deletions fern/pages/documentation/learn/intro.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -19,28 +19,28 @@ Traceloop natively plugs into OpenLLMetry SDK. To get started, pick the language
<Card
title="Python"
icon="fa-brands fa-python"
href="/open-ll-metry/quick-start/python"
href="/docs/openllmetry/quick-start/python"
>
Available
</Card>
<Card
title="Javascript / Typescript"
icon="fa-brands fa-node"
href="/open-ll-metry/quick-start/node-js"
href="/docs/openllmetry/quick-start/node-js"
>
Available
</Card>
<Card
title="Go"
icon="fa-brands fa-golang"
href="/open-ll-metry/quick-start/go"
href="/docs/openllmetry/quick-start/go"
>
Beta
</Card>
<Card
title="Ruby"
icon="fa-regular fa-gem"
href="/open-ll-metry/quick-start/ruby"
href="/docs/openllmetry/quick-start/ruby"
>
Beta
</Card>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ The default poll interval is 60 seconds but can be configured with the `TRACELOO

To disable polling all together, set the `TRACELOOP_SYNC_ENABLED` environment variable to false (its enabled by default).

Make sure you’ve configured the SDK with the right environment and API Key. See the [SDK documentation](/open-ll-metry/integrations/traceloop) for more information.
Make sure you’ve configured the SDK with the right environment and API Key. See the [SDK documentation](/docs/openllmetry/integrations/traceloop) for more information.

<Callout intent="info">
The SDK uses smart caching mechanisms to proide zero latency for fetching prompts.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ Here, you can see all recent prompt versions, and which environments they are de
As a safeguard, you cannot deploy a prompt to the `Staging` environment before first deploying it to `Development`. Similarly, you cannot deploy to `Production` without first deploying to `Staging`.
</Callout>

To fetch prompts from a specific environment, you must supply that environment’s API key to the Traceloop SDK. See the [SDK Configuration](/open-ll-metry/integrations/traceloop) for details
To fetch prompts from a specific environment, you must supply that environment’s API key to the Traceloop SDK. See the [SDK Configuration](/docs/openllmetry/integrations/traceloop) for details

## Prompt Versions

Expand Down
6 changes: 3 additions & 3 deletions fern/pages/documentation/prompt-management/quickstart.mdx
Original file line number Diff line number Diff line change
@@ -1,11 +1,11 @@
<div align="center">
<Frame>
![prompt-management-quickstart](https://fern-image-hosting.s3.amazonaws.com/traceloop/prompt-management-quickstart.png)
</div>
</Frame>

You can use Traceloop to manage your prompts and model configurations. That way you can easily experiment with different prompts, and rollout changes gradually and safely.

<Callout intent="info">
Make sure you’ve created an API key and set it as an environment variable `TRACELOOP_API_KEY` before you start. Check out the SDK’s [getting started guide](/open-ll-metry/quick-start/python) for more information.
Make sure you’ve created an API key and set it as an environment variable `TRACELOOP_API_KEY` before you start. Check out the SDK’s [getting started guide](/docs/openllmetry/quick-start/python) for more information.
</Callout>

<Steps>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,4 +8,4 @@ Since Traceloop is emitting standard OTLP HTTP (standard OpenTelemetry protocol)
TRACELOOP_BASE_URL=https://<opentelemetry-collector-hostname>:4318
```

You can connect your collector to Traceloop by following the instructions in the [Traceloop integration section](/open-ll-metry/integrations/traceloop#using-an-opentelemetry-collector).
You can connect your collector to Traceloop by following the instructions in the [Traceloop integration section](/docs/openllmetry/integrations/traceloop#using-an-opentelemetry-collector).
30 changes: 15 additions & 15 deletions fern/pages/openllmetry/integrations/overview.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -7,19 +7,19 @@ Since Traceloop SDK is using OpenTelemetry under the hood, you can see everythin
## The Integrations Catalog

<Cards>
<Card title="Traceloop" href="/open-ll-metry/integrations/traceloop"/>
<Card title="Axiom" href="/open-ll-metry/integrations/axiom"/>
<Card title="Azure Application Insights" href="/open-ll-metry/integrations/azure-applications-insights"/>
<Card title="Datadog" href="/open-ll-metry/integrations/datadog"/>
<Card title="Dynatrace" href="/open-ll-metry/integrations/dynatrace"/>
<Card title="Grafana Tempo" href="/open-ll-metry/integrations/grafana"/>
<Card title="Honeycomb" href="/open-ll-metry/integrations/honeycomb"/>
<Card title="HyperDX" href="/open-ll-metry/integrations/hyper-dx"/>
<Card title="Instana" href="/open-ll-metry/integrations/instana"/>
<Card title="New Relic" href="/open-ll-metry/integrations/new-relic"/>
<Card title="OpenTelemetry Collector" href="/open-ll-metry/integrations/open-telemetry-collector"/>
<Card title="Service Now Cloud Observability" href="/open-ll-metry/integrations/service-now-cloud-observability"/>
<Card title="Sentry" href="/open-ll-metry/integrations/sentry"/>
<Card title="SigNoz" href="/open-ll-metry/integrations/sig-noz"/>
<Card title="Splunk" href="/open-ll-metry/integrations/splunk"/>
<Card title="Traceloop" href="/docs/openllmetry/integrations/traceloop"/>
<Card title="Axiom" href="/docs/openllmetry/integrations/axiom"/>
<Card title="Azure Application Insights" href="/docs/openllmetry/integrations/azure-applications-insights"/>
<Card title="Datadog" href="/docs/openllmetry/integrations/datadog"/>
<Card title="Dynatrace" href="/docs/openllmetry/integrations/dynatrace"/>
<Card title="Grafana Tempo" href="/docs/openllmetry/integrations/grafana"/>
<Card title="Honeycomb" href="/docs/openllmetry/integrations/honeycomb"/>
<Card title="HyperDX" href="/docs/openllmetry/integrations/hyper-dx"/>
<Card title="Instana" href="/docs/openllmetry/integrations/instana"/>
<Card title="New Relic" href="/docs/openllmetry/integrations/new-relic"/>
<Card title="OpenTelemetry Collector" href="/docs/openllmetry/integrations/open-telemetry-collector"/>
<Card title="Service Now Cloud Observability" href="/docs/openllmetry/integrations/service-now-cloud-observability"/>
<Card title="Sentry" href="/docs/openllmetry/integrations/sentry"/>
<Card title="SigNoz" href="/docs/openllmetry/integrations/sig-noz"/>
<Card title="Splunk" href="/docs/openllmetry/integrations/splunk"/>
</Cards>
2 changes: 1 addition & 1 deletion fern/pages/openllmetry/integrations/traceloop.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -55,6 +55,6 @@ excerpt: LLM Observability with Traceloop
exporters: [otlp/traceloop]
```

You can route OpenLLMetry to your collector by following the [OpenTelemetry Collector](/open-ll-metry/integrations/open-telemetry-collector) integration instructions.
You can route OpenLLMetry to your collector by following the [OpenTelemetry Collector](/docs/openllmetry/integrations/open-telemetry-collector) integration instructions.
</Tab>
</Tabs>
12 changes: 6 additions & 6 deletions fern/pages/openllmetry/intro/what-is-llmetry.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -53,22 +53,22 @@ You can use OpenLLMetry whether you use a framework like LangChain, or directly
## Getting Started

<Cards>
<Card title="Start with Python" icon="fa-brands fa-python" href="/open-ll-metry/quick-start/python">
<Card title="Start with Python" icon="fa-brands fa-python" href="/docs/openllmetry/quick-start/python">
Set up Traceloop Python SDK in your project
</Card>
<Card title="Start with Javascript / Typescript" icon="fa-brands fa-node" href="/open-ll-metry/quick-start/node-js">
<Card title="Start with Javascript / Typescript" icon="fa-brands fa-node" href="/docs/openllmetry/quick-start/node-js">
Set up Traceloop Javascript SDK in your project
</Card>
<Card title="Start with Go" icon="fa-brands fa-golang" href="/open-ll-metry/quick-start/go">
<Card title="Start with Go" icon="fa-brands fa-golang" href="/docs/openllmetry/quick-start/go">
Set up Traceloop Go SDK in your project
</Card>
<Card title="Workflows, Agents, and Tools" icon="fa-regular fa-code" href="/open-ll-metry/tracing/workflows-tasks-agents-and-tools">
<Card title="Workflows, Agents, and Tools" icon="fa-regular fa-code" href="/docs/openllmetry/tracing/workflows-tasks-agents-and-tools">
Learn how to annotate your code to enrich your traces
</Card>
<Card title="Integrations" icon="fa-regular fa-bars-staggered" href="/open-ll-metry/integrations/overview">
<Card title="Integrations" icon="fa-regular fa-bars-staggered" href="/docs/openllmetry/integrations/overview">
Learn how to connect to your existing observability stack
</Card>
<Card title="Privacy" icon="fa-regular fa-shield" href="/open-ll-metry/privacy/prompts-completions-and-embeddings">
<Card title="Privacy" icon="fa-regular fa-shield" href="/docs/openllmetry/privacy/prompts-completions-and-embeddings">
How we secure your data
</Card>
</Cards>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ You can decide to selectively enable or disable prompt logging for specific user

<Tabs>
<Tab title="Using the Traceloop Platform">
We have an API to enable content tracing for specific users, as defined by [association entities](/open-ll-metry/tracing/associating-entities-with-traces). See the [Traceloop API documentation](/dashboard-api/endpoints) for more information.
We have an API to enable content tracing for specific users, as defined by [association entities](/docs/openllmetry/tracing/associating-entities-with-traces). See the [Traceloop API documentation](/docs/dashboard-api/endpoints) for more information.
</Tab>
<Tab title="Without the Traceloop Platform">
Set a key called `override_enable_content_tracing` in the OpenTelemetry context to `True` right before making the LLM call you want to trace with prompts. This will create a new context that will instruct instrumentations to log prompts and completions as span attributes.
Expand Down
2 changes: 1 addition & 1 deletion fern/pages/openllmetry/quickstart/go.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -99,7 +99,7 @@ func call_llm() {

Lastly, you’ll need to configure where to export your traces. The 2 environment variables controlling this are `TRACELOOP_API_KEY` and `TRACELOOP_BASE_URL`.

For Traceloop, read on. For other options, see [Exporting](/open-ll-metry/integrations/overview).
For Traceloop, read on. For other options, see [Exporting](/docs/openllmetry/integrations/overview).

### Using Traceloop Cloud

Expand Down
6 changes: 3 additions & 3 deletions fern/pages/openllmetry/quickstart/next.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -158,7 +158,7 @@ You can check out our full working example with Next.js 13 [here](https://github

f you have complex workflows or chains, you can annotate them to get a better understanding of what’s going on. You’ll see the complete trace of your workflow on Traceloop or any other dashboard you’re using.

We have a set of [methods and decorators](/open-ll-metry/tracing/workflows-tasks-agents-and-tools) to make this easier. Assume you have a function that renders a prompt and calls an LLM, simply wrap it in a `withWorkflow()` function call.
We have a set of [methods and decorators](/docs/openllmetry/tracing/workflows-tasks-agents-and-tools) to make this easier. Assume you have a function that renders a prompt and calls an LLM, simply wrap it in a `withWorkflow()` function call.

We also have compatible Typescript decorators for class methods which are more convenient.

Expand Down Expand Up @@ -188,13 +188,13 @@ You can check out our full working example with Next.js 13 [here](https://github
</CodeBlock>
</CodeBlocks>

For more information, see the [dedicated section in the docs](/open-ll-metry/tracing/workflows-tasks-agents-and-tools).
For more information, see the [dedicated section in the docs](/docs/openllmetry/tracing/workflows-tasks-agents-and-tools).

### Configure Trace Exporting

Lastly, you’ll need to configure where to export your traces. The 2 environment variables controlling this are `TRACELOOP_API_KEY` and `TRACELOOP_BASE_URL`.

For Traceloop, read on. For other options, see [Exporting](/open-ll-metry/integrations/overview).
For Traceloop, read on. For other options, see [Exporting](/docs/openllmetry/integrations/overview).

### Using Traceloop Cloud

Expand Down
6 changes: 3 additions & 3 deletions fern/pages/openllmetry/quickstart/node.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ traceloop.initialize({ disableBatch: true });

If you have complex workflows or chains, you can annotate them to get a better understanding of what’s going on. You’ll see the complete trace of your workflow on Traceloop or any other dashboard you’re using.

We have a set of [methods and decorators](/open-ll-metry/tracing/workflows-tasks-agents-and-tools) to make this easier. Assume you have a function that renders a prompt and calls an LLM, simply wrap it in a `withWorkflow()` function call.
We have a set of [methods and decorators](/docs/openllmetry/tracing/workflows-tasks-agents-and-tools) to make this easier. Assume you have a function that renders a prompt and calls an LLM, simply wrap it in a `withWorkflow()` function call.

We also have compatible Typescript decorators for class methods which are more convenient.

Expand Down Expand Up @@ -84,13 +84,13 @@ If you’re using an LLM framework like Haystack, Langchain or LlamaIndex - we
</CodeBlock>
</CodeBlocks>

For more information, see the [dedicated section in the docs](/open-ll-metry/tracing/workflows-tasks-agents-and-tools).
For more information, see the [dedicated section in the docs](/docs/openllmetry/tracing/workflows-tasks-agents-and-tools).

### Configure Trace Exporting

Lastly, you’ll need to configure where to export your traces. The 2 environment variables controlling this are `TRACELOOP_API_KEY` and `TRACELOOP_BASE_URL`.

For Traceloop, read on. For other options, see [Exporting](/open-ll-metry/integrations/overview).
For Traceloop, read on. For other options, see [Exporting](/docs/openllmetry/integrations/overview).

### Using Traceloop Cloud

Expand Down
6 changes: 3 additions & 3 deletions fern/pages/openllmetry/quickstart/python.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ Traceloop.init(disable_batch=True)

If you have complex workflows or chains, you can annotate them to get a better understanding of what’s going on. You’ll see the complete trace of your workflow on Traceloop or any other dashboard you’re using.

We have a set of [decorators](/open-ll-metry/tracing/workflows-tasks-agents-and-tools) to make this easier. Assume you have a function that renders a prompt and calls an LLM, simply add `@workflow` (or for asynchronous methods - `@aworkflow`).
We have a set of [decorators](/docs/openllmetry/tracing/workflows-tasks-agents-and-tools) to make this easier. Assume you have a function that renders a prompt and calls an LLM, simply add `@workflow` (or for asynchronous methods - `@aworkflow`).

<Callout intent="info">
If you’re using an LLM framework like Haystack, Langchain or LlamaIndex - we’ll do that for you. No need to add any annotations to your code.
Expand All @@ -60,13 +60,13 @@ def suggest_answers(question: str):
...
```

For more information, see the [dedicated section in the docs](/open-ll-metry/tracing/workflows-tasks-agents-and-tools).
For more information, see the [dedicated section in the docs](/docs/openllmetry/tracing/workflows-tasks-agents-and-tools).

### Configure trace exporting

Lastly, you’ll need to configure where to export your traces. The 2 environment variables controlling this are `TRACELOOP_API_KEY` and `TRACELOOP_BASE_URL`.

For Traceloop, read on. For other options, see [Exporting](/open-ll-metry/integrations/overview).
For Traceloop, read on. For other options, see [Exporting](/docs/openllmetry/integrations/overview).


### Using Traceloop Cloud
Expand Down
2 changes: 1 addition & 1 deletion fern/pages/openllmetry/quickstart/ruby.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ end

Lastly, you’ll need to configure where to export your traces. The 2 environment variables controlling this are `TRACELOOP_API_KEY` and `TRACELOOP_BASE_URL`.

For Traceloop, read on. For other options, see [Exporting](/open-ll-metry/integrations/overview).
For Traceloop, read on. For other options, see [Exporting](/docs/openllmetry/integrations/overview).

### Using Traceloop Cloud

Expand Down
4 changes: 2 additions & 2 deletions fern/pages/openllmetry/quickstart/sdk-initialization.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ This defines the OpenTelemetry endpoint to connect to. It defaults to https://ap

If you prefix it with `http` or `https`, it will use the OTLP/HTTP protocol. Otherwise, it will use the OTLP/GRPC protocol.

For configuring this to different observability platform, check out our [integrations section](/open-ll-metry/integrations/overview).
For configuring this to different observability platform, check out our [integrations section](/docs/openllmetry/integrations/overview).

<Callout intent="info">
The OpenTelemetry standard defines that the actual endpoint should always end with `/v1/traces`. Thus, if you specify a base URL, we always append `/v1/traces` to it. This is similar to how `OTLP_EXPORTER_OTLP_ENDPOINT` works in all OpenTelemetry SDKs.
Expand Down Expand Up @@ -69,7 +69,7 @@ The OpenTelemetry standard defines that the actual endpoint should always end wi

If set, this is sent as a bearer token on the Authorization header.

[Traceloop](/open-ll-metry/integrations/traceloop), for example, use this to authenticate incoming traces and requests.
[Traceloop](/docs/openllmetry/integrations/traceloop), for example, use this to authenticate incoming traces and requests.

<Callout intent="info">
If this is not set, and the base URL is set to `https://api.traceloop.com`, the SDK will generate a new API key automatically with the Traceloop dashboard.
Expand Down
Loading
Loading