Skip to content

Commit

Permalink
docs: update readme
Browse files Browse the repository at this point in the history
  • Loading branch information
rxliuli committed Aug 24, 2024
1 parent 9dbed19 commit d37acf5
Show file tree
Hide file tree
Showing 2 changed files with 12 additions and 0 deletions.
6 changes: 6 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -85,3 +85,9 @@ const response = await openai.chat.completions.create({

console.log(response)
```

## Motivation

I'm using Vertex AI's Anthropic model, but found that many LLM tools don't support configuring it directly. This prompted me to develop an API proxy. With this proxy, I can seamlessly use other AI models in any tool that supports the OpenAI API.

Although there are some commercial services that resell LLM tokens, they usually require routing through their servers. Well, there's no need for another third party to know how I'm using it. This proxy can be deployed to any Edge Runtime environment, such as Cloudflare Workers, which provides up to 100k free requests per day for individuals.
6 changes: 6 additions & 0 deletions README.zh-CN.md
Original file line number Diff line number Diff line change
Expand Up @@ -85,3 +85,9 @@ const response = await openai.chat.completions.create({

console.log(response)
```

## 动机

我正在使用 Vertex AI 的 Anthropic 模型,但发现许多 LLM 工具不支持直接配置它。这促使我萌生了开发一个 API 代理的想法。通过这个代理,我可以在任何支持 OpenAI API 的工具中无缝使用其他 AI 模型。

虽然已经有一些转卖 LLM token 的商业服务,但它们通常需要通过他们的服务器中转。嗯,没必要让另一个第三方知道我如何使用的。这个代理可以部署到任何 Edge Runtime 环境,例如 Cloudflare Workers,对于个人而言,它提供高达 100k/天的免费请求次数。

0 comments on commit d37acf5

Please sign in to comment.