Skip to content

Commit

Permalink
changes to naming
Browse files Browse the repository at this point in the history
  • Loading branch information
MichaelUnkey committed Aug 14, 2024
1 parent 0b6414c commit 88f408a
Show file tree
Hide file tree
Showing 15 changed files with 67 additions and 68 deletions.
5 changes: 2 additions & 3 deletions apps/www/components/blog/blog-code-block.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -92,8 +92,8 @@ export function BlogCodeBlockSingle({ className, children }: any) {
element.click();
}
return (
<div className={cn(CN_BLOG_CODE_BLOCK, className)}>
<div className="flex flex-row justify-end gap-4 mt-2 mr-4 border-white/10">
<div className={cn(CN_BLOG_CODE_BLOCK, className, "pl-4 pb-4")}>
<div className="flex flex-row justify-end gap-4 mt-2 mr-4 border-white/10 ">
<CopyButton value={copyData} />
<button
type="button"
Expand All @@ -109,7 +109,6 @@ export function BlogCodeBlockSingle({ className, children }: any) {
style={darkTheme}
showLineNumbers={true}
highlighter={"hljs"}
s
>
{block.children}
</SyntaxHighlighter>
Expand Down
8 changes: 4 additions & 4 deletions apps/www/content/blog/announcing-unkey-cache-package.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ This performed well, but the developer experience left something to be desired.

Caching is a common requirement in many applications, but traditional approaches often fall short. Here's a typical example of what developers have to deal with:

```ts
```typescript
const cache = new Some3rdPartyCache(...)

type User = { email: string };
Expand All @@ -46,7 +46,7 @@ if (!user) {

`@unkey/cache` abstracts all the boilerplate away and gives you a clean API that is fully type-safe:

```ts
```typescript
const user = await cache.user.swr("chronark", async (id) => {
return await db.query.users.findFirst({
where: (table, { eq }) => eq(table.id, id),
Expand Down Expand Up @@ -78,7 +78,7 @@ npm install @unkey/cache
### Basic cache


```ts
```typescript
import { createCache, DefaultStatefulContext, Namespace } from "@unkey/cache";
import { MemoryStore } from "@unkey/cache/stores";

Expand Down Expand Up @@ -138,7 +138,7 @@ const user = await cache.user.swr("userId", async () => {

Tiered caching is a powerful feature that allows you to chain multiple caches together. This is useful when you want to use a fast, in-memory cache as the first tier and a slower, more persistent cache as the second tier.

```ts
```typescript
import { createCache, DefaultStatefulContext, Namespace } from "@unkey/cache";
import { CloudflareStore, MemoryStore } from "@unkey/cache/stores";

Expand Down
2 changes: 1 addition & 1 deletion apps/www/content/blog/fixing-serverless-with-a-vps.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ So why do we use serverless if it's limited by servers? Well in our case, we use

Our API emits multiple different events, such as the outcome of a verification, or a rate-limit being hit. We use Tinybird for all of our analytics, and we were sending each event individually to Tinybird. This is easy on the worker's side because you just fire and forget.

```ts
```typescript
executionContext.waitUntil(tinybird.ingestKeyVerification({ ... }))
```

Expand Down
2 changes: 1 addition & 1 deletion apps/www/content/blog/high-frequency-usage-billing.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -176,7 +176,7 @@ Once a month, a workflow starts to load all billable workspaces, load their usag

We simply store a `stripeCustomerId` and `subscription`column in our database and can query all workspaces easily:

```ts title="query billable workspaces"
```typescript title="query billable workspaces"
const workspaces = await io.runTask("list workspaces", async () =>
db.query.workspaces.findMany({
where: (table, { isNotNull, isNull, not, eq, and }) =>
Expand Down
10 changes: 5 additions & 5 deletions apps/www/content/blog/identities-beta.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ You can now do this by configuring 2 or more limits on the identity:
- A base level of 10,000 per day
Whichever limit is reached first will be enforced and the request will be rejected.

```ts
```typescript
{
ratelimits: [
{
Expand All @@ -70,7 +70,7 @@ The most common way of limiting is setting a limit on how many requests they may

In this example we offer LLM inference as a service via multiple models and want to limit the number of requests and tokens consumed by each model.

```ts
```typescript
{
ratelimits: [
// baseline ratelimit of 100 requests per second
Expand Down Expand Up @@ -122,7 +122,7 @@ An identity consists of an `externalId`, `meta` and `ratelimits`.
When you create a key, you can now assign it to an identity by providing the `identityId` or `externalId` in the key creation request. This will automatically group the key under the specified identity.
```ts
```json
{
"identityId": "id_123",
"externalId": "user_123"
Expand All @@ -136,7 +136,7 @@ If your identity is configured with metadata and/or ratelimits, you can now [ver
To use the configured ratelimits, you'll need to specify which one you want to use in the `ratelimits` parameter. You can optionally specify a `cost` for each ratelimit, which will be deducted from the ratelimit when the request is verified.
```ts
```bash
curl --request POST \
--url https://api.unkey.dev/v1/keys.verifyKey \
--header 'Content-Type: application/json' \
Expand All @@ -153,7 +153,7 @@ curl --request POST \

The response will include the metadata of the identity, which you can use to make decisions in your API handler.

```ts
```typescript
{
// ...
"valid": true,
Expand Down
12 changes: 6 additions & 6 deletions apps/www/content/blog/introducing-ratelimiting.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ To use our rate limit feature, you can use our API directly, an existing package

Firstly, you will want to configure the ratelimiter. Unkey uses namespaces to allow you to separate different parts of your application and have isolated limits for them. If the namespace doesn't exist, we will create it with the first request. For example, each of your tRPC routes could be a namespace. In this example, we use our `ai.generate` route.

```tsx
```typescript
import { Ratelimitfrom "@unkey/ratelimit"

const unkey = new Ratelimit({
Expand All @@ -34,7 +34,7 @@ const unkey = new Ratelimit({

With the ratelimiter now configured, we can use it in a route to decide whether to proceed with handling the request or reject it with a 429 response. You will need to pass an identifier for the request; this could be anything, but commonly, it is a userId or an IP address.

```tsx
```typescript
async function handler(request) {
const identifier = request.getUserId();

Expand Down Expand Up @@ -62,7 +62,7 @@ You can enable async requests, which will sacrifice minimal accuracy and improve

**Ratelimit configuriation**

```tsx
```typescript
const unkey = new Ratelimit({
// ...
async: true,
Expand All @@ -71,7 +71,7 @@ const unkey = new Ratelimit({

**Using the ratelimiter**

```tsx
```typescript
async function handler(request: NextApiRequest) {
const identifier = request.getUserId();

Expand All @@ -89,7 +89,7 @@ async function handler(request: NextApiRequest) {

Sometimes, you may have an expensive resource. We allow you to set a cost for the request, and we will deduct that cost from the current window and reject a request if it exceeds the allowed amount. For example:

```tsx
```typescript
async function handler(request: NextApiRequest) {
const identifier = request.getUserId();

Expand All @@ -109,7 +109,7 @@ This request would now cost 4 tokens versus 1, allowing you to be flexible about

Unkey provides audit logging out of the box at no additional cost to you. In cases where you want to create a paper trail, you can do by providing the ratelimit request with resource details, for example:

```tsx
```typescript
async function handler(request: NextApiRequest) {
const identifier = request.getUserId();

Expand Down
2 changes: 1 addition & 1 deletion apps/www/content/blog/ocr-service.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ It takes an image either as a file or base64 and does OCR on it and returns the

OCR is done via an npm package [tesseract.js](https://www.npmjs.com/package/tesseract.js). Following is its function which takes in the image and recognizes English and Spanish languages.

```tsx
```typescript
const doOcr = async (image) => {
try {
// It detects English and Spanish
Expand Down
12 changes: 6 additions & 6 deletions apps/www/content/blog/ratelimit-trpc-routes.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ Now that the package is installed and our `.env` has been updated, we can config

In this example, we will configure our ratelimiter in the procedure itself. Of course, you can abstract this into a utility file if you prefer. First, we must import `Ratelimit` from the `@unkey/ratelimit` package and `TRPCError` and `env`.

```ts
```typescript
import { z } from "zod";

import {
Expand All @@ -60,7 +60,7 @@ import { Ratelimit } from "@unkey/ratelimit";

To configure the Ratelimiter, we need to pass four things along, the root key, the namespace, the limit, and the duration of our ratelimiting. Inside the mutation, add the following:

```ts
```typescript
const unkey = new Ratelimit({
rootKey: env.UNKEY_ROOT_KEY,
namespace: "posts.create",
Expand All @@ -75,13 +75,13 @@ The namespace can be anything, but we are using the tRPC route and procedure to

To use the ratelimit, we need an identifier. This can be anything you like, such as a user ID or an IP address. We will be using our user's ID as they are required to be logged in to create a new post. Then, we can call `unkey.limit` with the identifier, and unkey will return a boolean of true or false, which we can use to make a decision.

```ts
```typescript
const { success } = await unkey.limit(ctx.session.user.id);
```

So now we have the boolean we can check if it's false and then throw a TRPCError telling the user they have been ratelimited and stop any more logic running.

```ts
```typescript
const { success } = await unkey.limit(ctx.session.user.id);

if (!success) {
Expand All @@ -95,7 +95,7 @@ At this point, our code is ready to test. Give it a whirl, and try posting multi

Unkey allows you to tell us how expensive a request should be. For example, maybe you have an AI route that costs you a lot more than any other route, so you want to reduce the number of requests that can be used.

```ts
```typescript
const { success } = await unkey.limit(ctx.session.user.id, {
cost: 3,
});
Expand All @@ -107,7 +107,7 @@ This request costs three instead of one, giving you extra flexibility around exp

Although Unkey response times are fast, there are some cases where you are willing to give up some accuracy in favor of quicker response times. You can use our `async` option, which has 98% accuracy, but we don't need to confirm the limit with the origin before returning a decision. You can set this either on the `limit` request or on the configuration itself.

```ts
```typescript
const unkey = new Ratelimit({
rootKey: env.UNKEY_ROOT_KEY,
namespace: "posts.create",
Expand Down
14 changes: 7 additions & 7 deletions apps/www/content/blog/ratelimiting-otp.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ Before we begin with the tutorial, it should be stated that OTP implementations

Let’s start with the sending of an OTP. Below is an insecure OTP implementation with a fake email that sends a random 6-digit code to the user via a next.js server action.

```jsx
```typescript
"use server";
import { randomInt } from "crypto";

Expand Down Expand Up @@ -70,14 +70,14 @@ export async function sendOTP(formData: FormData) {

First, you’ll need to install the `@unkey/ratelimit` package to your project and then add the following imports.

```jsx
```typescript
import { Ratelimit } from "@unkey/ratelimit";
import { headers } from "next/headers";
```

We will use the headers to retrieve the IP of the requester and use that as an identifier to limit against. Now we need to configure the ratelimiter

```jsx
```typescript
const unkey = new Ratelimit({
rootKey: process.env.UNKEY_ROOT_KEY,
namespace: "otp-send",
Expand All @@ -93,7 +93,7 @@ The above code will configure a new namespace named `otp-send` if it doesn’t e
Now that we have our ratelimiter configured, we can modify the request to first retrieve the IP address; this will check for both the forwarded IP address and the real IP from the headers. We will use the forwarded IP first and fall back to the real IP.
```jsx
```typescript
export async function sendOTP(formData: FormData) {
try {
// check for forwarded
Expand All @@ -109,7 +109,7 @@ export async function sendOTP(formData: FormData) {
Now we have access to an identifier, and we can run our rate limit against it. Add the following code before checking if the user has provided an email.
```jsx
```typescript
const { success, reset } = await unkey.limit(
forwardedIP || realIP || "no-ip",
);
Expand Down Expand Up @@ -138,7 +138,7 @@ The endpoint that verifies an OTP has more potential for brute force attacks; se
This is where the flexibility of ratelimiting for Unkey can come into play while it is similar to the above server action. For example
```tsx
```typescript
export async function verifyOTP(prevState: any, formData: FormData) {
try {
// check for forwarded
Expand Down Expand Up @@ -180,7 +180,7 @@ export async function verifyOTP(prevState: any, formData: FormData) {
You can set the limits and namespace to be different, allowing you to be more restrictive and keep all your analytical data separated, for example.
```jsx
```typescript
const unkey = new Ratelimit({
rootKey: process.env.UNKEY_ROOT_KEY!,
namespace: "otp-verify",
Expand Down
16 changes: 8 additions & 8 deletions apps/www/content/blog/secure-supabase-functions-using-unkey.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ supabase functions new hello-world

This command creates a function stub in your Supabase folder at `./functions/hello-world/index.ts`. This stub will have a function that returns the name passed in as data for the request.

```tsx title="./functions/hello-world/index.ts"
```typescript title="./functions/hello-world/index.ts"
import { serve } from "https://deno.land/[email protected]/http/server.ts";

console.log("Hello from Functions!");
Expand Down Expand Up @@ -97,7 +97,7 @@ After invoking your Edge Function, you should see the response `{ "message":"Hel

Now that we have a function, we must add Unkey to secure the endpoint. Supabase uses Deno, so instead of installing our npm package, we will use ESM CDN to provide the `verifyKey` function we need.

```jsx {2} title="./functions/hello-world/index.ts"
```typescript {2} title="./functions/hello-world/index.ts"
import { serve } from "https://deno.land/[email protected]/http/server.ts";
import { verifyKey } from "https://esm.sh/@unkey/api";
```
Expand All @@ -122,7 +122,7 @@ Unkey's `verifykey` lets you verify a key from your end users. We will return a

First, let's remove the boilerplate code from the function so we can work on adding Unkey.

```jsx title="./functions/hello-world/index.ts"
```typescript title="./functions/hello-world/index.ts"
import { serve } from "https://deno.land/[email protected]/http/server.ts";
import { verifyKey } from "https://esm.sh/@unkey/api";

Expand All @@ -131,7 +131,7 @@ serve(async (req) => {});

Next, we will wrap the `serve` function inside a try-catch. Just in case something goes wrong, we can handle that.

```jsx title="./functions/hello-world/index.ts"
```typescript title="./functions/hello-world/index.ts"
import { serve } from "https://deno.land/[email protected]/http/server.ts";
import { verifyKey } from "https://esm.sh/@unkey/api";

Expand All @@ -151,7 +151,7 @@ serve(async (req) => {

Inside our try, we can look for a header containing the user's API Key. In this example we will use `x-unkey-api-key` but you could call the header whatever you want. If there is no header will immediately return 401.

```jsx title="./functions/hello-world/index.ts"
```typescript title="./functions/hello-world/index.ts"
import { serve } from "https://deno.land/[email protected]/http/server.ts";
import { verifyKey } from "https://esm.sh/@unkey/api";

Expand All @@ -174,7 +174,7 @@ serve(async (req) => {

The `verifyKey` function returns a `result` and `error`, making the logic easy to handle. Below is a simplified example of the verification flow.

```tsx
```typescript
const { result, error } = await verifyKey("key_123");
if (error) {
// handle potential network or bad request error
Expand All @@ -192,7 +192,7 @@ console.log(result);

Now you have a basic understanding of verification, let's add this to our Supabase function.

```tsx title="./functions/hello-world/index.ts"
```typescript title="./functions/hello-world/index.ts"
serve(async (req) => {
try {
const token = req.headers.get("x-unkey-api-key");
Expand Down Expand Up @@ -232,7 +232,7 @@ curl -XPOST -H 'Authorization: Bearer <SUPBASE_BEARER_TOKEN>' \
Adding CORS allows us to call our function from the frontend and decide what headers can be passed to our function. Inside your `functions` folder, add a file called `cors.ts`. Inside this cors file, we will tell the Supabase function which headers and origins are allowed.
```tsx title="./functions/cors.ts"
```typescript title="./functions/cors.ts"
export const corsHeaders = {
"Access-Control-Allow-Origin": "*",
"Access-Control-Allow-Headers":
Expand Down
2 changes: 1 addition & 1 deletion apps/www/content/blog/semantic-caching.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ the same question receive a cache hit. The below diagrams give an overview of th
In response to feedback from our AI customers, we're offering semantic caching now as part of Unkey. You can enable it now through
[signing up](https://unkey.dev/semantic-caching) and changing the baseUrl parameter of the OpenAI SDK:

```jsx
```typescript
const openai = new OpenAI({
apiKey: process.env.OPENAI_API_KEY,
baseURL: "https://<gateway>.llm.unkey.io", // change the baseUrl parameter to your gateway name
Expand Down
Loading

0 comments on commit 88f408a

Please sign in to comment.