Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Integrate with Vercel AI SDK or React components #1231

Open
andrewhopper opened this issue Dec 11, 2024 · 2 comments
Open

Integrate with Vercel AI SDK or React components #1231

andrewhopper opened this issue Dec 11, 2024 · 2 comments

Comments

@andrewhopper
Copy link

I'd like to call BAML directly from React client side components to enable rapid prototyping.

Example pseudocode:


interface BamlReactProps {
    callback: Function
    prompt: string
    input_data: string
    output_format: string
}

export default function BamlContextProvider({ prompt, input_data, output_format }: BamlReactProps) {
    const baml_ai_get_feedback = ({ callback, prompt, input_data, output_format }: BamlReactProps) => {
        const ai_result = b.execute(prompt, input_data, output_format);
        ai_result.result.then((ai_result, callback) => {
            const result = await callback(ai_result);
        });
}

import React, { useState } from 'react';
import BamlContext from 'baml-react-context';
interface AddTodo {
    todo: string
}



export function TodoFeedback(todo: string) {
    const [feedback, setFeedback] = useState([]);

    return (
        <BamlContext prompt="Provide for the todo input" in={todo} out={setFeedback}>
            <div className="flex flex-wrap gap-2">
                {feedback.map((item, index) => (
                    <div
                        key={index}
                        className="px-3 py-1 text-sm bg-gray-200 rounded-full hover:bg-gray-300 transition-colors"
                    >
                        {item}
                    </div>
                ))}
            </div>
        </BamlPromptContext>
    );
}




export default function AddTodo(todo: string) {
    return (
        <div className="flex flex-col gap-4">
            <form className="flex gap-2">
                <input
                    type="text"
                    placeholder="Add a new todo..."
                    className="flex-1 px-4 py-2 border rounded-lg focus:outline-none focus:ring-2 focus:ring-blue-500"
                    value={todo}
                    onChange={(e) => setTodo(e.target.value)}
                />
                <button
                    type="submit"
                    className="px-4 py-2 text-white bg-blue-500 rounded-lg hover:bg-blue-600 focus:outline-none focus:ring-2 focus:ring-blue-500"
                    onClick={handleSubmit}
                >
                    Add Todo
                </button>
                <TodoFeedback todo={todo} />
            </form>
            <ul className="space-y-2">
                {todos.map((todo, index) => (
                    <li key={index} className="flex items-center gap-2">
                        <input
                            type="checkbox"
                            checked={todo.completed}
                            onChange={() => toggleTodo(index)}
                            className="w-4 h-4"
                        />
                        <span className={todo.completed ? 'line-through' : ''}>
                            {todo.text}
                        </span>
                        <button
                            onClick={() => deleteTodo(index)}
                            className="ml-auto text-red-500 hover:text-red-600"
                        >
                            Delete
                        </button>
                    </li>
                ))}
            </ul>
        </div>
    );
}
 ` 
@hellovai
Copy link
Contributor

great idea. curious, how do you plan on dealing with API keys? the primary reason we didn't do this is because most folks have to add AI models to their backends anyways, so direct web integration felt like it wouldn't be practical.

Or is the idea, run BAML as a web-rest service, and then produce a JS client to talk to it

@andrewhopper
Copy link
Author

When developing locally I would give the frontend direct keys to the models. When deploying I'd either move the execution to the server or build a lightweight proxy to the LLM that would integrate with the frontend auth with something like a Supabase or Clerk JWT based session.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants