Ollama + Mixtral + Langchain example (deployed to Cloudflare)
You can use Cloudflare Tunnel to open a secure HTTPS tunnel to a running Ollama instance on your local computer.
Using Cloudflare Workers, you can connect the Ollama instance to Langchain and use cutting-edge models like Mixtral.
- Configure your Workers project and deploy it (see "Get started" if you don't know how)
- Install cloudflared.
- Start a new tunnel with
tunnel.sh
. - Find the output of your new Cloudflare Tunnel and set it as the
TUNNEL
secret in your Workers project:echo "YOURURL" | npx wrangler secret put TUNNEL
. - Request
GET $workersURL/
with an optional query paramquery
to directly query the model on your local machine. The response is streaming!
This is not recommended for production use, but shows how you can use local tooling to generate embeddings instead of relying on a third-party API.