Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

There was an error processing your request: Custom error: Failed after 3 attempts. Last error: Internal Server Error #1178

Closed
loki-smip opened this issue Jan 26, 2025 · 11 comments
Labels
question Further information is requested

Comments

@loki-smip
Copy link

loki-smip commented Jan 26, 2025

Describe the bug

There was an error processing your request: Custom error: Failed after 3 attempts. Last error: Internal Server Error
im useing ollama with deepseek-r1:latest in Ubuntu and im geting this

DEBUG api.chat Total message length: 2, words
INFO LLMManager Getting dynamic models for Ollama
INFO LLMManager Got 3 dynamic models for Ollama
INFO stream-text Sending llm call to Ollama with model qwen2.5-coder:latest
DEBUG Ollama Base Url used: http://127.0.0.1:11434
ERROR api.chat AI_RetryError: Failed after 3 attempts. Last error: Internal Server Error
DEBUG api.chat Total message length: 2, words
INFO LLMManager Found 3 cached models for Ollama
INFO stream-text Sending llm call to Ollama with model qwen2.5-coder:latest
DEBUG Ollama Base Url used: http://127.0.0.1:11434
ERROR api.chat AI_RetryError: Failed after 3 attempts. Last error: Internal Server Error
ERROR LLMManager Error getting dynamic models Hyperbolic : Missing Api Key configuration for Hyperbolic provider
INFO LLMManager Caching 3 dynamic models for Ollama
Error: No baseUrl found for OLLAMA provider
at OllamaProvider.getDynamicModels (/home/loki/Downloads/bolt.diy-0.0.6/app/lib/modules/llm/providers/ollama.ts:58:13)
at loader (/home/loki/Downloads/bolt.diy-0.0.6/app/routes/api.models.ts:66:26)
at Object.callRouteLoader (/home/loki/Downloads/bolt.diy-0.0.6/node_modules/.pnpm/@remix-run[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/data.js:59:22)
at handler (/home/loki/Downloads/bolt.diy-0.0.6/node_modules/.pnpm/@remix-run[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/routes.js:54:39)
at actualHandler (/home/loki/Downloads/bolt.diy-0.0.6/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:4887:14)
at /home/loki/Downloads/bolt.diy-0.0.6/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:4901:13
at runHandler (/home/loki/Downloads/bolt.diy-0.0.6/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:4906:6)
at callLoaderOrAction (/home/loki/Downloads/bolt.diy-0.0.6/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:4963:22)
at Object.resolve (/home/loki/Downloads/bolt.diy-0.0.6/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:4814:11)
at map (/home/loki/Downloads/bolt.diy-0.0.6/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:4772:62)
at Array.map ()
at dataStrategyImpl (/home/loki/Downloads/bolt.diy-0.0.6/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:4772:49)
at callDataStrategyImpl (/home/loki/Downloads/bolt.diy-0.0.6/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:4835:23)
at callDataStrategy (/home/loki/Downloads/bolt.diy-0.0.6/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:3992:25)
at loadRouteData (/home/loki/Downloads/bolt.diy-0.0.6/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:3937:25)
at queryImpl (/home/loki/Downloads/bolt.diy-0.0.6/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:3696:26)
at Object.queryRoute (/home/loki/Downloads/bolt.diy-0.0.6/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:3629:24)
at handleResourceRequest (/home/loki/Downloads/bolt.diy-0.0.6/node_modules/.pnpm/@remix-run[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/server.js:402:40)
at requestHandler (/home/loki/Downloads/bolt.diy-0.0.6/node_modules/.pnpm/@remix-run[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/server.js:156:24)
at /home/loki/Downloads/bolt.diy-0.0.6/node_modules/.pnpm/@remix-run+dev@2.15.0_@remix-run[email protected][email protected]_react@[email protected]_typ_3djlhh3t6jbfog2cydlrvgreoy/node_modules/@remix-run/dev/dist/vite/cloudflare-proxy-plugin.js:70:31
at processTicksAndRejections (node:internal/process/task_queues:95:5)
ERROR LLMManager Error getting dynamic models Hyperbolic : Missing Api Key configuration for Hyperbolic provider
DEBUG api.chat Total message length: 2, words
INFO LLMManager Getting dynamic models for Ollama
INFO LLMManager Got 3 dynamic models for Ollama
INFO stream-text Sending llm call to Ollama with model deepseek-r1:latest
DEBUG Ollama Base Url used: http://127.0.0.1:11434
ERROR api.chat AI_RetryError: Failed after 3 attempts. Last error: Internal Server Error
Error: No baseUrl found for OLLAMA provider
at OllamaProvider.getDynamicModels (/home/loki/Downloads/bolt.diy-0.0.6/app/lib/modules/llm/providers/ollama.ts:58:13)
at loader (/home/loki/Downloads/bolt.diy-0.0.6/app/routes/api.models.ts:66:26)
at Object.callRouteLoader (/home/loki/Downloads/bolt.diy-0.0.6/node_modules/.pnpm/@remix-run[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/data.js:59:22)
at handler (/home/loki/Downloads/bolt.diy-0.0.6/node_modules/.pnpm/@remix-run[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/routes.js:54:39)
at actualHandler (/home/loki/Downloads/bolt.diy-0.0.6/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:4887:14)
at /home/loki/Downloads/bolt.diy-0.0.6/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:4901:13
at runHandler (/home/loki/Downloads/bolt.diy-0.0.6/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:4906:6)
at callLoaderOrAction (/home/loki/Downloads/bolt.diy-0.0.6/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:4963:22)
at Object.resolve (/home/loki/Downloads/bolt.diy-0.0.6/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:4814:11)
at map (/home/loki/Downloads/bolt.diy-0.0.6/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:4772:62)
at Array.map ()
at dataStrategyImpl (/home/loki/Downloads/bolt.diy-0.0.6/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:4772:49)
at callDataStrategyImpl (/home/loki/Downloads/bolt.diy-0.0.6/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:4835:23)
at callDataStrategy (/home/loki/Downloads/bolt.diy-0.0.6/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:3992:25)
at loadRouteData (/home/loki/Downloads/bolt.diy-0.0.6/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:3937:25)
at queryImpl (/home/loki/Downloads/bolt.diy-0.0.6/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:3696:26)
at Object.queryRoute (/home/loki/Downloads/bolt.diy-0.0.6/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:3629:24)
at handleResourceRequest (/home/loki/Downloads/bolt.diy-0.0.6/node_modules/.pnpm/@remix-run[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/server.js:402:40)
at requestHandler (/home/loki/Downloads/bolt.diy-0.0.6/node_modules/.pnpm/@remix-run[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/server.js:156:24)
at /home/loki/Downloads/bolt.diy-0.0.6/node_modules/.pnpm/@remix-run+dev@2.15.0_@remix-run[email protected][email protected]_react@[email protected]_typ_3djlhh3t6jbfog2cydlrvgreoy/node_modules/@remix-run/dev/dist/vite/cloudflare-proxy-plugin.js:70:31
at processTicksAndRejections (node:internal/process/task_queues:95:5)
DEBUG api.chat Total message length: 2, words
INFO LLMManager Getting dynamic models for Ollama
INFO LLMManager Got 3 dynamic models for Ollama
INFO stream-text Sending llm call to Ollama with model deepseek-r1:latest
DEBUG Ollama Base Url used: http://127.0.0.1:11434
ERROR api.chat AI_RetryError: Failed after 3 attempts. Last error: Internal Server Error
ERROR LLMManager Error getting dynamic models Hyperbolic : Missing Api Key configuration for Hyperbolic provider
INFO LLMManager Caching 3 dynamic models for Ollama

Screen Recording / Screenshot

Image

and in term

Image

Platform

  • OS: ,Linux Ubuntu
  • Browser: [e.g. Chrome, Safari, Firefox]
  • Version: [e.g. 91.1]
@Ranger-Dj
Copy link

same issue

@iiinvent
Copy link

iiinvent commented Jan 26, 2025 via email

@kareem-g
Copy link

same issue here.

@leex279
Copy link
Collaborator

leex279 commented Jan 26, 2025

Just got problems with this very Model or with all models you try to use?

See my video on youtube. Might help. Let me know if still issues.

https://youtu.be/JItHTxqE0KQ

@leex279 leex279 added the question Further information is requested label Jan 26, 2025
@kareem-g
Copy link

Hi, I resolved the issue by adding the following rules to the Ubuntu environment:

  1. Open a terminal and run: sudo gedit ~/.bashrc
  2. Add the following rules:
# OLLAMA
export OLLAMA_HOST=0.0.0.0
export OLLAMA_ORIGINS=*
  1. Save the file.
  2. Restart your device and try running the Bolt app again.

Additionally, make sure to add the default URL of Ollama in the .env or .env.local file like this: http://127.0.0.1:11434. The localhost:11434 URL causes an error.

Hope this helps!

@mahimsafa
Copy link

mahimsafa commented Jan 27, 2025

@kareem-g Your solution works. Thanks a lot

@kareem-g
Copy link

@kareem-g Your solution works. Thanks a lot

You're welcome!

do you know how to make ollama runs on GPU instead of CPU? I use Ubuntu with GTX 1650 ti 4gb

@mahimsafa
Copy link

@kareem-g as far as I know by default tries to run on GPU. While running you can check for the GPU usage with this command nvidia-smi to verify.

Here is the supported GPU list: https://github.com/ollama/ollama/blob/main/docs/gpu.md

@kareem-g
Copy link

@kareem-g as far as I know by default tries to run on GPU. While running you can check for the GPU usage with this command nvidia-smi to verify.

Here is the supported GPU list: https://github.com/ollama/ollama/blob/main/docs/gpu.md

I checked this page before, but it doesn't work it always uses the CPU. when i use small model like phi3 it uses GPU and sometimes CPU. idk why, I saw people runs on both CPU/GPU and I donnu how.

the cpu is damn slow but it works fine and I use it inside vscode but I need to use GPU for faster results.

my current CUDA Version:

nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2023 NVIDIA Corporation
Built on Fri_Jan__6_16:45:21_PST_2023
Cuda compilation tools, release 12.0, V12.0.140
Build cuda_12.0.r12.0/compiler.32267302_0

@kareem-g
Copy link

@kareem-g as far as I know by default tries to run on GPU. While running you can check for the GPU usage with this command nvidia-smi to verify.
Here is the supported GPU list: https://github.com/ollama/ollama/blob/main/docs/gpu.md

I checked this page before, but it doesn't work it always uses the CPU. when i use small model like phi3 it uses GPU and sometimes CPU. idk why, I saw people runs on both CPU/GPU and I donnu how.

the cpu is damn slow but it works fine and I use it inside vscode but I need to use GPU for faster results.

my current CUDA Version:

nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2023 NVIDIA Corporation
Built on Fri_Jan__6_16:45:21_PST_2023
Cuda compilation tools, release 12.0, V12.0.140
Build cuda_12.0.r12.0/compiler.32267302_0

Lol. I was able to fix it by running this command on terminal:

nvidia-smi -L to force ollama to use GPU alongside with CPU and restarting ollama did the trick.

sudo systemctl stop ollama
sudo systemctl start ollama
sudo systemctl status ollama

I wanna thank me hehe. xD

Image

@thecodacus
Copy link
Collaborator

I believe the issue is fixed

@thecodacus thecodacus pinned this issue Jan 29, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

7 participants