Replies: 1 comment
-
I checked and it still seems to work with local models for me. Your config looks correct. I tested it with ollama/llama3. Unfortunately my machine can't run the model you're trying so I can't check that. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello, I just find this fantastic TUI. But it seems like the Ollama models can not be loaded?
![image](https://private-user-images.githubusercontent.com/10807209/334824442-528f1253-3f29-491f-94f9-e0fd4604491e.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MzkxMTk5ODcsIm5iZiI6MTczOTExOTY4NywicGF0aCI6Ii8xMDgwNzIwOS8zMzQ4MjQ0NDItNTI4ZjEyNTMtM2YyOS00OTFmLTk0ZjktZTBmZDQ2MDQ0OTFlLnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNTAyMDklMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjUwMjA5VDE2NDgwN1omWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPWQ1ODI3NzRkOWUwOWEyMjhjMTFiZDI4ODZiYzQ0YWEzZDZkMWQxYzA3MzJlNDVlMmVhZjg0MTM4NzU2NzJhOGEmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.pxzaEb7lhPot5jGlWIF_9EDLVpQR_8r-qD9UhVVAafQ)
This is what I added to the config.toml & Ollama is running on the server mode.
But I still cant load the model to Edia.
Anything wrong the my configuration maybe?
Beta Was this translation helpful? Give feedback.
All reactions