-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Firefox AI features (Summarize, etc.) does not work with llamafile #640
Comments
Yes , it's just a too simple "replace" insteed of use " URL() constructor " in the line https://github.com/Mozilla-Ocho/llamafile/blob/main/llama.cpp/server/public/index.html#L422 |
I got excited when I saw a new release...they didn't even bother to fix the issue. I haven't really looked at what this "lamafiler" thing is yet, but I don't feel like going through more disappointment right now. |
everything will be fine, thank you
…________________________________
From: TFWol ***@***.***>
Sent: Monday, December 2, 2024 12:53 PM
To: Mozilla-Ocho/llamafile ***@***.***>
Cc: Subscribed ***@***.***>
Subject: Re: [Mozilla-Ocho/llamafile] Firefox AI features (Summarize, etc.) does not work with llamafile (Issue #640)
I got excited when I saw a new release...they didn't even bother to fix the issue.
I haven't really looked at what this "lamafiler" thing is yet, but I don't feel like going through more disappointment right now.
—
Reply to this email directly, view it on GitHub<#640 (comment)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/AOOFRCDKE2F2QBMFMGPUSRT2DQU2VAVCNFSM6AAAAABSNNGQ3OVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDKMJRGA3DAOBUG4>.
You are receiving this because you are subscribed to this thread.Message ID: ***@***.***>
|
@sizvix would you happen to know how to use github Actions to build just the lamafile binary for placing into an artifact to download? I'm not familiar with builds that have a ton of imports like this project has. |
Discussed in #633
Originally posted by TFWol November 19, 2024
Windows 10 (not WSL)
I'm not entirely sure where the problem lies, but I haven't been able to get llamafiles to work with Firefox. It worked before.
A handful of separate commands I've tried running it with - all have the same outcome:
llamafile
Firefox
More info
going directly to http://localhost:8080/ or http://127.0.0.1:8080/ in a browser tab and typing works, but not when using the firefox built-in ML stuff like 'Summarize'.
Setting
browser.ml.chat.sidebar
to "False" and using firefox ml feature from context menu will open a tab with a url likehttp://localhost:8080/?q=I’m+on+page+“Test+Webpage”+with+“foobar”+selected.%0A%0APlease+summarize+the+selection+using+precise+and+concise+language.+Use+headers+and+bulleted+lists+in+the+summary%2C+to+make+it+scannable.+Maintain+the+meaning+and+factual+accuracy.
and I can see firefox talk to llamafile server via the commandprompt window, but there won't be a response back from the llamafile server. Even typing in the chatbox and submitting it doesn't yield a response.In Dev Tools, I see a 404 in the POST:
The text was updated successfully, but these errors were encountered: