-
Notifications
You must be signed in to change notification settings - Fork 528
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
need support for inhouse hosted models #607
Comments
Hi there, |
we are planning to deploy LLaVA on our private cloud hardware, so is it possible to use that model as the reference instead we interact with chatgpt(GPT-4o) |
In my case, when initializing from lavague.core import WorldModel, ActionEngine
from lavague.core.agents import WebAgent
from lavague.drivers.selenium import SeleniumDriver
from llama_index.multi_modal_llms.huggingface import HuggingFaceMultiModal
from llama_index.embeddings.huggingface import HuggingFaceEmbedding and running Traceback (most recent call last):
File "./test1.py", line 26, in <module>
action_engine = ActionEngine(driver=selenium_driver, llm=llm, embedding=embed_model)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "./venv/lib/python3.12/site-packages/lavague/core/action_engine.py", line 84, in __init__
python_engine = PythonEngine(driver, llm, embedding)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "./venv/lib/python3.12/site-packages/lavague/core/python_engine.py", line 66, in __init__
self.ocr_mm_llm = ocr_mm_llm or OpenAIMultiModal(
^^^^^^^^^^^^^^^^^
File "./venv/lib/python3.12/site-packages/llama_index/multi_modal_llms/openai/base.py", line 107, in __init__
self._messages_to_prompt = messages_to_prompt or generic_messages_to_prompt
^^^^^^^^^^^^^^^^^^^^^^^^
File "./venv/lib/python3.12/site-packages/pydantic/main.py", line 865, in __setattr__
if self.__pydantic_private__ is None or name not in self.__private_attributes__:
^^^^^^^^^^^^^^^^^^^^^^^^^
File "./venv/lib/python3.12/site-packages/pydantic/main.py", line 853, in __getattr__
return super().__getattribute__(item) # Raises AttributeError if appropriate
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AttributeError: 'OpenAIMultiModal' object has no attribute '__pydantic_private__'. Did you mean: '__pydantic_complete__'? So it initializes OpenAIMultiModal even when using local models. Is there a working example non-OpenAI that I could using local models? Thank you in advance! |
This issue seems related: #565 |
Is your feature request related to a problem? Please describe.
we are exploring around using LaVague for accomplishing web automation but the limitation is using public facing models. can we support LaVague to allow inhouse hosted models to eliminate the cost constraints
Describe the solution you'd like
need LaVague should support custom inhouse deployed models
Describe alternatives you've considered
NA
Additional context
NA
The text was updated successfully, but these errors were encountered: