You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The response template is added to the user_prompt before the latter is processed by each LLM class.
Some LLM APIs natively support response format (e.g., ChatGPT and Gemini) and sometimes can get confused by the json dump added to the user prompt.
One solution is to let each LLM class add the response format, maybe by implementing the user prompt creation in the base class and allowing subclasses to specialize its behaviour.
The text was updated successfully, but these errors were encountered:
The response template is added to the user_prompt before the latter is processed by each LLM class.
Some LLM APIs natively support response format (e.g., ChatGPT and Gemini) and sometimes can get confused by the json dump added to the user prompt.
One solution is to let each LLM class add the response format, maybe by implementing the user prompt creation in the base class and allowing subclasses to specialize its behaviour.
The text was updated successfully, but these errors were encountered: