Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Refactor to reuse stream parsing across ChatModels (#380)
* Copy discard_none_arguments to LitellmChatModel * Switch OpenaiChatModel to use Stream approach * Add TODO to test retry logic * Add StreamParser to DRY openai streaming code * Add _if_given * Fix typing for openai Stream classes * Add parse_stream and use in OpenaiChatModel * tidy function schema matching * Copy Stream classes into stream.py * Switch LitellmChatModel to use stream parsers * Switch OpenaiChatModel to use shared streaming classes * Add docstring for is_instance_origin * Use type origins in parse_stream * Fix complete type hints for LitellmChatModel * Change function_schemas type list -> Iterable * Add StreamState and OpenaiStreamState * Add TODOs * Update openai retry test cassettes * Add back usage for OpenaiChatModel * Consolodate parsers into one * Make LitellmChatModel use new parsing format * Fix litellm_ollama * Handle multiple tools in a chunk, for Mistral * Remove anthropic context manager usage * Add _if_given helper for anthropic * Allow parser.get_content to return None * Switch AnthropicChatModel to new parsing logic * Delete unused validate_str_content functions * Remove redundant TODOs * Fix prompt_chain and unskip tests * Only yield tool call args if not falsy * Add FunctionCallNotAllowedError, ObjectNotAllowedError * Add UnknownToolError, raise in OutputStream * Remove done todo for unknown tool call * Remove is_content_ended if favor of is_tool_call * Add typecheck to make all * Add get_function_schemas. Fix some mypy errors * Tidy calculation of allow_string_output * Fix remaining mypy errors * Delete is_instance_origin
- Loading branch information