Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

build(deps): bump mlflow from 2.17.2 to 2.18.0 in /runtimes/mlflow #1970

Merged
merged 1 commit into from
Nov 26, 2024

Conversation

dependabot[bot]
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Nov 25, 2024

Bumps mlflow from 2.17.2 to 2.18.0.

Release notes

Sourced from mlflow's releases.

MLflow 2.18.0

We are excited to announce the release of MLflow 2.18.0! This release includes a number of significant features, enhancements, and bug fixes.

Python Version Update

Python 3.8 is now at an end-of-life point. With official support being dropped for this legacy version, MLflow now requires Python 3.9 as a minimum supported version.

Note: If you are currently using MLflow's ChatModel interface for authoring custom GenAI applications, please ensure that you have read the future breaking changes section below.

Major New Features

  • 🦺 Fluent API Thread/Process Safety - MLflow's fluent APIs for tracking and the model registry have been overhauled to add support for both thread and multi-process safety. You are now no longer forced to use the Client APIs for managing experiments, runs, and logging from within multiprocessing and threaded applications. (#13456, #13419, @​WeichenXu123)

  • 🧩 DSPy flavor - MLflow now supports logging, loading, and tracing of DSPy models, broadening the support for advanced GenAI authoring within MLflow. Check out the MLflow DSPy Flavor documentation to get started! (#13131, #13279, #13369, #13345, @​chenmoneygithub, #13543, #13800, #13807, @​B-Step62, #13289, @​michael-berk)

  • 🖥️ Enhanced Trace UI - MLflow Tracing's UI has undergone a significant overhaul to bring usability and quality of life updates to the experience of auditing and investigating the contents of GenAI traces, from enhanced span content rendering using markdown to a standardized span component structure, (#13685, #13357, #13242, @​daniellok-db)

  • 🚄 New Tracing Integrations - MLflow Tracing now supports DSPy, LiteLLM, and Google Gemini, enabling a one-line, fully automated tracing experience. These integrations unlock enhanced observability across a broader range of industry tools. Stay tuned for upcoming integrations and updates! (#13801, @​TomeHirata, #13585, @​B-Step62)

  • 📊 Expanded LLM-as-a-Judge Support - MLflow now enhances its evaluation capabilities with support for additional providers, including Anthropic, Bedrock, Mistral, and TogetherAI, alongside existing providers like OpenAI. Users can now also configure proxy endpoints or self-hosted LLMs that follow the provider API specs by using the new proxy_url and extra_headers options. Visit the LLM-as-a-Judge documentation for more details! (#13715, #13717, @​B-Step62)

  • ⏰ Environment Variable Detection - As a helpful reminder for when you are deploying models, MLflow now detects and reminds users of environment variables set during model logging, ensuring they are configured for deployment. In addition to this, the mlflow.models.predict utility has also been updated to include these variables in serving simulations, improving pre-deployment validation. (#13584, @​serena-ruan)

Breaking Changes to ChatModel Interface

  • ChatModel Interface Updates - As part of a broader unification effort within MLflow and services that rely on or deeply integrate with MLflow's GenAI features, we are working on a phased approach to making a consistent and standard interface for custom GenAI application development and usage. In the first phase (planned for release in the next few releases of MLflow), we are marking several interfaces as deprecated, as they will be changing. These changes will be:

    • Renaming of Interfaces:
      • ChatRequestChatCompletionRequest to provide disambiguation for future planned request interfaces.
      • ChatResponseChatCompletionResponse for the same reason as the input interface.
      • metadata fields within ChatRequest and ChatResponsecustom_inputs and custom_outputs, respectively.
    • Streaming Updates:
      • predict_stream will be updated to enable true streaming for custom GenAI applications. Currently, it returns a generator with synchronous outputs from predict. In a future release, it will return a generator of ChatCompletionChunks, enabling asynchronous streaming. While the API call structure will remain the same, the returned data payload will change significantly, aligning with LangChain’s implementation.
    • Legacy Dataclass Deprecation:
      • Dataclasses in mlflow.models.rag_signatures will be deprecated, merging into unified ChatCompletionRequest, ChatCompletionResponse, and ChatCompletionChunks.

Other Features:

  • [Evaluate] Add Huggingface BLEU metrics to MLflow Evaluate (#12799, @​nebrass)
  • [Models / Databricks] Add support for spark_udf when running on Databricks Serverless runtime, Databricks connect, and prebuilt python environments (#13276, #13496, @​WeichenXu123)
  • [Scoring] Add a model_config parameter for pyfunc.spark_udf for customization of batch inference payload submission (#13517, @​WeichenXu123)
  • [Tracing] Standardize retriever span outputs to a list of MLflow Documents (#13242, @​daniellok-db)
  • [UI] Add support for visualizing and comparing nested parameters within the MLflow UI (#13012, @​jescalada)

... (truncated)

Changelog

Sourced from mlflow's changelog.

2.18.0 (2024-11-18)

We are excited to announce the release of MLflow 2.18.0! This release includes a number of significant features, enhancements, and bug fixes.

Python Version Update

Python 3.8 is now at an end-of-life point. With official support being dropped for this legacy version, MLflow now requires Python 3.9 as a minimum supported version.

Note: If you are currently using MLflow's ChatModel interface for authoring custom GenAI applications, please ensure that you have read the future breaking changes section below.

Major New Features

  • 🦺 Fluent API Thread/Process Safety - MLflow's fluent APIs for tracking and the model registry have been overhauled to add support for both thread and multi-process safety. You are now no longer forced to use the Client APIs for managing experiments, runs, and logging from within multiprocessing and threaded applications. (#13456, #13419, @​WeichenXu123)

  • 🧩 DSPy flavor - MLflow now supports logging, loading, and tracing of DSPy models, broadening the support for advanced GenAI authoring within MLflow. Check out the MLflow DSPy Flavor documentation to get started! (#13131, #13279, #13369, #13345, @​chenmoneygithub, #13543, #13800, #13807, @​B-Step62, #13289, @​michael-berk)

  • 🖥️ Enhanced Trace UI - MLflow Tracing's UI has undergone a significant overhaul to bring usability and quality of life updates to the experience of auditing and investigating the contents of GenAI traces, from enhanced span content rendering using markdown to a standardized span component structure, (#13685, #13357, #13242, @​daniellok-db)

  • 🚄 New Tracing Integrations - MLflow Tracing now supports DSPy, LiteLLM, and Google Gemini, enabling a one-line, fully automated tracing experience. These integrations unlock enhanced observability across a broader range of industry tools. Stay tuned for upcoming integrations and updates! (#13801, @​TomeHirata, #13585, @​B-Step62)

  • 📊 Expanded LLM-as-a-Judge Support - MLflow now enhances its evaluation capabilities with support for additional providers, including Anthropic, Bedrock, Mistral, and TogetherAI, alongside existing providers like OpenAI. Users can now also configure proxy endpoints or self-hosted LLMs that follow the provider API specs by using the new proxy_url and extra_headers options. Visit the LLM-as-a-Judge documentation for more details! (#13715, #13717, @​B-Step62)

  • ⏰ Environment Variable Detection - As a helpful reminder for when you are deploying models, MLflow now detects and reminds users of environment variables set during model logging, ensuring they are configured for deployment. In addition to this, the mlflow.models.predict utility has also been updated to include these variables in serving simulations, improving pre-deployment validation. (#13584, @​serena-ruan)

Breaking Changes to ChatModel Interface

  • ChatModel Interface Updates - As part of a broader unification effort within MLflow and services that rely on or deeply integrate with MLflow's GenAI features, we are working on a phased approach to making a consistent and standard interface for custom GenAI application development and usage. In the first phase (planned for release in the next few releases of MLflow), we are marking several interfaces as deprecated, as they will be changing. These changes will be:

    • Renaming of Interfaces:
      • ChatRequestChatCompletionRequest to provide disambiguation for future planned request interfaces.
      • ChatResponseChatCompletionResponse for the same reason as the input interface.
      • metadata fields within ChatRequest and ChatResponsecustom_inputs and custom_outputs, respectively.
    • Streaming Updates:
      • predict_stream will be updated to enable true streaming for custom GenAI applications. Currently, it returns a generator with synchronous outputs from predict. In a future release, it will return a generator of ChatCompletionChunks, enabling asynchronous streaming. While the API call structure will remain the same, the returned data payload will change significantly, aligning with LangChain’s implementation.
    • Legacy Dataclass Deprecation:
      • Dataclasses in mlflow.models.rag_signatures will be deprecated, merging into unified ChatCompletionRequest, ChatCompletionResponse, and ChatCompletionChunks.

Other Features:

  • [Evaluate] Add Huggingface BLEU metrics to MLflow Evaluate (#12799, @​nebrass)
  • [Models / Databricks] Add support for spark_udf when running on Databricks Serverless runtime, Databricks connect, and prebuilt python environments (#13276, #13496, @​WeichenXu123)
  • [Scoring] Add a model_config parameter for pyfunc.spark_udf for customization of batch inference payload submission (#13517, @​WeichenXu123)

... (truncated)

Commits

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Bumps [mlflow](https://github.com/mlflow/mlflow) from 2.17.2 to 2.18.0.
- [Release notes](https://github.com/mlflow/mlflow/releases)
- [Changelog](https://github.com/mlflow/mlflow/blob/master/CHANGELOG.md)
- [Commits](mlflow/mlflow@v2.17.2...v2.18.0)

---
updated-dependencies:
- dependency-name: mlflow
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <[email protected]>
@dependabot dependabot bot added dependencies Pull requests that update a dependency file python Pull requests that update Python code labels Nov 25, 2024
@dependabot dependabot bot requested a review from a team November 25, 2024 05:32
@sakoush sakoush merged commit 3f2c043 into master Nov 26, 2024
26 checks passed
@sakoush sakoush deleted the dependabot/pip/runtimes/mlflow/mlflow-2.18.0 branch November 26, 2024 11:34
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file python Pull requests that update Python code
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant