New Relic's MLOps solution provides a Python library that makes it easy to monitor your ML models.
The key features:
- Easy to integrate
- ML framework agnostic
- Full compatibility with New Relic observability framework
- Based on the newrelic-telemetry-sdk-python library
- Monitoring and alerting:
- Predictions
- Features
- Missing values
- Custom model and data drift
- Custom inference data
- Custom performance metrics
- Documentation - Overview of the New Relic MLOps docs and related resources.
- Additional Guides - Learn about New Relic's Telemetry Software Development Kit.
- Try out an XGBoost model on California housing prices dataset.
- Try out a TensorFlow model on California housing prices dataset.
- Try out how to simulate 24 hours of model inference data using New Relic MLOps.
easily try an end-to-end example of model monitoring.
# STEP 1: Initialize the monitoring
ml_monitor = MLPerformanceMonitoring(...)
# STEP 2: Add your algorithm
y = my_model.predict(X)
# STEP 3: Record your data
ml_monitor.record_inference_data(X, y)
With pip
pip install git+https://github.com/newrelic-experimental/ml-performance-monitoring.git
Get your License key (also referenced as ingest - license
) and set it as environment variable: NEW_RELIC_LICENSE_KEY
.
Click here for more details and instructions.
Are you reporting data to the New Relic EU region? click here for more instructions.
from ml_performance_monitoring.monitor import MLPerformanceMonitoring
metadata = {"environment": "notebook"}
model_version = "1.0"
features_columns, labels_columns = (
["feature_1", "feature_2", "feature_3", "feature_4"],
["target"],
)
ml_monitor = MLPerformanceMonitoring(
insert_key=None, # set the environment variable NEW_RELIC_LICENSE_KEY or send your insert key here
model_name="My stunning model",
metadata=metadata,
features_columns=features_columns,
labels_columns=labels_columns,
model_version=model_version
)
y = my_model.predict(X)
ml_monitor.record_inference_data(X, y)
Done! Check your application in the New Relic UI to see the real time data.
If you are using an EU account, set your environment variables as following:
EVENT_CLIENT_HOST
andMETRIC_CLIENT_HOST
- US region account (default)-
EVENT_CLIENT_HOST
: insights-collector.newrelic.comMETRIC_CLIENT_HOST
: metric-api.newrelic.com
- EU region account-
EVENT_CLIENT_HOST
: insights-collector.eu01.nr-data.netMETRIC_CLIENT_HOST
: metric-api.eu.newrelic.com/metric/v1
- US region account (default)-
They can also be sent as parameters at the MLPerformanceMonitoring call.
As an open source library, customers can interact with New Relic employees as well as other customers to get help by opening GitHub issues in the repository.
We encourage your contributions to improve ml-performance-monitoring! Keep in mind when you submit your pull request, you'll need to sign the CLA via the click-through using CLA-Assistant. You only have to sign the CLA one time per project. If you have any questions, or to execute our corporate CLA (required if your contribution is on behalf of a company) please drop us an email at [email protected].
A note about vulnerabilities: As noted in our security policy, New Relic is committed to the privacy and security of our customers and their data. We believe that providing coordinated disclosure by security researchers and engaging with the security community are important means to achieve our security goals.
If you believe you have found a security vulnerability in this project or any of New Relic's products or websites, we welcome and greatly appreciate you reporting it to New Relic through HackerOne.
ml-performance-monitoring is licensed under the Apache 2.0 License.
If applicable: The ml-performance-monitoring also uses source code from third-party libraries. You can find full details on which libraries are used and the terms under which they are licensed in the third-party notices document.