vishnu2kmohan
released this
03 Jan 03:59
·
2 commits
to master
since this release
Release: 1.3.0-0.35.4
This release contains everything from 1.2.0-0.33.7 and includes and/or updates the following:
Major Features and Improvements
- Based on Debian 9.6
Package Additions
- boost
- conda-pack
- gensim
- h2oai::h2o
- ibis-framework
- ipyleaflet
- nbdime
- nbserverproxy
- numexpr
- openblas
- plotly
- pyomo
- pyomo.extras
- pyomo.solvers
- r-caret
- r-devtools
- r-forecast
- r-nycflights13
- r-plotly
- r-randomforest
- r-sqlite
- r-shiny
- r-sparklyr
- r-tidyverse
- s3cmd
- setproctitle
- typing
- pygdf
- quilt[img,pytorch,torchvision]
Package Bumps
- dask 1.0.0
- distributed 1.25.1
- hadoop 2.9.2
- horovod 0.15.2
- jupyterlab 0.35.4
- mlflow 0.8.1
- pyarrow 0.11.0
- pytorch 1.0.0
- r-base 3.5.1
- ray[debug,rllib] 0.6.1
- setuptools 40.6.3
- tensorflow 1.11.0
- tensorflowonspark 1.4.1
- toree 0.3.0-incubating
- torchvision 0.2.1
Jupyter Extensions Additions
- @jupyterlab/celltags
- dask-labextension
- jupyterlab/git
- jupyterlab-drawio
- jupyter-leaflet
- jupyterlab-kernelspy
- jupyterlab_iframe
- nbdime-jupyterlab
- nbserverproxy
- qgrid
NVIDIA Library Bumps
- cuDNN 7.4.1.5-1+cuda9.0
- NCCL 2.3.7-1+cuda9.0
Miscellaneous Bumps
OpenID Connect
- Added support for specifying:
- Authorization Parameters
- Redirect After Logout URI
- Redirect After Logout With ID Token Hint (default:
true
) - Refresh Session Interval (default:
3300
seconds) - Whether to renew Access Token on Expiry (default:
true
)
Breaking Changes
Configuration
- The
OIDC_REDIRECT_URI
environment variable must now be specified as an absolute URI since redirect_uri_path is deprecated - Rename the
OIDC_AUTH_METHOD
environment variable toOIDC_TOKEN_ENDPOINT_AUTH_METHOD
to disambiguate from the Introspection Endpoint Authentication method
Features
- Apache Toree, as of 0.3.0-incubating has removed support for PySpark and SparkR, only the Scala and SQL interpreters will remain available. The vanilla PySpark and SparkR kernels, however retain their ability to launch pre-configured Spark Jobs