You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The Open Neural Network Exchange (ONNX) is an open standard for representing machine learning models. The biggest advantage of ONNX is that it allows interoperability across different open source AI frameworks, which itself offers more flexibility for AI frameworks adoption.
ONNX provides TensorRT (Windows / LInux, fast Nvidia GPU backend), DirectML (Windows, can be used on all GPUs) and WebGL (all platforms, can be used on all GPUs) backends, which have big advantages compared to pyTorch backend.
Currently, the official node-js backend does not support any GPU functionality. Considering running a python backend to do all of the ML stuffs?
The Open Neural Network Exchange (ONNX) is an open standard for representing machine learning models. The biggest advantage of ONNX is that it allows interoperability across different open source AI frameworks, which itself offers more flexibility for AI frameworks adoption.
ONNX provides TensorRT (Windows / LInux, fast Nvidia GPU backend), DirectML (Windows, can be used on all GPUs) and WebGL (all platforms, can be used on all GPUs) backends, which have big advantages compared to pyTorch backend.
Currently, the official node-js backend does not support any GPU functionality. Considering running a python backend to do all of the ML stuffs?
https://www.npmjs.com/package/onnxruntime (node-js CPU only)
https://github.com/microsoft/onnxjs (WebGL, not all ops we need are implemented)
The text was updated successfully, but these errors were encountered: