English | 中文
MegEngine is a fast, scalable and easy-to-use deep learning framework, with auto-differentiation.
NOTE: MegEngine now supports Python installation on Linux-64bit/Windows-64bit/MacOS(CPU-Only)-10.14+ platforms with Python from 3.5 to 3.8. On Windows 10 you can either install the Linux distribution through Windows Subsystem for Linux (WSL) or install the Windows distribution directly. Many other platforms are supported for inference.
To install the pre-built binaries via pip wheels:
python3 -m pip install megengine -f https://megengine.org.cn/whl/mge.html
Most of the dependencies of MegEngine are located in third_party directory, which can be prepared by executing:
./third_party/prepare.sh
./third_party/install-mkl.sh
But some dependencies need to be installed manually:
- CUDA(>=10.1), cuDNN(>=7.6) are required when building MegEngine with CUDA support.
- TensorRT(>=5.1.5) is required when building with TensorRT support.
- LLVM/Clang(>=6.0) is required when building with Halide JIT support.
- Python(>=3.5) and numpy are required to build Python modules.
MegEngine uses CMake as the build tool. We provide the following scripts to facilitate building.
- host_build.sh builds MegEngine that runs on the same host machine (i.e., no cross compiling).
The following command displays the usage:
scripts/cmake-build/host_build.sh -h
- cross_build_android_arm_inference.sh builds MegEngine for DNN inference on Android-ARM platforms.
The following command displays the usage:
scripts/cmake-build/cross_build_android_arm_inference.sh -h
- cross_build_linux_arm_inference.sh builds MegEngine for DNN inference on Linux-ARM platforms.
The following command displays the usage:
scripts/cmake-build/cross_build_linux_arm_inference.sh -h
- cross_build_ios_arm_inference.sh builds MegEngine for DNN inference on iOS (iPhone/iPad) platforms.
The following command displays the usage:
scripts/cmake-build/cross_build_ios_arm_inference.sh
Please refer to BUILD_README.md for more details.
- MegEngine adopts Contributor Covenant as a guideline to run our community. Please read the Code of Conduct.
- Every contributor of MegEngine must sign a Contributor License Agreement (CLA) to clarify the intellectual property license granted with the contributions.
- You can help improving MegEngine in many ways:
- Write code.
- Improve documentation.
- Answer questions on MegEngine Forum, or Stack Overflow.
- Contribute new models in MegEngine Model Hub.
- Try a new idea on MegStudio.
- Report or investigate bugs and issues.
- Review Pull Requests.
- Star MegEngine repo.
- Cite MegEngine in your papers and articles.
- Recommend MegEngine to your friends.
- Any other form of contribution is welcomed.
We strive to build an open and friendly community. We aim to power humanity with AI.
- Issue: github.com/MegEngine/MegEngine/issues
- Email: [email protected]
- Forum: discuss.megengine.org.cn
- QQ Group: 1029741705
- OPENI: openi.org.cn/MegEngine
MegEngine is Licensed under the Apache License, Version 2.0
Copyright (c) 2014-2021 Megvii Inc. All rights reserved.