Skip to content
This repository has been archived by the owner on May 1, 2024. It is now read-only.

Latest commit

 

History

History
79 lines (59 loc) · 2.55 KB

README.md

File metadata and controls

79 lines (59 loc) · 2.55 KB

Collective Knowledge repository for collaboratively benchmarking and optimising embedded deep vision runtime library for Jetson TX1

This fork is maintained by Krai Ltd.

compatibility License

Introduction

CK-TensorRT is an open framework for collaborative and reproducible optimisation of convolutional neural networks for Jetson TX1 based on the Collective Knowledge framework. It's based on the Deep Inference framework from Dustin Franklin (a Jetson developer @ NVIDIA). In essence, CK-TensorRT is simply a suite of convenient wrappers with unified JSON API for customizable building, evaluating and multi-objective optimisation of Jetson Inference runtime library for Jetson TX1.

Authors/contributors

Quick installation on Ubuntu

TBD

Installing general dependencies

$ sudo apt install coreutils \
                   build-essential \
                   make \
                   cmake \
                   wget \
                   git \
                   python \
                   python-pip

Installing CK-TensorRT dependencies

$ sudo apt install libqt4-dev \
                   libglew-dev \
                   libgstreamer1.0-dev

Installing CK

$ sudo pip install ck
$ ck version

Installing CK-TensorRT repository

$ ck pull repo:ck-tensorrt

Building CK-TensorRT and all dependencies via CK

The first time you run a TensorRT program (e.g. tensorrt-test), CK will build and install all missing dependencies on your machine, download the required data sets and start the benchmark:

$ ck run program:tensorrt-test

Related projects and initiatives

We are working with the community to unify and crowdsource performance analysis and tuning of various DNN frameworks (or any realistic workload) using the Collective Knowledge Technology: