-
Notifications
You must be signed in to change notification settings - Fork 1
/
how-to-run-examples.html
executable file
·444 lines (402 loc) · 17.6 KB
/
how-to-run-examples.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
<!DOCTYPE html>
<html lang="en">
<head>
<base href=".">
<meta http-equiv="Content-Type" content="text/html; charset=utf-8">
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=1">
<title>How to run examples</title>
<link rel="stylesheet" href="assets/css/dark-frontend.css" type="text/css" title="dark">
<link rel="alternate stylesheet" href="assets/css/light-frontend.css" type="text/css" title="light">
<link rel="stylesheet" href="assets/css/bootstrap-toc.min.css" type="text/css">
<link rel="stylesheet" href="assets/css/jquery.mCustomScrollbar.min.css">
<link rel="stylesheet" href="assets/js/search/enable_search.css" type="text/css">
<link rel="stylesheet" href="assets/css/extra_frontend.css" type="text/css">
<link rel="stylesheet" href="assets/css/prism-tomorrow.css" type="text/css" title="dark">
<link rel="alternate stylesheet" href="assets/css/prism.css" type="text/css" title="light">
<script src="assets/js/mustache.min.js"></script>
<script src="assets/js/jquery.js"></script>
<script src="assets/js/bootstrap.js"></script>
<script src="assets/js/scrollspy.js"></script>
<script src="assets/js/typeahead.jquery.min.js"></script>
<script src="assets/js/search.js"></script>
<script src="assets/js/compare-versions.js"></script>
<script src="assets/js/jquery.mCustomScrollbar.concat.min.js"></script>
<script src="assets/js/bootstrap-toc.min.js"></script>
<script src="assets/js/jquery.touchSwipe.min.js"></script>
<script src="assets/js/anchor.min.js"></script>
<script src="assets/js/tag_filtering.js"></script>
<script src="assets/js/language_switching.js"></script>
<script src="assets/js/styleswitcher.js"></script>
<script src="assets/js/lines_around_headings.js"></script>
<script src="assets/js/prism-core.js"></script>
<script src="assets/js/prism-autoloader.js"></script>
<script src="assets/js/prism_autoloader_path_override.js"></script>
<script src="assets/js/trie.js"></script>
<link rel="icon" type="image/png" href="assets/images/nnstreamer_logo.png">
</head>
<body class="no-script
">
<script>
$('body').removeClass('no-script');
</script>
<nav class="navbar navbar-fixed-top navbar-default" id="topnav">
<div class="container-fluid">
<div class="navbar-right">
<a id="toc-toggle">
<span class="glyphicon glyphicon-menu-right"></span>
<span class="glyphicon glyphicon-menu-left"></span>
</a>
<button type="button" class="navbar-toggle collapsed" data-toggle="collapse" data-target="#navbar-wrapper" aria-expanded="false">
<span class="sr-only">Toggle navigation</span>
<span class="icon-bar"></span>
<span class="icon-bar"></span>
<span class="icon-bar"></span>
</button>
<span title="light mode switch" class="glyphicon glyphicon-sunglasses pull-right" id="lightmode-icon"></span>
<form class="navbar-form pull-right" id="navbar-search-form">
<div class="form-group has-feedback">
<input type="text" class="form-control input-sm" name="search" id="sidenav-lookup-field" placeholder="search" disabled>
<span class="glyphicon glyphicon-search form-control-feedback" id="search-mgn-glass"></span>
</div>
</form>
</div>
<div class="navbar-header">
<a id="sidenav-toggle">
<span class="glyphicon glyphicon-menu-right"></span>
<span class="glyphicon glyphicon-menu-left"></span>
</a>
<a id="home-link" href="index.html" class="hotdoc-navbar-brand">
<img src="assets/images/nnstreamer_logo.png" alt="Home">
</a>
</div>
<div class="navbar-collapse collapse" id="navbar-wrapper">
<ul class="nav navbar-nav" id="menu">
<li class="dropdown">
<a class="dropdown-toggle" role="button" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">
API References<span class="caret"></span>
</a>
<ul class="dropdown-menu" id="modules-menu">
<li>
<a href="doc-index.html">NNStreamer doc</a>
</li>
<li>
<a href="gst/nnstreamer/README.html">NNStreamer Elements</a>
</li>
<li>
<a href="nnstreamer-example/index.html">NNStreamer Examples</a>
</li>
<li>
<a href="API-reference.html">API reference</a>
</li>
</ul>
</li>
<li>
<a href="doc-index.html">Documents</a>
</li>
<li>
<a href="gst/nnstreamer/README.html">Elements</a>
</li>
<li>
<a href="tutorials.html">Tutorials</a>
</li>
<li>
<a href="API-reference.html">API reference</a>
</li>
</ul>
<div class="hidden-xs hidden-sm navbar-text navbar-center">
</div>
</div>
</div>
</nav>
<main>
<div data-extension="core" data-hotdoc-in-toplevel="True" data-hotdoc-project="NNStreamer" data-hotdoc-ref="how-to-run-examples.html" class="page_container" id="page-wrapper">
<script src="assets/js/utils.js"></script>
<div class="panel panel-collapse oc-collapsed" id="sidenav" data-hotdoc-role="navigation">
<script src="assets/js/full-width.js"></script>
<div id="sitenav-wrapper">
<iframe src="hotdoc-sitemap.html" id="sitenav-frame"></iframe>
</div>
</div>
<div id="body">
<div id="main">
<div id="page-description" data-hotdoc-role="main">
<h2 id="note-more-examples-can-be-found-here-nnstreamerexample">Note: More examples can be found here. <a href="https://github.com/nnstreamer/nnstreamer-example">nnstreamer-example</a>
</h2>
<h1 id="table-of-contents">Table of Contents</h1>
<ul>
<li>
<a href="how-to-run-examples.html#preparing-nnstreamer-for-execution">Preparing nnstreamer for execution</a>
<ul>
<li><a href="how-to-run-examples.html#use-ppa">Use PPA</a></li>
<li><a href="how-to-run-examples.html#build-examples-ubuntu-1604-and-1804">Build examples (Ubuntu 16.04)</a></li>
</ul>
</li>
<li>
<a href="how-to-run-examples.html#usage-examples">Usage Examples</a>
<ul>
<li><a href="how-to-run-examples.html#example-camera-liveview-image-classification-w-gstlaunch-decoded-by-tensor_decoder">Example : camera live-view image classification. w/ gst-launch, decoded by tensor_decoder</a></li>
<li><a href="how-to-run-examples.html#example-camera-liveview-image-classification-decoded-by-user-application">Example : camera live-view image classification, decoded by user application</a></li>
<li><a href="how-to-run-examples.html#example-camera-liveview-object-detection-decoded-by-user-application-with-tensorflowlite">Example : camera live-view object detection, decoded by user application with Tensorflow-Lite</a></li>
<li><a href="how-to-run-examples.html#example-camera-liveview-object-detection-decoded-by-user-application-with-tensorflow">Example : camera live-view object detection, decoded by user application with Tensorflow</a></li>
<li><a href="how-to-run-examples.html#example-video-mixer-with-nnstreamer-plugin">Example : video mixer with NNStreamer plug-in</a></li>
<li><a href="how-to-run-examples.html#example-tensor-sink">Example : tensor sink</a></li>
</ul>
</li>
</ul>
<h1 id="preparing-nnstreamer-for-execution">Preparing nnstreamer for execution.</h1>
<h2 id="use-ppa">Use PPA</h2>
<p>If you don't mind building binaries, you can directly download from PPA with daily releases:</p>
<ul>
<li>Download nnstreamer :</li>
</ul>
<pre><code>$ sudo add-apt-repository ppa:nnstreamer/ppa
$ sudo apt-get update
$ sudo apt-get install nnstreamer
</code></pre>
<p>Note that this may install tensorflow packaged by us.</p>
<ul>
<li>Download nnstreamer example :</li>
</ul>
<pre><code>$ sudo add-apt-repository ppa:nnstreamer-example/ppa
$ sudo apt-get update
$ sudo apt-get install nnstreamer-example
$ cd /usr/lib/nnstreamer/bin
</code></pre>
<p>As of 2018/10/13, we support 16.04 and 18.04</p>
<h2 id="build-examples-ubuntu-1604-and-1804">Build examples (Ubuntu 16.04 and 18.04)</h2>
<ul>
<li><a href="getting-started.html">getting-started</a></li>
</ul>
<ul>
<li>Install related packages before building nnstreamer and examples.</li>
</ul>
<ol>
<li>ninja-build, meson (>=0.50)</li>
<li>liborc (>=0.4.25, optional)</li>
<li>tensorflow, protobuf (>=3.6.1)</li>
<li>tensorflow-lite</li>
</ol>
<ul>
<li>
<p>Build options (meson)</p>
<ul>
<li>You can find the option definitions here: <a href="https://github.com/nnstreamer/nnstreamer/blob/main/meson_options.txt">meson options</a>
</li>
<li>For more information about meson build options itself, see <a href="https://mesonbuild.com/Build-options.html">here</a>.</li>
</ul>
<p>For example, to build and install NNStreamer by disabling tensorflow2-lite,</p>
<pre><code>$ meson --prefix=${NNST_ROOT} --sysconfdir=${NNST_ROOT} --libdir=lib --bindir=bin --includedir=include -Dtflite2-support=disabled build
</code></pre>
</li>
<li>
<p>Build source code</p>
</li>
</ul>
<pre><code># Install packages for python example
$ sudo apt-get install python-gi python3-gi
$ sudo apt-get install python-gst-1.0 python3-gst-1.0
$ sudo apt-get install python-gst-1.0-dbg python3-gst-1.0-dbg
# Set your own path to install libraries and header files
$ sudo vi ~/.bashrc
export NNST_ROOT=$HOME/nnstreamer
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$NNST_ROOT/lib
export GST_PLUGIN_PATH=$GST_PLUGIN_PATH:$NNST_ROOT/lib/gstreamer-1.0
# Include NNStreamer headers and libraries
export C_INCLUDE_PATH=$C_INCLUDE_PATH:$NNST_ROOT/include
export CPLUS_INCLUDE_PATH=$CPLUS_INCLUDE_PATH:$NNST_ROOT/include
export PKG_CONFIG_PATH=$PKG_CONFIG_PATH:$NNST_ROOT/lib/pkgconfig
$ source ~/.bashrc
# Download source, then compile it.
# Build and install nnstreamer
$ git clone https://github.com/nnstreamer/nnstreamer.git nnstreamer.git
$ cd nnstreamer.git
$ meson --prefix=${NNST_ROOT} --sysconfdir=${NNST_ROOT} --libdir=lib --bindir=bin --includedir=include build
$ ninja -C build install
$ cd ..
# Build and install examples
$ git clone https://github.com/nnstreamer/nnstreamer-example.git nnstreamer-example.git
$ cd nnstreamer-example.git
$ meson --prefix=${NNST_ROOT} --libdir=lib --bindir=bin --includedir=include build
$ ninja -C build install
$ rm -rf build
$ cd ..
</code></pre>
<ul>
<li>Download tensorflow-lite model & labels.</li>
</ul>
<pre><code>$ cd $NNST_ROOT/bin/
$ ./get-model.sh image-classification-tflite # for the example image classification
$ ./get-model.sh object-detection-tflite # for the example object detection
</code></pre>
<hr>
<h1 id="usage-examples">Usage Examples</h1>
<h2 id="example-camera-liveview-image-classification-w-gstlaunch-decoded-by-tensor_decoder">Example : camera live-view image classification. w/ gst-launch, decoded by tensor_decoder</h2>
<pre><code>$ cd $NNST_ROOT/bin/
$ gst-launch-1.0 textoverlay name=overlay font-desc="Sans, 26" ! \
videoconvert ! ximagesink name=img_test \
v4l2src name=cam_src ! videoconvert ! videoscale ! video/x-raw,width=640,height=480,format=RGB ! \
tee name=t_raw \
t_raw. ! queue ! overlay.video_sink \
t_raw. ! queue ! videoscale ! video/x-raw,width=224,height=224 ! \
tensor_converter ! \
tensor_filter framework=tensorflow-lite \
model=tflite_model_img/mobilenet_v1_1.0_224_quant.tflite ! \
tensor_decoder mode=image_labeling \
option1=tflite_model_img/labels.txt ! \
overlay.text_sink
</code></pre>
<p>The stream pipeline in this example is:</p>
<pre><code>[CAM] - [videoconvert] - [videoscale] - [tee] -+- [queue] - [videoscale] - [tensor_converter] - [tensor_filter] - [tensor_decoder] -+- [textoverlay] - [videoconvert] - [ximagesink]
+- [queue] --------------------------------------------------------------------------+
</code></pre>
<p>The same stream pipeline implementation with C-API is at <code>nnstreamer-example.git/native/example_decoder_image_labelling/</code></p>
<h2 id="example-camera-liveview-image-classification-decoded-by-user-application">Example : camera live-view image classification, decoded by user application.</h2>
<pre><code>[CAM] - [videoconvert] - [videoscale] - [tee] -+- [queue] - [textoverlay] - [videoconvert] - [ximagesink]
+- [queue] - [videoscale] - [tensor_converter] - [tensor_filter] - [tensor_sink]
</code></pre>
<p>Displays a video sink.</p>
<ol>
<li>
<code>tensor_filter</code> for image recognition. (classification with 224x224 image)</li>
<li>Example application gets the buffer from <code>tensor_sink</code>, then updates recognition result to display in <code>textoverlay</code>.</li>
</ol>
<ul>
<li>Run example</li>
</ul>
<pre><code>$ cd $NNST_ROOT/bin/
$ ./nnstreamer_example_image_classification # run C/C++ example
$ python ./nnstreamer_example_image_classification.py # run Python example
</code></pre>
<ul>
<li>Screenshot</li>
</ul>
<p><img src="Documentation/media/exam_filter_keyboard.png" width="320"> <img src="Documentation/media/exam_filter_mouse.png" width="320"></p>
<h2 id="example-camera-liveview-object-detection-decoded-by-user-application-with-tensorflowlite">Example : camera live-view object detection, decoded by user application with Tensorflow-Lite.</h2>
<pre><code>[CAM] - [videoconvert] - [videoscale] - [tee] -+- [queue] - [videoconvert] - [cairooverlay] - [ximagesink]
+- [queue] - [videoscale] - [tensor_converter] - [tensor_transform] - [tensor_filter] - [tensor_sink]
</code></pre>
<p>Displays a video sink.</p>
<ol>
<li>
<code>tensor_transform</code> for typecasting(<code>uint_8</code> to <code>float32</code>) and regularization of <code>other/tensor</code> extracted from the previous element.</li>
<li>
<code>tensor_filter</code> for SSD(Single Shot object Detection). It generates <code>other/tensors</code> as the output.</li>
<li>Example application gets the buffer from <code>tensor_sink</code>, then updates recognition result to display in <code>cairooverlay</code>.</li>
</ol>
<ul>
<li>Run example - with execution file</li>
</ul>
<pre><code class="language-bash">$ cd $NNST_ROOT/bin/
$ ./nnstreamer_example_object_detection_tflite # run C/C++ example
</code></pre>
<ul>
<li>Run example - with bash script</li>
</ul>
<pre><code class="language-bash">$ cd $NNST_ROOT/bin/
$ ./gst-launch-object-detection-tflite.sh # run bash script example
</code></pre>
<ul>
<li>Screenshot</li>
</ul>
<img src="Documentation/media/exam_object_detection_result_tflite.png" width="640">
<h2 id="example-camera-liveview-object-detection-decoded-by-user-application-with-tensorflow">Example : camera live-view object detection, decoded by user application with Tensorflow.</h2>
<pre><code>[CAM] - [videoconvert] - [videoscale] - [tee] -+- [queue] - [videoconvert] - [cairooverlay] - [ximagesink]
+- [queue] - [videoscale] - [tensor_converter] - [tensor_filter] - [tensor_sink]
</code></pre>
<p>Displays a video sink.</p>
<ol>
<li>
<code>tensor_filter</code> for SSD(Single Shot object Detection). It generates <code>other/tensors</code> as the output.</li>
<li>Example application gets the buffer from <code>tensor_sink</code>, then updates recognition result to display in <code>cairooverlay</code>.</li>
</ol>
<ul>
<li>Run example - with execution file</li>
</ul>
<pre><code class="language-bash">$ cd $NNST_ROOT/bin/
$ ./nnstreamer_example_object_detection_tf # run C/C++ example
</code></pre>
<ul>
<li>Run example - with bash script</li>
</ul>
<pre><code class="language-bash">$ cd $NNST_ROOT/bin/
$ ./gst-launch-object-detection-tf.sh # run bash script example
</code></pre>
<ul>
<li>Screenshot</li>
</ul>
<img src="Documentation/media/exam_object_detection_result_tf.png" width="640">
<h2 id="example-video-mixer-with-nnstreamer-plugin">Example : video mixer with NNStreamer plug-in</h2>
<pre><code>[CAM] - [videoconvert] - [videoscale] - [tee] -+- [queue] -------------------------------------------------------+- [videomixer] - [videoconvert] -- [ximagesink (Mixed)]
+- [queue] - [tensor_converter] - [tensor_decoder] - [videoscale]-+
+- [queue] - [videoconvert] - [ximagesink (Original)]
</code></pre>
<p>Displays two video sinks,</p>
<ol>
<li>Original from cam</li>
<li>Mixed : original + scaled (tensor_converter-tensor_decoder-videoscale)</li>
</ol>
<p>In pipeline, converter-decoder passes video frame.</p>
<ul>
<li>Run example</li>
</ul>
<pre><code>$ cd $NNST_ROOT/bin/
$ ./nnstreamer_example_cam
</code></pre>
<ul>
<li>Screenshot</li>
</ul>
<img src="Documentation/media/exam_cam_mix.png" width="640">
<h2 id="example-tensor-sink">Example : tensor sink</h2>
<p>Two simple examples to use tensor sink.</p>
<h4 id="1-100-buffers-passed-to-tensor-sink">1. 100 buffers passed to tensor sink</h4>
<pre><code>[videotestsrc] - [tensor_converter] - [tensor_sink]
</code></pre>
<p>Displays nothing, this sample code shows how to get buffer from tensor sink.</p>
<ul>
<li>Run example</li>
</ul>
<pre><code>$ cd $NNST_ROOT/bin/
$ ./nnstreamer_sink_example
</code></pre>
<h4 id="2-launch-two-pipelines">2. launch two pipelines</h4>
<pre><code>[videotestsrc] - [tensor_converter] - [tensor_sink]
[appsrc] - [tensor_decoder] - [videoconvert] - [ximagesink]
[push buffer from tensor_sink to appsrc]
</code></pre>
<p>Displays video sink.</p>
<p>Tensor sink receives buffer and pushes it into appsrc in 2nd pipeline.</p>
<ul>
<li>Run example</li>
</ul>
<pre><code>$ cd $NNST_ROOT/bin/
$ ./nnstreamer_sink_example_play
</code></pre>
</div>
<div id="subpages">
<p><b>Subpages:</b></p>
<div class="thumb-subpages">
</div>
<p>
<a href="gst-launch-script-example.html">gst-launch script examples</a>
</p>
</div>
</div>
<div id="search_results">
<p>The results of the search are</p>
</div>
<div id="footer">
</div>
</div>
<div id="toc-column">
<div class="edit-button">
</div>
<div id="toc-wrapper">
<nav id="toc"></nav>
</div>
</div>
</div>
</main>
<script src="assets/js/navbar_offset_scroller.js"></script>
</body>
</html>