-
Notifications
You must be signed in to change notification settings - Fork 1
/
onnx_convert.html
327 lines (205 loc) · 14.9 KB
/
onnx_convert.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
<!DOCTYPE html>
<html class="writer-html5" lang="en" >
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Obtain ONNX models — N2D2 documentation</title>
<link rel="stylesheet" href="_static/css/theme.css" type="text/css" />
<link rel="stylesheet" href="_static/pygments.css" type="text/css" />
<!--[if lt IE 9]>
<script src="_static/js/html5shiv.min.js"></script>
<![endif]-->
<script type="text/javascript" id="documentation_options" data-url_root="./" src="_static/documentation_options.js"></script>
<script src="_static/jquery.js"></script>
<script src="_static/underscore.js"></script>
<script src="_static/doctools.js"></script>
<script src="_static/language_data.js"></script>
<script async="async" src="https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.7/latest.js?config=TeX-AMS-MML_HTMLorMML"></script>
<script type="text/javascript" src="_static/js/theme.js"></script>
<link rel="author" title="About these documents" href="about.html" />
<link rel="index" title="Index" href="genindex.html" />
<link rel="search" title="Search" href="search.html" />
<link rel="next" title="Import ONNX models" href="onnx_import.html" />
<link rel="prev" title="Tutorials" href="tuto.html" />
</head>
<body class="wy-body-for-nav">
<div class="wy-grid-for-nav">
<nav data-toggle="wy-nav-shift" class="wy-nav-side">
<div class="wy-side-scroll">
<div class="wy-side-nav-search" >
<a href="index.html" class="icon icon-home" alt="Documentation Home"> N2D2
</a>
<div role="search">
<form id="rtd-search-form" class="wy-form" action="search.html" method="get">
<input type="text" name="q" placeholder="Search docs" />
<input type="hidden" name="check_keywords" value="yes" />
<input type="hidden" name="area" value="default" />
</form>
</div>
</div>
<div class="wy-menu wy-menu-vertical" data-spy="affix" role="navigation" aria-label="main navigation">
<p class="caption"><span class="caption-text">Introduction:</span></p>
<ul>
<li class="toctree-l1"><a class="reference internal" href="intro.html">Presentation</a></li>
<li class="toctree-l1"><a class="reference internal" href="about.html">About N2D2-IP</a></li>
<li class="toctree-l1"><a class="reference internal" href="simus.html">Performing simulations</a></li>
<li class="toctree-l1"><a class="reference internal" href="perfs_tools.html">Performance evaluation tools</a></li>
<li class="toctree-l1"><a class="reference internal" href="tuto.html">Tutorials</a></li>
</ul>
<p class="caption"><span class="caption-text">ONNX Import:</span></p>
<ul class="current">
<li class="toctree-l1 current"><a class="current reference internal" href="#">Obtain ONNX models</a><ul>
<li class="toctree-l2"><a class="reference internal" href="#convert-from-pytorch">Convert from PyTorch</a></li>
<li class="toctree-l2"><a class="reference internal" href="#convert-from-tf-keras">Convert from TF/Keras</a></li>
<li class="toctree-l2"><a class="reference internal" href="#download-pre-trained-models">Download pre-trained models</a></li>
</ul>
</li>
<li class="toctree-l1"><a class="reference internal" href="onnx_import.html">Import ONNX models</a></li>
<li class="toctree-l1"><a class="reference internal" href="onnx_transfer.html">Train from ONNX models</a></li>
</ul>
<p class="caption"><span class="caption-text">Quantization and Export:</span></p>
<ul>
<li class="toctree-l1"><a class="reference internal" href="quant_post.html">Post-training quantization</a></li>
<li class="toctree-l1"><a class="reference internal" href="quant_qat.html">[NEW] Quantization-Aware Training</a></li>
<li class="toctree-l1"><a class="reference internal" href="export_CPP.html">Export: C++</a></li>
<li class="toctree-l1"><a class="reference internal" href="export_CPP_STM32.html">Export: C++/STM32</a></li>
<li class="toctree-l1"><a class="reference internal" href="export_TensorRT.html">Export: TensorRT</a></li>
<li class="toctree-l1"><a class="reference internal" href="export_DNeuro.html">Export: DNeuro</a></li>
<li class="toctree-l1"><a class="reference internal" href="export_ONNX.html">Export: ONNX</a></li>
<li class="toctree-l1"><a class="reference internal" href="export_legacy.html">Export: other / legacy</a></li>
</ul>
<p class="caption"><span class="caption-text">INI File Interface:</span></p>
<ul>
<li class="toctree-l1"><a class="reference internal" href="ini_intro.html">Introduction</a></li>
<li class="toctree-l1"><a class="reference internal" href="ini_databases.html">Databases</a></li>
<li class="toctree-l1"><a class="reference internal" href="ini_data_analysis.html">Stimuli data analysis</a></li>
<li class="toctree-l1"><a class="reference internal" href="ini_environment.html">Stimuli provider (Environment)</a></li>
<li class="toctree-l1"><a class="reference internal" href="ini_layers.html">Network Layers</a></li>
<li class="toctree-l1"><a class="reference internal" href="ini_target.html">Targets (outputs & losses)</a></li>
<li class="toctree-l1"><a class="reference internal" href="adversarial.html">Adversarial module</a></li>
</ul>
<p class="caption"><span class="caption-text">Python API:</span></p>
<ul>
<li class="toctree-l1"><a class="reference internal" href="containers.html">Containers</a></li>
<li class="toctree-l1"><a class="reference internal" href="cells.html">Cells</a></li>
<li class="toctree-l1"><a class="reference internal" href="databases.html">Databases</a></li>
<li class="toctree-l1"><a class="reference internal" href="stimuliprovider.html">StimuliProvider</a></li>
<li class="toctree-l1"><a class="reference internal" href="deepnet.html">DeepNet</a></li>
</ul>
<p class="caption"><span class="caption-text">C++ API / Developer:</span></p>
<ul>
<li class="toctree-l1"><a class="reference internal" href="dev_intro.html">Introduction</a></li>
</ul>
</div>
</div>
</nav>
<section data-toggle="wy-nav-shift" class="wy-nav-content-wrap">
<nav class="wy-nav-top" aria-label="top navigation">
<i data-toggle="wy-nav-top" class="fa fa-bars"></i>
<a href="index.html">N2D2</a>
</nav>
<div class="wy-nav-content">
<div class="rst-content">
<div role="navigation" aria-label="breadcrumbs navigation">
<ul class="wy-breadcrumbs">
<li><a href="index.html" class="icon icon-home"></a> »</li>
<li>Obtain ONNX models</li>
<li class="wy-breadcrumbs-aside">
<a href="_sources/onnx_convert.rst.txt" rel="nofollow"> View page source</a>
</li>
</ul>
<hr/>
</div>
<div role="main" class="document" itemscope="itemscope" itemtype="http://schema.org/Article">
<div itemprop="articleBody">
<div class="section" id="obtain-onnx-models">
<h1>Obtain ONNX models<a class="headerlink" href="#obtain-onnx-models" title="Permalink to this headline">¶</a></h1>
<div class="section" id="convert-from-pytorch">
<h2>Convert from PyTorch<a class="headerlink" href="#convert-from-pytorch" title="Permalink to this headline">¶</a></h2>
<p>ONNX conversion is natively supported in PyTorch with the <code class="docutils literal notranslate"><span class="pre">torch.onnx.export</span></code>
function. An example of a pre-trained PyTorch model conversion to ONNX is
provided in <code class="docutils literal notranslate"><span class="pre">tools/pytorch_to_onnx.py</span></code>:</p>
<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="kn">import</span> <span class="nn">torch</span>
<span class="kn">from</span> <span class="nn">MobileNetV2</span> <span class="kn">import</span> <span class="n">mobilenet_v2</span>
<span class="n">dummy_input</span> <span class="o">=</span> <span class="n">torch</span><span class="o">.</span><span class="n">randn</span><span class="p">(</span><span class="mi">10</span><span class="p">,</span> <span class="mi">3</span><span class="p">,</span> <span class="mi">224</span><span class="p">,</span> <span class="mi">224</span><span class="p">)</span>
<span class="n">model</span> <span class="o">=</span> <span class="n">mobilenet_v2</span><span class="p">(</span><span class="n">pretrained</span><span class="o">=</span><span class="kc">True</span><span class="p">)</span>
<span class="n">input_names</span> <span class="o">=</span> <span class="p">[</span> <span class="s2">"input"</span> <span class="p">]</span>
<span class="n">output_names</span> <span class="o">=</span> <span class="p">[</span> <span class="s2">"output"</span> <span class="p">]</span>
<span class="n">torch</span><span class="o">.</span><span class="n">onnx</span><span class="o">.</span><span class="n">export</span><span class="p">(</span><span class="n">model</span><span class="p">,</span> <span class="n">dummy_input</span><span class="p">,</span> <span class="s2">"mobilenet_v2_pytorch.onnx"</span><span class="p">,</span> <span class="n">verbose</span><span class="o">=</span><span class="kc">True</span><span class="p">,</span> <span class="n">input_names</span><span class="o">=</span><span class="n">input_names</span><span class="p">,</span> <span class="n">output_names</span><span class="o">=</span><span class="n">output_names</span><span class="p">)</span>
</pre></div>
</div>
</div>
<div class="section" id="convert-from-tf-keras">
<h2>Convert from TF/Keras<a class="headerlink" href="#convert-from-tf-keras" title="Permalink to this headline">¶</a></h2>
<p>ONNX conversion is not natively supported by TF/Keras. Instead, a third-party
tool must be used, like <code class="docutils literal notranslate"><span class="pre">keras2onnx</span></code> or <code class="docutils literal notranslate"><span class="pre">tf2onnx</span></code>. Currently, the <code class="docutils literal notranslate"><span class="pre">tf2onnx</span></code>
is the most active and most maintained solution.</p>
<p>The <code class="docutils literal notranslate"><span class="pre">tf2onnx</span></code> tool can be used in command line, by providing a TensorFlow
frozen graph (.pb).</p>
<div class="admonition note">
<p class="admonition-title">Note</p>
<p>Make sure to use the option <code class="docutils literal notranslate"><span class="pre">--inputs-as-nchw</span></code> on the model input(s)
because N2D2 expects NCHW inputs, but the default format in TF/Keras is
NHWC. Otherwise you would typically get an error like:</p>
<div class="highlight-default notranslate"><div class="highlight"><pre><span></span><span class="n">Error</span><span class="p">:</span> <span class="n">Unexpected</span> <span class="n">size</span> <span class="k">for</span> <span class="n">ONNX</span> <span class="nb">input</span> <span class="s2">"conv2d_77_input"</span><span class="p">:</span> <span class="n">got</span> <span class="mi">3</span> <span class="mi">224</span> <span class="mi">224</span> <span class="p">,</span> <span class="n">but</span> <span class="n">StimuliProvider</span> <span class="n">provides</span> <span class="mi">224</span> <span class="mi">224</span> <span class="mi">3</span>
</pre></div>
</div>
<p>The format of the exported ONNX graph from TF/Keras will depend on the
execution platform (CPU or GPU). The default format is NHWC on CPU and
NCHW on GPU. ONNX mandates the NCHW format for the operators, so exporting
an ONNX model on CPU can result in the insertion of many <code class="docutils literal notranslate"><span class="pre">Transpose</span></code>
operations in the graph before and after other operators.</p>
</div>
<div class="highlight-bash notranslate"><div class="highlight"><pre><span></span><span class="nv">tfmodel</span><span class="o">=</span>mobilenet_v1_1.0_224_frozen.pb
<span class="nv">onnxmodel</span><span class="o">=</span>mobilenet_v1_1.0_224.onnx
<span class="nv">url</span><span class="o">=</span>http://download.tensorflow.org/models/mobilenet_v1_2018_08_02/mobilenet_v1_1.0_224.tgz
<span class="nv">tgz</span><span class="o">=</span><span class="k">$(</span>basename <span class="nv">$url</span><span class="k">)</span>
<span class="k">if</span> <span class="o">[</span> ! -r <span class="nv">$tgz</span> <span class="o">]</span><span class="p">;</span> <span class="k">then</span>
wget -q <span class="nv">$url</span>
tar zxvf <span class="nv">$tgz</span>
<span class="k">fi</span>
python3 -m tf2onnx.convert --input <span class="nv">$tfmodel</span> --output <span class="nv">$onnxmodel</span> <span class="se">\</span>
--opset <span class="m">10</span> --verbose <span class="se">\</span>
--inputs-as-nchw input:0 <span class="se">\</span>
--inputs input:0 <span class="se">\</span>
--outputs MobilenetV1/Predictions/Reshape_1:0
</pre></div>
</div>
<p>Example conversion scripts are provided for the Mobilenet families:
<code class="docutils literal notranslate"><span class="pre">tools/mobilenet_v1_to_onnx.sh</span></code>, <code class="docutils literal notranslate"><span class="pre">tools/mobilenet_v2_to_onnx.sh</span></code> and
<code class="docutils literal notranslate"><span class="pre">tools/mobilenet_v3_to_onnx.sh</span></code>.</p>
</div>
<div class="section" id="download-pre-trained-models">
<h2>Download pre-trained models<a class="headerlink" href="#download-pre-trained-models" title="Permalink to this headline">¶</a></h2>
<p>Many already trained ONNX models are freely available and ready to use in the
ONNX Model Zoo: <a class="reference external" href="https://github.com/onnx/models/blob/master/README.md">https://github.com/onnx/models/blob/master/README.md</a></p>
</div>
</div>
</div>
</div>
<footer>
<div class="rst-footer-buttons" role="navigation" aria-label="footer navigation">
<a href="onnx_import.html" class="btn btn-neutral float-right" title="Import ONNX models" accesskey="n" rel="next">Next <span class="fa fa-arrow-circle-right"></span></a>
<a href="tuto.html" class="btn btn-neutral float-left" title="Tutorials" accesskey="p" rel="prev"><span class="fa fa-arrow-circle-left"></span> Previous</a>
</div>
<hr/>
<div role="contentinfo">
<p>
© Copyright 2019, CEA LIST
</p>
</div>
Built with <a href="http://sphinx-doc.org/">Sphinx</a> using a
<a href="https://github.com/rtfd/sphinx_rtd_theme">theme</a>
provided by <a href="https://readthedocs.org">Read the Docs</a>.
</footer>
</div>
</div>
</section>
</div>
<script type="text/javascript">
jQuery(function () {
SphinxRtdTheme.Navigation.enable(true);
});
</script>
</body>
</html>