Replies: 1 comment
-
Which version are you using? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
When I run the mmclassification prediction demo, I get the correct prediction results, but when exporting the onnx model with torch.onnx.export(model, x, "resnet50.onnx", verbose=True), I always report an error. This error is the same as when I tried to export a model suitable for invocation in C++ using traced_module=torch.jit.trace(net, x).
Beta Was this translation helpful? Give feedback.
All reactions