You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I don't know how to do it, I tried to export 512x512 onnx model but it failed. But it is possible to export onnx models with dynamic width and height, before I had problems reasoning about models with dynamic width and height with onnxruntime reasoning about them @antithing
Hi, and thanks for making this code available!
Is it possible to run at low resolution, eg 512x512, and reach faster inference speeds?
I am aiming for 20ms per image.
The text was updated successfully, but these errors were encountered: