You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi,
I have trained an object detection model on my custom data using efficient det lite d0 architecture.
I have created a tflite model from a checkpoint using the export example.
The model trains fine according to tensorboard and I can run inferences just fine on my PC.
I want to run this model on an NPU so I quantized it to uint8 but when i try to run inference on the platform i get the following error:
RuntimeError: Attempting to use a delegate that only supports static-sized tensors with a graph that has dynamic-sized tensors.
From what i checked so far this error means there is a tensor with dynamic input somewhere in the graph.
I tried examining the model using netron but I couldn't find any dynamic operations.
Do you have any clue what this dynamic part might be?
Thanks.
The text was updated successfully, but these errors were encountered:
Hi,
I have trained an object detection model on my custom data using efficient det lite d0 architecture.
I have created a tflite model from a checkpoint using the export example.
The model trains fine according to tensorboard and I can run inferences just fine on my PC.
I want to run this model on an NPU so I quantized it to uint8 but when i try to run inference on the platform i get the following error:
RuntimeError: Attempting to use a delegate that only supports static-sized tensors with a graph that has dynamic-sized tensors.
From what i checked so far this error means there is a tensor with dynamic input somewhere in the graph.
I tried examining the model using netron but I couldn't find any dynamic operations.
Do you have any clue what this dynamic part might be?
Thanks.
The text was updated successfully, but these errors were encountered: