Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Translation of inputs to Resize, when using 'sizes' argument, drops required inputs #2007

Open
kubaraczkowski opened this issue Jan 13, 2025 · 1 comment
Labels
bug Something isn't working topic: ast converter

Comments

@kubaraczkowski
Copy link

Hi,

I found a bug happening when translating a Resize op that uses a 'sizes' argument.
The required input arguments 'rois' and 'scales' are being dropped, since they are None, but they are "required empty" by onnxruntime.

The strange thing is that the translation when running 'from function' is OK, but exporting the model misses the inputs.

Here is the minimal working example:

import numpy as np
import onnx
import onnxruntime as ort
from onnxscript import FLOAT
from onnxscript import opset20 as op
from onnxscript import script


@script()
def resize(X: FLOAT[800, 600]) -> FLOAT[512, 512]:
    return op.Resize(X, sizes=[512, 512])


# checker is OK
onnx.checker.check_model(resize.to_model_proto())

# This works - skipped arguments `roi` and `scales` get correctly translated to "empty tensors"
print("From function - OK")
X = np.eye(800, 600, dtype=np.float32)
Y = resize(X)

# This doesn't - the `roi` and `scales` get omitted, which makes `sizes` a second argument, while it needs to be fourth.
try:
    print("\n\n\nBug - not OK \n==================")
    session = ort.InferenceSession(resize.to_model_proto().SerializeToString())
    Y = session.run(None, {"X": X})
except Exception as e:
    print(e)
    print("==================")

# workaround - add two empty arguments for `roi` and `scales`
args = resize.function_ir.stmts[-1].args
resize.function_ir.stmts[-1].args = (args[0], "", "", args[1])

# Now it's OK
print("\nWith workaround - OK")
session = ort.InferenceSession(resize.to_model_proto().SerializeToString())
Y = session.run(None, {"X": X})

And the output of it:

From function - OK

Bug - not OK 
==================
[ONNXRuntimeError] : 10 : INVALID_GRAPH : This is an invalid model. Type Error: Type 'tensor(int64)' of input parameter (const) of operator (Resize) in node (n1) is invalid.
==================

With workaround - OK

I'm using onnxscript=0.1.0.dev20250108

And lastly - THANKS - onnxscript is such a great idea for writing pre/post-processing functions!

@justinchuby justinchuby added bug Something isn't working topic: ast converter labels Jan 13, 2025
@justinchuby
Copy link
Collaborator

Thanks for reporting. We will look into this

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working topic: ast converter
Projects
None yet
Development

No branches or pull requests

2 participants