Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to include an ONNX model into onnxscript script? #1882

Open
kubaraczkowski opened this issue Sep 25, 2024 · 6 comments
Open

How to include an ONNX model into onnxscript script? #1882

kubaraczkowski opened this issue Sep 25, 2024 · 6 comments
Assignees
Labels
contribution welcome We welcome code contributions for this question Further information is requested

Comments

@kubaraczkowski
Copy link

Hi onnnxscript people,

Onnxscript is great, I love it, thanks!
I currently use it e.g. for preprocessing tasks, but would want to include the 'real' model in the flow as well.
More, I would e.g. like to include 3 models as an assembly in an onnxscript.

Until now I didn't find a way (tutorial, documentation) on how to do it. It's perhaps slightly related to #952 which converts the ONNX model to a script, I guess to also be able to 'merge' it with other scripts (and tweak the whole thing).

Therefore, in essence - is it possible to include a 'compiled' (proto) ONNX model as a part of an onnxscript?

Thanks!

@justinchuby
Copy link
Collaborator

Cc @gramalingam

@justinchuby justinchuby added the question Further information is requested label Sep 27, 2024
@gramalingam
Copy link
Collaborator

Agree, it would be great to have this feature. Unfortunately, onnxscript doesn't have it yet.

Ideally, we should have a wrapper function that allows us to treat an existing ONNX model as a function, and be able to call it from an onnxscript function. Would be great to add this feature.

@kubaraczkowski
Copy link
Author

Yes!
My workaround is the following:

  • write the preprocessing and postprocessing functions
  • merge them with the models using onnx.compose.merge_models. Not as trivial since names often clash, so requires adding prefixes everywhere
  • run using onnxruntime

Works, but misses the fantastic debuggability of onnxscript!

Looking forward to this happening :)

@titaiwangms
Copy link
Contributor

Feel free to reopen if the question is not answered.

@kubaraczkowski
Copy link
Author

The workaround mentioned earlier works OK, but is REALLY cumbersome, against the ease-of-use of onnxscript.

With anything but a trivial model graph it becomes a pain to manage the manual merging of all the models, manually linking the inputs and outputs.
Additionally you need to prefix all the onnxscript-generated models as they get generated from the same "empty" starting point.

So although this is possible, a function wrapper that @gramalingam suggested would be oh-so-much-better. Any chance it could be put on the roadmap?

@justinchuby justinchuby reopened this Jan 10, 2025
@justinchuby justinchuby added the contribution welcome We welcome code contributions for this label Jan 10, 2025
@justinchuby justinchuby self-assigned this Jan 10, 2025
@justinchuby
Copy link
Collaborator

Thank you for your feedback! I will take a closer look at this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
contribution welcome We welcome code contributions for this question Further information is requested
Projects
None yet
Development

No branches or pull requests

4 participants