Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Frozen model for inference #1

Open
noumanriazkhan opened this issue Aug 13, 2018 · 2 comments
Open

Frozen model for inference #1

noumanriazkhan opened this issue Aug 13, 2018 · 2 comments

Comments

@noumanriazkhan
Copy link

Thanks for writing this beautiful piece.

I was looking for inference from frozen graph (for higher fps after optimizing). The inference script loads from checkpoint, for frozen graph, what would be input, output nodes?

Thanks!

@KleinYuan
Copy link
Owner

@noumanriazkhan As I mentioned in here that I tried to freeze the model and test but failed. I may miss some preprocess steps.
You can easily loop through the nodes in tf to find the input/output nodes by names I guess.
I suggest you check the original training codes for references.

@noumanriazkhan
Copy link
Author

Thankyou for suggesting me going through the original codes, DrSleep however has no idea himself how to do it.

I am working on it and will update here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants