-
Notifications
You must be signed in to change notification settings - Fork 134
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
about argument parse #2
Comments
I'm way too late. I was stuck over here as well, after tinkering around with different formats, I went to that directory which had both the converter python file and matlab model, and ran this. It worked.
|
@greed2411 Thanks for this greed! However, I got an error and thought maybe it's from the way I converted the weight file. Did you get something like this? Traceback (most recent call last): |
I got an error about:TypeError: split() got an unexpected keyword argument 'num_or_size_splits',Can you give me some help? |
Thanks for sharing, but I still have some questions about "matconvnet_hr101_to_pickle.py" to ask you.
First, I can not run it successfully even though I've downloaded "hr_res101.mat" but it still throws "AssertionError: Matlab pretrained model: /path/to/hr_res101.mat not found". I learned about the usage of ArgumentParser, but I did not solve it. If you can give me some help, I would be greatly appreciated.
Second, I do not know much about matlab, so I can not understand the code, such as "clusters = np.copy (net ['meta'] [0] [0] [0] [0] [6])". More precisely, I think "layers = net ['layers']" is correct, but why use "layers = net [' layers'] [0] [0] [0]". Further more, I think "layer_name" and "layer_type" are just a string, but why do you put all the data into layer_inputs and layer_outputs?
Third, what does "weight_file" mean? Can you elaborate on the data you wrote in the weight_file file?
Maybe you think I asked the questions are stupid, yes, I admit, but who have not experienced such a stage? Thanks very much for sharing, and I will be even more grateful if you can help me with these questions.
The text was updated successfully, but these errors were encountered: