Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error running FOM colab, animated.py #395

Open
G-force78 opened this issue Oct 17, 2023 · 5 comments
Open

Error running FOM colab, animated.py #395

G-force78 opened this issue Oct 17, 2023 · 5 comments

Comments

@G-force78
Copy link

2023-10-17 10:57:05,909 [nnabla][INFO]: Initializing CPU extension...
usage: animate.py [-h] [--config CONFIG] [--params PARAMS] [--source SOURCE] [--driving DRIVING]
[--out-dir OUT_DIR] [--context {cudnn,cpu}] [--output-png] [--fps FPS]
[--only-generated] [--detailed] [--full] [--adapt-movement-scale]
[--unuse-relative-movement] [--unuse-relative-jacobian]
animate.py: error: unrecognized arguments: \


NameError Traceback (most recent call last)

in <cell line: 2>()
1 get_ipython().system('python animate.py --source imgs/sample_src.png --driving imgs/sample_drv.mp4 --adapt-movement-scale --fps 24 \')
----> 2 --detailed --full

NameError: name 'detailed' is not defined

@G-force78
Copy link
Author

Also, how can it be used to generate with ted384 model and yaml? Thanks

@TomonobuTsujikawa
Copy link
Contributor

Thank you for creating issue, we will check it soon.

@TomonobuTsujikawa
Copy link
Contributor

TomonobuTsujikawa commented Oct 18, 2023

Sorry, I didn't reproduce your issue.
Did you click the "RESTART RUNTIME" button after executing first paragraph? or did you add backslash the end of line?

!python animate.py --source imgs/sample_src.png \
                  --driving imgs/sample_drv.mp4 \
                  --adapt-movement-scale --fps 24 --detailed --full

2023-10-18 02:52:36,823 [nnabla][INFO]: Initializing CPU extension...
2023-10-18 02:52:37,717 [nnabla][INFO]: Initializing CUDA extension...
2023-10-18 02:52:37,740 [nnabla][INFO]: Initializing cuDNN extension...
voxceleb_trained_info.yaml: 100% 1.71k/1.71k [00:00<00:00, 7.49MB/s]
pretrained_fomm_params.h5: 100% 228M/228M [00:24<00:00, 9.58MB/s]
Loading pretrained_fomm_params.h5 for image animation...
100% 125/125 [00:20<00:00,  6.15it/s]

@TakuyaYashima
Copy link
Contributor

Hi, thanks for trying our demo. Regarding the 2nd question,

Also, how can it be used to generate with ted384 model and yaml? Thanks

I suppose that you mean this model (or, pretrained weights and config file, to be precise).
Though it depends on the internal neural network architecture and parameters' name, I don't think ted384 model can not be applied to our model as is.

The best way would be , git clone our source code and modify that to be able to load their model.

@G-force78
Copy link
Author

Hi, thanks for trying our demo. Regarding the 2nd question,

Also, how can it be used to generate with ted384 model and yaml? Thanks

I suppose that you mean this model (or, pretrained weights and config file, to be precise). Though it depends on the internal neural network architecture and parameters' name, I don't think ted384 model can not be applied to our model as is.

The best way would be , git clone our source code and modify that to be able to load their model.

Hi, thanks for quick reply, the modifications are what I would like to know, it seems most of the code is almsot identical to articulated-animation so I assume there would not be much to modify to get the 384 model to run?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants