Skip to content
This repository has been archived by the owner on Feb 7, 2023. It is now read-only.

Training without pretrained state #17

Open
nirvitarka opened this issue May 3, 2022 · 2 comments
Open

Training without pretrained state #17

nirvitarka opened this issue May 3, 2022 · 2 comments

Comments

@nirvitarka
Copy link

nirvitarka commented May 3, 2022

I removed the line
resume_state: checkpoints/pretrained.state from train_basicsr.yml

Then got the error about 128 x 128 dimensions. So I resized "hq" images to 128x128 and "lq" images to 32x32.

Now I am getting error
LQ (32, 32) is smaller than patch size (96, 96)
in
/basicsr/data/transforms.py", line 59, in paired_random_crop

Where exactly need to set this patch size?
Has anyone got the training working without the pretrained state?

@shehrum
Copy link

shehrum commented Jun 9, 2022

Hi, did you manage to solve this?

@nirvitarka
Copy link
Author

No, I could not make it work, have not tried anything else further for this since then.

This was referenced Aug 16, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants