You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The script reports that several files located in the /data directory cannot be found, thus causing the script to fail to run properly. However, there is no /data directory within the project files, as shown in the image below.
Could you please clarify the original purpose of these files located in the /data directory? Are they the datasets used for training? Additionally, how can I modify the script to successfully load these files and enable the training to proceed smoothly? If the data directory indeed contains the datasets, considering that the script directly accesses them by name (with error messages like 'No such file or directory', 'data/LAION_6plus/train_00000/00039.tar'), do I need to adjust the relevant script code to ensure that the model can load the data properly if I choose to train it with my own dataset?
The text was updated successfully, but these errors were encountered:
When I attempt to train the Adapter (main branch) using the following terminal command, executed in the background:
nohup accelerate launch train_sketch.py --pretrained_model_name_or_path stabilityai/stable-diffusion-xl-base-1.0 --output_dir experiments/adapter_sketch_xl --config configs/train/Adapter-XL-sketch.yaml --mixed_precision="fp16" --resolution=1024 --learning_rate=1e-5 --max_train_steps=10 --train_batch_size=1 --gradient_accumulation_steps=4 --report_to="wandb" --seed=42 --num_train_epochs 1 > pre_experiment.out &
The script reports that several files located in the /data directory cannot be found, thus causing the script to fail to run properly. However, there is no /data directory within the project files, as shown in the image below.
Could you please clarify the original purpose of these files located in the /data directory? Are they the datasets used for training? Additionally, how can I modify the script to successfully load these files and enable the training to proceed smoothly? If the data directory indeed contains the datasets, considering that the script directly accesses them by name (with error messages like 'No such file or directory', 'data/LAION_6plus/train_00000/00039.tar'), do I need to adjust the relevant script code to ensure that the model can load the data properly if I choose to train it with my own dataset?
The text was updated successfully, but these errors were encountered: