You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I see below warning in logs when running a LoRA training , can this be ignored ?
/text-generation-webui-main/installer_files/env/lib/python3.11/site-packages/torch/utils/checkpoint.py:429: UserWarning: torch.utils.checkpoint: please pass in use_reentrant=True or use_reentrant=False explicitly. The default value of use_reentrant will be updated to be False in the future. To maintain current behavior, pass use_reentrant=True. It is recommended that you use use_reentrant=False. Refer to docs for more details on the differences between the two variants. warnings.warn(
settings used
Text: (Service_API.txt) has 1604 blocks (Block Size 256 tokens)
[Batch Size: 7, Epochs: 3.0, Gradient Accumulation: 1]
Total number of steps: 690
Steps per each Epoch: 230
Suggestions:
Checkpoints: Save every 69 - 138 steps (Current: 192)
Warmup steps: 69 (Current: 96)
The text was updated successfully, but these errors were encountered:
I see below warning in logs when running a LoRA training , can this be ignored ?
/text-generation-webui-main/installer_files/env/lib/python3.11/site-packages/torch/utils/checkpoint.py:429: UserWarning: torch.utils.checkpoint: please pass in use_reentrant=True or use_reentrant=False explicitly. The default value of use_reentrant will be updated to be False in the future. To maintain current behavior, pass use_reentrant=True. It is recommended that you use use_reentrant=False. Refer to docs for more details on the differences between the two variants. warnings.warn(
settings used
Text: (Service_API.txt) has 1604 blocks (Block Size 256 tokens)
[Batch Size: 7, Epochs: 3.0, Gradient Accumulation: 1]
Total number of steps: 690
Steps per each Epoch: 230
Suggestions:
Checkpoints: Save every 69 - 138 steps (Current: 192)
Warmup steps: 69 (Current: 96)
The text was updated successfully, but these errors were encountered: