You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If your hardware supports the FP8 format, you can certainly use FP8 for training and adopt mixed precision, where the large model operates in BF16 and the LoRA module in FP8, to reduce memory usage. However, it's important to note that the LoRA module itself has relatively few parameters, so using FP8 may not yield significant benefits in this case.
有可能利用fp8训练,将显存占用降低到24g吗
The text was updated successfully, but these errors were encountered: