You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
AttributeError: 'LightningDistributedDataParallel' object has no attribute '_sync_params'
On these tests:
tests/callbacks/learning_rate_test.py:# TODO: fix test with num_processes=2
tests/callbacks/training_timer_test.py:# TODO: fix test with num_processes=2
tests/loggers/epoch_csv_logger_test.py:# TODO: fix test with num_processes=2
tests/scripts/htr/decode_ctc_test.py:# TODO: fix test with nprocs=2
tests/scripts/htr/netout_test.py:# TODO: fix test with nprocs=2
tests/scripts/htr/train_ctc_test.py:# TODO: fix "ddp_cpu" mode
tests/scripts/htr/train_ctc_test.py:# TODO: fix "ddp" mode
tests/scripts/htr/train_ctc_test.py:# TODO: fix first assertion
I skipped the tests for now, but I need to investigate why we are getting this error and how to fix it.
The text was updated successfully, but these errors were encountered:
The bump to Pytorch 1.13 broke some tests related to multiprocessing on CPU and GPU. We get the following errors:
torch.multiprocessing.spawn.ProcessRaisedException
AttributeError: 'LightningDistributedDataParallel' object has no attribute '_sync_params'
On these tests:
I skipped the tests for now, but I need to investigate why we are getting this error and how to fix it.
The text was updated successfully, but these errors were encountered: