We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
tthe file FoodSeg103/mmseg/datasets/build.py
if dist: sampler = DistributedSampler( dataset, world_size, rank, shuffle=shuffle) shuffle = False batch_size = samples_per_gpu # 2 num_workers = workers_per_gpu else: sampler = None batch_size = num_gpus * samples_per_gpu # 2 num_workers = num_gpus * workers_per_gpu
when dist is True, batch_size=2, but the paper shows the batch_size is 8 (4 GPUs) ,so I don't know why, whether batch_size (=2) is for single GPU ?
The text was updated successfully, but these errors were encountered:
Yes, we use 4 GPU cards to run the experiments, and for each GPU, we set the batch size as 2. Totally the batch size is 2*4 =8.
Sorry, something went wrong.
Thanks very much
No branches or pull requests
tthe file FoodSeg103/mmseg/datasets/build.py
if dist:
sampler = DistributedSampler(
dataset, world_size, rank, shuffle=shuffle)
shuffle = False
batch_size = samples_per_gpu # 2
num_workers = workers_per_gpu
else:
sampler = None
batch_size = num_gpus * samples_per_gpu # 2
num_workers = num_gpus * workers_per_gpu
when dist is True, batch_size=2, but the paper shows the batch_size is 8 (4 GPUs) ,so I don't know why, whether batch_size (=2) is for single GPU ?
The text was updated successfully, but these errors were encountered: