Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The question of batch_size #8

Open
Mark1Dong opened this issue Aug 23, 2021 · 2 comments
Open

The question of batch_size #8

Mark1Dong opened this issue Aug 23, 2021 · 2 comments

Comments

@Mark1Dong
Copy link

tthe file FoodSeg103/mmseg/datasets/build.py

if dist:
sampler = DistributedSampler(
dataset, world_size, rank, shuffle=shuffle)
shuffle = False
batch_size = samples_per_gpu # 2
num_workers = workers_per_gpu
else:
sampler = None
batch_size = num_gpus * samples_per_gpu # 2
num_workers = num_gpus * workers_per_gpu

when dist is True, batch_size=2, but the paper shows the batch_size is 8 (4 GPUs) ,so I don't know why, whether batch_size (=2) is for single GPU ?

@XiongweiWu
Copy link
Collaborator

Yes, we use 4 GPU cards to run the experiments, and for each GPU, we set the batch size as 2. Totally the batch size is 2*4 =8.

@Mark1Dong
Copy link
Author

Thanks very much

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants