Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ContBatchNorm3d #17

Open
EricKani opened this issue Jun 6, 2018 · 7 comments
Open

ContBatchNorm3d #17

EricKani opened this issue Jun 6, 2018 · 7 comments

Comments

@EricKani
Copy link

EricKani commented Jun 6, 2018

Hi,
I can't run through your ContBatchNorm2d code. It returns to me an error about "AttributeError: 'super' object has no attribute '_check_input_dim'".
I want to know why you write code like this, what if I use F.barchnorm3d directly?
Thank you very much.

@qianqianCDQ
Copy link

Hi,
I can't run through your ContBatchNorm2d code. It returns to me an error about "AttributeError: 'super' object has no attribute '_check_input_dim'".
I want to know why you write code like this, what if I use F.barchnorm3d directly?
Thank you very much.

I also encountered this problem today, did you solve the problem now?

@jphdotam
Copy link

jphdotam commented May 31, 2020

You can substitute the following in:

class ContBatchNorm3d(nn.modules.batchnorm._BatchNorm):
    def _check_input_dim(self, input):
        if input.dim() != 5:
            raise ValueError('expected 5D input (got {}D input)'
                             .format(input.dim()))
        #super(ContBatchNorm3d, self)._check_input_dim(input)

    def forward(self, input):
        self._check_input_dim(input)
        return F.batch_norm(
            input, self.running_mean, self.running_var, self.weight, self.bias,
            True, self.momentum, self.eps)

@rayryeng
Copy link

@jphdotam Thank you for the fix. I can confirm this works. Weird how this issue has been open for 2.5 years and has not been committed to the repo.

@jphdotam
Copy link

Dear @rayryeng I went on to find this network was very poor and could barely fit. I ended up rolling my own 3D Unet which worked MUCH better with fewer parameters: https://github.com/jphdotam/Unet3D

@rayryeng
Copy link

rayryeng commented Jan 28, 2021

@jphdotam thanks! I'll take a look after I finish with some experiments. I'm currently training something right now and will also take a look at yours.

Edit: oh wow fresh off the press! Thanks for sharing it with the community!

@chaoyan1037
Copy link

@jphdotam Why make the batch_norm always the training mode in ContBatchNorm3d?

@bsolano
Copy link

bsolano commented Oct 11, 2023

Hi,

I had the same issue. The problem is the super class of ContBatchNorm3d is _BatchNorm, but _BatchNorm has not a _check_input_dim method. I don't think that commenting the code is the right fix. IMHO it should be a change in the inheritance to BatchNorm3d and commenting the double check like this:

class ContBatchNorm3d(nn.modules.batchnorm.BatchNorm3d):
    def _check_input_dim(self, input):
        #if input.dim() != 5:
        #    raise ValueError('expected 5D input (got {}D input)'
        #                     .format(input.dim()))
        super()._check_input_dim(input)

    def forward(self, input):
        self._check_input_dim(input)
        return F.batch_norm(
            input, self.running_mean, self.running_var, self.weight, self.bias,
            True, self.momentum, self.eps)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants