We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
你好,这里的bn_value不就是bn3.weight吗?为什么residual还要再加一次呢? `
residual += self.bn_value.cuda() residual.index_add_(1, self.index.cuda(), out) residual = self.relu(residual)
`
The text was updated successfully, but these errors were encountered:
看您之前的解释,是零输入对应的bn层输出,那为什么不减去bn_value呢?是因为未剪枝前bn就有输出,所以剪完枝之后要加上吗?
Sorry, something went wrong.
Hi, I also don't understand this operation. Do you have any answers? @Emily0219
应该是这样的,是因为bn_value是对应剪掉位置的bn值,self.bn3的大小则和conv3保持一致(只有remaining 的filter对应的值), bn_value相当于0输入时,剪掉部分的对应输出,所以需要加上
No branches or pull requests
你好,这里的bn_value不就是bn3.weight吗?为什么residual还要再加一次呢?
`
`
The text was updated successfully, but these errors were encountered: