You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Feb 14, 2024. It is now read-only.
Hi, thanks for your great project. I just wanted to point out a potential issue with the implementation of the PartialConv function here, which is easily spotted if you run the following:
size = (1, 1, 10, 10)
X = torch.ones(size) # > Input layer
Y = torch.ones(size) # > Mask layer (=all elements are good to go)
convH0 = torch.nn.Conv2d(1,1,3,1,1,bias=False)
with torch.no_grad(): # > Manually set the weights of the convolution kernel
convH0.weight = nn.Parameter(torch.FloatTensor([[[[ 0.2273, 0.1403, -1.0889],
[-0.0351, -0.2992, 0.2029],
[ 0.0280, 0.2878, 0.5101]]]]))
output0 = convH0(X) # > Result from standard convolution kernel
PConv = PartialConv(1,1,3,1,1,bias=False)
with torch.no_grad(): # > Set weights of PConv layer equal to conv. layer
PConv.input_conv.weight = nn.Parameter(torch.FloatTensor([[[[ 0.2273, 0.1403, -1.0889],
[-0.0351, -0.2992, 0.2029],
[ 0.0280, 0.2878, 0.5101]]]]))
output1, mask1 = PConv(X,Y) # > Result from partial convolution layer
I would expect the result for both operations to be the same. However, output1=output0/9! The cause of the error lies in the following line:
where 'mask_sum' is a tensor mostly filled with the value 9. In the original papers, that corresponds to the sum(M) in the denominator. But what is missing is the sum(1) numerator, which should cancel this value of 9 again. I think it can be fixed if you compute the following in the __init__ part of PartialConv
These changes [assuming a square kernel -- otherwise I suppose you could compute self.sumI by multiplying the shape of the weights or something like that] also correctly fix the results in case holes are present. That is, it would then be fully in line with the original paper.
I don't know how big the effect will be on the training, but they could be non-zero.
Oops. I only just now see that this is the same as issue #44 ! Well, this time with some more background then.
The text was updated successfully, but these errors were encountered:
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Hi, thanks for your great project. I just wanted to point out a potential issue with the implementation of the PartialConv function here, which is easily spotted if you run the following:
I would expect the result for both operations to be the same. However,
output1=output0/9
! The cause of the error lies in the following line:pytorch-inpainting-with-partial-conv/net.py
Line 87 in 7f0aa4d
where 'mask_sum' is a tensor mostly filled with the value 9. In the original papers, that corresponds to the
sum(M)
in the denominator. But what is missing is thesum(1)
numerator, which should cancel this value of 9 again. I think it can be fixed if you compute the following in the__init__
part ofPartialConv
and then in the
forward
call you computeThese changes [assuming a square kernel -- otherwise I suppose you could compute
self.sumI
by multiplying the shape of the weights or something like that] also correctly fix the results in case holes are present. That is, it would then be fully in line with the original paper.I don't know how big the effect will be on the training, but they could be non-zero.
Oops. I only just now see that this is the same as issue #44 ! Well, this time with some more background then.
The text was updated successfully, but these errors were encountered: