Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RuntimeError: "reflection_pad1d_out_template" not implemented for 'Short' : when using separate(...) method #119

Open
deepakpawade opened this issue May 25, 2022 · 7 comments

Comments

@deepakpawade
Copy link

estimates = separate(audio=mix_torch, targets=['podcasts'], model_str_or_path='../scripts/open-unmix-512', device='cuda', rate = rate )
stackstrace :
RuntimeError Traceback (most recent call last)
~\AppData\Local\Temp/ipykernel_14992/3837081952.py in
----> 1 estimates = separate(audio=mix_torch,
2 targets=['podcasts'],
3 model_str_or_path='../scripts/open-unmix-512',
4 device='cuda',
5 rate = rate

d:\InterferenceSeperation\umx_demo\openunmix\predict.py in separate(audio, rate, model_str_or_path, targets, niter, residual, wiener_win_len, aggregate_dict, separator, device, filterbank)
76
77 # getting the separated signals
---> 78 estimates = separator(audio)
79 estimates = separator.to_dict(estimates, aggregate_dict=aggregate_dict)
80 return estimates

c:\Users\deepdesk\AppData\Local\Programs\Python\Python39\lib\site-packages\torch\nn\modules\module.py in _call_impl(self, *input, **kwargs)
1100 if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks
1101 or _global_forward_hooks or _global_forward_pre_hooks):
-> 1102 return forward_call(*input, **kwargs)
1103 # Do not call functions when jit is used
1104 full_backward_hooks, non_full_backward_hooks = [], []

c:\Users\deepdesk\AppData\Local\Programs\Python\Python39\lib\site-packages\openunmix\model.py in forward(self, audio)
256 # getting the STFT of mix:
257 # (nb_samples, nb_channels, nb_bins, nb_frames, 2)
--> 258 mix_stft = self.stft(audio)
259 X = self.complexnorm(mix_stft)
260

c:\Users\deepdesk\AppData\Local\Programs\Python\Python39\lib\site-packages\torch\nn\modules\module.py in _call_impl(self, *input, **kwargs)
1100 if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks
1101 or _global_forward_hooks or _global_forward_pre_hooks):
-> 1102 return forward_call(*input, **kwargs)
1103 # Do not call functions when jit is used
1104 full_backward_hooks, non_full_backward_hooks = [], []

c:\Users\deepdesk\AppData\Local\Programs\Python\Python39\lib\site-packages\openunmix\transforms.py in forward(self, x)
97 x = x.view(-1, shape[-1])
98
---> 99 complex_stft = torch.stft(
100 x,
101 n_fft=self.n_fft,

c:\Users\deepdesk\AppData\Local\Programs\Python\Python39\lib\site-packages\torch\functional.py in stft(input, n_fft, hop_length, win_length, window, center, pad_mode, normalized, onesided, return_complex)
568 extended_shape = [1] * (3 - signal_dim) + list(input.size())
569 pad = int(n_fft // 2)
--> 570 input = F.pad(input.view(extended_shape), [pad, pad], pad_mode)
571 input = input.view(input.shape[-signal_dim:])
572 return _VF.stft(input, n_fft, hop_length, win_length, window, # type: ignore[attr-defined]

c:\Users\deepdesk\AppData\Local\Programs\Python\Python39\lib\site-packages\torch\nn\functional.py in _pad(input, pad, mode, value)
4177 if len(pad) == 2 and (input.dim() == 2 or input.dim() == 3):
4178 if mode == "reflect":
-> 4179 return torch._C._nn.reflection_pad1d(input, pad)
4180 elif mode == "replicate":
4181 return torch._C._nn.replication_pad1d(input, pad)

RuntimeError: "reflection_pad1d_out_template" not implemented for 'Short'

python 3.9.7
torch 1.10.1+cu113
torchaudio 0.10.1+cu113
torchvision 0.11.2+cu113
cuda 11.7.r11.7

@deepakpawade deepakpawade changed the title RuntimeError: "reflection_pad1d_out_template" not implemented for 'Short' : when using separate method RuntimeError: "reflection_pad1d_out_template" not implemented for 'Short' : when using separate(...) method May 25, 2022
@deepakpawade
Copy link
Author

do I need to install cuda 11.3?

@faroit
Copy link
Member

faroit commented May 25, 2022

@deepakpawade the current master version doesn't support 1.10 yet. The tests still run on torch 1.9. See #112

@deepakpawade
Copy link
Author

deepakpawade commented May 26, 2022

@deepakpawade the current master version doesn't support 1.10 yet. The tests still run on torch 1.9. See #112

@faroit was having compatibility issues with 1.9.0 + cuda 10.x with other libraries so installed 1.9.1 with cuda 11.1 & still got the same error. Is it strictly dependent on 1.9.0?
cuda 11.1
torch 1.9.1+cu111
torchaudio 0.9.1
torchvision 0.10.1+cu111

@deepakpawade
Copy link
Author

deepakpawade commented May 26, 2022

Also, can we do it in a different way without using separate(...) method or torch?

@QinHsiu
Copy link

QinHsiu commented Sep 18, 2023

I have the sample problem,
RuntimeError: "reflection_pad1d_out_template" not implemented for 'Long'

@faroit
Copy link
Member

faroit commented Apr 17, 2024

@deepakpawade can this be closed?

@deepakpawade
Copy link
Author

deepakpawade commented Jun 1, 2024

Yes please. @faroit

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants