Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

A bug about python main_SemanticKITTI.py #22

Open
ksama777 opened this issue Aug 1, 2022 · 4 comments
Open

A bug about python main_SemanticKITTI.py #22

ksama777 opened this issue Aug 1, 2022 · 4 comments

Comments

@ksama777
Copy link

ksama777 commented Aug 1, 2022

I got a bug when I run python main_SemanticKITTI.py

**** EPOCH 000 ****
2022-08-01 17:14:40.759397
/media/wx/HDD/DQ/RandLA-Net-pytorch-master/RandLANet.py:265: UserWarning: torch.range is deprecated and will be removed in a future release because its behavior is inconsistent with Python's range builtin. Instead, use torch.arange, which produces values in [start, end).
reducing_list = torch.range(0, cfg.num_classes).long().cuda()
Traceback (most recent call last):
File "main_SemanticKITTI.py", line 205, in
train(start_epoch)
File "main_SemanticKITTI.py", line 185, in train
train_one_epoch()
File "main_SemanticKITTI.py", line 105, in train_one_epoch
loss, end_points = compute_loss(end_points, cfg)
File "/media/wx/HDD/DQ/RandLA-Net-pytorch-master/RandLANet.py", line 270, in compute_loss
loss = get_loss(valid_logits, valid_labels, cfg.class_weights)
File "/media/wx/HDD/DQ/RandLA-Net-pytorch-master/RandLANet.py", line 282, in get_loss
output_loss = criterion(logits, labels)
File "/home/wx/anaconda3/envs/RandLA-Net-pytorch-master/lib/python3.7/site-packages/torch/nn/modules/module.py", line 1102, in _call_impl
return forward_call(*input, **kwargs)
File "/home/wx/anaconda3/envs/RandLA-Net-pytorch-master/lib/python3.7/site-packages/torch/nn/modules/loss.py", line 1152, in forward
label_smoothing=self.label_smoothing)
File "/home/wx/anaconda3/envs/RandLA-Net-pytorch-master/lib/python3.7/site-packages/torch/nn/functional.py", line 2846, in cross_entropy
return torch._C._nn.cross_entropy_loss(input, target, weight, _Reduction.get_enum(reduction), ignore_index, label_smoothing)
RuntimeError: weight tensor should be defined either for all 18 classes or no classes but got weight tensor of shape: [1, 19]

can you help me
thanks a lot

@mzyCSDN
Copy link

mzyCSDN commented Oct 10, 2022

I have the same question with you.How can you save the question?

@ksama777
Copy link
Author

ksama777 commented Oct 10, 2022 via email

@Sylva-Lin
Copy link

I have the same question with you.How can you save the question?

just change [1,19] to [19], and it work!

@MohamedMansour89
Copy link

Hi ,

i helper_tool.py i changed this line
return np.expand_dims(ce_label_weight, axis=0) which is 2d array

to this one return ce_label_weight which is 1d array.

hope it helps you

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants