Device = cuda ———> in Colab not working #1606
-
While attempting to run with colab gpu , while torch.cuda is getting the device name , the whole body model deliver error of no cuda detected and accept the cpu run!!! Kindly advice on cuda version at colab or the inferer file solution where I tried everything with the inference file with no result |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 3 replies
-
Hi @Alisoltan82, which tutorial did you refer to? |
Beta Was this translation helpful? Give feedback.
-
This may be because the weights were stored with a certain device that isn't the exact one you're using. Can you try: model = config.get_parsed_content("network").to("cpu")
model.load_state_dict(torch.load(model_path, map_location="cpu"))
model=model.to(device) |
Beta Was this translation helpful? Give feedback.
This may be because the weights were stored with a certain device that isn't the exact one you're using. Can you try: