You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, we have a simple CNN network, which works fine on CPU, but since we have a new GPU installed (TITAN X), we want to train the network using GPU, but it gives us wrong results.
This is the code for training on CPU, it works fine
This is the result, I made it display the average loss every 100 iterations.
So then I modified it and try to make it work on GPU, this are the changes I made
It is indeed much faster, but the result is totally wrong, you can see the loss of first 100 - 300 iterations became 3.4+, and it doesn't converge at all, the loss will stay at 3.3-3.4 till the end of training.
Did I miss something? How can I make the training work on GPU. Thanks
The text was updated successfully, but these errors were encountered:
Hi, we have a simple CNN network, which works fine on CPU, but since we have a new GPU installed (TITAN X), we want to train the network using GPU, but it gives us wrong results.
This is the code for training on CPU, it works fine
This is the result, I made it display the average loss every 100 iterations.
So then I modified it and try to make it work on GPU, this are the changes I made
It is indeed much faster, but the result is totally wrong, you can see the loss of first 100 - 300 iterations became 3.4+, and it doesn't converge at all, the loss will stay at 3.3-3.4 till the end of training.
Did I miss something? How can I make the training work on GPU. Thanks
The text was updated successfully, but these errors were encountered: