-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Very HIgh MSE Loss #2
Comments
You need to find the minima of the function given in the dataset. What you have done, you are trying to reduce the MSE loss between the fit NN and the target values. You have changed the objective function altogether here. Plus, scaling plays a big role in NN convergence. Try to scale the values |
ah ok |
hey, is it okay to estimate the function and then find the minima? but it may not correspond to the actual minima. Can you give hints on this? To get a better estimate, maybe i'll use gradient free black optimization: ZO-AdaMM, which eliminates the need of finding the gradient of the function with respect to |
Yes you can use any method. HINT : Use Next, look at other easy methods that will not require curve fitting. |
Epoch [95/100], Loss: 15795335555002138624.0000
Epoch [96/100], Loss: 15947571735960748032.0000
Epoch [97/100], Loss: 22230783709444833280.0000
Epoch [98/100], Loss: 24243408957763223552.0000
Epoch [99/100], Loss: 20029352523428003840.0000
Epoch [100/100], Loss: 26956084463393570816.0000
Mean Squared Error: 20261864048330539008.0000
I trained a simple neural network with 3 linear layers, relu and a dropout and Adam optimizer
class Net(nn.Module):
def init(self):
super(Net, self).init()
self.fc1 = Linear(12, 128)
self.fc2 = Linear(128, 64)
self.fc3 = Linear(64, 32)
self.fc4 = Linear(32, 1)
self.dropout = nn.Dropout(0.2)
The error seems to be very high.
Something is unique about the dataset.
A custom model tailored for this dataset is required
The text was updated successfully, but these errors were encountered: