Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

全连接网络 #28

Open
yanglu1994 opened this issue Oct 11, 2017 · 3 comments
Open

全连接网络 #28

yanglu1994 opened this issue Oct 11, 2017 · 3 comments

Comments

@yanglu1994
Copy link

全连接网络中的损失函数为什么是用tf.matmul(hidden, fc2_weight) + fc2_biases, 和 label比较呢?而不是用softmax, argmax后的比较呢?

@CreatCodeBuild
Copy link
Owner

你可以将代码哪一行提一下吗?都一年了,我有些忘记了。

@yanglu1994
Copy link
Author

yanglu1994 commented Oct 13, 2017

`【model的返回值 return tf.matmul(hidden, fc2_weight) + fc2_biases】

logits = model(self.tf_train_samples)
with tf.name_scope('loss'):
self.loss = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits, self.tf_train_labels))
self.loss += self.apply_regularization(_lambda=5e-4)`
Label 集是个one-hot集合,里面的值为1可以理解为这个图片为该值的概率为1,可是计算loss函数时,model 的输出直接与label 计算,为啥不添加一个softmax层转化成概率呢再比较呢?

@CreatCodeBuild
Copy link
Owner

函数名应该表明了用途tf.nn.softmax_cross_entropy_with_logits(logits, self.tf_train_labels)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants