Home » Python » What is logits, softmax and softmax_cross_entropy_with_logits?

What is logits, softmax and softmax_cross_entropy_with_logits?

Posted by: admin May 25, 2018 Leave a comment

Questions:

I was going through the tensorflow API docs here. In the tensorflow documentation, they used a keyword called logits. What is it? In a lot of methods in the API docs it is written like

tf.nn.softmax(logits, name=None)

If what is written is those logits are only Tensors, why keeping a different name like logits?

Another thing is that there are two methods I could not differentiate. They were

tf.nn.softmax(logits, name=None)
tf.nn.softmax_cross_entropy_with_logits(logits, labels, name=None)

What are the differences between them? The docs are not clear to me. I know what tf.nn.softmax does. But not the other. An example will be really helpful.

Answers: