You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
All the models in TensorNets return a softmax tf.Tensor, while the models in tensorflow/models (also backbone of tensorflow/hub) do a logits tf.Tensor (the values before softmax). That is because most regular TensorFlow loss APIs take a logits as an argument. In results,
All the models in TensorNets return a softmax
tf.Tensor
, while the models intensorflow/models
(also backbone oftensorflow/hub
) do a logitstf.Tensor
(the values before softmax). That is because most regular TensorFlow loss APIs take a logits as an argument. In results,tf.losses.softmax_cross_entropy(onehot_labels=outputs, logits=model)
,tf.nn.softmax_cross_entropy_with_logits_v2(labels=outputs, logits=model.logits)
,tf.losses.softmax_cross_entropy(onehot_labels=outputs, logits=model.logits)
,where
model
is any model function in TensorNets (e.g.,model = nets.MobileNet50(inputs)
) andmodel.logits
is equivalent tomodel.get_outputs()[-2]
.The following is a comparison of the three losses mentioned above and TL; DR.
The text was updated successfully, but these errors were encountered: