#include <nn_ops.h>
Computes softmax activations.
For each batch i and class j we have
softmax[i, j] = exp(logits[i, j]) / sum_j(exp(logits[i, j]))
Arguments:
[batch_size, num_classes].Returns:
Output: Same shape as logits. | Constructors and Destructors | |
|---|---|
Softmax(const ::tensorflow::Scope & scope, ::tensorflow::Input logits) |
| Public attributes | |
|---|---|
softmax | |
| Public functions | |
|---|---|
node() const | ::tensorflow::Node * |
operator::tensorflow::Input() const | |
operator::tensorflow::Output() const | |
::tensorflow::Output softmax
Softmax( const ::tensorflow::Scope & scope, ::tensorflow::Input logits )
::tensorflow::Node * node() const
operator::tensorflow::Input() const
operator::tensorflow::Output() const
© 2018 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/api_docs/cc/class/tensorflow/ops/softmax.html