giagrad.Tensor.softmax#
- Tensor.softmax(axis) Tensor [source]#
Applies Softmax function to every 1-D slice defined by
axis
. See Softmax.The elements of the n-dimensinal output Tensor will lie in the range \([0, 1]\) and sum to \(1\) for the specified 1-D slices defined by
axis
.Softmax for a one-dimensional slice is defined as:
\[\text{Softmax}(x_i) = \frac{\exp(x_i)}{\sum_j \exp(x_j)}\]- Parameters:
axis¶ (int) – The dimension along which Softmax will be computed (so every slice along axis will sum to 1).
Examples
>>> t = Tensor.empty(2, 3).uniform(-1, 1) >>> t tensor: [[ 0.27639335 0.7524293 0.69203097] [ 0.37772807 -0.9291505 -0.80418533]] >>> t.softmax(axis=1) tensor: [[0.24242324 0.390224 0.36735278] [0.6339727 0.17159334 0.19443396]] fn: Softmax(axis=1)