giagrad.Tensor.softplus#

Tensor.softplus(beta=1.0, limit=20.0) Tensor[source]#

Applies the Softplus function element-wise. See Softplus.

For numerical stability the implementation reverts to the linear function when \(data_i \times \text{beta} > \text{limit}\).

\[out_i = \frac{1}{\text{beta}} \cdot \log(1 + \exp(\text{beta} \times data_i))\]
Parameters:
  • beta (float) – The \(\beta\) value for the Softplus formulation.

  • limit (float) – Data times beta above this reverts to a linear function.

Examples

>>> t = Tensor.empty(2, 3).uniform(-1, 1)
>>> t
tensor: [[ 0.54631704 -0.703394    0.85786563]
         [-0.24458279  0.23733494 -0.32190484]]
>>> t.softplus(beta=5, limit=1)
tensor: [[0.54631704 0.00585142 0.85786563]
         [0.05160499 0.23733494 0.03646144]] fn: Softplus(lim=1, alpha=5)