giagrad.Tensor.relu#
- Tensor.relu() Tensor [source]#
Applies the Rectified Linear Unit (ReLU) function element-wise. See ReLU.
\[out_i = \max(0, data)\]Examples
>>> t = Tensor.empty(2, 3).uniform(-1, 1) >>> t tensor: [[ 0.96863234 0.64852756 -0.52318954] [-0.18809071 -0.48402452 0.86754996]] >>> t.relu() tensor: [[0.96863234 0.64852756 0. ] [0. 0. 0.86754996]] fn: ReLU