giagrad.Tensor.leakyrelu#
- Tensor.leakyrelu(neg_slope=0.01) Tensor [source]#
Creates a new Tensor applying Leaky Rectified Linear Unit (Leaky ReLU) function to data. See Leaky ReLU .
\[\begin{split}out_i = \begin{cases} data_i \ \ if \ \ data_i > 0 \\ \text{neg_slope} \times data_i \ \ if \ \ x \leq 0 \\ \end{cases}\end{split}\]- Parameters:
neg_slope¶ (float) – Controls de angle of the negative slope (which only affects negative input values).
Examples
>>> t = Tensor.empty(2, 3, requires_grad=True).uniform(-1, 1) >>> t tensor: [[-0.83589154 0.8874637 -0.465633 ] [-0.5879877 0.22095676 -0.0592072 ]] >>> d = t.leakyrelu(neg_slope=3) >>> d tensor: [[-2.5076747 0.8874637 -1.396899 ] [-1.7639632 0.22095676 -0.17762159]] fn: LeakyReLU(neg_slope=3) >>> d.backward() >>> t.grad array([[3., 1., 3.], [3., 1., 3.]], dtype=float32)