giagrad.Tensor.xavier_normal#
- Tensor.xavier_normal(gain=1.0) Tensor [source]#
Fills Tensor data with the also known Glorot normal initialization.
This method is described in Understanding the difficulty of training deep feedforward neural networks - Glorot, X. & Bengio, Y. (2010), using a normal distribution. Tensor data will have values sampled from \(\mathcal{N}(0, \sigma^2)\) where
\[\sigma = \text{gain} \times \sqrt{\frac{2}{\text{fan_in} + \text{fan_out}}}\]- Parameters:
gain¶ (float) – An optional scaling factor.
Examples
>>> from giagrad import calculate_gain >>> Tensor.empty(3, 5).xavier_normal(gain=calculate_gain('relu'))