Normalization
We define two kind of normalizations: component
and norm
.
Definition
component
component
normalization refers to tensors with each component of value around 1.
More precisely, the second moment of each component is 1.
Examples:
[1.0, -1.0, -1.0, 1.0]
[1.0, 1.0, 1.0, 1.0]
the mean don’t need to be zero[0.0, 2.0, 0.0, 0.0]
this is still fine because \(\|x\|^2 = n\)
torch.randn(10)
tensor([ 1.4346, -0.3785, 1.4520, -0.8914, -1.1336, 0.6574, -0.5302, -0.1507,
-1.0169, 2.2482])
norm
norm
normalization refers to tensors of norm close to 1.
Examples:
[0.5, -0.5, -0.5, 0.5]
[0.5, 0.5, 0.5, 0.5]
the mean don’t need to be zero[0.0, 1.0, 0.0, 0.0]
torch.randn(10) / 10**0.5
tensor([-0.0563, 0.3049, 0.8694, -0.4810, 0.1809, -0.0580, 0.2061, 0.0978,
0.5951, 0.1251])
There is just a factor \(\sqrt{n}\) between the two normalizations.
Motivation
Assuming that the weights distribution obey
It imply that the two first moments of \(x \cdot w\) (and therefore mean and variance) are only function of the second moment of \(x\)
Testing
You can use e3nn.util.test.assert_normalized
to check whether a function or module is normalized at initialization:
from e3nn.util.test import assert_normalized
from e3nn import o3
assert_normalized(o3.Linear("10x0e", "10x0e"))