layer_norm#
- ivy.layer_norm(x, normalized_idxs, /, *, scale=None, offset=None, eps=1e-05, new_std=1.0, out=None)[source]#
Apply Layer Normalization over a mini-batch of inputs.
- Parameters:
x (
Union
[Array
,NativeArray
]) – Input arraynormalized_idxs (
List
[int
]) – Indices to apply the normalization to.scale (
Optional
[Union
[Array
,NativeArray
]], default:None
) – Learnable gamma variables for elementwise post-multiplication, default isNone
.offset (
Optional
[Union
[Array
,NativeArray
]], default:None
) – Learnable beta variables for elementwise post-addition, default isNone
.eps (
float
, default:1e-05
) – small constant to add to the denominator. Default is1e-05
new_std (
float
, default:1.0
) – The standard deviation of the new normalized values. Default is1
.out (
Optional
[Array
], default:None
) – optional output array, for writing the result to. It must have a shape that the inputs broadcast to.
- Return type:
- Returns:
- ret
The layer after applying layer normalization.
Examples
With
ivy.Array
input: >>> x = ivy.array([[1.0, 2.0], [3.0, 4.0]]) >>> y = ivy.layer_norm(x, [0, 1], new_std=2.0) >>> print(y) ivy.array([[-2.68 , -0.894],[ 0.894, 2.68 ]])
>>> x = ivy.array([[1., 2., 3.], [4., 5., 6.]]) >>> y = ivy.zeros((2, 3)) >>> ivy.layer_norm(x, [0], out=y) >>> print(y) ivy.array([[-1., -1., -1.], [ 1., 1., 1.]]) >>> x = ivy.array([[0.0976, -0.3452, 1.2740], ... [0.1047, 0.5886, 1.2732], ... [0.7696, -1.7024, -2.2518]]) >>> y = ivy.layer_norm(x, [0, 1], eps=0.001, ... new_std=1.5, scale=0.5, offset=[0.5, 0.02, 0.1]) >>> print(y) ivy.array([[ 0.826, -0.178, 0.981 ], [ 0.831, 0.421, 0.981 ], [ 1.26 , -1.05 , -1.28 ]]) With a mix of :class:`ivy.Array` and :class:`ivy.Container` inputs: >>> x = ivy.array([[1., 2., 3.], [4., 5., 6.]]) >>> normalized_idxs = ivy.Container({'a': [0], 'b': [1]}) >>> y = ivy.layer_norm(x, normalized_idxs, new_std=1.25, offset=0.2) >>> print(y) { a: ivy.array([[-1.25, -1.25, -1.25], [1.25, 1.25, 1.25]]), b: ivy.array([[-1.53, 0., 1.53], [-1.53, 0., 1.53]]) } With one :class:`ivy.Container` input: >>> x = ivy.Container({'a': ivy.array([7., 10., 12.]), ... 'b': ivy.array([[1., 2., 3.], [4., 5., 6.]])}) >>> normalized_idxs = [0] >>> y = ivy.layer_norm(x, normalized_idxs, eps=1.25, scale=0.3) >>> print(y) { a: ivy.array([-0.34198591, 0.04274819, 0.29923761]), b: ivy.array([[-0.24053511, -0.24053511, -0.24053511], [0.24053511, 0.24053511, 0.24053511]]) }
With multiple
ivy.Container
inputs:>>> x = ivy.Container(a=ivy.array([7.0, 10.0, 12.0]), ... b=ivy.array([[1.0, 2.0, 3.0], [4.0, 5.0, 6.0]])) >>> normalized_idxs = ivy.Container(a=[0], b=[1]) >>> new_std = ivy.Container(a=1.25, b=1.5) >>> bias = ivy.Container(a=[0.2, 0.5, 0.7], b=0.3) >>> y = ivy.layer_norm(x, normalized_idxs, new_std=new_std, offset=0.2) >>> print(y) { a: ivy.array([-1.62, 0.203, 1.42]), b: ivy.array([[-1.84, 0., 1.84], [-1.84, 0., 1.84]]) } # Both the description and the type hints above assumes an array input for simplicity, but this function is *nestable*, and therefore also accepts :class:`ivy.Container` instances in place of any of the arguments.
- Array.layer_norm(self, normalized_idxs, /, *, scale=None, offset=None, eps=1e-05, new_std=1.0, out=None)[source]#
ivy.Array instance method variant of ivy.layer_norm. This method simply wraps the function, and so the docstring for ivy.layer_norm also applies to this method with minimal changes.
- Parameters:
self (
Array
) – Input arraynormalized_idxs (
List
[int
]) – Indices to apply the normalization to.scale (
Optional
[Union
[Array
,NativeArray
]], default:None
) – Learnable gamma variables for elementwise post-multiplication, default isNone
.offset (
Optional
[Union
[Array
,NativeArray
]], default:None
) – Learnable beta variables for elementwise post-addition, default isNone
.eps (
float
, default:1e-05
) – small constant to add to the denominator. Default is1e-05
.new_std (
float
, default:1.0
) – The standard deviation of the new normalized values. Default is 1.out (
Optional
[Array
], default:None
) – optional output array, for writing the result to. It must have a shape that the inputs broadcast to.
- Return type:
Array
- Returns:
ret – The layer after applying layer normalization.
Examples
>>> x = ivy.array([[0.0976, -0.3452, 1.2740], ... [0.1047, 0.5886, 1.2732], ... [0.7696, -1.7024, -2.2518]]) >>> norm = x.layer_norm([0, 1], eps=0.001, ... new_std=1.5, scale=0.5, offset=[0.5, 0.02, 0.1]) >>> print(norm) ivy.array([[ 0.826, -0.178, 0.981 ], [ 0.831, 0.421, 0.981 ], [ 1.26 , -1.05 , -1.28 ]])
- Container.layer_norm(self, normalized_idxs, /, *, scale=None, offset=None, eps=1e-05, new_std=1.0, out=None)[source]#
ivy.Container instance method variant of ivy.layer_norm. This method simply wraps the function, and so the docstring for ivy.layer_norm also applies to this method with minimal changes.
- Parameters:
self (
Union
[Array
,NativeArray
,Container
]) – Input containernormalized_idxs (
List
[Union
[int
,Container
]]) – Indices to apply the normalization to.scale (
Optional
[Union
[Array
,NativeArray
,Container
]], default:None
) – Learnable gamma variables for elementwise post-multiplication, default isNone
.offset (
Optional
[Union
[Array
,NativeArray
,Container
]], default:None
) – Learnable beta variables for elementwise post-addition, default isNone
.eps (
Union
[float
,Container
], default:1e-05
) – small constant to add to the denominator. Default is1e-05
.new_std (
Union
[float
,Container
], default:1.0
) – The standard deviation of the new normalized values. Default is 1.out (
Optional
[Union
[Array
,Container
]], default:None
) – optional output container, for writing the result to. It must have a shape that the inputs broadcast to.
- Return type:
Container
- Returns:
ret – The layer after applying layer normalization.
Examples
With one
ivy.Container
input: >>> x = ivy.Container({‘a’: ivy.array([7., 10., 12.]), … ‘b’: ivy.array([[1., 2., 3.], [4., 5., 6.]])}) >>> normalized_idxs = [0] >>> norm = x.layer_norm(normalized_idxs, eps=1.25, scale=0.3) >>> print(norm) {a: ivy.array([-0.34198591, 0.04274819, 0.29923761]), b: ivy.array([[-0.24053511, -0.24053511, -0.24053511],
[0.24053511, 0.24053511, 0.24053511]])
} With multiple
ivy.Container
inputs: >>> x = ivy.Container({‘a’: ivy.array([7., 10., 12.]), … ‘b’: ivy.array([[1., 2., 3.], [4., 5., 6.]])}) >>> normalized_idxs = ivy.Container({‘a’: [0], ‘b’: [1]}) >>> new_std = ivy.Container({‘a’: 1.25, ‘b’: 1.5}) >>> bias = ivy.Container({‘a’: [0.2, 0.5, 0.7], ‘b’: 0.3}) >>> norm = x.layer_norm(normalized_idxs, new_std=new_std, offset=1) >>> print(norm) {a: ivy.array([-1.62221265, 0.20277636, 1.41943574]), b: ivy.array([[-1.83710337, 0., 1.83710337],
[-1.83710337, 0., 1.83710337]])
}