Activations#
- ivy.celu(x, /, *, alpha=1.0, complex_mode='jax', out=None)[source]#
Apply the Continuously Differentiable Exponential Linear Unit (CELU) activation function to each element of the input.
- Parameters:
x (
Union
[Array
,NativeArray
]) – Input array.alpha (
float
, default:1.0
) – The alpha value (negative slope) for the CELU formulation. Default is1.0
complex_mode (
Literal
['split'
,'magnitude'
,'jax'
], default:'jax'
) – optional specifier for how to handle complex data types. Seeivy.func_wrapper.handle_complex_input
for more detail.out (
Optional
[Array
], default:None
) – optional output array, for writing the result to. It must have a shape that the inputs broadcast to.
- Return type:
- Returns:
ret – The input array with celu applied element-wise.
Examples
With
ivy.Array
input:>>> x = ivy.array([0.39, -0.85]) >>> y = ivy.celu(x) >>> y ivy.array([ 0.39, -0.57])
With
ivy.Container
input:>>> x = ivy.Container(a=ivy.array([0.39, -0.85]), b=ivy.array([1., -0.2])) >>> y = ivy.celu(x) >>> y { a: ivy.array([0.38999999, -0.57]), b: ivy.array([1., -0.18]) }
- ivy.elu(x, /, *, alpha=1.0, out=None)[source]#
Apply the elu unit function element-wise.
- Parameters:
- Return type:
- Returns:
ret – The input array with elu applied element-wise.
Examples
With
ivy.Array
input: >>> x = ivy.array([0.39, -0.85]) >>> y = ivy.elu(x) >>> print(y) ivy.array([ 0.38999999, -0.57258511]) >>> x = ivy.array([1.5, 0.7, -2.4]) >>> y = ivy.zeros(3) >>> ivy.elu(x, out=y) >>> print(y) ivy.array([ 1.5, 0.69999999, -0.90928203]) >>> x = ivy.array([[1.1, 2.2, 3.3], … [-4.4, -5.5, -6.6]]) >>> ivy.elu(x, out=x) >>> print(x) ivy.array([[ 1.10000002, 2.20000005, 3.29999995],[-0.98772264, -0.99591321, -0.99863964]])
With
ivy.Container
input: >>> x = ivy.Container(a=ivy.array([0.0, -1.2]), b=ivy.array([0.4, -0.2])) >>> x = ivy.elu(x, out=x) >>> print(x) {a: ivy.array([0., -0.69880581]), b: ivy.array([0.40000001, -0.18126924])
}
- ivy.hardshrink(x, /, *, lambd=0.5, out=None)[source]#
Apply the hardshrink function element-wise.
- Parameters:
- Return type:
- Returns:
ret – an array containing the hardshrink activation of each element in
x
.
Examples
With
ivy.Array
input: >>> x = ivy.array([-1.0, 1.0, 2.0]) >>> y = ivy.hardshrink(x) >>> print(y) ivy.array([-1., 1., 2.]) >>> x = ivy.array([-1.0, 1.0, 2.0]) >>> y = x.hardshrink() >>> print(y) ivy.array([-1., 1., 2.]) >>> x = ivy.array([[-1.3, 3.8, 2.1], [1.7, 4.2, -6.6]]) >>> y = ivy.hardshrink(x) >>> print(y) ivy.array([[-1.29999995, 3.79999995, 2.0999999 ],[ 1.70000005, 4.19999981, -6.5999999 ]])
- ivy.hardsilu(x, /, *, out=None)[source]#
Apply the hardsilu/hardswish function element-wise.
- Parameters:
- Return type:
- Returns:
an array containing the output of the hardsilu/hardswish function applied to each element in
x
.
Examples
With
ivy.Array
input:>>> x = ivy.array([1., 2., 3.]) >>> y = ivy.hardsilu(x) >>> print(y) ivy.array([0.66666669, 1.66666663, 3. ]) >>> x = ivy.array([-2.1241, 1.4897, 4.4090]) >>> y = ivy.zeros(3) >>> ivy.hardsilu(x, out=y) >>> print(y) ivy.array([-0.31008321, 1.1147176 , 4.40899992])
With
ivy.Container
input:>>> x = ivy.Container(a=ivy.array([-0.5, -1, 0]), b=ivy.array([0.5, 1., 2])) >>> y = ivy.hardsilu(x) >>> print(y) { a: ivy.array([-0.20833333, -0.33333334, 0.]), b: ivy.array([0.29166666, 0.66666669, 1.66666663]) }
- ivy.hardtanh(x, /, *, max_val=1, min_val=-1, out=None)[source]#
Apply the hardtanh unit function element-wise.
- Parameters:
x (
Union
[Array
,NativeArray
]) – Input array.min_val (
float
, default:-1
) – minimum value of the linear region range. Default: -1.max_val (
float
, default:1
) – maximum value of the linear region range. Default: 1.out (
Optional
[Array
], default:None
) – optional output array, for writing the result to. It must have a shape that the inputs broadcast to.
- Return type:
- Returns:
ret – The input array with elu applied element-wise.
Examples
With
ivy.Array
input: >>> x = ivy.array([0.39, -0.85]) >>> y = ivy.hardtanh(x) >>> print(y) ivy.array([ 0.39, -0.85]) >>> x = ivy.array([1.5, 0.7, -2.4]) >>> y = ivy.zeros(3) >>> ivy.hardtanh(x, out=y) >>> print(y) ivy.array([ 1., 0.7, -1.]) >>> x = ivy.array([[1.1, 2.2, 3.3],[-0.4, 0.5, -6.6]]) >>> ivy.hardtanh(x, out=x) >>> print(x) ivy.array([[ 1., 1., 1.],[-0.4, 0.5, -1.]])With
ivy.Container
input: >>> x = ivy.Container(a=ivy.array([0.0, -1.2]), b=ivy.array([0.4, -0.2])) >>> x = ivy.hardtanh(x, out=x) >>> print(x) {a: ivy.array([0., -1.]), b: ivy.array([0.4, -0.2])
}
- ivy.logit(x, /, *, eps=None, complex_mode='jax', out=None)[source]#
Compute the logit of x.
logit(x) = log(x / (1 - x)).
- Parameters:
x (
Union
[float
,int
,Array
]) – Input data.eps (
Optional
[float
], default:None
) – When eps is None the function outputs NaN where x < 0 or x > 1. and inf or -inf where x = 1 or x = 0, respectively. Otherwise if eps is defined, x is clamped to [eps, 1 - eps]complex_mode (
Literal
['split'
,'magnitude'
,'jax'
], default:'jax'
) – optional specifier for how to handle complex data types. Seeivy.func_wrapper.handle_complex_input
for more detail.out (
Optional
[Array
], default:None
) – Optional output array.
- Return type:
- Returns:
ret – Array containing elementwise logits of x.
Examples
>>> x = ivy.array([1, 0, 0.9]) >>> z = ivy.logit(x) >>> print(z) ivy.array([ inf, -inf, 2.19722438])
>>> x = ivy.array([1, 2, -0.9]) >>> z = ivy.logit(x, eps=0.2) >>> print(z) ivy.array([ 1.38629448, 1.38629448, -1.38629436])
- ivy.logsigmoid(input, /, *, complex_mode='jax', out=None)[source]#
Apply element-wise Log-sigmoid of x.
logsigmoid(x) = log(1 / (1 + exp(-x)).
- Parameters:
input (
Union
[NativeArray
,Array
]) – Input array.complex_mode (
Literal
['split'
,'magnitude'
,'jax'
], default:'jax'
) – optional specifier for how to handle complex data types. Seeivy.func_wrapper.handle_complex_input
for more detail.
- Return type:
- Returns:
Array with same shape as input with Log-sigmoid applied to every element.
Examples
With
ivy.Array
input:>>> x = ivy.array([-1., 0., 1.]) >>> z = x.logsigmoid() >>> print(z) ivy.array([-1.31326175, -0.69314718, -0.31326169])
>>> x = ivy.array([1.5, 0.7, -2.4]) >>> z = x.logsigmoid() >>> print(z) ivy.array([-0.20141329, -0.40318608, -2.48683619])
With
ivy.Container
input:>>> x = ivy.Container(a=ivy.array([1.0, -1.2]), b=ivy.array([0.2, 0.6])) >>> x = ivy.logsigmoid(x) >>> print(x) { a: ivy.array([-0.31326169, -1.46328247]), b: ivy.array([-0.59813893, -0.43748799]) }
- ivy.prelu(x, slope, /, *, out=None)[source]#
Prelu takes input data (Array) and slope array as input,
and produces one output data (array) where the function f(x) = slope * x for x < 0, f(x) = x for x >= 0., is applied to the data array elementwise. This operator supports unidirectional broadcasting (array slope should be unidirectional broadcastable to input tensor X);
- Parameters:
- Return type:
- Returns:
ret – Array containing Parametrized relu values.
- ivy.relu6(x, /, *, complex_mode='jax', out=None)[source]#
Apply the rectified linear unit 6 function element-wise.
- Parameters:
x (
Union
[Array
,NativeArray
]) – input arraycomplex_mode (
Literal
['split'
,'magnitude'
,'jax'
], default:'jax'
) – optional specifier for how to handle complex data types. Seeivy.func_wrapper.handle_complex_input
for more detail.out (
Optional
[Array
], default:None
) – optional output array, for writing the result to. It must have a shape that the inputs broadcast to.
- Return type:
- Returns:
ret – an array containing the rectified linear unit 6 activation of each element in
x
.
Examples
With
ivy.Array
input:>>> x = ivy.array([-1., 0., 1., 2., 3., 4., 5., 6., 7.]) >>> y = ivy.relu6(x) >>> print(y) ivy.array([0., 0., 1., 2., 3., 4., 5., 6., 6.])
>>> x = ivy.array([-1., 0., 1., 2., 3., 4., 5., 6., 7.]) >>> y = ivy.zeros(9) >>> ivy.relu6(x, out = y) >>> print(y) ivy.array([0., 0., 1., 2., 3., 4., 5., 6., 6.])
- ivy.scaled_tanh(x, /, *, alpha=1.7159, beta=0.67, out=None)[source]#
Compute the scaled hyperbolic tangent (tanh) activation.
The scaled tanh activation function is defined as: out = alpha * tanh(beta * x)
- Parameters:
x (
Union
[Array
,NativeArray
]) – input array.alpha (
float
, default:1.7159
) – The scaling parameter for the output. Determines the amplitude of the tanh function. Default: 1.7159beta (
float
, default:0.67
) – The scaling parameter for the input. Determines the slope of the tanh function. Default: 0.67out (
Optional
[Array
], default:None
) – optional output array, for writing the result to. It must have a shape that the inputs broadcast to.
- Return type:
- Returns:
ret – The input array after applying the scaled tanh activation.
Examples
With
ivy.Array
input:>>> x = ivy.array([22.]) >>> y = ivy.scaled_tanh(x) >>> y ivy.array([1.71589994]))
>>> x = ivy.array([4.0, 7.0]) >>> y = ivy.scaled_tanh(x, alpha=1.2, beta=5) >>> y ivy.array([1.20000005, 1.20000005])
With
ivy.Container
input:>>> x = ivy.Container(a=ivy.array([1.2, -1.2]), b=ivy.array([4.4, -2.2])) >>> y = ivy.scaled_tanh(x) >>> y { a: ivy.array([1.14324772, -1.14324772]), b: ivy.array([1.70648694, -1.54488957]) } >>> x = ivy.Container(a=ivy.array([1.2]), b=ivy.array([4.4])) >>> y = ivy.scaled_tanh(x, alpha=0.2, beta=0.5) >>> y { a: ivy.array([0.10740992]), b: ivy.array([0.19514863]) }
- ivy.selu(x, /, *, out=None)[source]#
Apply the scaled exponential linear unit function element-wise.
- Parameters:
- Return type:
- Returns:
ret – an array containing the scaled exponential linear unit activation of each element in
x
.
Examples
With
ivy.Array
input: >>> x = ivy.array([-1., 0., 1., 2., 3., 4., 5., 6., 7.]) >>> y = ivy.selu(x) >>> print(y) ivy.array([-1.11133075, 0. , 1.05070102, 2.10140204, 3.15210295,4.20280409, 5.25350523, 6.30420589, 7.35490704])
>>> x = ivy.array([-1., 0., 1., 2., 3., 4., 5., 6., 7.]) >>> y = ivy.zeros(9) >>> ivy.selu(x, out = y) >>> print(y) ivy.array([-1.11133075, 0. , 1.05070102, 2.10140204, 3.15210295, 4.20280409, 5.25350523, 6.30420589, 7.35490704])
With
ivy.Container
input: >>> x = ivy.Container(a=ivy.array([-3., -2., -1., 0., 1., 2., 3., 4., 5.]), … b=ivy.array([1., 2., 3., 4., 5., 6., 7., 8., 9.]) … ) >>> x = ivy.selu(x, out=x) >>> print(x) {- a: ivy.array([-1.6705687, -1.52016652, -1.11133075, 0., 1.05070102,
2.10140204, 3.15210295, 4.20280409, 5.25350523]),
- b: ivy.array([1.05070102, 2.10140204, 3.15210295, 4.20280409, 5.25350523,
6.30420589, 7.35490704, 8.40560818, 9.45630932])
}
- ivy.silu(x, /, *, out=None)[source]#
Apply the silu function element-wise.
- Parameters:
- Return type:
- Returns:
ret – an array containing the silu activation of each element in
x
.
Examples
With
ivy.Array
input:>>> x = ivy.array([-1.0, 1.0, 2.0]) >>> y = ivy.silu(x) >>> print(y) ivy.array([-0.2689, 0.7310, 1.7615])
>>> x = ivy.array([-1.0, 1.0, 2.0]) >>> y = x.silu() >>> print(y) ivy.array([-0.2689, 0.7310, 1.7615])
>>> x = ivy.array([[-1.3, 3.8, 2.1], [1.7, 4.2, -6.6]]) >>> y = ivy.silu(x) >>> print(y) ivy.array([[-0.2784, 3.7168, 1.8708], [ 1.4374, 4.1379, -0.0089]])
- ivy.softshrink(x, /, *, lambd=0.5, out=None)[source]#
Apply the softshrink function element-wise.
- Parameters:
- Return type:
- Returns:
ret – an array containing the softshrink activation of each element in
x
.
Examples
With
ivy.Array
input: >>> x = ivy.array([-1.0, 1.0, 2.0]) >>> y = ivy.softshrink(x) >>> print(y) ivy.array([-0.5, 0.5, 1.5])>>> x = ivy.array([-1.0, 1.0, 2.0]) >>> y = x.softshrink() >>> print(y) ivy.array([-0.5, 0.5, 1.5])
>>> x = ivy.array([[-1.3, 3.8, 2.1], [1.7, 4.2, -6.6]]) >>> y = ivy.softshrink(x) >>> print(y) ivy.array([[-0.79999995, 3.29999995, 1.59999991], [ 1.20000005, 3.69999981, -6.0999999 ]])
- ivy.stanh(x, /, *, alpha=1.7159, beta=0.67, out=None)[source]#
Compute the scaled hyperbolic tangent (tanh) activation.
The scaled tanh activation function is defined as: out = alpha * tanh(beta * x)
- Parameters:
x (
Union
[Array
,NativeArray
]) – input array.alpha (
float
, default:1.7159
) – The scaling parameter for the output. Determines the amplitude of the tanh function. Default: 1.7159beta (
float
, default:0.67
) – The scaling parameter for the input. Determines the slope of the tanh function. Default: 0.67out (
Optional
[Array
], default:None
) – optional output array, for writing the result to. It must have a shape that the inputs broadcast to.
- Return type:
- Returns:
ret – The input array after applying the scaled tanh activation.
Examples
With
ivy.Array
input:>>> x = ivy.array([22.]) >>> y = ivy.scaled_tanh(x) >>> y ivy.array([1.71589994]))
>>> x = ivy.array([4.0, 7.0]) >>> y = ivy.scaled_tanh(x, alpha=1.2, beta=5) >>> y ivy.array([1.20000005, 1.20000005])
With
ivy.Container
input:>>> x = ivy.Container(a=ivy.array([1.2, -1.2]), b=ivy.array([4.4, -2.2])) >>> y = ivy.scaled_tanh(x) >>> y { a: ivy.array([1.14324772, -1.14324772]), b: ivy.array([1.70648694, -1.54488957]) } >>> x = ivy.Container(a=ivy.array([1.2]), b=ivy.array([4.4])) >>> y = ivy.scaled_tanh(x, alpha=0.2, beta=0.5) >>> y { a: ivy.array([0.10740992]), b: ivy.array([0.19514863]) }
- ivy.tanhshrink(x, /, *, out=None)[source]#
Apply the tanhshrink function element-wise.
- Parameters:
- Return type:
- Returns:
ret – an array containing the tanhshrink activation of each element in
x
.
Examples
With
ivy.Array
input:>>> x = ivy.array([-1.0, 1.0, 2.0]) >>> y = ivy.tanhshrink(x) >>> print(y) ivy.array([-0.23840582, 0.23840582, 1.03597236])
>>> x = ivy.array([-1.0, 1.0, 2.0]) >>> y = x.tanhshrink() >>> print(y) ivy.array([-0.23840582, 0.23840582, 1.03597236])
>>> x = ivy.array([[-1.3, 3.8, 2.1], [1.7, 4.2, -6.6]]) >>> y = ivy.tanhshrink(x) >>> print(y) ivy.array([[-0.43827677, 2.80100036, 1.12954807], [ 0.76459098, 3.20044947, -5.60000372]])
- ivy.threshold(x, /, *, threshold, value, out=None)[source]#
Apply the threshold function element-wise.
- Parameters:
- Return type:
- Returns:
ret – an array containing the threshold activation of each element in
x
.
Examples
With
ivy.Array
input: >>> x = ivy.array([-1.0, 1.0, 2.0]) >>> y = ivy.threshold(x,value=0.0, threshold=1.5) >>> print(y) ivy.array([0., 0., 2.])>>> x = ivy.array([-1.0, 1.0, 2.0]) >>> x.threshold(value=0.0, threshold=1.5) >>> print(y) ivy.array([0., 0., 2.])
>>> x = ivy.array([[-1.3, 3.8, 2.1], [1.7, 4.2, -6.6]]) >>> y = ivy.threshold(x, value=0.0, threshold=1.5) >>> print(y) ivy.array([[0. , 3.79999995, 2.0999999 ], [1.70000005, 4.19999981, 0. ]])
- ivy.thresholded_relu(x, /, *, threshold=0, out=None)[source]#
Apply the rectified linear unit function with custom threshold.
- Parameters:
- Return type:
- Returns:
ret – an array containing the rectified linear unit activation of each element in
x
. with customthreshold
.
Examples
With
ivy.Array
input:>>> x = ivy.array([-1., 0., 1.]) >>> y = ivy.thresholded_relu(x, threshold=0.5) >>> print(y) ivy.array([0., 0. , 1.])
>>> x = ivy.array([1.5, 0.7, -2.4]) >>> y = ivy.zeros(3) >>> ivy.thresholded_relu(x, threshold=1, out = y) >>> print(y) ivy.array([ 1.5, 0., 0.])
With
ivy.Container
input:>>> x = ivy.Container(a=ivy.array([1.0, -1.2]), b=ivy.array([0.2, 0.6])) >>> x = ivy.thresholded_relu(x, threshold=0.5) >>> print(x) { a: ivy.array([1., 0.]), b: ivy.array([0., 0.6]) }
This should have hopefully given you an overview of the activations submodule, if you have any questions, please feel free to reach out on our discord!