Activations
Collection of Ivy neural network activations as stateful classes.
-
class ivy.stateful.activations.ELU(*args, **kwargs)[source]
Bases: Module
-
__init__(alpha=1.0)[source]
Apply the ELU activation function.
-
class ivy.stateful.activations.GEGLU(*args, **kwargs)[source]
Bases: Module
-
__init__()[source]
Apply the GEGLU activation function.
-
class ivy.stateful.activations.GELU(*, approximate=False, complex_mode='jax')[source]
Bases: Module
-
__init__(*, approximate=False, complex_mode='jax')[source]
Apply the GELU activation function.
- Parameters:
approximate (bool
, default: False
) – whether to use the gelu approximation algorithm or exact formulation.
complex_mode (Literal
['split'
, 'magnitude'
, 'jax'
], default: 'jax'
) – Specifies how to handle complex input. See
ivy.func_wrapper.handle_complex_input
for more detail.
-
class ivy.stateful.activations.Hardswish(complex_mode='jax')[source]
Bases: Module
-
__init__(complex_mode='jax')[source]
Apply the HARDSWISH activation function.
- Parameters:
complex_mode (Literal
['split'
, 'magnitude'
, 'jax'
], default: 'jax'
) –
- Specifies how to handle complex input. See
ivy.func_wrapper.handle_complex_input
for more detail.
-
class ivy.stateful.activations.LeakyReLU(alpha=0.2, complex_mode='jax')[source]
Bases: Module
-
__init__(alpha=0.2, complex_mode='jax')[source]
Apply the LEAKY RELU activation function.
- Parameters:
alpha (float
, default: 0.2
) – Negative slope for ReLU.
complex_mode (Literal
['split'
, 'magnitude'
, 'jax'
], default: 'jax'
) – Specifies how to handle complex input. See
ivy.func_wrapper.handle_complex_input
for more detail.
-
class ivy.stateful.activations.LogSigmoid(complex_mode='jax')[source]
Bases: Module
-
__init__(complex_mode='jax')[source]
Apply the LogSigmoid activation function.
Parameter
- complex_mode
Specifies how to handle complex input. See
ivy.func_wrapper.handle_complex_input
for more detail.
-
class ivy.stateful.activations.LogSoftmax(axis=-1, complex_mode='jax')[source]
Bases: Module
-
__init__(axis=-1, complex_mode='jax')[source]
Apply the LOG SOFTMAX activation function.
- Parameters:
axis (Optional
[int
], default: -1
) – The dimension log_softmax would be performed on. The default is None
complex_mode (Literal
['split'
, 'magnitude'
, 'jax'
], default: 'jax'
) – optional specifier for how to handle complex data types. See
ivy.func_wrapper.handle_complex_input
for more detail.
-
class ivy.stateful.activations.Logit(*args, **kwargs)[source]
Bases: Module
-
__init__(eps=None, complex_mode='jax')[source]
Apply the LOGIT activation function.
- Parameters:
eps (default: None
) – The epsilon value for the logit formation. Default: None
.
complex_mode (default: 'jax'
) – optional specifier for how to handle complex data types. See
ivy.func_wrapper.handle_complex_input
for more detail.
-
class ivy.stateful.activations.Mish(*args, **kwargs)[source]
Bases: Module
-
__init__()[source]
Apply the MISH activation function.
-
class ivy.stateful.activations.PReLU(*args, **kwargs)[source]
Bases: Module
-
__init__(slope)[source]
Apply the PRELU activation function.
-
class ivy.stateful.activations.ReLU(complex_mode='jax')[source]
Bases: Module
-
__init__(complex_mode='jax')[source]
Apply the RELU activation function.
- Parameters:
complex_mode (Literal
['split'
, 'magnitude'
, 'jax'
], default: 'jax'
) –
- Specifies how to handle complex input. See
ivy.func_wrapper.handle_complex_input
for more detail.
-
class ivy.stateful.activations.ReLU6(complex_mode='jax')[source]
Bases: Module
-
__init__(complex_mode='jax')[source]
Apply the TANH activation function.
- Parameters:
complex_mode (Literal
['split'
, 'magnitude'
, 'jax'
], default: 'jax'
) –
- Specifies how to handle complex input. See
ivy.func_wrapper.handle_complex_input
for more detail.
-
class ivy.stateful.activations.SeLU(*args, **kwargs)[source]
Bases: Module
-
__init__()[source]
Apply the SELU activation function.
-
class ivy.stateful.activations.SiLU(*args, **kwargs)[source]
Bases: Module
-
__init__()[source]
Apply the SiLU activation function.
-
class ivy.stateful.activations.Sigmoid(complex_mode='jax')[source]
Bases: Module
-
__init__(complex_mode='jax')[source]
Apply the SIGMOID activation function.
Parameter
- complex_mode
Specifies how to handle complex input. See
ivy.func_wrapper.handle_complex_input
for more detail.
-
class ivy.stateful.activations.Softmax(axis=-1, complex_mode='jax')[source]
Bases: Module
-
__init__(axis=-1, complex_mode='jax')[source]
Apply the SOFTMAX activation function.
- Parameters:
axis (int
, default: -1
) – The axis which we apply softmax op on.
complex_mode (Literal
['split'
, 'magnitude'
, 'jax'
], default: 'jax'
) – Specifies how to handle complex input. See
ivy.func_wrapper.handle_complex_input
for more detail.
-
class ivy.stateful.activations.Softplus(*args, **kwargs)[source]
Bases: Module
-
__init__(beta=1.0, threshold=None)[source]
Apply the SOFTPLUS activation function.
-
class ivy.stateful.activations.Tanh(complex_mode='jax')[source]
Bases: Module
-
__init__(complex_mode='jax')[source]
Apply the TANH activation function.
- Parameters:
complex_mode (Literal
['split'
, 'magnitude'
, 'jax'
], default: 'jax'
) –
- Specifies how to handle complex input. See
ivy.func_wrapper.handle_complex_input
for more detail.
This should have hopefully given you an overview of the activations submodule, if you have any questions, please feel free to reach out on our discord!