Losses#
Collection of Ivy loss functions.
- ivy.binary_cross_entropy(true, pred, /, *, from_logits=False, epsilon=0.0, reduction='mean', pos_weight=None, axis=None, out=None)[source]#
Compute the binary cross entropy loss.
- Parameters:
true (
Union
[Array
,NativeArray
]) – input array containing true labels.pred (
Union
[Array
,NativeArray
]) – input array containing Predicted labels.from_logits (
bool
, default:False
) – Whether pred is expected to be a logits tensor. By default, we assume that pred encodes a probability distribution.epsilon (
float
, default:0.0
) – a float in [0.0, 1.0] specifying the amount of smoothing when calculating the loss. If epsilon is0
, no smoothing will be applied. Default:0
.reduction (
str
, default:'mean'
) –'none'
: No reduction will be applied to the output.'mean'
: The output will be averaged.'sum'
: The output will be summed. Default:'none'
.pos_weight (
Optional
[Union
[Array
,NativeArray
]], default:None
) – a weight for positive examples. Must be an array with length equal to the number of classes.axis (
Optional
[int
], default:None
) – Axis along which to compute crossentropy.out (
Optional
[Array
], default:None
) – optional output array, for writing the result to. It must have a shape that the inputs broadcast to.
- Return type:
- Returns:
ret – The binary cross entropy between the given distributions.
Examples
With
ivy.Array
input:>>> x = ivy.array([0, 1, 0, 0]) >>> y = ivy.array([0.2, 0.8, 0.3, 0.8]) >>> z = ivy.binary_cross_entropy(x, y) >>> print(z) ivy.array(0.60309976)
>>> x = ivy.array([[0, 1, 1, 0]]) >>> y = ivy.array([[2.6, 6.2, 3.7, 5.3]]) >>> z = ivy.binary_cross_entropy(x, y, reduction='mean') >>> print(z) ivy.array(7.6666193)
>>> x = ivy.array([[0, 1, 1, 0]]) >>> y = ivy.array([[2.6, 6.2, 3.7, 5.3]]) >>> pos_weight = ivy.array([1, 2, 3, 4]) >>> z = ivy.binary_cross_entropy(x, y, pos_weight=pos_weight, from_logits=True) ivy.array(2.01348412)
>>> x = ivy.array([[0, 1, 1, 0]]) >>> y = ivy.array([[2.6, 6.2, 3.7, 5.3]]) >>> pos_weight = ivy.array([1, 2, 3, 4]) >>> z = ivy.binary_cross_entropy(x, y, pos_weight=pos_weight, from_logits=True, reduction='sum', axis=1) >>> print(z) ivy.array([8.05393649])
>>> x = ivy.array([[0, 1, 1, 0]]) >>> y = ivy.array([[2.6, 6.2, 3.7, 5.3]]) >>> z = ivy.binary_cross_entropy(x, y, reduction='none', epsilon=0.5) >>> print(z) ivy.array([[11.49992943, 3.83330965, 3.83330965, 11.49992943]])
>>> x = ivy.array([[0, 1, 0, 0]]) >>> y = ivy.array([[0.6, 0.2, 0.7, 0.3]]) >>> z = ivy.binary_cross_entropy(x, y, epsilon=1e-3) >>> print(z) ivy.array(1.02136981)
With
ivy.NativeArray
input:>>> x = ivy.native_array([0, 1, 0, 1]) >>> y = ivy.native_array([0.2, 0.7, 0.2, 0.6]) >>> z = ivy.binary_cross_entropy(x, y) >>> print(z) ivy.array(0.32844672)
With a mix of
ivy.Array
andivy.NativeArray
inputs:>>> x = ivy.array([0, 0, 1, 1]) >>> y = ivy.native_array([0.1, 0.2, 0.8, 0.6]) >>> z = ivy.binary_cross_entropy(x, y) >>> print(z) ivy.array(0.26561815)
With
ivy.Container
input:>>> x = ivy.Container(a=ivy.array([1, 0, 0]),b=ivy.array([0, 0, 1])) >>> y = ivy.Container(a=ivy.array([0.6, 0.2, 0.3]),b=ivy.array([0.8, 0.2, 0.2])) >>> z = ivy.binary_cross_entropy(x, y) >>> print(z) { a: ivy.array(0.36354783), b: ivy.array(1.14733934) }
With a mix of
ivy.Array
andivy.Container
inputs:>>> x = ivy.array([1 , 1, 0]) >>> y = ivy.Container(a=ivy.array([0.7, 0.8, 0.2])) >>> z = ivy.binary_cross_entropy(x, y) >>> print(z) { a: ivy.array(0.26765382) }
Instance Method Examples
Using
ivy.Array
instance method:>>> x = ivy.array([1, 0, 0, 0]) >>> y = ivy.array([0.8, 0.2, 0.2, 0.2]) >>> z = ivy.binary_cross_entropy(x, y) >>> print(z) ivy.array(0.22314337)
- ivy.cross_entropy(true, pred, /, *, axis=None, epsilon=1e-07, reduction='mean', out=None)[source]#
Compute cross-entropy between predicted and true discrete distributions.
- Parameters:
true (
Union
[Array
,NativeArray
]) – input array containing true labels.pred (
Union
[Array
,NativeArray
]) – input array containing the predicted labels.axis (
Optional
[int
], default:None
) – the axis along which to compute the cross-entropy. If axis is-1
, the cross-entropy will be computed along the last dimension. Default:-1
.epsilon (
float
, default:1e-07
) – a float in [0.0, 1.0] specifying the amount of smoothing when calculating the loss. If epsilon is0
, no smoothing will be applied. Default:1e-7
.out (
Optional
[Array
], default:None
) – optional output array, for writing the result to. It must have a shape that the inputs broadcast to.
- Return type:
- Returns:
ret – The cross-entropy loss between the given distributions
Examples
>>> x = ivy.array([0, 0, 1, 0]) >>> y = ivy.array([0.25, 0.25, 0.25, 0.25]) >>> print(ivy.cross_entropy(x, y)) ivy.array(0.34657359)
>>> z = ivy.array([0.1, 0.1, 0.7, 0.1]) >>> print(ivy.cross_entropy(x, z)) ivy.array(0.08916873)
- ivy.sparse_cross_entropy(true, pred, /, *, axis=-1, epsilon=1e-07, reduction='mean', out=None)[source]#
Compute sparse cross entropy between logits and labels.
- Parameters:
true (
Union
[Array
,NativeArray
]) – input array containing the true labels as logits.pred (
Union
[Array
,NativeArray
]) – input array containing the predicted labels as logits.axis (
int
, default:-1
) – the axis along which to compute the cross-entropy. If axis is-1
, the cross-entropy will be computed along the last dimension. Default:-1
.epsilon (
float
, default:1e-07
) – a float in [0.0, 1.0] specifying the amount of smoothing when calculating the loss. If epsilon is0
, no smoothing will be applied. Default:1e-7
.out (
Optional
[Array
], default:None
) – optional output array, for writing the result to. It must have a shape that the inputs broadcast to.
- Return type:
- Returns:
ret – The sparse cross-entropy loss between the given distributions
Examples
With
ivy.Array
input:>> x = ivy.array([2]) >> y = ivy.array([0.1, 0.1, 0.7, 0.1]) >> print(ivy.sparse_cross_entropy(x, y)) ivy.array([0.08916873])
>>> x = ivy.array([3]) >>> y = ivy.array([0.1, 0.1, 0.7, 0.1]) >>> print(ivy.cross_entropy(x, y)) ivy.array(5.44832274)
>>> x = ivy.array([2,3]) >>> y = ivy.array([0.1, 0.1]) >>> print(ivy.cross_entropy(x, y)) ivy.array(5.75646281)
With
ivy.NativeArray
input:>>> x = ivy.native_array([4]) >>> y = ivy.native_array([0.1, 0.2, 0.1, 0.1, 0.5]) >>> print(ivy.sparse_cross_entropy(x, y)) ivy.array([0.13862944])
With
ivy.Container
input:>>> x = ivy.Container(a=ivy.array([4])) >>> y = ivy.Container(a=ivy.array([0.1, 0.2, 0.1, 0.1, 0.5])) >>> print(ivy.sparse_cross_entropy(x, y)) { a: ivy.array([0.13862944]) }
With a mix of
ivy.Array
andivy.NativeArray
inputs:>>> x = ivy.array([0]) >>> y = ivy.native_array([0.1, 0.2, 0.6, 0.1]) >>> print(ivy.sparse_cross_entropy(x,y)) ivy.array([0.57564628])
With a mix of
ivy.Array
andivy.Container
inputs:>>> x = ivy.array([0]) >>> y = ivy.Container(a=ivy.array([0.1, 0.2, 0.6, 0.1])) >>> print(ivy.sparse_cross_entropy(x,y)) { a: ivy.array([0.57564628]) }
Instance Method Examples
With
ivy.Array
input:>>> x = ivy.array([2]) >>> y = ivy.array([0.1, 0.1, 0.7, 0.1]) >>> print(x.sparse_cross_entropy(y)) ivy.array([0.08916873])
With
ivy.Container
input:>>> x = ivy.Container(a=ivy.array([2])) >>> y = ivy.Container(a=ivy.array([0.1, 0.1, 0.7, 0.1])) >>> print(x.sparse_cross_entropy(y)) { a: ivy.array([0.08916873]) }
- ivy.ssim_loss(true, pred, out=None)[source]#
Calculate the Structural Similarity Index (SSIM) loss between two images.
- Parameters:
true (A 4D image array of shape (batch_size, channels, height, width).) –
pred (A 4D image array of shape (batch_size, channels, height, width).) –
- Return type:
- Returns:
ivy.Array: The SSIM loss measure similarity between the two images.
Examples
With
ivy.Array
input: >>> import ivy >>> x = ivy.ones((5, 3, 28, 28)) >>> y = ivy.zeros((5, 3, 28, 28)) >>> ivy.ssim_loss(x, y) ivy.array(0.99989986)
- ivy.wasserstein_loss_discriminator(p_real, p_fake, out=None)[source]#
Compute the Wasserstein loss for the discriminator (critic).
- Parameters:
(ivy.Array) (p_fake) –
(ivy.Array) –
- Return type:
- Returns:
ivy.Array: Wasserstein loss for the discriminator.
- ivy.wasserstein_loss_generator(pred_fake, out=None)[source]#
Compute the Wasserstein loss for the generator.
- Parameters:
(ivy.Array) (pred_fake) –
- Return type:
- Returns:
ivy.Array: Wasserstein loss for the generator.
This should have hopefully given you an overview of the losses submodule, if you have any questions, please feel free to reach out on our discord!