poisson_nll_loss#
- ivy.poisson_nll_loss(input, target, *, log_input=True, full=False, eps=1e-08, reduction='mean')[source]#
Compute the Poisson Negative Log Likelihood Loss.
This function calculates the negative log likelihood loss between the input and target`under the assumption that the target follows a Poisson distribution. By default, the loss is not the exact loss, but the loss minus a constant term [log(z!)]. This omission does not affect optimization but can be significant for relative loss comparisons. The Stirling’s Approximation is used to approximate the log factorial term when `full is set to True.
- Parameters:
input (
Union
[Array
,NativeArray
]) – Expectation of the underlying Poisson distribution.target (
Union
[Array
,NativeArray
]) – Random sample from the Poisson distribution described by the input.log_input (
bool
, default:True
) – If True, the loss is computed as \(exp(input) - target * input\). If False, the loss is computed as \(input - target * log(input + eps)\). Default is True.full (
bool
, default:False
) – Whether to compute the full loss, i.e., to add the Stirling approximation term \(target * log(target) - target + 0.5 * log(2 * pi * target)\). Default is False.eps (
float
, default:1e-08
) – Small value to prevent evaluation of log(0) when log_input is False. Default is 1e-8.reduction (
str
, default:'mean'
) – Specifies the reduction applied to the output. Options are ‘none’, ‘mean’, or ‘sum’. ‘none’: no reduction will be applied. ‘mean’: the output will be averaged. ‘sum’: the output will be summed. Default is ‘mean’.
- Return type:
- Returns:
ret – An array of the same shape as input representing the Poisson Negative Log Likelihood Loss.
- Raises:
ValueError – If the input and target tensors do not have the same shape.
Examples
>>> input_tensor = ivy.array([1, 2, 3, 4], dtype=ivy.float64) >>> target_tensor = ivy.array([2, 2, 2, 2], dtype=ivy.float64) >>> loss = ivy.poisson_nll_loss(input_tensor, target_tensor, log_input=False) >>> print(loss) ivy.array(0.91097307)
- Array.poisson_nll_loss(self, target, *, log_input=True, full=False, eps=1e-08, reduction='mean')[source]#
Compute the Poisson Negative Log Likelihood Loss.
This function calculates the negative log likelihood loss between the input and target`under the assumption that the target follows a Poisson distribution. By default, the loss is not the exact loss, but the loss minus a constant term [log(z!)]. This omission does not affect optimization but can be significant for relative loss comparisons. The Stirling’s Approximation is used to approximate the log factorial term when `full is set to True.
- Parameters:
input – Expectation of the underlying Poisson distribution.
target (
Union
[Array
,NativeArray
]) – Random sample from the Poisson distribution described by the input.log_input (
bool
, default:True
) – If True, the loss is computed as \(exp(input) - target * input\). If False, the loss is computed as \(input - target * log(input + eps)\). Default is True.full (
bool
, default:False
) – Whether to compute the full loss, i.e., to add the Stirling approximation term \(target * log(target) - target + 0.5 * log(2 * pi * target)\). Default is False.eps (
float
, default:1e-08
) – Small value to prevent evaluation of log(0) when log_input is False. Default is 1e-8.reduction (
str
, default:'mean'
) – Specifies the reduction applied to the output. Options are ‘none’, ‘mean’, or ‘sum’. ‘none’: no reduction will be applied. ‘mean’: the output will be averaged. ‘sum’: the output will be summed. Default is ‘mean’.
- Return type:
Array
- Returns:
ret – An array of the same shape as input representing the Poisson Negative Log Likelihood Loss.
- Raises:
ValueError – If the input and target tensors do not have the same shape.
Examples
>>> input_tensor = ivy.array([1, 2, 3, 4], dtype=ivy.float64) >>> target_tensor = ivy.array([2, 2, 2, 2], dtype=ivy.float64) >>> loss = input_tensor.poisson_nll_loss(target_tensor, log_input=True) >>> print(loss) ivy.array(16.1977562)
- Container.poisson_nll_loss(self, target, *, log_input=True, full=False, eps=1e-08, reduction='mean', key_chains=None, to_apply=True, prune_unapplied=False, map_sequences=False)[source]#
ivy.Container instance method variant of ivy.poisson_nll_loss. This method simply wraps the function, and so the docstring for ivy. poisson_nll_loss also applies to this method with minimal changes.
- Parameters:
self (
Union
[Container
,Array
,NativeArray
]) – input array or container containing input labels.target (
Union
[Container
,Array
,NativeArray
]) – input array or container containing the target labels.log_input ([typing.Union[bool, ivy.Container]], default:
True
) – If True, the loss is computed as \(exp(input) - target * input\). If False, the loss is computed as \(input - target * log(input + eps)\). Default is True.full ([typing.Union[bool, ivy.Container]], default:
False
) – Whether to compute the full loss, i.e., to add the Stirling approximation term \(target * log(target) - target + 0.5 * log(2 * pi * target)\). Default is False.eps ([typing.Union[float, ivy.Container]], default:
1e-08
) – Small value to prevent evaluation of log(0) when log_input is False. Default is 1e-8.reduction ([typing.Union[str, ivy.Container]], default:
'mean'
) – Specifies the reduction applied to the output. Options are ‘none’, ‘mean’, or ‘sum’. ‘none’: no reduction will be applied. ‘mean’: the output will be averaged. ‘sum’: the output will be summed. Default is ‘mean’.key_chains (
Optional
[Union
[List
[str
],Dict
[str
,str
],Container
]], default:None
) – The key-chains to apply or not apply the method to. Default isNone
.to_apply (
Union
[bool
,Container
], default:True
) – If input, the method will be applied to key_chains, otherwise key_chains will be skipped. Default isinput
.prune_unapplied (
Union
[bool
,Container
], default:False
) – Whether to prune key_chains for which the function was not applied. Default isFalse
.map_sequences (
Union
[bool
,Container
], default:False
) – Whether to also map method to sequences (lists, tuples). Default isFalse
.
- Return type:
Container
- Returns:
ret – An array of the same shape as input representing the Poisson Negative Log Likelihood Loss.
- Raises:
ValueError – If the input and target tensors do not have the same shape.
Examples
>>> x = ivy.Container(a=ivy.array([[1, 0, 2]], dtype=ivy.float32), ... b=ivy.array([[3, 2, 1]], dtype=ivy.float32)) >>> y = ivy.Container(a=ivy.array([[0.6, 0.2, 0.3]], dtype=ivy.float32), ... b=ivy.array([[0.8, 0.2, 0.2]], dtype=ivy.float32)) >>> z = x.poisson_nll_loss(y) >>> print(z) { a: ivy.array(3.30244565), b: ivy.array(9.06429195) }