Creates a criterion that optimizes a two-class classification logistic loss between input tensor \(x\) and target tensor \(y\) (containing 1 or -1).
nn_soft_margin_loss(reduction = "mean")(string, optional): Specifies the reduction to apply to the output:
'none' | 'mean' | 'sum'. 'none': no reduction will be applied,
'mean': the sum of the output will be divided by the number of
elements in the output, 'sum': the output will be summed. Note: size_average
and reduce are in the process of being deprecated, and in the meantime,
specifying either of those two args will override reduction. Default: 'mean'
Input: \((*)\) where \(*\) means, any number of additional dimensions
Target: \((*)\), same shape as the input
Output: scalar. If reduction is 'none', then same shape as the input
$$ \mbox{loss}(x, y) = \sum_i \frac{\log(1 + \exp(-y[i]*x[i]))}{\mbox{x.nelement}()} $$