
Source: Pythonic Biologist
This article is about 1900 words long, and it is recommended to read it in 8 minutes.
TensorFlow and PyTorch are quite similar; this article introduces loss functions using PyTorch as an example.
19 Types of Loss Functions
1. L1 Loss L1Loss
Calculates the absolute difference between output and target.
torch.nn.L1Loss(reduction='mean')
Parameters:
reduction – three values: none (no reduction), mean (returns the average of losses), sum (returns the sum of losses). Default: mean.
torch.nn.MSELoss(reduction='mean')
reduction – three values: none (no reduction), mean (returns the average of losses), sum (returns the sum of losses). Default: mean.

torch.nn.CrossEntropyLoss(weight=None, ignore_index=-100, reduction='mean')
weight (Tensor, optional) – custom weights for each category. Must be a Tensor of length C. ignore_index (int, optional) – sets a target value that will be ignored and will not affect the gradient of the input. reduction – three values: none (no reduction), mean (returns the average of losses), sum (returns the sum of losses). Default: mean.
torch.nn.KLDivLoss(reduction='mean')
reduction – three values: none (no reduction), mean (returns the average of losses), sum (returns the sum of losses). Default: mean.
torch.nn.BCELoss(weight=None, reduction='mean')
weight (Tensor, optional) – custom weights for each batch element’s loss. Must be a Tensor of length “nbatch”.
torch.nn.BCEWithLogitsLoss(weight=None, reduction='mean', pos_weight=None)
weight (Tensor, optional) – custom weights for each batch element’s loss. Must be a Tensor of length “nbatch”.
7. Margin Ranking Loss
torch.nn.MarginRankingLoss(margin=0.0, reduction='mean')

margin: default value 0
8. Hinge Embedding Loss
torch.nn.HingeEmbeddingLoss(margin=1.0, reduction='mean')
For each instance in a mini-batch, the loss function is as follows:
Parameters:
margin: default value 1
9. Multi-Label Classification Loss MultiLabelMarginLoss
torch.nn.MultiLabelMarginLoss(reduction='mean')

10. Smooth L1 Loss SmoothL1Loss
torch.nn.SmoothL1Loss(reduction='mean')
Where:

11. Logistic Loss for 2-Class SoftMarginLoss
torch.nn.SoftMarginLoss(reduction='mean')

12. Multi-Label One-Versus-All Loss MultiLabelSoftMarginLoss
torch.nn.MultiLabelSoftMarginLoss(weight=None, reduction='mean')

13. Cosine Loss CosineEmbeddingLoss
torch.nn.CosineEmbeddingLoss(margin=0.0, reduction='mean')
margin: default value 0
14. Hinge Loss for Multi-Class MultiMarginLoss
torch.nn.MultiMarginLoss(p=1, margin=1.0, weight=None, reduction='mean')
p=1 or 2 default value: 1 margin: default value 1
15. Triplet Loss TripletMarginLoss

torch.nn.TripletMarginLoss(margin=1.0, p=2.0, eps=1e-06, swap=False, reduction='mean')
Where:
torch.nn.CTCLoss(blank=0, reduction='mean')
reduction – three values: none (no reduction), mean (returns the average of losses), sum (returns the sum of losses). Default: mean.
torch.nn.NLLLoss(weight=None, ignore_index=-100, reduction='mean')
weight (Tensor, optional) – custom weights for each category. Must be a Tensor of length C. ignore_index (int, optional) – sets a target value that will be ignored and will not affect the gradient of the input.
torch.nn.NLLLoss2d(weight=None, ignore_index=-100, reduction='mean')
weight (Tensor, optional) – custom weights for each category. Must be a Tensor of length C. reduction – three values: none (no reduction), mean (returns the average of losses), sum (returns the sum of losses). Default: mean.
torch.nn.PoissonNLLLoss(log_input=True, full=False, eps=1e-08, reduction='mean')
Editor: Huang Jiyan
Proofreader: Lin Yilin