Abstract
In this paper, we propose a novel adaptive loss function for enhancing deep learning performance in classification tasks. Specifically, we redefine the cross-entropy loss to effectively address class-level noise conditions, including the challenging problem of class imbalance. Our approach introduces aggregation operators to improve classification accuracy. The rationale behind our proposed method lies in the iterative up-weighting of class-level components within the loss function, focusing on those with larger errors. To achieve this, we employ the ordered weighted average (OWA) operator and combine it with an adaptive scheme for gradient-based learning. The main finding is that our method outperforms other commonly used loss functions, such as the standard cross-entropy or focal loss, across various binary and multiclass classification tasks. Furthermore, we explore the influence of hyperparameters associated with the OWA operators and propose a default configuration that performs well across different experimental settings.
Original language | English |
---|---|
Article number | 111022 |
Journal | Knowledge-Based Systems |
Volume | 280 |
DOIs | |
State | Published - 25 Nov 2023 |
Bibliographical note
Funding Information:The authors gratefully acknowledge financial support from ANID PIA/PUENTE AFB220003 and FONDECYT-Chile, grants 1200221, 11200007, 1201403 and 1230694. The authors would like to thank the anonymous reviewers for their valuable comments and suggestions for improving the quality of the paper.
Publisher Copyright:
© 2023 Elsevier B.V.
Keywords
- Class-imbalance classification
- Deep learning
- Loss functions
- OWA operators