A closed-form reduction of multi-class cost-sensitive learning to weighted multi-class learning

Fen Xia, Yan wu Yang, Liang Zhou, Fuxin Li, Min Cai, Daniel D. Zeng

Research output: Contribution to journalArticlepeer-review

15 Scopus citations

Abstract

In cost-sensitive learning, misclassification costs can vary for different classes. This paper investigates an approach reducing a multi-class cost-sensitive learning to a standard classification task based on the data space expansion technique developed by Abe et al., which coincides with Elkan's reduction with respect to binary classification tasks. Using this proposed reduction approach, a cost-sensitive learning problem can be solved by considering a standard 0 / 1 loss classification problem on a new distribution determined by the cost matrix. We also propose a new weighting mechanism to solve the reduced standard classification problem, based on a theorem stating that the empirical loss on independently identically distributed samples from the new distribution is essentially the same as the loss on the expanded weighted training set. Experimental results on several synthetic and benchmark datasets show that our weighting approach is more effective than existing representative approaches for cost-sensitive learning.

Original languageEnglish (US)
Pages (from-to)1572-1581
Number of pages10
JournalPattern Recognition
Volume42
Issue number7
DOIs
StatePublished - Jul 1 2009

Keywords

  • Classification
  • Cost-sensitive learning
  • Statistical learning theory
  • Supervised learning

ASJC Scopus subject areas

  • Software
  • Signal Processing
  • Computer Vision and Pattern Recognition
  • Artificial Intelligence

Fingerprint Dive into the research topics of 'A closed-form reduction of multi-class cost-sensitive learning to weighted multi-class learning'. Together they form a unique fingerprint.

Cite this