A closed-form reduction of multi-class cost-sensitive learning to weighted multi-class learning

Fen Xia, Yan wu Yang, Liang Zhou, Fuxin Li, Min Cai, Dajun Zeng

Research output: Contribution to journalArticle

14 Citations (Scopus)

Abstract

In cost-sensitive learning, misclassification costs can vary for different classes. This paper investigates an approach reducing a multi-class cost-sensitive learning to a standard classification task based on the data space expansion technique developed by Abe et al., which coincides with Elkan's reduction with respect to binary classification tasks. Using this proposed reduction approach, a cost-sensitive learning problem can be solved by considering a standard 0 / 1 loss classification problem on a new distribution determined by the cost matrix. We also propose a new weighting mechanism to solve the reduced standard classification problem, based on a theorem stating that the empirical loss on independently identically distributed samples from the new distribution is essentially the same as the loss on the expanded weighted training set. Experimental results on several synthetic and benchmark datasets show that our weighting approach is more effective than existing representative approaches for cost-sensitive learning.

Original languageEnglish (US)
Pages (from-to)1572-1581
Number of pages10
JournalPattern Recognition
Volume42
Issue number7
DOIs
StatePublished - Jul 2009

Fingerprint

Costs

Keywords

  • Classification
  • Cost-sensitive learning
  • Statistical learning theory
  • Supervised learning

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence
  • Computer Vision and Pattern Recognition
  • Signal Processing

Cite this

A closed-form reduction of multi-class cost-sensitive learning to weighted multi-class learning. / Xia, Fen; Yang, Yan wu; Zhou, Liang; Li, Fuxin; Cai, Min; Zeng, Dajun.

In: Pattern Recognition, Vol. 42, No. 7, 07.2009, p. 1572-1581.

Research output: Contribution to journalArticle

Xia, Fen ; Yang, Yan wu ; Zhou, Liang ; Li, Fuxin ; Cai, Min ; Zeng, Dajun. / A closed-form reduction of multi-class cost-sensitive learning to weighted multi-class learning. In: Pattern Recognition. 2009 ; Vol. 42, No. 7. pp. 1572-1581.
@article{275b4ee90f6c4b2db560e6c5bbbec22e,
title = "A closed-form reduction of multi-class cost-sensitive learning to weighted multi-class learning",
abstract = "In cost-sensitive learning, misclassification costs can vary for different classes. This paper investigates an approach reducing a multi-class cost-sensitive learning to a standard classification task based on the data space expansion technique developed by Abe et al., which coincides with Elkan's reduction with respect to binary classification tasks. Using this proposed reduction approach, a cost-sensitive learning problem can be solved by considering a standard 0 / 1 loss classification problem on a new distribution determined by the cost matrix. We also propose a new weighting mechanism to solve the reduced standard classification problem, based on a theorem stating that the empirical loss on independently identically distributed samples from the new distribution is essentially the same as the loss on the expanded weighted training set. Experimental results on several synthetic and benchmark datasets show that our weighting approach is more effective than existing representative approaches for cost-sensitive learning.",
keywords = "Classification, Cost-sensitive learning, Statistical learning theory, Supervised learning",
author = "Fen Xia and Yang, {Yan wu} and Liang Zhou and Fuxin Li and Min Cai and Dajun Zeng",
year = "2009",
month = "7",
doi = "10.1016/j.patcog.2008.12.011",
language = "English (US)",
volume = "42",
pages = "1572--1581",
journal = "Pattern Recognition",
issn = "0031-3203",
publisher = "Elsevier Limited",
number = "7",

}

TY - JOUR

T1 - A closed-form reduction of multi-class cost-sensitive learning to weighted multi-class learning

AU - Xia, Fen

AU - Yang, Yan wu

AU - Zhou, Liang

AU - Li, Fuxin

AU - Cai, Min

AU - Zeng, Dajun

PY - 2009/7

Y1 - 2009/7

N2 - In cost-sensitive learning, misclassification costs can vary for different classes. This paper investigates an approach reducing a multi-class cost-sensitive learning to a standard classification task based on the data space expansion technique developed by Abe et al., which coincides with Elkan's reduction with respect to binary classification tasks. Using this proposed reduction approach, a cost-sensitive learning problem can be solved by considering a standard 0 / 1 loss classification problem on a new distribution determined by the cost matrix. We also propose a new weighting mechanism to solve the reduced standard classification problem, based on a theorem stating that the empirical loss on independently identically distributed samples from the new distribution is essentially the same as the loss on the expanded weighted training set. Experimental results on several synthetic and benchmark datasets show that our weighting approach is more effective than existing representative approaches for cost-sensitive learning.

AB - In cost-sensitive learning, misclassification costs can vary for different classes. This paper investigates an approach reducing a multi-class cost-sensitive learning to a standard classification task based on the data space expansion technique developed by Abe et al., which coincides with Elkan's reduction with respect to binary classification tasks. Using this proposed reduction approach, a cost-sensitive learning problem can be solved by considering a standard 0 / 1 loss classification problem on a new distribution determined by the cost matrix. We also propose a new weighting mechanism to solve the reduced standard classification problem, based on a theorem stating that the empirical loss on independently identically distributed samples from the new distribution is essentially the same as the loss on the expanded weighted training set. Experimental results on several synthetic and benchmark datasets show that our weighting approach is more effective than existing representative approaches for cost-sensitive learning.

KW - Classification

KW - Cost-sensitive learning

KW - Statistical learning theory

KW - Supervised learning

UR - http://www.scopus.com/inward/record.url?scp=62349086709&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=62349086709&partnerID=8YFLogxK

U2 - 10.1016/j.patcog.2008.12.011

DO - 10.1016/j.patcog.2008.12.011

M3 - Article

AN - SCOPUS:62349086709

VL - 42

SP - 1572

EP - 1581

JO - Pattern Recognition

JF - Pattern Recognition

SN - 0031-3203

IS - 7

ER -