Publication
ICML 2005
Conference paper

Error limiting reductions between classification tasks

View publication

Abstract

We introduce a reduction-based model for analyzing supervised learning tasks. We use this model to devise a new reduction from multi-class cost-sensitive classification to binary classification with the following guarantee: If the learned binary classifier has error rate at most e then the cost-sensitive classifier has cost at most 2ε times the expected sum of costs of all possible lables. Since cost-sensitive classification can embed any bounded loss finite choice supervised learning task, this result shows that any such task can be solved using a binary classification oracle. Finally, we present experimental results showing that our new reduction out-performs existing algorithms for multi-class cost-sensitive learning.

Date

01 Dec 2005

Publication

ICML 2005

Authors

Share