[PDF][PDF] Building ensembles of classifiers for loss minimization

D Margineantu - Computing Science and Statistics, 1999 - Citeseer
Computing Science and Statistics, 1999Citeseer
One of the most active areas of research in supervised learning has been the study of
methods for constructing good ensembles of classi ers, that is, a set of classiers whose
individual decisions are combined to increase overall accuracy of classifying new examples.
In many applications classi ers are required to minimize an asymmetric loss function rather
than the raw misclassi cation rate. In this paper, we present approaches to modifying
existing methods for constructing ensembles to incorporate arbitrary loss functions. We …
Abstract
One of the most active areas of research in supervised learning has been the study of methods for constructing good ensembles of classi ers, that is, a set of classiers whose individual decisions are combined to increase overall accuracy of classifying new examples. In many applications classi ers are required to minimize an asymmetric loss function rather than the raw misclassi cation rate. In this paper, we present approaches to modifying existing methods for constructing ensembles to incorporate arbitrary loss functions. We compare the performance of the new algorithms with traditional ensemble learners, MetaCost-a novel method for cost-sensitive learning, and single decision tree classi ers. We evaluated the algorithms on multi-class data sets from the UC Irvine ML Repository.
Citeseer
以上显示的是最相近的搜索结果。 查看全部搜索结果