Boosting (Freund & Schapire 1996, Schapire & Singer 1998) is one of the most important recent developments in classiication methodology. The performance of many classiication algorithms often can be dramatically improved by sequentially applying them to reweighted versions of the input data, and taking a weighted majority vote of the sequence of classiiers thereby produced. We show that this seemingly mysterious phenomenon can be understood in terms of well known statistical principles, namely additive modeling and maximum likelihood. For the two-class problem, boosting can be viewed as an approximation to additive modeling on the logistic scale using maximum Bernoulli likelihood as a criterion. We develop more direct approximations and show that they exhibit nearly identical results to that of boosting. Direct multi-class generalizations based on multinomial likelihood are derived that exhibit performance comparable to other recently proposed multi-class generalizations of boosting in most situations , and far superior in some. We suggest a minor modiication to boosting that can reduce computation, often by factors of 10 to 50. Finally, we apply these insights to produce an alternative formulation of boosting decision trees. This approach, based on best-rst truncated tree induction, often leads to better performance, and can provide interpretable descriptions of the aggregate decision rule. It is also much faster computationally making it more suitable to large scale data mining applications.
[1]
Peter E. Hart,et al.
Nearest neighbor pattern classification
,
1967,
IEEE Trans. Inf. Theory.
[2]
M. West.
AN EXPERIMENTAL COMPARISON OF THREE METHODS OF CORRECTING THEMES TO IMPROVE SENTENCE STRUCTURE OF SEVENTH GRADE PUPILS IN LARGE CLASSES
,
1967
.
[3]
J. Friedman,et al.
Projection Pursuit Regression
,
1981
.
[4]
Yoav Freund,et al.
Boosting a weak learning algorithm by majority
,
1995,
COLT '90.
[5]
Stéphane Mallat,et al.
Matching pursuits with time-frequency dictionaries
,
1993,
IEEE Trans. Signal Process..
[6]
R. Tibshirani,et al.
Flexible Discriminant Analysis by Optimal Scoring
,
1994
.
[7]
Umesh V. Vazirani,et al.
An Introduction to Computational Learning Theory
,
1994
.
[8]
Yoav Freund,et al.
Experiments with a New Boosting Algorithm
,
1996,
ICML.
[9]
Yoram Singer,et al.
Improved Boosting Algorithms Using Confidence-rated Predictions
,
1998,
COLT' 98.
[10]
R. Tibshirani,et al.
Classi cation by Pairwise Coupling
,
1998
.
[11]
Leo Breiman,et al.
Prediction Games and Arcing Algorithms
,
1999,
Neural Computation.