Boosting at Start

R.L. Milidiú and J.C. Duarte (Brazil)

Keywords

Machine Learning, Ensemble Algorithms, Boosting,BAS.

Abstract

Boosting is a machine learning technique that combines several weak classifiers to improve the overall accuracy. At each iteration, the algorithm changes the weights of the examples and builds an additional classifier. A simple voting scheme among these classifiers defines the final classification of a new instance. A well known algorithm based on boosting is AdaBoost. At each iteration, it changes the example weights in order to focus the new classifier on examples that are hard for the previous one. An uniform distribution is initially assigned to the examples. There is no guarantee that this is the best choice for the initial distribution. Here, we propose a novel boosting algorithm, called Boosting At Start (BAS), that generalizes AdaBoost by allowing any initial weight distribution. We also present a scheme for determining an initial weight distribution. We examine the performance of the proposed approach in some problem instances from the UCI Machine Learning Repository. Our empirical findings indicate that using extra knowledge to build an initial non-uniform distribution of weights for the examples results in better learning.

Important Links:



Go Back