LocBoost is a boosting meta-algorithm for boosting weak classifiers using a locality-based approach, proposed by Ronny Meir,Ran El-Yaniv and Shai Ben-David of the Technion.
LocBoost classifies an instance by combining locally weighted mixtures of probabilistic classifiers; in other words it combines a number of easy to compute classifiers, each of which is an ?expert? in a designated subset of all the instances, into one strong classifier.
The algorithm is incremental, in the sense that each
iteration takes the previous results and improves them.
The basic flow of each iteration is:
1. Try to locate the area that contains the most classification errors, and guess the weight that should be given to a classifier that is an expert in this area.
2. Using the parameters of this area and the weight of the classifiers as an initial guess, maximize the log-likelihood of the combined classifier that will be created after adding the new weak classifier to the ones already used.
Add the new weak classifier with the optimized parameters calculated to
the classifiers being used.
A classification for an instance is created by combining the individual classifications of each weak classifier that was part of the process, multiplied by their respective (normalized) weights.
The classification is confidence-based - the decision for an instance is given as a probability of that instance belonging to a class.
Much more about LocBoost and the flow of each phase
in the iteration ? in the paper.