Why do we use gradient boosting?
Gradient boosting is a type of machine learning boosting. Moreover, relying on the intuition that the best possible next model, when combined with previous models, minimizes the overall prediction error. The key idea is to set the target outcomes for this next model in order to minimize the error.
Boosting is one of the ensemble learning methods used to reduce the bias error during prediction.
It first defines a loss function (i.e., prediction error such as L2 loss and hinge loss) to be minimized during model fitting. Then, it adaptatively fits a sequence of models by each time paying more attention to the set of data poorly predicted by the previous model, fitting a new weak learner that minimizes the loss function, and finally adding the weak learner to the previous model with a weight. The final model produced by boosting is a weighted sum of weak learners.
Gradient boosting is a type of boosting.
Usable for solving both regression and classification problems. It aims to produce a weighted sum of weak learners. By iteratively adding the weak learners using a method similar to gradient descent. Gradient boosting first fits the weak learner to the negative of the gradient of the loss function, then finds the optimal step size, and finally adds it to the previous model (i.e., combines it with the previous weak learners) with a weight; the weights of the previous weak learners do not change during this process.
By fitting to the negative of the gradient of the cost function. The model is modified in a direction that reduces the loss, the gradient descent procedure is then performed on the updated model. And stops after a given number of iterations. Or when the accuracy of the prediction on a validation set does not show significant improvement. Gradient boosting applies on a wide range of loss functions as long as the loss function is differentiable. Also applicable to classification and regression trees. Such as using decision trees as the weak learner resulting in gradient tree boosting.