Bagging and Boosting Method

# Introduction The previous labs are independent of classifiers. Each of the classes has its own unique characteristics and is very suitable for some data. However, in practice, there may not be such suitable data. When applying one of the previous classifiers, there may be a problem of low classification accuracy. In order to solve such problems, ensemble learning has been proposed. The combination of multiple weak classifiers has improved the classification performance. This chapter explains some of the most classic algorithms in ensemble learning in detail: Bagging Tree and Random Forest (Bammody Forest) in Bagging, and AdaBoost and Gradient Boosting Decision Tree (GBDT) in Boosting.

|60 : 00

Click the virtual machine below to start practicing