771: Gradient Boosting: XGBoost, LightGBM and CatBoost, with Kirill Eremenko

Published: April 2, 2024, 11 a.m.

b'Kirill Eremenko joins Jon Krohn for another exclusive, in-depth teaser for a new course just released on the SuperDataScience platform, \\u201cMachine Learning Level 2\\u201d. Kirill walks listeners through why decision trees and random forests are fruitful for businesses, and he offers hands-on walkthroughs for the three leading gradient-boosting algorithms today: XGBoost, LightGBM, and CatBoost.This episode is brought to you by Ready Tensor, where innovation meets reproducibility, and by Data Universe, the out-of-this-world data conference. Interested in sponsoring a SuperDataScience Podcast episode? Visit passionfroot.me/superdatascience for sponsorship information.In this episode you will learn:\\u2022 All about decision trees [09:17]\\u2022 All about ensemble models [21:43]\\u2022 All about AdaBoost [36:47]\\u2022 All about gradient boosting [45:52]\\u2022 Gradient boosting for classification problems [59:54]\\u2022 Advantages of XGBoost [1:03:51]\\u2022 LightGBM [1:17:06]\\u2022 CatBoost [1:32:07]Additional materials: www.superdatascience.com/771'