Applied Math And Analysis Seminar
Monday, January 28, 2019, 12:00pm, LSRC D106
Haihao Lu (MIT)
Gradient Boosting Machines: Structural Insights and Improved Algorithms
Abstract:
The gradient boosting machine (GBM) is one of the most successful supervised learning algorithms, and it has been the dominant method in many data science competitions, including Kaggle and KDDCup. In spite of its practical success, there has been a huge gap between practice and theoretical understanding. In this line of research, we show that GBM can be interpreted as a greedy coordinate descent method in the coefficient space and/or a mirror descent method in the “pseudo-residual” space. Armed with this structural insight, we develop two new algorithms for classification in the context of GBM: (i) the Random-then-Greedy Gradient Boosting Machine (RtGBM), which lowers the cost per iteration and achieves improved performance in theory as well as practice; and (ii) the Accelerated Gradient Boosting Machine (AGBM), which achieves the computational efficiency of acceleration schemes in general, again both in theory and in practice. These two algorithms are currently being incorporated by Google into their TensorFlow Boosted Trees software.

Generated at 6:20am Saturday, April 20, 2024 by Mcal.   Top * Reload * Login