Download PDFOpen PDF in browserHistorical Gradient Boosting Machine13 pages•Published: September 17, 2018AbstractWe introduce the Historical Gradient Boosting Machine with the objective of improving the convergence speed of gradient boosting. Our approach is analyzed from the perspective of numerical optimization in function space and considers gradients in previous steps, which have rarely been appreciated by traditional methods. To better exploit the guiding effect of historical gradient information, we incorporate both the accumulated previous gradients and the current gradient into the computation of descent direction in the function space. By fitting to the descent direction given by our algorithm, the weak learner could enjoy the advantages of historical gradients that mitigate the greediness of the steepest descent direction. Experimental results show that our approach improves the convergence speed of gradient boosting without significant decrease in accuracy.Keyphrases: decision tree, gradient boosting machine, historical gradient In: Daniel Lee, Alexander Steen and Toby Walsh (editors). GCAI-2018. 4th Global Conference on Artificial Intelligence, vol 55, pages 68-80.
|