XGBOOST stands for eXtreme Gradient Boosting. A big brother of the earlier AdaBoost, XGB is a supervised learning algorithm that uses an ensemble of adaptively boosted decision trees. For those unfamiliar with adaptive boosting algorithms, here’s a 2-minute explanation video and a written tutorial. Although XGBOOST often performs well in predictive tasks, the training process can be quite time-consuming (similar to other bagging/boosting algorithms (e.g., random forest)).
In a recent blog, Analytics Vidhya compares the inner workings as well as the predictive accuracy of the XGBOOST algorithm to an upcoming boosting algorithm: Light GBM. The blog demonstrates a stepwise implementation of both algorithms in Python. The table below reflects the main conclusion of the comparison: Although the algorithms are comparable in terms of their predictive performance, light GBM is much faster to train. With continuously increasing data volumes, light GBM, therefore, seems the way forward.
Laurae also benchmarked lightGBM against xgboost on a Bosch dataset and her results show that, on average, LightGBM (binning) is between 11x to 15x faster than xgboost (without binning):

However, the differences get smaller as more threads are used due to thread inefficiencies (idle-time increases because threads are not scheduled a next task fast enough).
Light GBM is also available in R:
devtools::install_github("Microsoft/LightGBM", subdir = "R-package")
Neil Schneider tested the three algorithms for gradient boosting in R (GBM
, xgboost
, and lightGBM
) and sums up their (dis)advantages:
GBM
has no specific advantages but its disadvantages include no early stopping, slower training and decreased accuracy,xgboost
has demonstrated successful on kaggle and though traditionally slower thanlightGBM
,tree_method = 'hist'
(histogram binning) provides a significant improvement.lightGBM
has the advantages of training efficiency, low memory usage, high accuracy, parallel learning, corporate support, and scale-ability. However, its’ newness is its main disadvantage because there is little community support.
8 thoughts on “Light GBM vs. XGBOOST in Python & R”