LightGBM vs XGBoost
Learn the difference between two popular gradient boosting models, LightGBM and XGBoost.
1. What is gradient boosting?
2. What is XGBoost?
3. What is LightGBM?
4. What are the similarities?
5. What are the differences?
6. When should I use them?
7. Which is better?
What is gradient boosting?
Both of these models are gradient boosting models, so let's have a quick catch-up on what this means.
Gradient boosting is a machine learning technique where many weak learners, typically decision trees, are iteratively trained and combined to create a highly performant model. The decision trees are trained sequentially and use the error from the previous tree to adjust its learning and eventually minimise the loss function.

What is XGBoost?
XGBoost is a gradient boosting machine learning algorithm that can be used for classification and regression problems.
Like all gradient boosting models, it is an ensemble model which trains a series of decision trees sequentially but it does so in a level-wise (aka. horizontally) fashion. In this horizontal sequential training, each decision tree is shallow but the number of trees is many (by default).
XGBoost is designed to be a general all-purpose gradient boosting model which performs well out-of-the-box for most datasets.
What is LightGBM?
LightGBM is an open-source machine learning framework developed by Microsoft for classification and regression problems which uses gradient boosting.
It's also an ensemble method which trains a series of decision trees sequentially but does so leaf-wise (aka. vertically), where the trees have many leaves but the number of trees is relatively low. This approach creates a highly performant boosting model whilst being fast to train.
What are the similarities between LightGBM and XGBoost?
- Model framework. Both use the gradient boosting method to train many weak decision trees in an ensemble model
- Performance. Both models perform very well out of the box with standard parameters on most datasets
- Use case. They can be used for classification and regression
- Datasets. Both can handle large datasets with ease
What are the differences between LightGBM and XGBoost?
- Training time. LightGBM is often faster to train due to its use of Gradient-Based One-Side Sampling (GOSS) and Exclusive Feature Bundling (EFB)
- Memory use. LightGBM uses a lot less memory than XGBoost during training
- Tree structure. XGBoost uses level-wise trees, whilst LightGBM uses leaf-wise trees
- Overfitting. Due to the use of deeper decision trees, LightGBM can have a tendency to overfit on the training dataset
- Hyper-parameter tuning: XGBoost has more parameters that can be optimised to improve performance further
When should I use XGBoost or LightGBM?
The choice between the two models depends on your project constraints:
- Use LightGBM when memory availability is a constraint or you need a shorter training time, whether that's due to a large dataset or the need to have fast feedback.
- Use XGBoost if the technical constraints are not a factor for your project.
XGBoost vs LightGBM, which is better?
XGBoost is generally considered to be a better choice than LightGBM, and is likely to be the most commonly used gradient boosting model in real-world use cases. This is likely due to its high performance out of the box, its resistance to overfitting, the many parameter tuning options, and the wide array of documentation available online.
Related articles
What is a baseline machine learning model?
Model choice
XGBoost vs Random Forest, which is better?
Catboost vs XGBoost, which is better?
Catboost vs LightGBM, which is better?