Start free trial
Take Naologic for a spin today, no credit card needed and no obligations.
Start free trial

Gradient Descent - What is the difference between XGBoost and gradient descent?


Evading Gradient Boosting with XGBoost A more regimented variant of Gradient Boosting is XGBoost. XGBoost enhances the generalizability of models through the application of advanced regularization (L1 & L2). When contrasted with Gradient Boosting, XGBoost produces superior results. It has the ability to be trained in parallel across clusters and is extremely fast.