XGBoost (Extreme Gradient Boosting) is an optimized, open-source implementation of gradient boosting machines. Unlike single decision trees, XGBoost builds an ensemble by iteratively creating trees that correct the errors of previous trees, weighted by gradient descent. Each new tree fits the residuals (prediction errors) of the accumulated ensemble, gradually improving accuracy. The implementation prioritizes speed and regularization, featuring tree pruning, GPU acceleration, parallel processing, and built-in handling of missing values. XGBoost excels on tabular, structured data with millions of records and dozens to hundreds of features. It's widely used in finance (credit scoring, fraud detection), e-commerce (ranking, recommendation), healthcare (risk prediction), and competitive machine learning (Kaggle competitions).