-
Book Overview & Buying
-
Table Of Contents
-
Feedback & Rating

Hands-On Gradient Boosting with XGBoost and scikit-learn
By :

"I used only XGBoost (tried others but none of them performed well enough to end up in my ensemble)."
– Qingchen Wang, Kaggle Winner
(https://www.cnblogs.com/yymn/p/4847130.html)
In this section, we will investigate Kaggle competitions by looking at a brief history of Kaggle competitions, how they are structured, and the importance of a hold-out/test set as distinguished from a validation/test set.
XGBoost built its reputation as the leading machine learning algorithm on account of its unparalleled success in winning Kaggle competitions. XGBoost often appeared in winning ensembles along with deep learning models such as neural networks, in addition to winning outright. A sample list of XGBoost Kaggle competition winners appears on the Distributed (Deep) Machine Learning Community web page at https://github.com/dmlc/xgboost/tree/master/demo#machine-learning-challenge-winning-solutions. For...