-
Book Overview & Buying
-
Table Of Contents
-
Feedback & Rating

Hands-On Gradient Boosting with XGBoost and scikit-learn
By :

It's time to combine all the components of this chapter to improve upon the 78% score obtained through cross-validation.
As you know, there is no one-size-fits-all approach to hyperparameter fine-tuning. One approach is to input all hyperparameter ranges with RandomizedSearchCV
. A more systematic approach is to tackle hyperparameters one at a time, using the best results for subsequent iterations. All approaches have advantages and limitations. Regardless of strategy, it's essential to try multiple variations and make adjustments when the data comes in.
Using a systematic approach, we add one hyperparameter at a time, aggregating results along the way.
Even though the n_estimators
value of 2
gave the best result, it's worth trying a range on the grid_search
function, which uses cross-validation:
grid_search(params={'n_estimators':[2, 25, 50, 75, 100]})
The output is as...