-
Book Overview & Buying
-
Table Of Contents
-
Feedback & Rating

Hands-On Gradient Boosting with XGBoost and scikit-learn
By :

The base learner dart
is similar to gbtree
in the sense that both are gradient boosted trees. The primary difference is that dart
removes trees (called dropout) during each round of boosting.
In this section, we will apply and compare the base learner dart
to other base learners in regression and classification problems.
Let's see how dart
performs on the Diabetes dataset:
First, redefine X
and y
using load_diabetes
as before:
X, y = load_diabetes(return_X_y=True)
To use dart
as the XGBoost base learner, set the XGBRegressor
parameter booster='dart'
inside the regression_model
function:
regression_model(XGBRegressor(booster='dart', objective='reg:squarederror'))
The score is as follows:
65.96444746130739
The dart
base learner gives the same result as the gbtree
base learner down to two decimal places. The similarity of results is on account of the small dataset and the success of the gbtree...