First steps with XGBoost (implementation in less than 5 minutes)

Fernando Marcos Wittmann
2 min readNov 3, 2018

In this post, we will implement an XGBoost classifier and compare with other Sklearn estimators. I want to keep this post as short and objective as possible. You will first need to install XGBoost. It is just one line in the terminal:

$pip install xgboost

Next, open iPython and copy and paste this code that compares several sklearn classifiers (reference):

The output will look something like this:

Finally, replace the last estimator of the previous example with XGBoostClassifier. You only need to change three rows:

from xgboost import XGBClassifier # <-- Import XGBoost
h = .02 # step size in the mesh
names = ["Nearest Neighbors", "Linear SVM", "RBF SVM", "Gaussian Process",
"Decision Tree", "Random Forest", "Neural Net", "AdaBoost",
"Naive Bayes", "XGBoost"] # <-- Replace name
classifiers = [
KNeighborsClassifier(3),
SVC(kernel="linear", C=0.025),
SVC(gamma=2, C=1),
GaussianProcessClassifier(1.0 * RBF(1.0)),
DecisionTreeClassifier(max_depth=5),
RandomForestClassifier(max_depth=5, n_estimators=10, max_features=1),
MLPClassifier(alpha=1),
AdaBoostClassifier(),
GaussianNB(),
XGBClassifier()]# <-- Replace QDA for XGB

The output should look similar to the following:

Ready! XGBoost has been installed, initialized and compared with Sklearn’s most popular estimators! :)

--

--

Fernando Marcos Wittmann

Head of Data Science @ Awari | Machine Learning Expert | E-Learning