XGBoost & LightGBM

Track your trees with W&B.

The wandb library includes special callbacks for both of the most popular libraries for training gradient-boosted machines: XGBoost and LightGBM. It's also easy to use the generic logging features of Weights & Biases to track large experiments, like hyperparameter sweeps.

XGBoost
LightGBM
XGBoost
from wandb.xgboost import wandb_callback
import xgboost as xgb
...
bst = xgb.train(param, train_data, num_round, watchlist,
callbacks=[wandb_callback()])
LightGBM
from wandb.lightgbm import wandb_callback
import lightgbm as lgb
...
gbm = lgb.train(params, train_data,
num_boost_round=20, valid_sets=valid_data,
valid_names=('validation'),
callbacks=[wandb_callback()])

Looking for code examples? Check out our repository of examples on GitHub.

Tuning your hyperparameters with Sweeps

Attaining the maximum performance out of models requires tuning hyperparameters, like tree depth and learning rate. Weights & Biases includes Sweeps, a powerful toolkit for configuring, orchestrating, and analyzing large hyperparameter testing experiments.

To learn more about these tools and see an example of how to use Sweeps with XGBoost, check out this interactive Colab notebook.

tl;dr: trees outperform linear learners on this classification dataset.