wandb library includes special callbacks for both of the most popular libraries for training gradient-boosted machines: XGBoost and LightGBM. It's also easy to use the generic logging features of Weights & Biases to track large experiments, like hyperparameter sweeps.
from wandb.xgboost import wandb_callbackimport xgboost as xgb...bst = xgb.train(param, train_data, num_round, watchlist,callbacks=[wandb_callback()])
from wandb.lightgbm import wandb_callbackimport lightgbm as lgb...gbm = lgb.train(params, train_data,num_boost_round=20, valid_sets=valid_data,valid_names=('validation'),callbacks=[wandb_callback()])
Attaining the maximum performance out of models requires tuning hyperparameters, like tree depth and learning rate. Weights & Biases includes Sweeps, a powerful toolkit for configuring, orchestrating, and analyzing large hyperparameter testing experiments.