Ray Tune is a scalable hyperparameter tuning library. We're adding support for Ray Tune to W&B Sweeps, which makes it easy to launch runs on many machines and visualize results in a central place.
Also check out the Ray Tune integrations for W&B for a feature complete, out-of-the-box solution for leveraging both Ray Tune and W&B!
This feature is in beta! We love feedback, and we really appreciate hearing from folks who are experimenting with our Sweeps product.
Here's a quick example:
import wandbfrom wandb.sweeps.config import tunefrom wandb.sweeps.config.tune.suggest.hyperopt import HyperOptSearchfrom wandb.sweeps.config.hyperopt import hptune_config = tune.run("train.py",search_alg=HyperOptSearch(dict(width=hp.uniform("width", 0, 20),height=hp.uniform("height", -100, 100),activation=hp.choice("activation", ["relu", "tanh"])),metric="mean_loss",mode="min"),num_samples=10)# Save sweep as yaml config filetune_config.save("sweep-hyperopt.yaml")# Create the sweepwandb.sweep(tune_config)
See full example on GitHub →
HyperOpt Feature | Support |
hp.choice | Supported |
hp.randint | Planned |
hp.pchoice | Planned |
hp.uniform | Supported |
hp.uniformint | Planned |
hp.quniform | Planned |
hp.loguniform | Supported |
hp.qloguniform | Planned |
hp.normal | Planned |
hp.qnormal | Planned |
hp.lognormal | Planned |
hp.qlognormal | Planned |
By default, Tune schedules runs in serial order. You can also specify a custom scheduling algorithm that can stop runs early or perturb parameters. Read more in the Tune docs →
Scheduler | Support |
Investigating | |
Planned | |
Investigating | |
Investigating | |
Investigating |