If you're already using wandb.init, wandb.config, and wandb.log in your project, start here!
If you have an existing W&B project, it’s easy to start optimizing your models with hyperparameter sweeps. This guide will walk through the steps with a working example— you can check out the results in this W&B Dashboard. The code is from this example, which trains a PyTorch convolutional neural network to classify images from the Fashion MNIST dataset.
1. Create a project
Run your first baseline run manually to check that W&B logging is working properly. You'll download this simple example model, train it for a few minutes, and see the example appear in the web dashboard.
Clone this repo git clone https://github.com/wandb/examples.git
Open this example cd examples/pytorch/pytorch-cnn-fashion
The auto-generated config guesses values to sweep over based on the runs you've done already. Edit the config to specify what ranges of hyperparameters you want to try. When you launch the sweep, it starts a new process on our hosted W&B sweep server. This centralized service coordinates the agents— the machines that are running the training jobs.
3. Launch agents
Next, launch an agent locally. You can launch up to 20 agents on different machines in parallel if you want to distribute the work and finish the sweep more quickly. The agent will print out the set of parameters it’s trying next.