A W&B Sweep combines a strategy for exploring hyperparameter values with the code that evaluates them. The strategy can be as simple as trying every option or as complex as Bayesian Optimization and Hyperband (BOHB).
Define a sweep configuration either in a Python dictionary or a YAML file. How you define your sweep configuration depends on how you want to manage your sweep.
The following guide describes how to format your sweep configuration. See Sweep configuration options for a comprehensive list of top-level sweep configuration keys.
Basic structure
Both sweep configuration format options (YAML and Python dictionary) utilize key-value pairs and nested structures.
Use top-level keys within your sweep configuration to define qualities of your sweep search such as the name of the sweep (name
key), the parameters to search through (parameters
key), the methodology to search the parameter space (method
key), and more.
For example, the proceeding code snippets show the same sweep configuration defined within a YAML file and within a Python dictionary. Within the sweep configuration there are five top level keys specified: program
, name
, method
, metric
and parameters
.
Define a sweep configuration in a YAML file if you want to manage sweeps interactively from the command line (CLI)
program: train.py
name: sweepdemo
method: bayes
metric:
goal: minimize
name: validation_loss
parameters:
learning_rate:
min: 0.0001
max: 0.1
batch_size:
values: [16, 32, 64]
epochs:
values: [5, 10, 15]
optimizer:
values: ["adam", "sgd"]
Define a sweep in a Python dictionary data structure if you define training algorithm in a Python script or notebook.
The proceeding code snippet stores a sweep configuration in a variable named sweep_configuration
:
sweep_configuration = {
"name": "sweepdemo",
"method": "bayes",
"metric": {"goal": "minimize", "name": "validation_loss"},
"parameters": {
"learning_rate": {"min": 0.0001, "max": 0.1},
"batch_size": {"values": [16, 32, 64]},
"epochs": {"values": [5, 10, 15]},
"optimizer": {"values": ["adam", "sgd"]},
},
}
Within the top level parameters
key, the following keys are nested: learning_rate
, batch_size
, epoch
, and optimizer
. For each of the nested keys you specify, you can provide one or more values, a distribution, a probability, and more. For more information, see the parameters section in Sweep configuration options.
Double nested parameters
Sweep configurations support nested parameters. To delineate a nested parameter, use an additional parameters
key under the top level parameter name. Sweep configs support multi-level nesting.
Specify a probability distribution for your random variables if you use a Bayesian or random hyperparameter search. For each hyperparameter:
- Create a top level
parameters
key in your sweep config. - Within the
parameters
key, nest the following:- Specify the name of hyperparameter you want to optimize.
- Specify the distribution you want to use for the
distribution
key. Nest thedistribution
key-value pair underneath the hyperparameter name. - Specify one or more values to explore. The value (or values) should be inline with the distribution key.
- (Optional) Use an additional parameters key under the top level parameter name to delineate a nested parameter.
Nested parameters defined in sweep configuration overwrite keys specified in a W&B run configuration.
For example, suppose you initialize a W&B run with the following configuration in a train.py
Python script (see Lines 1-2). Next, you define a sweep configuration in a dictionary called sweep_configuration
(see Lines 4-13). You then pass the sweep config dictionary to wandb.sweep
to initialize a sweep config (see Line 16).
def main():
run = wandb.init(config={"nested_param": {"manual_key": 1}})
sweep_configuration = {
"top_level_param": 0,
"nested_param": {
"learning_rate": 0.01,
"double_nested_param": {"x": 0.9, "y": 0.8},
},
}
# Initialize sweep by passing in config.
sweep_id = wandb.sweep(sweep=sweep_configuration, project="<project>")
# Start sweep job.
wandb.agent(sweep_id, function=main, count=4)
The nested_param.manual_key
that is passed when the W&B run is initialized is not accessible. The wandb.Run.config
only possess the key-value pairs that are defined in the sweep configuration dictionary.
Sweep configuration template
The following template shows how you can configure parameters and specify search constraints. Replace hyperparameter_name
with the name of your hyperparameter and any values enclosed in <>
.
program: <insert>
method: <insert>
parameter:
hyperparameter_name0:
value: 0
hyperparameter_name1:
values: [0, 0, 0]
hyperparameter_name:
distribution: <insert>
value: <insert>
hyperparameter_name2:
distribution: <insert>
min: <insert>
max: <insert>
q: <insert>
hyperparameter_name3:
distribution: <insert>
values:
- <list_of_values>
- <list_of_values>
- <list_of_values>
early_terminate:
type: hyperband
s: 0
eta: 0
max_iter: 0
command:
- ${Command macro}
- ${Command macro}
- ${Command macro}
- ${Command macro}
To express a numeric value using scientific notation, add the YAML !!float
operator, which casts the value to a floating point number. For example, min: !!float 1e-5
. See Command example.
Sweep configuration examples
program: train.py
method: random
metric:
goal: minimize
name: loss
parameters:
batch_size:
distribution: q_log_uniform_values
max: 256
min: 32
q: 8
dropout:
values: [0.3, 0.4, 0.5]
epochs:
value: 1
fc_layer_size:
values: [128, 256, 512]
learning_rate:
distribution: uniform
max: 0.1
min: 0
optimizer:
values: ["adam", "sgd"]
sweep_config = {
"method": "random",
"metric": {"goal": "minimize", "name": "loss"},
"parameters": {
"batch_size": {
"distribution": "q_log_uniform_values",
"max": 256,
"min": 32,
"q": 8,
},
"dropout": {"values": [0.3, 0.4, 0.5]},
"epochs": {"value": 1},
"fc_layer_size": {"values": [128, 256, 512]},
"learning_rate": {"distribution": "uniform", "max": 0.1, "min": 0},
"optimizer": {"values": ["adam", "sgd"]},
},
}
Bayes hyperband example
program: train.py
method: bayes
metric:
goal: minimize
name: val_loss
parameters:
dropout:
values: [0.15, 0.2, 0.25, 0.3, 0.4]
hidden_layer_size:
values: [96, 128, 148]
layer_1_size:
values: [10, 12, 14, 16, 18, 20]
layer_2_size:
values: [24, 28, 32, 36, 40, 44]
learn_rate:
values: [0.001, 0.01, 0.003]
decay:
values: [1e-5, 1e-6, 1e-7]
momentum:
values: [0.8, 0.9, 0.95]
epochs:
value: 27
early_terminate:
type: hyperband
s: 2
eta: 3
max_iter: 27
The proceeding tabs show how to specify either a minimum or maximum number of iterations for early_terminate
:
The brackets for this example are: [3, 3*eta, 3*eta*eta, 3*eta*eta*eta]
, which equals [3, 9, 27, 81]
.
early_terminate:
type: hyperband
min_iter: 3
The brackets for this example are [27/eta, 27/eta/eta]
, which equals [9, 3]
.
early_terminate:
type: hyperband
max_iter: 27
s: 2
Macro and custom command arguments example
For more complex command line arguments, you can use macros to pass environment variables, the Python interpreter, and additional arguments. W&B supports pre defined macros and custom command line arguments that you can specify in your sweep configuration.
For example, the following sweep configuration (sweep.yaml
) defines a command that runs a Python script (run.py
) with the ${env}
, ${interpreter}
, and ${program}
macros replaced with the appropriate values when the sweep runs.
The --batch_size=${batch_size}
, --test=True
, and --optimizer=${optimizer}
arguments use custom macros to pass the values of the batch_size
, test
, and optimizer
parameters defined in the sweep configuration.
program: run.py
method: random
metric:
name: validation_loss
parameters:
learning_rate:
min: 0.0001
max: 0.1
command:
- ${env}
- ${interpreter}
- ${program}
- "--batch_size=${batch_size}"
- "--optimizer=${optimizer}"
- "--test=True"
The associated Python script (run.py
) can then parse these command line arguments using the argparse
module.
# run.py
import wandb
import argparse
parser = argparse.ArgumentParser()
parser.add_argument('--batch_size', type=int)
parser.add_argument('--optimizer', type=str, choices=['adam', 'sgd'], required=True)
parser.add_argument('--test', type=str2bool, default=False)
args = parser.parse_args()
# Initialize a W&B Run
with wandb.init('test-project') as run:
run.log({'validation_loss':1})
See the Command macros section in Sweep configuration options for a list of pre-defined macros you can use in your sweep configuration.
Boolean arguments
The argparse
module does not support boolean arguments by default. To define a boolean argument, you can use the action
parameter or use a custom function to convert the string representation of the boolean value to a boolean type.
As an example, you can use the following code snippet to define a boolean argument. Pass store_true
or store_false
as an argument to ArgumentParser
.
import wandb
import argparse
parser = argparse.ArgumentParser()
parser.add_argument('--test', action='store_true')
args = parser.parse_args()
args.test # This will be True if --test is passed, otherwise False
You can also define a custom function to convert the string representation of the boolean value to a boolean type. For example, the following code snippet defines the str2bool
function, which converts a string to a boolean value.
def str2bool(v: str) -> bool:
"""Convert a string to a boolean. This is required because
argparse does not support boolean arguments by default.
"""
if isinstance(v, bool):
return v
return v.lower() in ('yes', 'true', 't', '1')