W&B provides first class support for PyTorch. To automatically log gradients and store the network topology, you can call watch
and pass in your PyTorch model.
import wandbwandb.init(config=args)# Magicwandb.watch(model)model.train()for batch_idx, (data, target) in enumerate(train_loader):output = model(data)loss = F.nll_loss(output, target)loss.backward()optimizer.step()if batch_idx % args.log_interval == 0:wandb.log({"loss": loss})
Gradients, metrics and the graph won't be logged until
wandb.log
is called after a forward and backward pass.
See this colab notebook for an end to end example of integrating wandb with PyTorch, including a video tutorial. You can also find more examples in our example projects section.
By default the hook only logs gradients.
Arguments | Options |
log |
|
log_freq | integer (default 1000): The number of steps between logging gradients |
You can pass PyTorch tensors with image data into wandb.Image
and torchvision utils will be used to log them automatically.
To log images and view them in the Media panel, you can use the following syntax:
wandb.log({"examples" : [wandb.Image(i) for i in images]})
If you need to track multiple models in the same script, you can wall wandb.watch() on each model separately.
We've created a few examples for you to see how the integration works:
Run in Google Colab: A simple notebook example to get you started
Example on Github: MNIST example in a Python script
Wandb Dashboard: View result on W&B