Documentation
Search…
Advanced Configuration
How to configure the W&B Local Server installation
Your W&B Local Server comes up ready-to-use on boot. However, several advanced configuration options are available, at the /system-admin page on your server once it's up and running. You can email [email protected] to request a trial license to enable more users and teams.
The following are detailed information about manually configuring your local instance. When possible we suggest you use our existing Terraform to configure your instance.

Configuration as code

All configuration settings can be set via the UI however if you would like to manage these configuration options via code you can set the following environment variables:
Environment Variable
Description
LICENSE
Your wandb/local license
MYSQL
The MySQL connection string
BUCKET
The S3 / GCS bucket for storing data
BUCKET_QUEUE
The SQS / Google PubSub queue for object creation events
NOTIFICATIONS_QUEUE
The SQS queue on which to publish run events
AWS_REGION
The AWS Region where your bucket lives
HOST
The FQD of your instance, i.e. https://my.domain.net
AUTH0_DOMAIN
The Auth0 domain of your tenant
AUTH0_CLIENT_ID
The Auth0 Client ID of application
SLACK_CLIENT_ID
The client ID of the Slack application you want to use for alerts
SLACK_SECRET
The secret of the Slack application you want to use for alerts

Authentication

By default, a W&B Local Server run with manual user management enabling up to 4 users. Licensed versions of wandb/local also unlock SSO. W&B can configure a Auth0 for you with any Identity provider they support such as Okta, Gsuite, Active Directory, etc. Just reach out to your account executive to schedule a setup call with one of our engineers. If you already use Auth0 or want to manage your own tenant, you can follow the instructions below.
Your server supports any authentication provider supported by Auth0. After creating an Auth0 app, you'll need to configure your Auth0 callbacks to the host of your W&B Server. By default, the server supports http from the public or private IP address provided by the host. You can also configure a DNS hostname and SSL certificate if you choose.
    Add the following Callback URLs to http(s)://YOUR-W&B-SERVER-HOST/login, http(s)://YOUR-W&B-HOST/oidc/callback
    Set the Logout URL to http(s)://YOUR-W&B-HOST
    Set the Allowed Web Origin to http(s)://YOUR-W&B-HOST
Auth0 Settings
Save the Client ID and domain from your Auth0 app.
Auth0 Settings
Then, navigate to the W&B settings page at http(s)://YOUR-W&B-SERVER-HOST/system-admin. Choose "Enable SSO" option, and fill in the Client ID and domain from your Auth0 app.
Finally, press "Update settings".

File Storage

By default, a W&B Enterprise Server saves files to a local data disk with a capacity that you set when you provision your instance. To support limitless file storage, you may configure your server to use an external cloud file storage bucket with an S3-compatible API.
You should always specify the bucket you're using with the BUCKET environment variable. This removes the need for a persistent volume as all settings can then be persisted to your bucket.

Amazon Web Services

To use an AWS S3 bucket as the file storage backend for W&B, you'll need to create a bucket, along with an SQS queue configured to receive object creation notifications from that bucket. Your instance will need permissions to read from this queue.
Create an SQS Queue
First, create an SQS Standard Queue. Add a permission for all principals for the SendMessage, ReceiveMessage, ChangeMessageVisibility, DeleteMessage, and GetQueueUrl actions. (If you'd like you can further lock this down using an advanced policy document)
Create an S3 Bucket and Bucket Notifications
Then, create an S3 bucket. Under the bucket properties page in the console, in the "Events" section of "Advanced Settings", click "Add notification", and configure all object creation events to be sent to the SQS Queue you configured earlier.
Enterprise file storage settings
Enable CORS access: your CORS configuration should look like the following:
1
<?xml version="1.0" encoding="UTF-8"?>
2
<CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
3
<CORSRule>
4
<AllowedOrigin>http://YOUR-W&B-SERVER-IP</AllowedOrigin>
5
<AllowedMethod>GET</AllowedMethod>
6
<AllowedMethod>PUT</AllowedMethod>
7
<AllowedHeader>*</AllowedHeader>
8
</CORSRule>
9
</CORSConfiguration>
Copied!
Grant Permissions to Node Running W&B
The node on which W&B Local is running must be configured to permit access to s3 and SQS. Depending on the type of server deployment you've opted for, you may need to add the following policy statements to your node role:
1
{
2
"Statement":[
3
{
4
"Sid":"",
5
"Effect":"Allow",
6
"Action":"s3:*",
7
"Resource":"arn:aws:s3:::<WANDB_BUCKET>"
8
},
9
{
10
"Sid":"",
11
"Effect":"Allow",
12
"Action":[
13
"sqs:*"
14
],
15
"Resource":"arn:aws:sqs:<REGION>:<ACCOUNT>:<WANDB_QUEUE>"
16
}
17
]
18
}
Copied!
Configure W&B Server
Finally, navigate to the W&B settings page at http(s)://YOUR-W&B-SERVER-HOST/system-admin. Enable the "Use an external file storage backend" option, and fill in the s3 bucket, region, and SQS queue in the following format:
    File Storage Bucket: s3://<bucket-name>
    File Storage Region (AWS only): <region>
    Notification Subscription: sqs://<queue-name>
Press "Update settings" to apply the new settings.

Google Cloud Platform

To use a GCP Storage bucket as a file storage backend for W&B, you'll need to create a bucket, along with a pubsub topic and subscription configured to receive object creation messages from that bucket.
Create Pubsub Topic and Subscription
Navigate to Pub/Sub > Topics in the GCP Console, and click "Create topic". Choose a name and create a topic.
Then click "Create subscription" in the subscriptions table at the bottom of the page. Choose a name, and make sure Delivery Type is set to "Pull". Click "Create".
Make sure the service account or account that your instance is running as has access to this subscription.
Create Storage Bucket
Navigate to Storage > Browser in the GCP Console, and click "Create bucket". Make sure to choose "Standard" storage class.
Make sure the service account or account that your instance is running as has access to this bucket.
Create Pubsub Notification
Creating a notification stream from the Storage Bucket to the Pubsub Topic can unfortunately only be done in the console. Make sure you have gsutil installed, and logged into the correct GCP Project, then run the following:
1
gcloud pubsub topics list # list names of topics for reference
2
gsutil ls # list names of buckets for reference
3
4
# create bucket notification
5
gsutil notification create -t <TOPIC-NAME> -f json gs://<BUCKET-NAME>
Copied!
Add Signing Permissions
To create signed file URLs, your W&B instance also needs the iam.serviceAccounts.signBlob permission in GCP. You can add it by adding the Service Account Token Creator role to the service account or IAM member that your instance is running as.
Grant Permissions to Node Running W&B
The node on which W&B Local is running must be configured to permit access to s3 and sqs. Depending on the type of server deployment you've opted for, you may need to add the following policy statements to your node role:
1
{
2
"Statement":[
3
{
4
"Sid":"",
5
"Effect":"Allow",
6
"Action":"s3:*",
7
"Resource":"arn:aws:s3:::<WANDB_BUCKET>"
8
},
9
{
10
"Sid":"",
11
"Effect":"Allow",
12
"Action":[
13
"sqs:*"
14
],
15
"Resource":"arn:aws:sqs:<REGION>:<ACCOUNT>:<WANDB_QUEUE>"
16
}
17
]
18
}
Copied!
Configure W&B Server
Finally, navigate to the W&B settings page at http(s)://YOUR-W&B-SERVER-HOST/system-admin. Enable the "Use an external file storage backend" option, and fill in the s3 bucket, region, and SQS queue in the following format:
    File Storage Bucket: gs://<bucket-name>
    File Storage Region: blank
    Notification Subscription: pubsub:/<project-name>/<topic-name>/<subscription-name>
Press "update settings" to apply the new settings.

Azure

To use an Azure blob container as the file storage for W&B, you'll need to create a storage account (if you don't already have one you want to use), create a blob container and a queue within that storage account, and then create an event subscription that sends "blob created" notifications to the queue from the blob container.

Create a Storage Account

If you have a storage account you want to use already, you can skip this step.
Navigate to Storage Accounts > Add in the Azure portal. Select an Azure subscription, and select any resource group or create a new one. Enter a name for your storage account.
Azure storage account setup
Click Review and Create, and then, on the summary screen, click Create:
Azure storage account details review

Creating the blob container

Go to Storage Accounts in the Azure portal, and click on your new storage account. In the storage account dashboard, click on Blob service > Containers in the menu:
Create a new container, and set it to Private:
Go to Settings > CORS > Blob service, and enter the IP of your wandb server as an allowed origin, with allowed methods GET and PUT, and all headers allowed and exposed, then save your CORS settings.

Creating the Queue

Go to Queue service > Queues in your storage account, and create a new Queue:
Go to Events in your storage account, and create an event subscription:
Give the event subscription the Event Schema "Event Grid Schema", filter to only the "Blob Created" event type, set the Endpoint Type to Storage Queues, and then select the storage account/queue as the endpoint.
In the Filters tab, enable subject filtering for subjects beginning with /blobServices/default/containers/your-blob-container-name/blobs/

Configure W&B Server

Go to Settings > Access keys in your storage account, click "Show keys", and then copy either key1 > Key or key2 > Key. Set this key on your W&B server as the environment variable AZURE_STORAGE_KEY.
Finally, navigate to the W&B settings page at http(s)://YOUR-W&B-SERVER-HOST/system-admin. Enable the "Use an external file storage backend" option, and fill in the s3 bucket, region, and SQS queue in the following format:
    File Storage Bucket: az://<storage-account-name>/<blob-container-name>
    Notification Subscription: az://<storage-account-name>/<queue-name>
Press "Update settings" to apply the new settings.

Slack

In order to integrate your local W&B installation with Slack, you'll need to create a suitable Slack application.

Creating the Slack application

Visit https://api.slack.com/apps and select Create New App in the top right.
You can name it whatever you like, but what's important is to select the same Slack workspace as the one you intend to use for alerts.

Configuring the Slack application

Now that we have a Slack application ready, we need to authorize for use as an OAuth bot. Select OAuth & Permissions in the sidebar to the left.
Under Scopes, supply the bot with the incoming_webhook scope.
Finally, configure the Redirect URL to point to your W&B installation. You should use the same value as what you set Frontend Host to in your local system settings. You can specify multiple URLs if you have different DNS mappings to your instance.
Hit Save URLs once finished.
To further secure your Slack application and prevent abuse, you can specify an IP range under Restrict API Token Usage, whitelisting the IP or IP range of your W&B instance(s).

Register your Slack application with W&B

Navigate to the System Settings page of your W&B instance. Check the box to enable a custom Slack application:
You'll need to supply your Slack application's client ID and secret, which you can find in the Basic Information tab.
That's it! You can now verify that everything is working by setting up a Slack integration in the W&B app. Visit this page for more detailed information.
Last modified 4mo ago