wandb server start
. However, several advanced configuration options are available using the /system-admin
page on your server once it's up and running. You can email [email protected] to request a trial license to enable more users and teams.wandb server
instance then you can run the commandwandb server -e HOST=http://<HOST>:<PORT>
wandb
client. Here are various ways to perform this action.wandb login --host=<HOST>:<PORT>
wandb.login(host="<HOST>:<PORT>")
export WANDB_BASE_URL=<HOST>:<PORT>
export WANDB_API_KEY=<API-KEY>
http(s)://YOUR-W&B-HOST/oidc/callback
http(s)://YOUR-W&B-HOST
$OIDC_ISSUER/.well-known/openid-configuration
For example, when using AWS Cognito you can generate your issuer url by appending your User Pool ID to the Cognito IDP url from the User Pools > App Integration tab:https://cognito-idp.$REGION.amazonaws.com/$USER_POOL_ID
/system-admin
or the environment variables and SSO will be configured.LOCAL_RESTORE=true
environment variable set. This will output a temporary password to the containers logs and disable SSO. Once you've resolved any issues with SSO, you must remove that environment variable to enable SSO again.SendMessage
, ReceiveMessage
, ChangeMessageVisibility
, DeleteMessage
, and GetQueueUrl
actions. If you'd like you can further lock this down using an advanced policy document. For instance, the policy for accessing SQS with a statement is as follows:http(s)://YOUR-W&B-SERVER-HOST/system-admin
. Enable the "Use an external file storage backend" option, and fill in the s3 bucket, region, and SQS queue in the following format:s3://<bucket-name>
<region>
sqs://<queue-name>
gsutil
installed, and logged into the correct GCP Project, then run the following:iam.serviceAccounts.signBlob
permission in GCP. You can add it by adding the Service Account Token Creator
role to the service account or IAM member that your instance is running as.http(s)://YOUR-W&B-SERVER-HOST/system-admin
. Enable the "Use an external file storage backend" option, and fill in the s3 bucket, region, and SQS queue in the following format:gs://<bucket-name>
pubsub:/<project-name>/<topic-name>/<subscription-name>
GET
and PUT
, and all headers allowed and exposed, then save your CORS settings./blobServices/default/containers/your-blob-container-name/blobs/
AZURE_STORAGE_KEY
.http(s)://YOUR-W&B-SERVER-HOST/system-admin
. Enable the "Use an external file storage backend" option, and fill in the s3 bucket, region, and SQS queue in the following format:az://<storage-account-name>/<blob-container-name>
az://<storage-account-name>/<queue-name>