mirror of
https://github.com/ultralytics/yolov5.git
synced 2025-06-03 14:49:29 +08:00
Copy wandb param dict before training to avoid overwrites (#7317)
* Copy wandb param dict before training to avoid overwrites. Copy the hyperparameter dict retrieved from wandb configuration before passing it to `train()`. Training overwrites parameters in the dictionary (eg scaling obj/box/cls gains), which causes the values reported in wandb to not match the input values. This is confusing as it makes it hard to reproduce a run, and also throws off wandb's Bayesian sweep algorithm. * Cleanup Co-authored-by: Glenn Jocher <glenn.jocher@ultralytics.com>
This commit is contained in:
parent
245d6459a9
commit
a88a81469a
@ -16,8 +16,8 @@ from utils.torch_utils import select_device
|
||||
|
||||
def sweep():
|
||||
wandb.init()
|
||||
# Get hyp dict from sweep agent
|
||||
hyp_dict = vars(wandb.config).get("_items")
|
||||
# Get hyp dict from sweep agent. Copy because train() modifies parameters which confused wandb.
|
||||
hyp_dict = vars(wandb.config).get("_items").copy()
|
||||
|
||||
# Workaround: get necessary opt args
|
||||
opt = parse_opt(known=True)
|
||||
|
Loading…
x
Reference in New Issue
Block a user