Created using Colaboratory
parent
08c8c3e00a
commit
e5991c9867
|
@ -414,7 +414,7 @@
|
|||
"import utils\n",
|
||||
"display = utils.notebook_init() # checks"
|
||||
],
|
||||
"execution_count": 1,
|
||||
"execution_count": null,
|
||||
"outputs": [
|
||||
{
|
||||
"output_type": "stream",
|
||||
|
@ -466,7 +466,7 @@
|
|||
"!python detect.py --weights yolov5s.pt --img 640 --conf 0.25 --source data/images\n",
|
||||
"#display.Image(filename='runs/detect/exp/zidane.jpg', width=600)"
|
||||
],
|
||||
"execution_count": 2,
|
||||
"execution_count": null,
|
||||
"outputs": [
|
||||
{
|
||||
"output_type": "stream",
|
||||
|
@ -546,7 +546,7 @@
|
|||
"torch.hub.download_url_to_file('https://ultralytics.com/assets/coco2017val.zip', 'tmp.zip')\n",
|
||||
"!unzip -q tmp.zip -d ../datasets && rm tmp.zip"
|
||||
],
|
||||
"execution_count": 3,
|
||||
"execution_count": null,
|
||||
"outputs": [
|
||||
{
|
||||
"output_type": "display_data",
|
||||
|
@ -577,7 +577,7 @@
|
|||
"# Run YOLOv5x on COCO val\n",
|
||||
"!python val.py --weights yolov5x.pt --data coco.yaml --img 640 --iou 0.65 --half"
|
||||
],
|
||||
"execution_count": 4,
|
||||
"execution_count": null,
|
||||
"outputs": [
|
||||
{
|
||||
"output_type": "stream",
|
||||
|
@ -737,7 +737,7 @@
|
|||
"# Train YOLOv5s on COCO128 for 3 epochs\n",
|
||||
"!python train.py --img 640 --batch 16 --epochs 3 --data coco128.yaml --weights yolov5s.pt --cache"
|
||||
],
|
||||
"execution_count": 5,
|
||||
"execution_count": null,
|
||||
"outputs": [
|
||||
{
|
||||
"output_type": "stream",
|
||||
|
@ -917,13 +917,14 @@
|
|||
"id": "DLI1JmHU7B0l"
|
||||
},
|
||||
"source": [
|
||||
"## Weights & Biases Logging 🌟 NEW\n",
|
||||
"## Weights & Biases Logging\n",
|
||||
"\n",
|
||||
"[Weights & Biases](https://wandb.ai/site?utm_campaign=repo_yolo_notebook) (W&B) is now integrated with YOLOv5 for real-time visualization and cloud logging of training runs. This allows for better run comparison and introspection, as well improved visibility and collaboration for teams. To enable W&B `pip install wandb`, and then train normally (you will be guided through setup on first use). \n",
|
||||
"[Weights & Biases](https://wandb.ai/site?utm_campaign=repo_yolo_notebook) (W&B) is integrated with YOLOv5 for real-time visualization and cloud logging of training runs. This allows for better run comparison and introspection, as well improved visibility and collaboration for teams. To enable W&B `pip install wandb`, and then train normally (you will be guided through setup on first use). \n",
|
||||
"\n",
|
||||
"During training you will see live updates at [https://wandb.ai/home](https://wandb.ai/home?utm_campaign=repo_yolo_notebook), and you can create and share detailed [Reports](https://wandb.ai/glenn-jocher/yolov5_tutorial/reports/YOLOv5-COCO128-Tutorial-Results--VmlldzozMDI5OTY) of your results. For more information see the [YOLOv5 Weights & Biases Tutorial](https://github.com/ultralytics/yolov5/issues/1289). \n",
|
||||
"During training you will see live updates at [https://wandb.ai/home](https://wandb.ai/home?utm_campaign=repo_yolo_notebook), and you can create and share detailed [Reports](https://wandb.ai/glenn-jocher/yolov5_tutorial/reports/YOLOv5-COCO128-Tutorial-Results--VmlldzozMDI5OTY) of your results. For more information see the [YOLOv5 Weights & Biases Tutorial](https://github.com/ultralytics/yolov5/issues/1289). \n",
|
||||
"\n",
|
||||
"<p align=\"left\"><img width=\"900\" alt=\"Weights & Biases dashboard\" src=\"https://user-images.githubusercontent.com/26833433/135390767-c28b050f-8455-4004-adb0-3b730386e2b2.png\"></p>"
|
||||
"<a href=\"https://wandb.ai/glenn-jocher/yolov5_tutorial\">\n",
|
||||
"<img alt=\"Weights & Biases dashboard\" src=\"https://user-images.githubusercontent.com/26833433/182482859-288a9622-4661-48db-99de-650d1dead5c6.jpg\" width=\"1280\"/></a>"
|
||||
]
|
||||
},
|
||||
{
|
||||
|
@ -934,16 +935,11 @@
|
|||
"source": [
|
||||
"## Local Logging\n",
|
||||
"\n",
|
||||
"All results are logged by default to `runs/train`, with a new experiment directory created for each new training as `runs/train/exp2`, `runs/train/exp3`, etc. View train and val jpgs to see mosaics, labels, predictions and augmentation effects. Note an Ultralytics **Mosaic Dataloader** is used for training (shown below), which combines 4 images into 1 mosaic during training.\n",
|
||||
"All results are logged by default to `runs/train`, with a new experiment directory created for each new training as `runs/train/exp2`, `runs/train/exp3`, etc. View train and val statistics, mosaics, labels, predictions and augmentations, as well as metrics and charts including Precision-Recall curves and Confusion Matrices. \n",
|
||||
"\n",
|
||||
"> <img src=\"https://user-images.githubusercontent.com/26833433/131255960-b536647f-7c61-4f60-bbc5-cb2544d71b2a.jpg\" width=\"700\"> \n",
|
||||
"`train_batch0.jpg` shows train batch 0 mosaics and labels\n",
|
||||
"A **Mosaic Dataloader** is used for training (shown in train*.jpg images), which combines 4 images into 1 mosaic during training.\n",
|
||||
"\n",
|
||||
"> <img src=\"https://user-images.githubusercontent.com/26833433/131256748-603cafc7-55d1-4e58-ab26-83657761aed9.jpg\" width=\"700\"> \n",
|
||||
"`test_batch0_labels.jpg` shows val batch 0 labels\n",
|
||||
"\n",
|
||||
"> <img src=\"https://user-images.githubusercontent.com/26833433/131256752-3f25d7a5-7b0f-4bb3-ab78-46343c3800fe.jpg\" width=\"700\"> \n",
|
||||
"`test_batch0_pred.jpg` shows val batch 0 _predictions_\n",
|
||||
"<img alt=\"Local logging results\" src=\"https://user-images.githubusercontent.com/26833433/182486932-628e5bb0-cdea-4581-be1d-66f5e4ddada4.jpg\" width=\"1280\"/>\n",
|
||||
"\n",
|
||||
"Training results are automatically logged to [Tensorboard](https://www.tensorflow.org/tensorboard) and [CSV](https://github.com/ultralytics/yolov5/pull/4148) as `results.csv`, which is plotted as `results.png` (below) after training completes. You can also plot any `results.csv` file manually:\n",
|
||||
"\n",
|
||||
|
|
Loading…
Reference in New Issue