Welcome to the guide on integrating [Ultralytics YOLOv5](https://github.com/ultralytics/yolov5) with [Comet](https://www.comet.com/site/)! Comet provides powerful tools for experiment tracking, model management, and visualization, enhancing your [machine learning](https://www.ultralytics.com/glossary/machine-learning-ml) workflow. This document details how to leverage Comet to monitor training, log results, manage datasets, and optimize hyperparameters for your YOLOv5 models.
[Comet](https://www.comet.com/site/) builds tools that help data scientists, engineers, and team leaders accelerate and optimize machine learning and [deep learning](https://www.ultralytics.com/glossary/deep-learning-dl) models.
Track and visualize model metrics in real-time, save your [hyperparameters](https://docs.ultralytics.com/guides/hyperparameter-tuning/), datasets, and model checkpoints, and visualize your model predictions with Comet Custom Panels! Comet ensures you never lose track of your work and makes it easy to share results and collaborate across teams of all sizes. Find more information in the [Comet Documentation](https://www.comet.com/docs/v2/).
That's it! Comet automatically logs hyperparameters, command-line arguments, and training/validation metrics. Visualize and analyze your runs in the Comet UI. For more details on training, see the [Ultralytics Training documentation](https://docs.ultralytics.com/modes/train/).
- **[View Example Run on Comet](https://www.comet.com/examples/comet-example-yolov5/a0e29e0e9b984e4a822db2a62d0cb357?experiment-tab=chart&showOutliers=true&smoothing=0&transformY=smoothing&xAxis=step&utm_source=yolov5&utm_medium=partner&utm_campaign=partner_yolov5_2022&utm_content=github_readme)**
Run the example yourself using this [Google Colab](https://colab.research.google.com/) Notebook:
[](https://colab.research.google.com/github/comet-ml/comet-examples/blob/master/integrations/model-training/yolov5/notebooks/Comet_and_YOLOv5.ipynb)
- **Performance:** [mAP@0.5](https://www.ultralytics.com/glossary/mean-average-precision-map), mAP@0.5:0.95 (Validation). Learn more about these metrics in the [YOLO Performance Metrics guide](https://docs.ultralytics.com/guides/yolo-performance-metrics/).
- **[Precision](https://www.ultralytics.com/glossary/precision) and [Recall](https://www.ultralytics.com/glossary/recall):** Validation data metrics.
- **[Confusion Matrix](https://www.ultralytics.com/glossary/confusion-matrix):** Model predictions on validation data, useful for understanding classification performance ([Wikipedia definition](https://en.wikipedia.org/wiki/Confusion_matrix)).
Model checkpoint logging to Comet is disabled by default. Enable it using the `--save-period` argument during training. This saves checkpoints to Comet at the specified epoch interval.
Checkpoints will appear in the "Assets & Artifacts" tab of your Comet experiment. Learn more about model management in the [Comet Model Registry documentation](https://www.comet.com/docs/v2/guides/model-registry/).
By default, model predictions (images, ground truth labels, [bounding boxes](https://www.ultralytics.com/glossary/bounding-box)) for the validation set are logged. Control the logging frequency using the `--bbox_interval` argument, which specifies logging every Nth batch per epoch.
Visualize predictions using Comet's Object Detection Custom Panel. See an [example project using the Panel here](https://www.comet.com/examples/comet-example-yolov5?shareable=YcwMiJaZSXfcEXpGOHDD12vA1&utm_source=yolov5&utm_medium=partner&utm_campaign=partner_yolov5_2022&utm_content=github_readme).
Upload your dataset using the `--upload_dataset` flag. Ensure your dataset follows the structure described in the [Ultralytics Datasets documentation](https://docs.ultralytics.com/datasets/) and the dataset config [YAML](https://www.ultralytics.com/glossary/yaml) file matches the format of `coco128.yaml` (see the [COCO128 dataset docs](https://docs.ultralytics.com/datasets/detect/coco128/)).
If a training run is interrupted (e.g., due to connection issues), you can resume it using the `--resume` flag with the Comet Run Path (`comet://<your_workspace>/<your_project>/<experiment_id>`).
This restores the model state, hyperparameters, arguments, and downloads necessary Artifacts, continuing logging to the existing Comet Experiment. Learn more about [resuming runs in the Comet documentation](https://www.comet.com/docs/v2/guides/experiment-logging/resume-experiment/).
YOLOv5 integrates with the [Comet Optimizer](https://www.comet.com/docs/v2/guides/hyperparameter-optimization/) for easy hyperparameter sweeps and visualization. This helps in finding the best set of parameters for your model, a process often referred to as [Hyperparameter Tuning](https://docs.ultralytics.com/guides/hyperparameter-tuning/).
Create a [JSON](https://www.ultralytics.com/glossary/json) configuration file defining the sweep parameters, search strategy, and objective metric. An example is provided at `utils/loggers/comet/optimizer_config.json`.
Replace `<num_workers>` with the desired number of parallel processes.
### Visualizing HPO Results
Comet offers various visualizations for analyzing sweep results, such as parallel coordinate plots and parameter importance plots. Explore a [project with a completed sweep here](https://www.comet.com/examples/comet-example-yolov5/view/PrlArHGuuhDTKC1UuBmTtOSXD/panels?utm_source=yolov5&utm_medium=partner&utm_campaign=partner_yolov5_2022&utm_content=github_readme).
Contributions to enhance the YOLOv5-Comet integration are welcome! Please see the [Ultralytics Contributing Guide](https://docs.ultralytics.com/help/contributing/) for more information on how to get involved. Thank you for helping improve this integration!