Commit Graph

307 Commits (84a8099b7523120847070af13ded776143248b62)

Author SHA1 Message Date
Glenn Jocher bcd452c482 replace random_affine() with random_perspective()
Signed-off-by: Glenn Jocher <glenn.jocher@ultralytics.com>
2020-07-31 15:53:52 -07:00
Liu Changyu c020875b17
PyTorch 1.6.0 update with native AMP (#573)
* PyTorch have Automatic Mixed Precision (AMP) Training.

* Fixed the problem of inconsistent code length indentation

* Fixed the problem of inconsistent code length indentation

* Mixed precision training is turned on by default
2020-07-31 10:52:45 -07:00
Laughing 4e2b9ecc7e
LR --resume repeat bug fix (#565) 2020-07-30 10:48:20 -07:00
AlexWang1900 a209a32019
Fix bug #541 #542 (#545)
* fix #541 #542

* Update train.py

Co-authored-by: Glenn Jocher <glenn.jocher@ultralytics.com>
2020-07-28 18:31:01 -07:00
NanoCode012 7f8471eaeb
--notest bug fix (#518)
* Fix missing results_file and fi when notest passed

* Update train.py

reverting previous changes and  removing functionality from 'if not opt.notest or final_epoch:  # Calculate mAP' loop.

Co-authored-by: Glenn Jocher <glenn.jocher@ultralytics.com>
2020-07-25 10:24:39 -07:00
Glenn Jocher 9da56b62dd
v2.0 Release (#491)
Signed-off-by: Glenn Jocher <glenn.jocher@ultralytics.com>
2020-07-23 15:34:23 -07:00
Glenn Jocher 5e970d45c4
Update train.py (#462) 2020-07-22 12:32:03 -07:00
Glenn Jocher 3edc38f603 update train.py gsutil bucket fix (#463) 2020-07-21 23:25:33 -07:00
Glenn Jocher 776555771f update train.py gsutil bucket fix (#463) 2020-07-21 23:21:36 -07:00
Glenn Jocher b569ed6d6b pretrained model loading bug fix (#450)
Signed-off-by: Glenn Jocher <glenn.jocher@ultralytics.com>
2020-07-19 22:12:55 -07:00
yzchen 4102fcc9a7
[WIP] Feature/ddp fixed (#401)
* Squashed commit of the following:

commit d738487089e41c22b3b1cd73aa7c1c40320a6ebf
Author: NanoCode012 <kevinvong@rocketmail.com>
Date:   Tue Jul 14 17:33:38 2020 +0700

    Adding world_size

    Reduce calls to torch.distributed. For use in create_dataloader.

commit e742dd9619d29306c7541821238d3d7cddcdc508
Author: yizhi.chen <chenyzsjtu@outlook.com>
Date:   Tue Jul 14 15:38:48 2020 +0800

    Make SyncBN a choice

commit e90d4004387e6103fecad745f8cbc2edc918e906
Merge: 5bf8beb cd90360
Author: yzchen <Chenyzsjtu@gmail.com>
Date:   Tue Jul 14 15:32:10 2020 +0800

    Merge pull request #6 from NanoCode012/patch-5

    Update train.py

commit cd9036017e7f8bd519a8b62adab0f47ea67f4962
Author: NanoCode012 <kevinvong@rocketmail.com>
Date:   Tue Jul 14 13:39:29 2020 +0700

    Update train.py

    Remove redundant `opt.` prefix.

commit 5bf8bebe8873afb18b762fe1f409aca116fac073
Merge: c9558a9 a1c8406
Author: yizhi.chen <chenyzsjtu@outlook.com>
Date:   Tue Jul 14 14:09:51 2020 +0800

    Merge branch 'master' of https://github.com/ultralytics/yolov5 into feature/DDP_fixed

commit c9558a9b51547febb03d9c1ca42e2ef0fc15bb31
Author: yizhi.chen <chenyzsjtu@outlook.com>
Date:   Tue Jul 14 13:51:34 2020 +0800

    Add device allocation for loss compute

commit 4f08c692fb5e943a89e0ee354ef6c80a50eeb28d
Author: yizhi.chen <chenyzsjtu@outlook.com>
Date:   Thu Jul 9 11:16:27 2020 +0800

    Revert drop_last

commit 1dabe33a5a223b758cc761fc8741c6224205a34b
Merge: a1ce9b1 4b8450b
Author: yizhi.chen <chenyzsjtu@outlook.com>
Date:   Thu Jul 9 11:15:49 2020 +0800

    Merge branch 'feature/DDP_fixed' of https://github.com/MagicFrogSJTU/yolov5 into feature/DDP_fixed

commit a1ce9b1e96b71d7fcb9d3e8143013eb8cebe5e27
Author: yizhi.chen <chenyzsjtu@outlook.com>
Date:   Thu Jul 9 11:15:21 2020 +0800

    fix lr warning

commit 4b8450b46db76e5e58cd95df965d4736077cfb0e
Merge: b9a50ae 02c63ef
Author: yzchen <Chenyzsjtu@gmail.com>
Date:   Wed Jul 8 21:24:24 2020 +0800

    Merge pull request #4 from NanoCode012/patch-4

    Add drop_last for multi gpu

commit 02c63ef81cf98b28b10344fe2cce08a03b143941
Author: NanoCode012 <kevinvong@rocketmail.com>
Date:   Wed Jul 8 10:08:30 2020 +0700

    Add drop_last for multi gpu

commit b9a50aed48ab1536f94d49269977e2accd67748f
Merge: ec2dc6c 121d90b
Author: yizhi.chen <chenyzsjtu@outlook.com>
Date:   Tue Jul 7 19:48:04 2020 +0800

    Merge branch 'master' of https://github.com/ultralytics/yolov5 into feature/DDP_fixed

commit ec2dc6cc56de43ddff939e14c450672d0fbf9b3d
Merge: d0326e3 82a6182
Author: yizhi.chen <chenyzsjtu@outlook.com>
Date:   Tue Jul 7 19:34:31 2020 +0800

    Merge branch 'feature/DDP_fixed' of https://github.com/MagicFrogSJTU/yolov5 into feature/DDP_fixed

commit d0326e398dfeeeac611ccc64198d4fe91b7aa969
Author: yizhi.chen <chenyzsjtu@outlook.com>
Date:   Tue Jul 7 19:31:24 2020 +0800

    Add SyncBN

commit 82a6182b3ad0689a4432b631b438004e5acb3b74
Merge: 96fa40a 050b2a5
Author: yzchen <Chenyzsjtu@gmail.com>
Date:   Tue Jul 7 19:21:01 2020 +0800

    Merge pull request #1 from NanoCode012/patch-2

    Convert BatchNorm to SyncBatchNorm

commit 050b2a5a79a89c9405854d439a1f70f892139b1c
Author: NanoCode012 <kevinvong@rocketmail.com>
Date:   Tue Jul 7 12:38:14 2020 +0700

    Add cleanup for process_group

commit 2aa330139f3cc1237aeb3132245ed7e5d6da1683
Author: NanoCode012 <kevinvong@rocketmail.com>
Date:   Tue Jul 7 12:07:40 2020 +0700

    Remove apex.parallel. Use torch.nn.parallel

    For future compatibility

commit 77c8e27e603bea9a69e7647587ca8d509dc1990d
Author: NanoCode012 <kevinvong@rocketmail.com>
Date:   Tue Jul 7 01:54:39 2020 +0700

    Convert BatchNorm to SyncBatchNorm

commit 96fa40a3a925e4ffd815fe329e1b5181ec92adc8
Author: yizhi.chen <chenyzsjtu@outlook.com>
Date:   Mon Jul 6 21:53:56 2020 +0800

    Fix the datset inconsistency problem

commit 16e7c269d062c8d16c4d4ff70cc80fd87935dc95
Author: yizhi.chen <chenyzsjtu@outlook.com>
Date:   Mon Jul 6 11:34:03 2020 +0800

    Add loss multiplication to preserver the single-process performance

commit e83805563065ffd2e38f85abe008fc662cc17909
Merge: 625bb49 3bdea3f
Author: yizhi.chen <chenyzsjtu@outlook.com>
Date:   Fri Jul 3 20:56:30 2020 +0800

    Merge branch 'master' of https://github.com/ultralytics/yolov5 into feature/DDP_fixed

commit 625bb49f4e52d781143fea0af36d14e5be8b040c
Author: yizhi.chen <chenyzsjtu@outlook.com>
Date:   Thu Jul 2 22:45:15 2020 +0800

    DDP established

* Squashed commit of the following:

commit 94147314e559a6bdd13cb9de62490d385c27596f
Merge: 65157e2 37acbdc
Author: yizhi.chen <chenyzsjtu@outlook.com>
Date:   Thu Jul 16 14:00:17 2020 +0800

    Merge branch 'master' of https://github.com/ultralytics/yolov4 into feature/DDP_fixed

commit 37acbdc0b6
Author: Glenn Jocher <glenn.jocher@ultralytics.com>
Date:   Wed Jul 15 20:03:41 2020 -0700

    update test.py --save-txt

commit b8c2da4a0d
Author: Glenn Jocher <glenn.jocher@ultralytics.com>
Date:   Wed Jul 15 20:00:48 2020 -0700

    update test.py --save-txt

commit 65157e2fc97d371bc576e18b424e130eb3026917
Author: yizhi.chen <chenyzsjtu@outlook.com>
Date:   Wed Jul 15 16:44:13 2020 +0800

    Revert the README.md removal

commit 1c802bfa503623661d8617ca3f259835d27c5345
Merge: cd55b44 0f3b8bb
Author: yizhi.chen <chenyzsjtu@outlook.com>
Date:   Wed Jul 15 16:43:38 2020 +0800

    Merge branch 'feature/DDP_fixed' of https://github.com/MagicFrogSJTU/yolov5 into feature/DDP_fixed

commit cd55b445c4dcd8003ff4b0b46b64adf7c16e5ce7
Author: yizhi.chen <chenyzsjtu@outlook.com>
Date:   Wed Jul 15 16:42:33 2020 +0800

    fix the DDP performance deterioration bug.

commit 0f3b8bb1fae5885474ba861bbbd1924fb622ee93
Author: Glenn Jocher <glenn.jocher@ultralytics.com>
Date:   Wed Jul 15 00:28:53 2020 -0700

    Delete README.md

commit f5921ba1e35475f24b062456a890238cb7a3cf94
Merge: 85ab2f3 bd3fdbb
Author: yizhi.chen <chenyzsjtu@outlook.com>
Date:   Wed Jul 15 11:20:17 2020 +0800

    Merge branch 'feature/DDP_fixed' of https://github.com/MagicFrogSJTU/yolov5 into feature/DDP_fixed

commit bd3fdbbf1b08ef87931eef49fa8340621caa7e87
Author: Glenn Jocher <glenn.jocher@ultralytics.com>
Date:   Tue Jul 14 18:38:20 2020 -0700

    Update README.md

commit c1a97a7767ccb2aa9afc7a5e72fd159e7c62ec02
Merge: 2bf86b8 f796708
Author: Glenn Jocher <glenn.jocher@ultralytics.com>
Date:   Tue Jul 14 18:36:53 2020 -0700

    Merge branch 'master' into feature/DDP_fixed

commit 2bf86b892fa2fd712f6530903a0d9b8533d7447a
Author: NanoCode012 <kevinvong@rocketmail.com>
Date:   Tue Jul 14 22:18:15 2020 +0700

    Fixed world_size not found when called from test

commit 85ab2f38cdda28b61ad15a3a5a14c3aafb620dc8
Merge: 5a19011 c8357ad
Author: yizhi.chen <chenyzsjtu@outlook.com>
Date:   Tue Jul 14 22:19:58 2020 +0800

    Merge branch 'feature/DDP_fixed' of https://github.com/MagicFrogSJTU/yolov5 into feature/DDP_fixed

commit 5a19011949398d06e744d8d5521ab4e6dfa06ab7
Author: yizhi.chen <chenyzsjtu@outlook.com>
Date:   Tue Jul 14 22:19:15 2020 +0800

    Add assertion for <=2 gpus DDP

commit c8357ad5b15a0e6aeef4d7fe67ca9637f7322a4d
Merge: e742dd9 787582f
Author: yzchen <Chenyzsjtu@gmail.com>
Date:   Tue Jul 14 22:10:02 2020 +0800

    Merge pull request #8 from MagicFrogSJTU/NanoCode012-patch-1

    Modify number of dataloaders' workers

commit 787582f97251834f955ef05a77072b8c673a8397
Author: NanoCode012 <kevinvong@rocketmail.com>
Date:   Tue Jul 14 20:38:58 2020 +0700

    Fixed issue with single gpu not having world_size

commit 63648925288d63a21174a4dd28f92dbfebfeb75a
Author: NanoCode012 <kevinvong@rocketmail.com>
Date:   Tue Jul 14 19:16:15 2020 +0700

    Add assert message for clarification

    Clarify why assertion was thrown to users

commit 69364d6050e048d0d8834e0f30ce84da3f6a13f3
Author: NanoCode012 <kevinvong@rocketmail.com>
Date:   Tue Jul 14 17:36:48 2020 +0700

    Changed number of workers check

commit d738487089e41c22b3b1cd73aa7c1c40320a6ebf
Author: NanoCode012 <kevinvong@rocketmail.com>
Date:   Tue Jul 14 17:33:38 2020 +0700

    Adding world_size

    Reduce calls to torch.distributed. For use in create_dataloader.

commit e742dd9619d29306c7541821238d3d7cddcdc508
Author: yizhi.chen <chenyzsjtu@outlook.com>
Date:   Tue Jul 14 15:38:48 2020 +0800

    Make SyncBN a choice

commit e90d4004387e6103fecad745f8cbc2edc918e906
Merge: 5bf8beb cd90360
Author: yzchen <Chenyzsjtu@gmail.com>
Date:   Tue Jul 14 15:32:10 2020 +0800

    Merge pull request #6 from NanoCode012/patch-5

    Update train.py

commit cd9036017e7f8bd519a8b62adab0f47ea67f4962
Author: NanoCode012 <kevinvong@rocketmail.com>
Date:   Tue Jul 14 13:39:29 2020 +0700

    Update train.py

    Remove redundant `opt.` prefix.

commit 5bf8bebe8873afb18b762fe1f409aca116fac073
Merge: c9558a9 a1c8406
Author: yizhi.chen <chenyzsjtu@outlook.com>
Date:   Tue Jul 14 14:09:51 2020 +0800

    Merge branch 'master' of https://github.com/ultralytics/yolov5 into feature/DDP_fixed

commit c9558a9b51547febb03d9c1ca42e2ef0fc15bb31
Author: yizhi.chen <chenyzsjtu@outlook.com>
Date:   Tue Jul 14 13:51:34 2020 +0800

    Add device allocation for loss compute

commit 4f08c692fb5e943a89e0ee354ef6c80a50eeb28d
Author: yizhi.chen <chenyzsjtu@outlook.com>
Date:   Thu Jul 9 11:16:27 2020 +0800

    Revert drop_last

commit 1dabe33a5a223b758cc761fc8741c6224205a34b
Merge: a1ce9b1 4b8450b
Author: yizhi.chen <chenyzsjtu@outlook.com>
Date:   Thu Jul 9 11:15:49 2020 +0800

    Merge branch 'feature/DDP_fixed' of https://github.com/MagicFrogSJTU/yolov5 into feature/DDP_fixed

commit a1ce9b1e96b71d7fcb9d3e8143013eb8cebe5e27
Author: yizhi.chen <chenyzsjtu@outlook.com>
Date:   Thu Jul 9 11:15:21 2020 +0800

    fix lr warning

commit 4b8450b46db76e5e58cd95df965d4736077cfb0e
Merge: b9a50ae 02c63ef
Author: yzchen <Chenyzsjtu@gmail.com>
Date:   Wed Jul 8 21:24:24 2020 +0800

    Merge pull request #4 from NanoCode012/patch-4

    Add drop_last for multi gpu

commit 02c63ef81cf98b28b10344fe2cce08a03b143941
Author: NanoCode012 <kevinvong@rocketmail.com>
Date:   Wed Jul 8 10:08:30 2020 +0700

    Add drop_last for multi gpu

commit b9a50aed48ab1536f94d49269977e2accd67748f
Merge: ec2dc6c 121d90b
Author: yizhi.chen <chenyzsjtu@outlook.com>
Date:   Tue Jul 7 19:48:04 2020 +0800

    Merge branch 'master' of https://github.com/ultralytics/yolov5 into feature/DDP_fixed

commit ec2dc6cc56de43ddff939e14c450672d0fbf9b3d
Merge: d0326e3 82a6182
Author: yizhi.chen <chenyzsjtu@outlook.com>
Date:   Tue Jul 7 19:34:31 2020 +0800

    Merge branch 'feature/DDP_fixed' of https://github.com/MagicFrogSJTU/yolov5 into feature/DDP_fixed

commit d0326e398dfeeeac611ccc64198d4fe91b7aa969
Author: yizhi.chen <chenyzsjtu@outlook.com>
Date:   Tue Jul 7 19:31:24 2020 +0800

    Add SyncBN

commit 82a6182b3ad0689a4432b631b438004e5acb3b74
Merge: 96fa40a 050b2a5
Author: yzchen <Chenyzsjtu@gmail.com>
Date:   Tue Jul 7 19:21:01 2020 +0800

    Merge pull request #1 from NanoCode012/patch-2

    Convert BatchNorm to SyncBatchNorm

commit 050b2a5a79a89c9405854d439a1f70f892139b1c
Author: NanoCode012 <kevinvong@rocketmail.com>
Date:   Tue Jul 7 12:38:14 2020 +0700

    Add cleanup for process_group

commit 2aa330139f3cc1237aeb3132245ed7e5d6da1683
Author: NanoCode012 <kevinvong@rocketmail.com>
Date:   Tue Jul 7 12:07:40 2020 +0700

    Remove apex.parallel. Use torch.nn.parallel

    For future compatibility

commit 77c8e27e603bea9a69e7647587ca8d509dc1990d
Author: NanoCode012 <kevinvong@rocketmail.com>
Date:   Tue Jul 7 01:54:39 2020 +0700

    Convert BatchNorm to SyncBatchNorm

commit 96fa40a3a925e4ffd815fe329e1b5181ec92adc8
Author: yizhi.chen <chenyzsjtu@outlook.com>
Date:   Mon Jul 6 21:53:56 2020 +0800

    Fix the datset inconsistency problem

commit 16e7c269d062c8d16c4d4ff70cc80fd87935dc95
Author: yizhi.chen <chenyzsjtu@outlook.com>
Date:   Mon Jul 6 11:34:03 2020 +0800

    Add loss multiplication to preserver the single-process performance

commit e83805563065ffd2e38f85abe008fc662cc17909
Merge: 625bb49 3bdea3f
Author: yizhi.chen <chenyzsjtu@outlook.com>
Date:   Fri Jul 3 20:56:30 2020 +0800

    Merge branch 'master' of https://github.com/ultralytics/yolov5 into feature/DDP_fixed

commit 625bb49f4e52d781143fea0af36d14e5be8b040c
Author: yizhi.chen <chenyzsjtu@outlook.com>
Date:   Thu Jul 2 22:45:15 2020 +0800

    DDP established

* Fixed destroy_process_group in DP mode

* Update torch_utils.py

* Update utils.py

Revert build_targets() to current master.

* Update datasets.py

* Fixed world_size attribute not found

Co-authored-by: NanoCode012 <kevinvong@rocketmail.com>
Co-authored-by: Glenn Jocher <glenn.jocher@ultralytics.com>
2020-07-19 12:33:30 -07:00
Glenn Jocher 07493a715c update train.py class count assertion #424
Signed-off-by: Glenn Jocher <glenn.jocher@ultralytics.com>
2020-07-19 01:57:42 -07:00
Glenn Jocher 912c06a67c update train.py class count assertion #424 2020-07-19 01:51:41 -07:00
Glenn Jocher 65857adf52 update train.py ckpt loading 2020-07-18 14:34:12 -07:00
Glenn Jocher 03489aaafb comment tb_writer.add_hparams(hyp, {}) 2020-07-15 10:38:32 -07:00
Glenn Jocher 1e94bcf3d2
Merge pull request #384 from jancio/master
Log hyperparameters in tensorboard
2020-07-14 12:32:27 -07:00
Glenn Jocher 120d40c06a
Update train.py
This updates the PR to a one-liner to minimize additions. Perhaps we can include opt in the future but let's start with this for now.
2020-07-14 12:32:08 -07:00
Janko Ondras 38acc5f3c5 Fix img_size naming in hyperparameters logging 2020-07-14 08:48:46 +02:00
Glenn Jocher a1c8406af3 EMA and non_blocking=True 2020-07-13 20:19:10 -07:00
Glenn Jocher 140d84cca1 comment updates 2020-07-13 12:17:52 -07:00
Janko Ondras e558963244 Log hyperparameters in tensorboard
Log both hyperparameters and command line options in tensorboard.
2020-07-13 12:55:43 +02:00
Glenn Jocher 01a73ec08e multi-gpu ckpt filesize bug fix #253 2020-07-11 12:39:27 -07:00
Glenn Jocher a586751904 multi-gpu ckpt filesize bug fix #253 2020-07-11 12:35:21 -07:00
Glenn Jocher 5de4e25d68 update tensorboard metric 2020-07-11 11:39:02 -07:00
Glenn Jocher 98fc483abc train.py results.txt to bucket bug fix 2020-07-11 09:31:53 -07:00
Glenn Jocher 1c13e67b33 evolution bug fix #346 2020-07-09 22:08:42 -07:00
Glenn Jocher e16e9e43e1 new nc=len(names) check 2020-07-09 17:10:43 -07:00
Glenn Jocher cb527d3af9 new nc=len(names) check 2020-07-09 17:03:12 -07:00
Glenn Jocher 603ea0bfdc update log_dir to runs/exp #107 2020-07-09 15:58:07 -07:00
Glenn Jocher 72d5b58b9a disable LR plot to suppress warning message 2020-07-09 15:16:57 -07:00
Glenn Jocher 24c5a941f0 --resume EMA fix #292 2020-07-09 15:09:06 -07:00
Alex Stoken 9d631408a2
Move hyp and opt yaml save to top of train()
Fixes bug where scaled values were saved in hyp.yaml, which would cause continuity issues with --resume
2020-07-09 16:18:55 -05:00
Glenn Jocher bf6f41567a hyperparameter printout update 2020-07-08 17:21:00 -07:00
Glenn Jocher dc5e18390a
Merge branch 'master' into advanced_logging 2020-07-08 17:01:19 -07:00
Glenn Jocher 6b134d93c5
Update train.py 2020-07-08 16:58:13 -07:00
Glenn Jocher 16f6834486 update train.py and experimental.py 2020-07-08 14:23:34 -07:00
Alex Stoken 52bac22f09 Add in --resume functionality with option to specify path or to get most recent run 2020-07-07 10:42:28 -05:00
Alex Stoken f517ba81c7
Merge branch 'master' into advanced_logging 2020-07-06 16:52:11 -05:00
Laughing 956511dafd
fix LR bug 2020-07-05 15:08:24 +08:00
Glenn Jocher bb3c346916 model.yaml nc inherited from dataset.yaml 2020-07-04 17:51:54 -07:00
Glenn Jocher df224a0d8f EMA bug fix #279 2020-07-03 11:56:14 -07:00
Glenn Jocher 3bdea3f697 strip_optimizer() bug fix #253 2020-07-02 21:24:26 -07:00
Glenn Jocher e02a189a3a
Merge pull request #245 from yxNONG/patch-2
Unify the check point of single and multi GPU
2020-07-02 12:05:46 -07:00
Glenn Jocher 597ed4ce63
Update train.py 2020-07-02 12:00:55 -07:00
Glenn Jocher 13f69777a6 typo fix 2020-07-02 09:26:03 -07:00
yxNONG 1aa2b67933
Update train.py 2020-07-02 13:51:52 +08:00
Glenn Jocher 86784cfdbf --resume bug fix #252 2020-06-30 21:43:53 -07:00
Glenn Jocher ad4c22cbfe --resume bug fix #187 2020-06-30 16:16:29 -07:00
Glenn Jocher 3b16c865f0 assert --epochs 2020-06-30 14:08:08 -07:00
yxNONG cdb9bde181
Unify the check point of single and multi GPU
save the model.hyp etc to checkpoint when use multi GPU training
2020-06-30 19:06:28 +08:00
Glenn Jocher b203c9b7ff update train.py incompatible model message fix #222 2020-06-29 12:45:25 -07:00
Glenn Jocher 37e13f8846 update mosaic border 2020-06-27 13:50:15 -07:00
Alex Stoken e18e6811dc
Merge branch 'master' into advanced_logging 2020-06-27 10:13:03 -05:00
Glenn Jocher 22fb2b0c25 refactor dataloader 2020-06-26 18:56:13 -07:00
Glenn Jocher 256a3e89d2 small dataset bug fix #140 2020-06-25 17:52:56 -07:00
Glenn Jocher b50fdf16af model.names multi-GPU bug fix #94 2020-06-24 22:22:13 -07:00
Alex Stoken de191655e4 Fix yaml saving (don't sort keys), reorder --opt keys, bug fix hyp dict accessor 2020-06-24 17:21:54 -05:00
Alex Stoken 2d396bea00 Fix bug in --help from percent sign in help string 2020-06-24 16:57:12 -05:00
Glenn Jocher b8557f87e3 add stride to datasets.py 2020-06-24 13:02:27 -07:00
Alex Stoken 611aacf1bf Turn opt into dictionary before sending it to yaml 2020-06-24 10:49:08 -05:00
Alex Stoken bc4ef4861b Default optimizer SGD 2020-06-24 10:07:43 -05:00
Alex Stoken 7abf202cad Mode all optimizer settings to 'hyp.yaml', integrate proper momentum with Adam optimizer 2020-06-24 10:03:21 -05:00
Alex Stoken 7edbf6570e Fix help message for cfg files 2020-06-24 09:45:57 -05:00
Alex Stoken d64ad0fbf3 Remove --resume functionality and related checks/logic. 2020-06-24 09:17:27 -05:00
Glenn Jocher 6c1b87a42e update google_utils import 2020-06-22 23:00:23 -07:00
Glenn Jocher 1f1917ef56 remove fast, add merge 2020-06-21 13:37:11 -07:00
Alex Stoken e572bb0803 Add plot_results save location to log_dir 2020-06-21 09:36:28 -05:00
Lornatang 899f1d4bde Fix DDP bug in single process multiple device use cases 2020-06-20 13:00:03 +08:00
Glenn Jocher cdf1eac9f7
Merge pull request #107 from Lornatang/fix-reference-bugs
fix refrence bug
2020-06-19 14:51:41 -07:00
Glenn Jocher cce95e744d backbone as FP16, save default to FP32 2020-06-18 00:13:18 -07:00
Glenn Jocher d9b64c27c2 save ckpt in FP16 #119 2020-06-17 22:34:13 -07:00
Glenn Jocher 9fdb0fbacf AutoAnchor bug fix # 117 2020-06-17 19:51:15 -07:00
Alex Stoken c8152c81a6 Syntax fixes 2020-06-17 16:32:13 -05:00
Alex Stoken 9b7386f603 Add save_dir arg to test.test, use arg as location for saving batch jpgs 2020-06-17 16:08:46 -05:00
Alex Stoken 945307beba Add save_dir to plot_lr_scheduler and plot_labels
Set save_dir = log_dir in train.py
2020-06-17 16:03:18 -05:00
Alex Stoken 3b2b330872 Move results.txt from weights/ to log_dir 2020-06-17 15:55:45 -05:00
Alex Stoken ade023cff2 Fix hyp file read in and dict update.
Add example of hyp yaml
2020-06-17 10:59:20 -05:00
Lornatang 2368603484 fix refrence bug
In torch==1.5, the import of the API has changed. Although it does not interrupt the operation of the program, it seems to me to be an implicit error and may throw an exception in later versions.
2020-06-17 09:56:26 +08:00
Glenn Jocher 8db51c7002 tb_writer bug fix 2020-06-16 16:05:28 -07:00
Alex Stoken 5f2eeba233 remove old print statements 2020-06-16 17:09:39 -05:00
Glenn Jocher afe1df385b dist.destroy_process_group() bug fix 2020-06-16 15:08:14 -07:00
Alex Stoken 333f678b37 add update default hyp dict with provided yaml 2020-06-16 16:36:20 -05:00
Alex Stoken a448c3bcd7 add logic for resuming and getting hyp for resume run 2020-06-16 16:30:12 -05:00
Alex Stoken 25e51bcec7 add util function to get most recent last.pt file
added logic in train.py __main__ to handle resuming from a run
2020-06-16 15:50:27 -05:00
Alex Stoken 490f1e7b9c add save_dir arg to plot_lr_scheduler, default to current dir.
Uncomment plot_lr_scheduler in train() and pass log_dir as save location
2020-06-16 15:13:03 -05:00
Alex Stoken 4418809cf5 change weights dir (wdir) to be unique to each run, under log_dir 2020-06-16 15:09:51 -05:00
Alex Stoken d9f446cd81 add save yaml of opt and hyp to tensorboard log_dir in train() 2020-06-16 15:06:13 -05:00
Alex Stoken a85e6d0fc0 add parser arg for hyp yaml file 2020-06-16 14:53:32 -05:00
Glenn Jocher 5a50491fa1 check_anchors bug fix 2020-06-16 10:36:35 -07:00
Glenn Jocher 05b8ee5ca4 check_anchors() bug fix #102 2020-06-16 10:34:16 -07:00
Glenn Jocher ec81c7b5f2 check_anchors() bug fix #90 2020-06-16 10:14:04 -07:00
Glenn Jocher 8b26e89006 AutoAnchor bug fix #72 2020-06-16 00:53:34 -07:00
Glenn Jocher bdd9fee841 update fast mode 2020-06-16 00:29:54 -07:00
Glenn Jocher 1c0b6236e3 update fast mode 2020-06-16 00:11:29 -07:00
Glenn Jocher 915b1481fc default check_git_status() to True 2020-06-15 16:18:46 -07:00
Glenn Jocher 14523bb030 FP16 to FP32 ckpt load 2020-06-15 13:18:39 -07:00
Glenn Jocher c5966abba8 glob search bug fix #77 2020-06-15 12:08:57 -07:00
Glenn Jocher 31f3310029 assert best possible recall > 0.9 before training 2020-06-13 15:05:41 -07:00
Lauritzen Kasper Primdal c3d4d321d3 Ensures weights/ dir exists
Allows train.py to be run outside of yolov5/ directory.
2020-06-13 14:44:23 +02:00
Glenn Jocher 099e6f5ebd --img-size stride-multiple verification 2020-06-12 22:10:46 -07:00
Glenn Jocher 22d6088205 speed-reproducibility fix #17 2020-06-05 13:07:09 -07:00
Glenn Jocher 55ca5c74d2 multi-scale fix #16 2020-06-05 12:57:16 -07:00
Glenn Jocher 7c2832cd49 assert equal model and dataset classes 2020-06-04 17:21:22 -07:00
Glenn Jocher 11121e39ed updates 2020-06-04 16:52:07 -07:00
Glenn Jocher eb97b2e413 NMS fast mode 2020-06-03 13:02:59 -07:00
Glenn Jocher ce36905358 updates 2020-05-30 00:12:45 -07:00
Glenn Jocher 1e84a23f38 initial commit 2020-05-29 17:04:54 -07:00