Commit Graph

4980 Commits (b0877289f4181e9c7195895396ffa5cf19493cad)
 

Author SHA1 Message Date
gaotingquan b0877289f4 disable promote kernel for amp training
compatible with paddle 2.5 and older version.
ref: https://github.com/PaddlePaddle/PaddleClas/pull/2798
2023-05-25 11:58:05 +08:00
gaotingquan 162f013ebe fix: minimize() dont support parameter_list of type dict
there are diffs that step()+update() and minimize().
this will be fixed in https://github.com/PaddlePaddle/Paddle/pull/53773.
2023-05-25 11:58:05 +08:00
gaotingquan 8b218b01ac refactor amp auto_cast context manager & loss scaler 2023-05-25 11:58:05 +08:00
gaotingquan f884f28853 refactor amp 2023-05-25 11:58:05 +08:00
Yang Nie b2cb417842 add-tinynet-tipc-configs 2023-05-23 19:33:25 +08:00
gaotingquan b3678234fe fix bug when update_freq > iter_per_epoch 2023-05-17 15:19:13 +08:00
gaotingquan 377950865c getargspec -> getfullargspec
getargspec dont support param annotations
2023-05-17 15:19:13 +08:00
gaotingquan 7e0207e5a4 fix resize param of tipc infer config 2023-05-17 15:19:13 +08:00
gaotingquan bb831c3baa code style 2023-05-17 15:19:13 +08:00
gaotingquan 70a784ce52 fix model name to MobileViTV3 2023-05-17 15:19:13 +08:00
gaotingquan a3e9e99fa0 revert: fix bs 2023-05-17 15:19:13 +08:00
gaotingquan 1770e14e28 rename: v3 -> V3 2023-05-17 15:19:13 +08:00
gaotingquan f42d6b6204 fix name: w24 -> W24 2023-05-17 15:19:13 +08:00
gaotingquan 07b9162bc0 fix pretrained url 2023-05-17 15:19:13 +08:00
gaotingquan a1fa19cd29 rename: v3 -> V3 2023-05-17 15:19:13 +08:00
gaotingquan 2091a59ff5 fix reference url 2023-05-17 15:19:13 +08:00
gaotingquan 890f77411a fix bs and unset update_freq to adapt to 8 gpus 2023-05-17 15:19:13 +08:00
gaotingquan fc9c59c4b1 update pretrained url 2023-05-17 15:19:13 +08:00
gaotingquan fe692cb84e specify pillow version >= 9.0.0
the different version(such as 8.3.2 and 9.x) result in diff in eval top1 acc
2023-05-12 21:25:07 +08:00
zhangyubo0722 0566e8d20c add_Swinv2_readme 2023-05-12 15:41:21 +08:00
zhangyubo0722 32e692175b add_Swinv2_readme 2023-05-12 15:41:21 +08:00
zhangyubo0722 7cf861cbbe add_Swinv2_readme 2023-05-12 15:41:21 +08:00
zhangyubo0722 9cdc5a20c7 add_Swinv2_readme 2023-05-12 15:41:21 +08:00
gaotingquan 7eaf619c5a revert bs fix 2023-05-10 21:09:49 +08:00
gaotingquan 925ae7a51d install nvidia-dali-cuda11.0 2023-05-10 21:09:49 +08:00
gaotingquan b2dc1788ea fix python3.7 -> python3.10 2023-05-10 21:09:49 +08:00
gaotingquan 0de2f6e989 rm fp32 2023-05-10 21:09:49 +08:00
gaotingquan 6262bc5247 add the r50 w/o dali, nhwc 2023-05-10 21:09:49 +08:00
gaotingquan 0368acac64 rm the useless bs config 2023-05-10 21:09:49 +08:00
gaotingquan 3bf37e9c33 add r50(bs256) w/o dali 2023-05-10 21:09:49 +08:00
gaotingquan 8a52343ff4 add r50 nhwc 2023-05-10 21:09:49 +08:00
gaotingquan 3873ae07bc rename: ResNet50 -> ResNet50_NHWC_dali 2023-05-10 21:09:49 +08:00
gaotingquan 2808fb702f add r50 AMPO2 ultra 2023-05-10 21:09:49 +08:00
Yang Nie b66ee6384b fix RMSProp one_dim_param_no_weight_decay 2023-05-06 19:04:37 +08:00
Yang Nie c351dac67e add tinynet 2023-05-06 19:04:37 +08:00
duanyanhui f8fdc5fd98 update npu inference api 2023-05-05 17:29:53 +08:00
zhangting2020 e7bef51f9e fix data dtype for amp training 2023-04-26 18:40:20 +08:00
kangguangli 731006f1fc set seed by configs 2023-04-25 17:39:55 +08:00
kangguangli ee36c40dd3 modify flags 2023-04-25 17:39:55 +08:00
kangguangli 293a216a0b fix random seed 2023-04-25 17:39:55 +08:00
zh-hike d7bd275379 update foundation_vit from EVA_vit_huge to EVA_vit_giant 2023-04-23 10:16:08 +08:00
Yang Nie 0af4680f86 set num_workers 8 2023-04-19 21:21:06 +08:00
Yang Nie cdd3c3a05c clear type hint 2023-04-19 21:21:06 +08:00
Yang Nie f6ac4a6187 fix typo 2023-04-19 21:21:06 +08:00
Yang Nie 692204eee6 fix code style 2023-04-19 21:21:06 +08:00
Yang Nie e7ad3909c8 update configs for 8gpus 2023-04-19 21:21:06 +08:00
Yang Nie deb8e98779 rename v2 to V2 2023-04-19 21:21:06 +08:00
Yang Nie be6a22be18 add MobileViTv2 2023-04-19 21:21:06 +08:00
gaotingquan 9f621279b8 fix infer output 2023-04-17 20:28:40 +08:00
gaotingquan 73f4d8e4ce to avoid cause issues for unset no_weight_decay models.
there seems be a diff for optimizer about using [] and [{"params":}, {"params":}] params
2023-04-12 20:55:38 +08:00