1
0
mirror of https://github.com/PaddlePaddle/PaddleClas.git synced 2025-06-03 21:55:06 +08:00

34 Commits

Author SHA1 Message Date
Nyakku Shigure
cc24ead0ad
Use inspect.getfullargspec instead of deprecated inspect.getargspec () 2024-06-20 20:16:36 +08:00
sky
54767fdda4
fix adamwdl bug () 2024-03-04 14:18:29 +08:00
Tingquan Gao
0675971d0b
rm fluid api () 2023-08-11 15:51:42 +08:00
Tingquan Gao
bb1596b8c3
compatibility with python 3.11 () 2023-07-20 10:50:52 +08:00
gaotingquan
042d1e7ef8 fix layer key name for dynamic lr in adamwdl optimizer 2023-05-26 15:40:48 +08:00
Yang Nie
b66ee6384b fix RMSProp one_dim_param_no_weight_decay 2023-05-06 19:04:37 +08:00
Yang Nie
c351dac67e add tinynet 2023-05-06 19:04:37 +08:00
gaotingquan
73f4d8e4ce to avoid cause issues for unset no_weight_decay models.
there seems be a diff for optimizer about using [] and [{"params":}, {"params":}] params
2023-04-12 20:55:38 +08:00
Yang Nie
f36ffbc492 fix 2023-04-10 15:02:54 +08:00
Yang Nie
beca8b2c1b add mobilenext
add cooldown config

update optimizer

fix ParamAttr & update  test_tipc

fix tipc

update tipc config

remove docs of `_make_divisible`

refactor the implementation of "no weight decay"

fix model name

remove cooldown config
2023-04-05 00:41:19 +08:00
dolcexu
c779cc14eb adamwdl fix 2023-03-02 19:20:18 +08:00
dongshuilong
944763d7a5 add fixmatch 2022-10-25 12:04:22 +08:00
zengshao0622
7b50ce6585 merge CAE 2022-09-08 08:11:25 +00:00
HydrogenSulfate
43410aa852 update config, add amp eval for retrieval 2022-06-21 23:03:56 +08:00
lubin
509f4d77e3 modify the cifar10 dataset format 2022-02-28 11:47:42 +00:00
lubin
cef3cb25d9 update optimizer and some comment 2022-02-28 08:02:51 +00:00
lubin
2507be1a51 add deephash configs and dch algorithm 2022-02-23 11:50:39 +00:00
zhangbo9674
558f03d684 refine code 2021-12-21 06:30:13 +00:00
zhangbo9674
28061f537c refine optimizer init logice 2021-12-21 06:28:13 +00:00
zhangbo9674
b54ee04491 Accelerate dynamic graph amp training 2021-12-20 06:36:56 +00:00
gaotingquan
7dcb2d4fd0 fix: raise exception
raise exception about using no_weight_decay of AdamW in static graph
2021-09-30 10:48:36 +00:00
gaotingquan
c7aeec28e2 fix: support static graph 2021-09-30 06:57:17 +00:00
gaotingquan
079434dc5f feat: add AdamW 2021-09-01 08:07:48 +00:00
littletomatodonkey
0bc5b54d19
fix opt () 2021-07-15 15:19:41 +08:00
littletomatodonkey
9d9cd3719e
add static training ()
* add static training

* fix typo

* add se fp16

* rm note

* fix loader

* fix cfg
2021-07-15 10:30:07 +08:00
littletomatodonkey
487c797230
fix optimizer builder () 2021-05-28 10:39:51 +08:00
huangxu96
4e43ec6995 new usage of amp training. ()
* new usage of amp training.

* change the usage of amp and pure fp16 training.

* modified code as reviews
2021-02-26 09:25:54 +00:00
littletomatodonkey
6a5f4626d7
add static running in dygraph ()
* add static running in dygraph
2020-11-18 09:48:56 +08:00
littletomatodonkey
7ef474169d
polish api to Paddle2.0-rc ()
polish codes and docs to adapt Paddle2.0-rc
2020-10-30 00:20:48 +08:00
littletomatodonkey
4773a2177e fix opt 2020-09-17 13:24:33 +00:00
littletomatodonkey
b8a7d186d7 fix optimizer and regularizer 2020-09-15 09:43:19 +00:00
littletomatodonkey
a43aac3250 fix optimizer 2020-09-03 08:39:14 +00:00
WuHaobo
ed2f71ca68 dygraph optimizer 2020-06-07 15:39:26 +08:00
WuHaobo
9f39da8859 Init PaddleClas 2020-04-09 02:16:30 +08:00