.. |
layers
|
Halo, bottleneck attn, lambda layer additions and cleanup along w/ experimental model defs
|
2021-10-06 16:32:48 -07:00 |
pruned
|
…
|
|
__init__.py
|
Add ConvMixer
|
2021-10-09 21:09:51 -04:00 |
beit.py
|
Add BeiT 'finetuned' 1k weights and pretrained 22k weights, pretraining specific (masked) model excluded for now
|
2021-09-13 16:38:23 -07:00 |
byoanet.py
|
Update lambda_resnet26rpt weights to 78.9, add better halonet26t weights at 79.1 with tweak to attention dim
|
2021-10-08 17:44:13 -07:00 |
byobnet.py
|
regnetz model default cfg tweaks
|
2021-10-06 21:14:59 -07:00 |
cait.py
|
Refactoring, cleanup, improved test coverage.
|
2021-06-12 16:40:02 -07:00 |
coat.py
|
Refactoring, cleanup, improved test coverage.
|
2021-06-12 16:40:02 -07:00 |
convit.py
|
Merge branch 'master' into cleanup_xla_model_fixes
|
2021-06-12 23:19:25 -07:00 |
convmixer.py
|
Add ConvMixer
|
2021-10-09 21:09:51 -04:00 |
crossvit.py
|
A few more crossvit tweaks, fix training w/ no_weight_decay names, add crop option for scaling, adjust default crop_pct for large img size to 1.0 for better results
|
2021-09-13 14:17:34 -07:00 |
cspnet.py
|
…
|
|
densenet.py
|
…
|
|
dla.py
|
Refactoring, cleanup, improved test coverage.
|
2021-06-12 16:40:02 -07:00 |
dpn.py
|
Refactoring, cleanup, improved test coverage.
|
2021-06-12 16:40:02 -07:00 |
efficientnet.py
|
make it possible to provide norm_layer via create_model
|
2021-09-21 10:19:04 +01:00 |
efficientnet_blocks.py
|
Remove dead code line from efficientnet
|
2021-09-30 21:54:42 -07:00 |
efficientnet_builder.py
|
Able to use other attn layer in EfficientNet now. Create test ECA + GC B0 configs. Make ECA more configurable.
|
2021-05-30 12:47:02 -07:00 |
factory.py
|
…
|
|
features.py
|
…
|
|
ghostnet.py
|
Fix a few issues loading pretrained vit/bit npz weights w/ num_classes=0 __init__ arg. Missed a few other small classifier handling detail on Mlp, GhostNet, Levit. Should fix #713
|
2021-06-22 23:16:05 -07:00 |
gluon_resnet.py
|
…
|
|
gluon_xception.py
|
…
|
|
hardcorenas.py
|
Bring EfficientNet SE layer in line with others, pull se_ratio outside of blocks. Allows swapping w/ other attn layers.
|
2021-05-29 23:41:38 -07:00 |
helpers.py
|
support bits checkpoints in avg/load
|
2021-10-03 17:31:22 -07:00 |
hrnet.py
|
…
|
|
hub.py
|
…
|
|
inception_resnet_v2.py
|
…
|
|
inception_v3.py
|
…
|
|
inception_v4.py
|
…
|
|
levit.py
|
Fix a few issues loading pretrained vit/bit npz weights w/ num_classes=0 __init__ arg. Missed a few other small classifier handling detail on Mlp, GhostNet, Levit. Should fix #713
|
2021-06-22 23:16:05 -07:00 |
mlp_mixer.py
|
Add gMLP-S weights, 79.6 top-1
|
2021-06-23 10:40:30 -07:00 |
mobilenetv3.py
|
Refactoring, cleanup, improved test coverage.
|
2021-06-12 16:40:02 -07:00 |
nasnet.py
|
…
|
|
nest.py
|
Use bicubic interpolation in resize_pos_embed()
|
2021-07-12 10:38:31 -07:00 |
nfnet.py
|
Refactoring, cleanup, improved test coverage.
|
2021-06-12 16:40:02 -07:00 |
pit.py
|
Refactoring, cleanup, improved test coverage.
|
2021-06-12 16:40:02 -07:00 |
pnasnet.py
|
…
|
|
registry.py
|
Refactoring, cleanup, improved test coverage.
|
2021-06-12 16:40:02 -07:00 |
regnet.py
|
Add Gather-Excite and Global Context attn modules. Refactor existing SE-like attn for consistency and refactor byob/byoanet for less redundancy.
|
2021-05-27 18:03:29 -07:00 |
res2net.py
|
…
|
|
resnest.py
|
Add non-local and BAT attention. Merge attn and self-attn factories into one. Add attention references to README. Add mlp 'mode' to ECA.
|
2021-05-31 13:18:11 -07:00 |
resnet.py
|
Change crop ratio on correct resnet50 variant.
|
2021-10-04 22:31:08 -07:00 |
resnetv2.py
|
Clean a1/a2/3 rsb _0 checkpoints properly, fix v2 loading.
|
2021-10-04 16:46:00 -07:00 |
rexnet.py
|
Post merge cleanup
|
2021-06-07 14:38:30 -07:00 |
selecsls.py
|
…
|
|
senet.py
|
…
|
|
sknet.py
|
Remove min channels for SelectiveKernel, divisor should cover cases well enough.
|
2021-05-31 15:38:56 -07:00 |
swin_transformer.py
|
Refactoring, cleanup, improved test coverage.
|
2021-06-12 16:40:02 -07:00 |
tnt.py
|
Expand scope of testing for non-std vision transformer / mlp models. Some related cleanup and create fn cleanup for all vision transformer and mlp models. More CoaT weights.
|
2021-05-24 21:13:26 -07:00 |
tresnet.py
|
Add Gather-Excite and Global Context attn modules. Refactor existing SE-like attn for consistency and refactor byob/byoanet for less redundancy.
|
2021-05-27 18:03:29 -07:00 |
twins.py
|
Refactoring, cleanup, improved test coverage.
|
2021-06-12 16:40:02 -07:00 |
vgg.py
|
…
|
|
visformer.py
|
Fix a few issues loading pretrained vit/bit npz weights w/ num_classes=0 __init__ arg. Missed a few other small classifier handling detail on Mlp, GhostNet, Levit. Should fix #713
|
2021-06-22 23:16:05 -07:00 |
vision_transformer.py
|
Fix silly typo
|
2021-08-27 09:22:20 -07:00 |
vision_transformer_hybrid.py
|
AugReg release
|
2021-06-20 17:46:06 -07:00 |
vovnet.py
|
…
|
|
xception.py
|
…
|
|
xception_aligned.py
|
…
|
|
xcit.py
|
Allow act_layer switch for xcit, fix in_chans for some variants
|
2021-07-12 13:27:29 -07:00 |