Ross Wightman 925e102982 Update attention / self-attn based models from a series of experiments:
* remove dud attention, involution + my swin attention adaptation don't seem worth keeping
* add or update several new 26/50 layer ResNe(X)t variants that were used in experiments
* remove models associated with dead-end or uninteresting experiment results
* weights coming soon...
2021-08-20 16:13:11 -07:00
..
2020-05-03 09:29:45 +03:00
2021-07-11 09:39:38 +01:00
2021-06-23 10:40:30 -07:00
2021-06-28 15:56:24 -07:00
2021-06-07 14:38:30 -07:00
2021-06-20 17:46:06 -07:00