mmpretrain/configs/itpn
Yixiao Fang e4c4a81b56
[Feature] Support iTPN and HiViT (#1584)
* hivit added

* Update hivit.py

* Update hivit.py

* Add files via upload

* Update __init__.py

* Add files via upload

* Update __init__.py

* Add files via upload

* Update hivit.py

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Update itpn.py

* Add files via upload

* Update __init__.py

* Update mae_hivit-base-p16.py

* Delete mim_itpn-base-p16.py

* Add files via upload

* Update itpn_hivit-base-p16.py

* Update itpn.py

* Update hivit.py

* Update __init__.py

* Update mae.py

* Delete hivit.py

* Update __init__.py

* Delete configs/itpn directory

* Add files via upload

* Add files via upload

* Delete configs/hivit directory

* Add files via upload

* refactor and add metafile and readme

* update clip

* add ut

* update ut

* update

* update docstring

* update model.rst

---------

Co-authored-by: 田运杰 <48153283+sunsmarterjie@users.noreply.github.com>
2023-05-26 12:08:34 +08:00
..
README.md [Feature] Support iTPN and HiViT (#1584) 2023-05-26 12:08:34 +08:00
itpn-clip-b_hivit-base-p16_8xb256-amp-coslr-300e_in1k.py [Feature] Support iTPN and HiViT (#1584) 2023-05-26 12:08:34 +08:00
itpn-clip-b_hivit-base-p16_8xb256-amp-coslr-800e_in1k.py [Feature] Support iTPN and HiViT (#1584) 2023-05-26 12:08:34 +08:00
itpn-pixel_hivit-base-p16_8xb512-amp-coslr-400e_in1k.py [Feature] Support iTPN and HiViT (#1584) 2023-05-26 12:08:34 +08:00
itpn-pixel_hivit-base-p16_8xb512-amp-coslr-800e_in1k.py [Feature] Support iTPN and HiViT (#1584) 2023-05-26 12:08:34 +08:00
itpn-pixel_hivit-base-p16_8xb512-amp-coslr-1600e_in1k.py [Feature] Support iTPN and HiViT (#1584) 2023-05-26 12:08:34 +08:00
itpn-pixel_hivit-large-p16_8xb512-amp-coslr-400e_in1k.py [Feature] Support iTPN and HiViT (#1584) 2023-05-26 12:08:34 +08:00
itpn-pixel_hivit-large-p16_8xb512-amp-coslr-800e_in1k.py [Feature] Support iTPN and HiViT (#1584) 2023-05-26 12:08:34 +08:00
itpn-pixel_hivit-large-p16_8xb512-amp-coslr-1600e_in1k.py [Feature] Support iTPN and HiViT (#1584) 2023-05-26 12:08:34 +08:00
metafile.yml [Feature] Support iTPN and HiViT (#1584) 2023-05-26 12:08:34 +08:00

README.md

iTPN

Integrally Pre-Trained Transformer Pyramid Networks

Abstract

In this paper, we present an integral pre-training framework based on masked image modeling (MIM). We advocate for pre-training the backbone and neck jointly so that the transfer gap between MIM and downstream recognition tasks is minimal. We make two technical contributions. First, we unify the reconstruction and recognition necks by inserting a feature pyramid into the pre-training stage. Second, we complement mask image modeling (MIM) with masked feature modeling (MFM) that offers multi-stage supervision to the feature pyramid. The pre-trained models, termed integrally pre-trained transformer pyramid networks (iTPNs), serve as powerful foundation models for visual recognition. In particular, the base/large-level iTPN achieves an 86.2%/87.8% top-1 accuracy on ImageNet-1K, a 53.2%/55.6% box AP on COCO object detection with 1x training schedule using Mask-RCNN, and a 54.7%/57.7% mIoU on ADE20K semantic segmentation using UPerHead -- all these results set new records. Our work inspires the community to work on unifying upstream pre-training and downstream fine-tuning tasks. Code and the pre-trained models will be released at https://github.com/sunsmarterjie/iTPN.

How to use it?

Train/Test Command

Prepare your dataset according to the docs.

Train:

python tools/train.py configs/itpn/itpn-pixel_hivit-base-p16_8xb512-amp-coslr-800e_in1k.py

Models and results

Pretrained models

Model Params (M) Flops (G) Config Download
itpn-clip-b_hivit-base-p16_8xb256-amp-coslr-800e_in1k 233.00 18.47 config N/A
itpn-pixel_hivit-base-p16_8xb512-amp-coslr-800e_in1k 103.00 18.47 config N/A
itpn-pixel_hivit-large-p16_8xb512-amp-coslr-800e_in1k 314.00 63.98 config N/A

Citation

@article{tian2022integrally,
  title={Integrally Pre-Trained Transformer Pyramid Networks},
  author={Tian, Yunjie and Xie, Lingxi and Wang, Zhaozhi and Wei, Longhui and Zhang, Xiaopeng and Jiao, Jianbin and Wang, Yaowei and Tian, Qi and Ye, Qixiang},
  journal={arXiv preprint arXiv:2211.12735},
  year={2022}
}