Official PyTorch implementation of SegFormer
 
 
Go to file
xieenze f6af29ff18 update 2021-06-13 01:38:14 +08:00
configs update 2021-06-13 00:32:37 +08:00
demo update 2021-06-13 00:32:37 +08:00
docker update 2021-06-13 00:32:37 +08:00
docs update 2021-06-13 00:32:37 +08:00
local_configs update 2021-06-13 01:32:38 +08:00
mmseg update 2021-06-13 01:38:14 +08:00
requirements update 2021-06-13 00:32:37 +08:00
resources update 2021-06-13 01:01:20 +08:00
tests update 2021-06-13 00:32:37 +08:00
tools update 2021-06-13 01:33:05 +08:00
.gitignore update 2021-06-13 00:32:37 +08:00
LICENSE Update LICENSE 2021-06-13 00:56:59 +08:00
README.md Update README.md 2021-06-13 01:22:32 +08:00
pytest.ini update 2021-06-13 00:32:37 +08:00
requirements.txt update 2021-06-13 00:32:37 +08:00
setup.cfg update 2021-06-13 00:32:37 +08:00
setup.py update 2021-06-13 00:32:37 +08:00

README.md

SegFormer: Simple and Efficient Design for Semantic Segmentation with Transformers

This repository contains PyTorch evaluation code, training code and pretrained models for SegFormer.

SegFormer is a simple, efficient and powerful semantic segmentation method, as shown in Figure 1.

We use MMSegmentation v0.13.0 as the codebase.

Figure 1: Performance of SegFormer-B0 to SegFormer-B5.

Install

For install and data preparation, please refer to the guidelines in MMSegmentation v0.13.0.

Other requirements: pip install timm==0.3.2

Evaluation

Download trained weights.

Example: evaluate SegFormer-B1 on ADE20K:

# single-gpu testing
python tools/test.py local_configs/segformer/B1/segformer.b1.512x512.ade.160k.py /path/to/checkpoint_file

# multi-gpu testing
./tools/dist_test.sh local_configs/segformer/B1/segformer.b1.512x512.ade.160k.py /path/to/checkpoint_file <GPU_NUM>

# multi-gpu, multi-scale testing
tools/dist_test.sh local_configs/segformer/B1/segformer.b1.512x512.ade.160k.py /path/to/checkpoint_file <GPU_NUM> --aug-test

Training

Download weights pretrained on ImageNet-1K, and put them in a folder pretrained/.

Example: train SegFormer-B1 on ADE20K:

# single-gpu training
python tools/train.py local_configs/segformer/B1/segformer.b1.512x512.ade.160k.py 

# multi-gpu training
./tools/dist_train.sh local_configs/segformer/B1/segformer.b1.512x512.ade.160k.py <GPU_NUM>

License

Please check the LICENSE file. SegFormer may be used non-commercially, meaning for research or evaluation purposes only. For business inquiries, please contact researchinquiries@nvidia.com.

Citing SegFormer

@article{xie2021segformer,
  title={SegFormer: Simple and Efficient Design for Semantic Segmentation with Transformers},
  author={Xie, Enze and Wang, Wenhai and Yu, Zhiding and Anandkumar, Anima and Alvarez, Jose M and Luo, Ping},
  journal={arXiv preprint arXiv:2105.15203},
  year={2021}
}