mirror of
https://github.com/open-mmlab/mmsegmentation.git
synced 2025-06-03 22:03:48 +08:00
## Motivation The original version of Visual Attention Network (VAN) can be found from https://github.com/Visual-Attention-Network/VAN-Segmentation 添加Visual Attention Network (VAN)的支持。 ## Modification added a floder mmsegmentation/projects/van/ added 13 configs totally and aligned performance basically. 只增加了一个文件夹,共增加13个配置文件,基本对齐性能(没有全部跑)。 ## Use cases (Optional) Before running, you may need to download the pretrain model from https://cloud.tsinghua.edu.cn/d/0100f0cea37d41ba8d08/ and then move them to the folder mmsegmentation/pretrained/, i.e. "mmsegmentation/pretrained/van_b2.pth". After that, run the following command: cd mmsegmentation bash tools/dist_train.sh projects/van/configs/van/van-b2_pre1k_upernet_4xb2-160k_ade20k-512x512.py 4 --------- Co-authored-by: xiexinch <xiexinch@outlook.com>
Projects
The OpenMMLab ecosystem can only grow through the contributions of the community. Everyone is welcome to post their implementation of any great ideas in this folder! If you wish to start your own project, please go through the example project for the best practice. For common questions about projects, please read our faq.
External Projects
There are also selected external projects released in the community that use MMSegmentation:
- SegNeXt: Rethinking Convolutional Attention Design for Semantic Segmentation
- Vision Transformer Adapter for Dense Predictions
- UniFormer: Unifying Convolution and Self-attention for Visual Recognition
- Multi-Scale High-Resolution Vision Transformer for Semantic Segmentation
- ViTAE: Vision Transformer Advanced by Exploring Intrinsic Inductive Bias
- DAFormer: Improving Network Architectures and Training Strategies for Domain-Adaptive Semantic Segmentation
- MPViT : Multi-Path Vision Transformer for Dense Prediction
- TopFormer: Token Pyramid Transformer for Mobile Semantic Segmentation
Note: These projects are supported and maintained by their own contributors. The core maintainers of MMSegmentation only ensure the results are reproducible and the code quality meets its claim at the time each project was submitted, but they may not be responsible for future maintenance.