* update LoveDA dataset api
* revised lint errors in dataset_prepare.md
* revised lint errors in loveda.py
* revised lint errors in loveda.py
* revised lint errors in dataset_prepare.md
* revised lint errors in dataset_prepare.md
* checked with isort and yapf
* checked with isort and yapf
* checked with isort and yapf
* Revert "checked with isort and yapf"
This reverts commit 686a51d9
* Revert "checked with isort and yapf"
This reverts commit b877e121bb.
* Revert "revised lint errors in dataset_prepare.md"
This reverts commit 2289e27c
* Revert "checked with isort and yapf"
This reverts commit 159db2f8
* Revert "checked with isort and yapf"
This reverts commit 159db2f8
* add configs & fix bugs
* update new branch
* upload models&logs and add format-only
* change pretraied model path of HRNet
* fix the errors in dataset_prepare.md
* fix the errors in dataset_prepare.md and configs in loveda.py
* change the description in docs_zh-CN/dataset_prepare.md
* use init_cfg
* fix test converage
* adding pseudo loveda dataset
* adding pseudo loveda dataset
* adding pseudo loveda dataset
* adding pseudo loveda dataset
* adding pseudo loveda dataset
* adding pseudo loveda dataset
* Update docs/dataset_prepare.md
Co-authored-by: Junjun2016 <hejunjun@sjtu.edu.cn>
* Update docs_zh-CN/dataset_prepare.md
Co-authored-by: Junjun2016 <hejunjun@sjtu.edu.cn>
* Update docs_zh-CN/dataset_prepare.md
Co-authored-by: Junjun2016 <hejunjun@sjtu.edu.cn>
* Delete unused lines of unittest and Add docs
* add convert .py file
* add downloading links from zenodo
* move place of LoveDA and Cityscapes in doc
* move place of LoveDA and Cityscapes in doc
Co-authored-by: MengzhangLI <mcmong@pku.edu.cn>
Co-authored-by: Junjun2016 <hejunjun@sjtu.edu.cn>
* Add support for Pascal Context 59 classes (#459)
* Create PascalContextDataset59 class in mmseg/datasets/pascal_context.py;
* Set reduce_zero_label=True for train_pipeline and PascalContextDataset59;
* Add some configs for Pascal-Context 59 classes training and testing;
* Try to solve the problem about "fence(IoU)=nan grass(IoU)=0";
* Continue(1): Try to solve the problem about "fence(IoU)=nan grass(IoU)=0";
* ignore files and folders named tempxxx;
* Continue(2): Try to solve the problem about "fence(IoU)=nan grass(IoU)=0";
* Modify the calculation of IoU;
* Modify the CLASSES order of PascalContextDataset;
* Add "fcn", "deeplabv3", "deeplabv3+", "pspnet" config file for model training based on PascalContextDataset59;
Add some ignore items in ".gitignore";
* fix the bug "test_cfg specified in both outer field and model field " of pspnet config file;
* * Clean unnecessary codes;
* Add weighs link, config link, log link and evaluation results about PascalContextDataset59 in README.md
* Add command line argument: "-p | --port", this arg can change the transmit port when you transmit data to distributed machine.
* * Remove rebundant config files;
* Remove "-p|--port" command argument;
Co-authored-by: Jiarui XU <xvjiarui0826@gmail.com>
* add more configs
* add more configs
* fixed backbone type
* fixed deeplabv3+ channels
* add r101
* update link
* change resnet18 link
* update aug test
* add inf time
* add mem
* Add Pascal Context to mmsegmentation
* Add benchmark result to Pascal Context
* fix mmcv version
* fix code syntax
* fix code syntax again
* Update mmseg/models/segmentors/encoder_decoder.py
update hint
Co-authored-by: Jerry Jiarui XU <xvjiarui0826@gmail.com>
* update comment
* fix pascal context model path
* fix model path mistake again
* fix model path mistake again
* fix model path mistakes again
Co-authored-by: Jerry Jiarui XU <xvjiarui0826@gmail.com>