# Dummy ResNet Wrapper This is an example README for community `projects/`. We have provided detailed explanations for each field in the form of html comments, which are visible when you read the source of this README file. If you wish to submit your project to our main repository, then all the fields in this README are mandatory for others to understand what you have achieved in this implementation. For more details, read our [contribution guide](https://mmocr.readthedocs.io/en/dev-1.x/notes/contribution_guide.html) or approach us in [Discussions](https://github.com/open-mmlab/mmocr/discussions). ## Description This project implements a dummy ResNet wrapper, which literally does nothing new but prints "hello world" during initialization. ## Usage ### Prerequisites - Python 3.7 - PyTorch 1.6 or higher - [MIM](https://github.com/open-mmlab/mim) - [MMOCR](https://github.com/open-mmlab/mmocr) All the commands below rely on the correct configuration of `PYTHONPATH`, which should point to the project's directory so that Python can locate the module files. In `example_project/` root directory, run the following line to add the current directory to `PYTHONPATH`: ```shell # Linux export PYTHONPATH=`pwd`:$PYTHONPATH # Windows PowerShell $env:PYTHONPATH=Get-Location ``` ### Training commands In MMOCR's root directory, run the following command to train the model: ```bash mim train mmocr configs/dbnet_dummy-resnet_fpnc_1200e_icdar2015.py --work-dir work_dirs/dummy_mae/ ``` To train on multiple GPUs, e.g. 8 GPUs, run the following command: ```bash mim train mmocr configs/dbnet_dummy-resnet_fpnc_1200e_icdar2015.py --work-dir work_dirs/dummy_mae/ --launcher pytorch --gpus 8 ``` ### Testing commands In MMOCR's root directory, run the following command to test the model: ```bash mim test mmocr configs/dbnet_dummy-resnet_fpnc_1200e_icdar2015.py --work-dir work_dirs/dummy_mae/ --checkpoint ${CHECKPOINT_PATH} ``` ## Results | Method | Backbone | Pretrained Model | Training set | Test set | #epoch | Test size | Precision | Recall | Hmean | Download | | :---------------------------------------------------------------: | :---------: | :--------------: | :-------------: | :------------: | :----: | :-------: | :-------: | :----: | :----: | :----------------------: | | [DBNet_dummy](configs/dbnet_dummy-resnet_fpnc_1200e_icdar2015.py) | DummyResNet | - | ICDAR2015 Train | ICDAR2015 Test | 1200 | 736 | 0.8853 | 0.7583 | 0.8169 | [model](<>) \| [log](<>) | ## Citation ```bibtex @software{MMOCR_Contributors_OpenMMLab_Text_Detection_2020, author = {{MMOCR Contributors}}, license = {Apache-2.0}, month = {8}, title = {{OpenMMLab Text Detection, Recognition and Understanding Toolbox}}, url = {https://github.com/open-mmlab/mmocr}, version = {0.3.0}, year = {2020} } ``` ## Checklist Here is a checklist illustrating a usual development workflow of a successful project, and also serves as an overview of this project's progress. - [ ] Milestone 1: PR-ready, and acceptable to be one of the `projects/`. - [ ] Finish the code - [ ] Basic docstrings & proper citation - [ ] Test-time correctness - [ ] A full README - [ ] Milestone 2: Indicates a successful model implementation. - [ ] Training-time correctness - [ ] Milestone 3: Good to be a part of our core package! - [ ] Type hints and docstrings - [ ] Unit tests - [ ] Code polishing - [ ] Metafile.yml - [ ] Move your modules into the core package following the codebase's file hierarchy structure. - [ ] Refactor your modules into the core package following the codebase's file hierarchy structure.