mirror of
https://github.com/open-mmlab/mmselfsup.git
synced 2025-06-03 14:59:38 +08:00
[Fix]: Fix empty link bug (#14)
This commit is contained in:
parent
43ba700ad2
commit
0425effc40
@ -6,7 +6,7 @@
|
||||
|
||||
## Overview of `Pipeline`
|
||||
|
||||
`DataSource` and `Pipeline` are two important components in `Dataset`. We have introduced `DataSource` in [add_new_dataset](./new_dataset.md). And the `Pipeline` is responsible for applying a series of data augmentations to images, such as random flip.
|
||||
`DataSource` and `Pipeline` are two important components in `Dataset`. We have introduced `DataSource` in [add_new_dataset](./1_new_dataset.md). And the `Pipeline` is responsible for applying a series of data augmentations to images, such as random flip.
|
||||
|
||||
Here is a config example of `Pipeline` for `SimCLR` training:
|
||||
|
||||
|
@ -79,7 +79,7 @@ Some common hooks are not registered through `custom_hooks`, they are
|
||||
| `EvalHook` | LOW (70) |
|
||||
| `LoggerHook(s)` | VERY_LOW (90) |
|
||||
|
||||
`OptimizerHook`, `MomentumUpdaterHook` and `LrUpdaterHook` have been introduced in [sehedule strategy](./schedule.md). `IterTimerHook` is used to record elapsed time and does not support modification.
|
||||
`OptimizerHook`, `MomentumUpdaterHook` and `LrUpdaterHook` have been introduced in [sehedule strategy](./4_schedule.md). `IterTimerHook` is used to record elapsed time and does not support modification.
|
||||
|
||||
Here we reveal how to customize `CheckpointHook`, `LoggerHooks`, and `EvalHook`.
|
||||
|
||||
|
@ -94,13 +94,13 @@ Remarks:
|
||||
|
||||
## Detection
|
||||
|
||||
Here we prefer to use MMDetection to do the detection task. First, make sure you have installed `MIM`, which is also a project of OpenMMLab. Please refer to [MIM](https://github.com/open-mmlab/mim/blob/main/docs/installation.md) for installation. Or simply from pypi
|
||||
Here we prefer to use MMDetection to do the detection task. First, make sure you have installed [MIM](https://github.com/open-mmlab/mim), which is also a project of OpenMMLab.
|
||||
```shell
|
||||
pip install openmim
|
||||
```
|
||||
It is very easy to install the package.
|
||||
|
||||
Besides, please refer to MMDet for [installation](https://github.com/open-mmlab/mmdetection/blob/master/docs/get_started.md) and [data preparation](https://github.com/open-mmlab/mmdetection/blob/master/docs/1_exist_data_model.md)
|
||||
Besides, please refer to MMDet for [installation](https://github.com/open-mmlab/mmdetection/blob/master/docs/en/get_started.md) and [data preparation](https://github.com/open-mmlab/mmdetection/blob/master/docs/en/1_exist_data_model.md)
|
||||
|
||||
After installation, you can run MMDet with simple command
|
||||
```shell
|
||||
@ -127,7 +127,8 @@ bash run.sh ${DET_CFG} ${OUTPUT_FILE}
|
||||
```
|
||||
## Segmentation
|
||||
|
||||
For semantic segmentation task, we are using MMSegmentation. First, make sure you have installed `MIM`, which is also a project of OpenMMLab. Please refer to [MIM](https://github.com/open-mmlab/mim/blob/main/docs/installation.md) for installation. Or simply from pypi
|
||||
For semantic segmentation task, we are using MMSegmentation. First, make sure you have installed [MIM](https://github.com/open-mmlab/mim), which is also a project of OpenMMLab.
|
||||
|
||||
```shell
|
||||
pip install openmim
|
||||
```
|
||||
|
@ -1,8 +1,16 @@
|
||||
# Tutorial 1: Adding New Dataset
|
||||
|
||||
In this tutorial, we introduce the basic steps to create your customized dataset:
|
||||
|
||||
- [Tutorial 1: Adding New Dataset](#tutorial-1-adding-new-dataset)
|
||||
- [An example of customized dataset](#an-example-of-customized-dataset)
|
||||
- [Creating the `DataSource`](#creating-the-datasource)
|
||||
- [Creating the `Dataset`](#creating-the-dataset)
|
||||
- [Modify config file](#modify-config-file)
|
||||
|
||||
If your algorithm does not need any customized dataset, you can use these off-the-shelf datasets under [datasets](../../mmselfsup/datasets). But to use these existing datasets, you have to convert your dataset to existing dataset format.
|
||||
|
||||
## An example of customized dataset
|
||||
### An example of customized dataset
|
||||
|
||||
Assuming the format of your dataset's annotation file is:
|
||||
|
||||
|
@ -1,8 +1,12 @@
|
||||
# Tutorial 2: Customize Data Pipelines
|
||||
|
||||
- [Tutorial 2: Customize Data Pipelines](#tutorial-2-customize-data-pipelines)
|
||||
- [Overview of `Pipeline`](#overview-of-pipeline)
|
||||
- [Creating new augmentations in `Pipeline`](#creating-new-augmentations-in-pipeline)
|
||||
|
||||
## Overview of `Pipeline`
|
||||
|
||||
`DataSource` and `Pipeline` are two important components in `Dataset`. We have introduced `DataSource` in [add_new_dataset](./new_dataset.md). And the `Pipeline` is responsible for applying a series of data augmentations to images, such as random flip.
|
||||
`DataSource` and `Pipeline` are two important components in `Dataset`. We have introduced `DataSource` in [add_new_dataset](./1_new_dataset.md). And the `Pipeline` is responsible for applying a series of data augmentations to images, such as random flip.
|
||||
|
||||
Here is a config example of `Pipeline` for `SimCLR` training:
|
||||
|
||||
|
@ -1,5 +1,11 @@
|
||||
# Tutorial 3: Adding New Modules
|
||||
|
||||
- [Tutorial 3: Adding New Modules](#tutorial-3-adding-new-modules)
|
||||
- [Add new backbone](#add-new-backbone)
|
||||
- [Add new necks](#add-new-necks)
|
||||
- [Add new loss](#add-new-loss)
|
||||
- [Combine all](#combine-all)
|
||||
|
||||
In self-supervised learning domain, each model can be divided into following four parts:
|
||||
|
||||
- backbone: used to extract image's feature
|
||||
|
@ -1,5 +1,17 @@
|
||||
# Tutorial 4: Customize Schedule
|
||||
|
||||
- [Tutorial 4: Customize Schedule](#tutorial-4-customize-schedule)
|
||||
- [Customize optimizer supported by Pytorch](#customize-optimizer-supported-by-pytorch)
|
||||
- [Customize learning rate schedules](#customize-learning-rate-schedules)
|
||||
- [Learning rate decay](#learning-rate-decay)
|
||||
- [Warmup strategy](#warmup-strategy)
|
||||
- [Customize momentum schedules](#customize-momentum-schedules)
|
||||
- [Parameter-wise configuration](#parameter-wise-configuration)
|
||||
- [Gradient clipping and gradient accumulation](#gradient-clipping-and-gradient-accumulation)
|
||||
- [Gradient clipping](#gradient-clipping)
|
||||
- [Gradient accumulation](#gradient-accumulation)
|
||||
- [Customize self-implemented optimizer](#customize-self-implemented-optimizer)
|
||||
|
||||
In this tutorial, we will introduce some methods about how to construct optimizers, customize learning rate, momentum schedules, parameter-wise configuration, gradient clipping, gradient accumulation, and customize self-implemented methods for the project.
|
||||
|
||||
## Customize optimizer supported by Pytorch
|
||||
@ -143,7 +155,7 @@ Here is an example:
|
||||
|
||||
```py
|
||||
data = dict(imgs_per_gpu=64)
|
||||
optimizer_config = dict(type="GradientCumulativeOptimizerHook", cumulative_iters=4)
|
||||
optimizer_config = dict(type="DistOptimizerHook", update_interval=4)
|
||||
```
|
||||
|
||||
Indicates that during training, back-propagation is performed every 4 iters. And the above is equivalent to:
|
||||
|
@ -1,5 +1,18 @@
|
||||
# Tutorial 5: Customize Runtime Settings
|
||||
|
||||
- [Tutorial 5: Customize Runtime Settings](#tutorial-5-customize-runtime-settings)
|
||||
- [Customize Workflow](#customize-workflow)
|
||||
- [Hooks](#hooks)
|
||||
- [default training hooks](#default-training-hooks)
|
||||
- [CheckpointHook](#checkpointhook)
|
||||
- [LoggerHooks](#loggerhooks)
|
||||
- [EvalHook](#evalhook)
|
||||
- [Use other implemented hooks](#use-other-implemented-hooks)
|
||||
- [Customize self-implemented hooks](#customize-self-implemented-hooks)
|
||||
- [1. Implement a new hook](#1-implement-a-new-hook)
|
||||
- [2. Import the new hook](#2-import-the-new-hook)
|
||||
- [3. Modify the config](#3-modify-the-config)
|
||||
|
||||
In this tutorial, we will introduce some methods about how to customize workflow and hooks when running your own settings for the project.
|
||||
|
||||
## Customize Workflow
|
||||
@ -38,17 +51,17 @@ The custom hooks are registered through custom_hooks. Generally, they are hooks
|
||||
|
||||
Priority list
|
||||
|
||||
| Level | Value |
|
||||
|:--:|:--:|
|
||||
| HIGHEST | 0 |
|
||||
| VERY_HIGH | 10 |
|
||||
| HIGH | 30 |
|
||||
| ABOVE_NORMAL | 40 |
|
||||
| NORMAL(default) | 50 |
|
||||
| BELOW_NORMAL | 60 |
|
||||
| LOW | 70 |
|
||||
| VERY_LOW | 90 |
|
||||
| LOWEST | 100 |
|
||||
| Level | Value |
|
||||
| :-------------: | :---: |
|
||||
| HIGHEST | 0 |
|
||||
| VERY_HIGH | 10 |
|
||||
| HIGH | 30 |
|
||||
| ABOVE_NORMAL | 40 |
|
||||
| NORMAL(default) | 50 |
|
||||
| BELOW_NORMAL | 60 |
|
||||
| LOW | 70 |
|
||||
| VERY_LOW | 90 |
|
||||
| LOWEST | 100 |
|
||||
|
||||
The priority determines the execution order of the hooks. Before training, the log will print out the execution order of the hooks at each stage to facilitate debugging.
|
||||
|
||||
@ -56,17 +69,17 @@ The priority determines the execution order of the hooks. Before training, the l
|
||||
|
||||
Some common hooks are not registered through `custom_hooks`, they are
|
||||
|
||||
| Hooks | Priority |
|
||||
|:--:|:--:|
|
||||
| `LrUpdaterHook` | VERY_HIGH (10) |
|
||||
| `MomentumUpdaterHook` | HIGH (30) |
|
||||
| `OptimizerHook` | ABOVE_NORMAL (40) |
|
||||
| `CheckpointHook` | NORMAL (50) |
|
||||
| `IterTimerHook` | LOW (70) |
|
||||
| `EvalHook` | LOW (70) |
|
||||
| `LoggerHook(s)` | VERY_LOW (90) |
|
||||
| Hooks | Priority |
|
||||
| :-------------------: | :---------------: |
|
||||
| `LrUpdaterHook` | VERY_HIGH (10) |
|
||||
| `MomentumUpdaterHook` | HIGH (30) |
|
||||
| `OptimizerHook` | ABOVE_NORMAL (40) |
|
||||
| `CheckpointHook` | NORMAL (50) |
|
||||
| `IterTimerHook` | LOW (70) |
|
||||
| `EvalHook` | LOW (70) |
|
||||
| `LoggerHook(s)` | VERY_LOW (90) |
|
||||
|
||||
`OptimizerHook`, `MomentumUpdaterHook` and `LrUpdaterHook` have been introduced in [sehedule strategy](./schedule.md). `IterTimerHook` is used to record elapsed time and does not support modification.
|
||||
`OptimizerHook`, `MomentumUpdaterHook` and `LrUpdaterHook` have been introduced in [sehedule strategy](./4_schedule.md). `IterTimerHook` is used to record elapsed time and does not support modification.
|
||||
|
||||
Here we reveal how to customize `CheckpointHook`, `LoggerHooks`, and `EvalHook`.
|
||||
|
||||
|
@ -94,13 +94,13 @@ Remarks:
|
||||
|
||||
## Detection
|
||||
|
||||
Here we prefer to use MMDetection to do the detection task. First, make sure you have installed `MIM`, which is also a project of OpenMMLab. Please refer to [MIM](https://github.com/open-mmlab/mim/blob/main/docs/installation.md) for installation. Or simply from pypi
|
||||
Here we prefer to use MMDetection to do the detection task. First, make sure you have installed [MIM](https://github.com/open-mmlab/mim), which is also a project of OpenMMLab.
|
||||
```shell
|
||||
pip install openmim
|
||||
```
|
||||
It is very easy to install the package.
|
||||
|
||||
Besides, please refer to MMDet for [installation](https://github.com/open-mmlab/mmdetection/blob/master/docs/get_started.md) and [data preparation](https://github.com/open-mmlab/mmdetection/blob/master/docs/1_exist_data_model.md)
|
||||
Besides, please refer to MMDet for [installation](https://github.com/open-mmlab/mmdetection/blob/master/docs/en/get_started.md) and [data preparation](https://github.com/open-mmlab/mmdetection/blob/master/docs/en/1_exist_data_model.md)
|
||||
|
||||
After installation, you can run MMDet with simple command
|
||||
```shell
|
||||
@ -127,7 +127,8 @@ bash run.sh ${DET_CFG} ${OUTPUT_FILE}
|
||||
```
|
||||
## Segmentation
|
||||
|
||||
For semantic segmentation task, we are using MMSegmentation. First, make sure you have installed `MIM`, which is also a project of OpenMMLab. Please refer to [MIM](https://github.com/open-mmlab/mim/blob/main/docs/installation.md) for installation. Or simply from pypi
|
||||
For semantic segmentation task, we are using MMSegmentation. First, make sure you have installed [MIM](https://github.com/open-mmlab/mim), which is also a project of OpenMMLab.
|
||||
|
||||
```shell
|
||||
pip install openmim
|
||||
```
|
||||
|
Loading…
x
Reference in New Issue
Block a user