fix distillation doc (#5660)
parent
6b1bc4ccaf
commit
d9e3832b3a
doc
|
@ -10,6 +10,7 @@
|
|||
* [2.1 启动训练](#21-----)
|
||||
* [2.2 断点训练](#22-----)
|
||||
* [2.3 更换Backbone 训练](#23---backbone---)
|
||||
* [2.4 知识蒸馏训练](#24---distill---)
|
||||
- [3. 模型评估与预测](#3--------)
|
||||
* [3.1 指标评估](#31-----)
|
||||
* [3.2 测试检测效果](#32-------)
|
||||
|
@ -182,6 +183,15 @@ args1: args1
|
|||
|
||||
**注意**:如果要更换网络的其他模块,可以参考[文档](./add_new_algorithm.md)。
|
||||
|
||||
|
||||
<a name="24---distill---"></a>
|
||||
|
||||
## 2.4 知识蒸馏训练
|
||||
|
||||
PaddleOCR支持了基于知识蒸馏的检测模型训练过程,更多内容可以参考[知识蒸馏说明文档](./knowledge_distillation.md)。
|
||||
|
||||
|
||||
|
||||
<a name="3--------"></a>
|
||||
# 3. 模型评估与预测
|
||||
|
||||
|
|
|
@ -11,6 +11,7 @@
|
|||
- [2.1 数据增强](#数据增强)
|
||||
- [2.2 通用模型训练](#通用模型训练)
|
||||
- [2.3 多语言模型训练](#多语言模型训练)
|
||||
- [2.4 知识蒸馏训练](#知识蒸馏训练)
|
||||
- [3 评估](#评估)
|
||||
- [4 预测](#预测)
|
||||
- [5 转Inference模型测试](#Inference)
|
||||
|
@ -368,6 +369,13 @@ Eval:
|
|||
label_file_list: ["./train_data/french_val.txt"]
|
||||
...
|
||||
```
|
||||
|
||||
<a name="知识蒸馏训练"></a>
|
||||
|
||||
### 2.4 知识蒸馏训练
|
||||
|
||||
PaddleOCR支持了基于知识蒸馏的文本识别模型训练过程,更多内容可以参考[知识蒸馏说明文档](./knowledge_distillation.md)。
|
||||
|
||||
<a name="评估"></a>
|
||||
## 3 评估
|
||||
|
||||
|
|
|
@ -9,6 +9,7 @@ This section uses the icdar2015 dataset as an example to introduce the training,
|
|||
* [2.1 Start Training](#21-start-training)
|
||||
* [2.2 Load Trained Model and Continue Training](#22-load-trained-model-and-continue-training)
|
||||
* [2.3 Training with New Backbone](#23-training-with-new-backbone)
|
||||
* [2.4 Training with knowledge distillation](#24)
|
||||
- [3. Evaluation and Test](#3-evaluation-and-test)
|
||||
* [3.1 Evaluation](#31-evaluation)
|
||||
* [3.2 Test](#32-test)
|
||||
|
@ -174,6 +175,11 @@ After adding the four-part modules of the network, you only need to configure th
|
|||
|
||||
**NOTE**: More details about replace Backbone and other mudule can be found in [doc](add_new_algorithm_en.md).
|
||||
|
||||
|
||||
### 2.4 Training with knowledge distillation
|
||||
|
||||
Knowledge distillation is supported in PaddleOCR for text detection training process. For more details, please refer to [doc](./knowledge_distillation_en.md).
|
||||
|
||||
## 3. Evaluation and Test
|
||||
|
||||
### 3.1 Evaluation
|
||||
|
|
|
@ -10,6 +10,7 @@
|
|||
- [2.1 Data Augmentation](#Data_Augmentation)
|
||||
- [2.2 General Training](#Training)
|
||||
- [2.3 Multi-language Training](#Multi_language)
|
||||
- [2.4 Training with Knowledge Distillation](#kd)
|
||||
|
||||
- [3. Evaluation](#EVALUATION)
|
||||
|
||||
|
@ -361,6 +362,12 @@ Eval:
|
|||
...
|
||||
```
|
||||
|
||||
<a name="kd"></a>
|
||||
|
||||
### 2.4 Training with Knowledge Distillation
|
||||
|
||||
Knowledge distillation is supported in PaddleOCR for text recognition training process. For more details, please refer to [doc](./knowledge_distillation_en.md).
|
||||
|
||||
<a name="EVALUATION"></a>
|
||||
|
||||
## 3. Evalution
|
||||
|
|
Loading…
Reference in New Issue