Optimize doc, test=document_fix (#642)
parent
7f3f44c105
commit
9a0d8af0ab
docs
en/tutorials
zh_CN/tutorials
|
@ -171,14 +171,15 @@ python3 tools/train.py -c ./configs/quick_start/ResNet50_vd_ssld_random_erasin
|
|||
|
||||
It improves by 1.27\% to 96.27\%
|
||||
|
||||
* Save ResNet50_vd pretrained model to experience next chapter.
|
||||
|
||||
### Distillation
|
||||
|
||||
* The ResNet50_vd model pretrained on previous chapter will be used as the teacher model to train student model. Save the model to specified directory, command as follows:
|
||||
|
||||
```shell
|
||||
cp -r output/ResNet50_vd/19/ ./pretrained/flowers102_R50_vd_final/
|
||||
```
|
||||
|
||||
### Distillation
|
||||
|
||||
* Use `extra_list.txt` as unlabeled data, Note:
|
||||
* Samples in the `extra_list.txt` and `val_list.txt` don't have intersection
|
||||
* Because of in the source code, label information is unused, This is still unlabeled distillation
|
||||
|
|
|
@ -166,14 +166,13 @@ python3 tools/train.py -c ./configs/quick_start/ResNet50_vd_ssld_random_erasi
|
|||
|
||||
最终flowers102验证集上的精度为0.9627,使用数据增广可以使得模型精度再次提升1.27\%。
|
||||
|
||||
* 如果希望体验`3.6节`的知识蒸馏部分,可以首先保存训练得到的ResNet50_vd预训练模型到合适的位置,作为蒸馏时教师模型的预训练模型。脚本如下所示。
|
||||
|
||||
### 3.6 知识蒸馏小试牛刀
|
||||
* 本小节将尝试使用知识蒸馏技术对MobileNetV3_large_x1_0模型进行训练,使用`3.5小节`训练得到的ResNet50_vd模型作为蒸馏所用的教师模型,首先将`3.5小节`训练得到的ResNet50_vd模型保存到指定目录,脚本如下。
|
||||
|
||||
```shell
|
||||
cp -r output/ResNet50_vd/best_model/ ./pretrained/flowers102_R50_vd_final/
|
||||
```
|
||||
|
||||
### 3.6 知识蒸馏小试牛刀
|
||||
|
||||
* 使用flowers102数据集进行模型蒸馏,为了进一步提提升模型的精度,使用`extra_list.txt`充当无标签数据,在这里有几点需要注意:
|
||||
* `extra_list.txt`与`val_list.txt`的样本没有重复,因此可以用于扩充知识蒸馏任务的训练数据。
|
||||
* 即使引入了有标签的extra_list.txt中的图像,但是代码中没有使用标签信息,因此仍然可以视为无标签的模型蒸馏。
|
||||
|
|
Loading…
Reference in New Issue