polish 2022 s1 faq

pull/2128/head
HydrogenSulfate 2022-07-06 11:25:33 +08:00
parent 14ad4ee0ce
commit e79b0ee2c2
2 changed files with 2 additions and 2 deletions

View File

@ -37,7 +37,7 @@
data_format=data_format)
```
2. Manually set stop_gradient=True for the frozen layer, please refer to [this link](https://github.com/RainFrost1/PaddleClas/blob/24e968b8d9f7d9e2309e713cbf2afe8fda9deacd/ppcls/engine/train/train_idml.py#L40-L66). After using this method, the layer from loss to stop_gradient stops, that is, the weight of the previous layer is also fixed
2. Manually set stop_gradient=True for the frozen layer, please refer to [this link](https://github.com/RainFrost1/PaddleClas/blob/24e968b8d9f7d9e2309e713cbf2afe8fda9deacd/ppcls/engine/train/train_idml.py#L40-L66). When using this method, after the gradient is returned to the layer which set strop_gradient=True, the gradient backward is stopped, that is, the weight of the previous layer will be fixed.
3. After loss.backward() and before optimizer.step(), use the clear_gradients() method in nn.Layer. For the layer to be fixed, call this method without affecting the loss return. The following code can clear the gradient of a layer or the gradient of a parameter of a layer
```python

View File

@ -38,7 +38,7 @@
data_format=data_format)
```
2. 手动设置冻结层的stop_gradient=True可参考[此链接](https://github.com/RainFrost1/PaddleClas/blob/24e968b8d9f7d9e2309e713cbf2afe8fda9deacd/ppcls/engine/train/train_idml.py#L40-L66)。使用此方法后,loss到stop_gradient的层回停止即之前的层的权重也被固定
2. 手动设置冻结层的stop_gradient=True可参考[此链接](https://github.com/RainFrost1/PaddleClas/blob/24e968b8d9f7d9e2309e713cbf2afe8fda9deacd/ppcls/engine/train/train_idml.py#L40-L66)。使用此方法后,梯度回传到strop_gradient的层之后停止反向回传即之前的层的权重也会被固定。
3. 在loss.backward()之后optimizer.step()之前使用nn.Layer或者paddle.Tensor的clear_gradients()方法。对要固定的层或参数调用此方法不用影响loss回传。如下代码可以清空某一层的梯度或者是某一层的某个参数张量的梯度
```python