[cherry-pick] update paddle2onnx doc (#14051)
parent
ee1aa57e52
commit
023d02d3f3
docs/ppocr/infer_deploy
|
@ -92,6 +92,12 @@ After execution, the ONNX model will be saved in `./inference/det_onnx/`, `./inf
|
|||
|
||||
In addition, the following models do not currently support conversion to ONNX models: NRTR, SAR, RARE, SRN.
|
||||
|
||||
If you have optimization needs for the exported ONNX model, we recommend using `onnxslim`.
|
||||
```bash linenums="1"
|
||||
pip install onnxslim
|
||||
onnxslim model.onnx slim.onnx
|
||||
```
|
||||
|
||||
## 3. prediction
|
||||
|
||||
Take the English OCR model as an example, use **ONNXRuntime** to predict and execute the following commands:
|
||||
|
|
|
@ -97,6 +97,12 @@ paddle2onnx --model_dir ./inference/ch_ppocr_mobile_v2.0_cls_infer \
|
|||
--input_shape_dict "{'x': [-1,3,-1,-1]}"
|
||||
```
|
||||
|
||||
如你对导出的 ONNX 模型有优化的需求,推荐使用 `onnxslim` 对模型进行优化:
|
||||
```bash linenums="1"
|
||||
pip install onnxslim
|
||||
onnxslim model.onnx slim.onnx
|
||||
```
|
||||
|
||||
## 3. 推理预测
|
||||
|
||||
以中文OCR模型为例,使用 ONNXRuntime 预测可执行如下命令:
|
||||
|
|
Loading…
Reference in New Issue