update backend docs (#319)
parent
9d4d52078b
commit
f76ac64cfe
|
@ -5,10 +5,14 @@
|
||||||
#### Install ncnn
|
#### Install ncnn
|
||||||
|
|
||||||
- Download VulkanTools for the compilation of ncnn.
|
- Download VulkanTools for the compilation of ncnn.
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
wget https://sdk.lunarg.com/sdk/download/1.2.176.1/linux/vulkansdk-linux-x86_64-1.2.176.1.tar.gz?Human=true -O vulkansdk-linux-x86_64-1.2.176.1.tar.gz
|
wget https://sdk.lunarg.com/sdk/download/1.2.176.1/linux/vulkansdk-linux-x86_64-1.2.176.1.tar.gz?Human=true -O vulkansdk-linux-x86_64-1.2.176.1.tar.gz
|
||||||
tar -xf vulkansdk-linux-x86_64-1.2.176.1.tar.gz
|
tar -xf vulkansdk-linux-x86_64-1.2.176.1.tar.gz
|
||||||
export VULKAN_SDK=$(pwd)/1.2.176.1/x86_64
|
export VULKAN_SDK=$(pwd)/1.2.176.1/x86_64
|
||||||
|
export LD_LIBRARY_PATH=$VULKAN_SDK/lib:$LD_LIBRARY_PATH
|
||||||
|
```
|
||||||
|
|
||||||
- Check your gcc version.
|
- Check your gcc version.
|
||||||
You should ensure your gcc satisfies `gcc >= 6`.
|
You should ensure your gcc satisfies `gcc >= 6`.
|
||||||
|
|
||||||
|
@ -27,17 +31,18 @@ You should ensure your gcc satisfies `gcc >= 6`.
|
||||||
- <font color=red>Make install</font> ncnn library
|
- <font color=red>Make install</font> ncnn library
|
||||||
```bash
|
```bash
|
||||||
cd ncnn
|
cd ncnn
|
||||||
|
export NCNN_DIR=$(pwd)
|
||||||
git submodule update --init
|
git submodule update --init
|
||||||
mkdir build
|
mkdir -p build && cd build
|
||||||
cd build
|
|
||||||
cmake -DNCNN_VULKAN=ON -DNCNN_SYSTEM_GLSLANG=ON -DNCNN_BUILD_EXAMPLES=ON -DNCNN_PYTHON=ON -DNCNN_BUILD_TOOLS=ON -DNCNN_BUILD_BENCHMARK=ON -DNCNN_BUILD_TESTS=ON ..
|
cmake -DNCNN_VULKAN=ON -DNCNN_SYSTEM_GLSLANG=ON -DNCNN_BUILD_EXAMPLES=ON -DNCNN_PYTHON=ON -DNCNN_BUILD_TOOLS=ON -DNCNN_BUILD_BENCHMARK=ON -DNCNN_BUILD_TESTS=ON ..
|
||||||
make install
|
make install
|
||||||
```
|
```
|
||||||
|
|
||||||
- Install pyncnn module
|
- Install pyncnn module
|
||||||
```bash
|
```bash
|
||||||
cd ncnn/python
|
cd ${NCNN_DIR} # To NCNN root directory
|
||||||
pip install .
|
cd python
|
||||||
|
pip install -e .
|
||||||
```
|
```
|
||||||
|
|
||||||
#### Build custom ops
|
#### Build custom ops
|
||||||
|
@ -46,8 +51,7 @@ Some custom ops are created to support models in OpenMMLab, the custom ops can b
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
cd ${MMDEPLOY_DIR}
|
cd ${MMDEPLOY_DIR}
|
||||||
mkdir build
|
mkdir -p build && cd build
|
||||||
cd build
|
|
||||||
cmake -DMMDEPLOY_TARGET_BACKENDS=ncnn ..
|
cmake -DMMDEPLOY_TARGET_BACKENDS=ncnn ..
|
||||||
make -j$(nproc)
|
make -j$(nproc)
|
||||||
```
|
```
|
||||||
|
@ -55,7 +59,7 @@ make -j$(nproc)
|
||||||
If you haven't installed NCNN in the default path, please add `-Dncnn_DIR` flag in cmake.
|
If you haven't installed NCNN in the default path, please add `-Dncnn_DIR` flag in cmake.
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
cmake -DMMDEPLOY_TARGET_BACKENDS=ncnn -Dncnn_DIR={path/of/ncnn}/build/install/lib/cmake/ncnn ..
|
cmake -DMMDEPLOY_TARGET_BACKENDS=ncnn -Dncnn_DIR=${NCNN_DIR}/build/install/lib/cmake/ncnn ..
|
||||||
make -j$(nproc)
|
make -j$(nproc)
|
||||||
```
|
```
|
||||||
|
|
||||||
|
|
|
@ -44,10 +44,9 @@ Note:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
cd ${MMDEPLOY_DIR} # To MMDeploy root directory
|
cd ${MMDEPLOY_DIR} # To MMDeploy root directory
|
||||||
mkdir build
|
mkdir -p build && cd build
|
||||||
cd build
|
|
||||||
cmake -DMMDEPLOY_TARGET_BACKENDS=ort -DONNXRUNTIME_DIR=${ONNXRUNTIME_DIR} ..
|
cmake -DMMDEPLOY_TARGET_BACKENDS=ort -DONNXRUNTIME_DIR=${ONNXRUNTIME_DIR} ..
|
||||||
make -j10
|
make -j$(nproc)
|
||||||
```
|
```
|
||||||
|
|
||||||
### How to convert a model
|
### How to convert a model
|
||||||
|
@ -73,8 +72,8 @@ make -j10
|
||||||
|
|
||||||
Take custom operator `roi_align` for example.
|
Take custom operator `roi_align` for example.
|
||||||
|
|
||||||
1. Create a `roi_align` directory in ONNX Runtime source directory `backend_ops/onnxruntime/`
|
1. Create a `roi_align` directory in ONNX Runtime source directory `${MMDEPLOY_DIR}/csrc/backend_ops/onnxruntime/`
|
||||||
2. Add header and source file into `roi_align` directory `backend_ops/onnxruntime/roi_align/`
|
2. Add header and source file into `roi_align` directory `${MMDEPLOY_DIR}/csrc/backend_ops/onnxruntime/roi_align/`
|
||||||
3. Add unit test into `tests/test_ops/test_ops.py`
|
3. Add unit test into `tests/test_ops/test_ops.py`
|
||||||
Check [here](../../../tests/test_ops/test_ops.py) for examples.
|
Check [here](../../../tests/test_ops/test_ops.py) for examples.
|
||||||
|
|
||||||
|
|
|
@ -63,8 +63,8 @@ The table below lists the models that are guaranteed to be exportable to OpenVIN
|
||||||
| Faster R-CNN + DCN | `configs/dcn/faster_rcnn_r50_fpn_dconv_c3-c5_1x_coco.py` | Y |
|
| Faster R-CNN + DCN | `configs/dcn/faster_rcnn_r50_fpn_dconv_c3-c5_1x_coco.py` | Y |
|
||||||
| VFNet | `configs/vfnet/vfnet_r50_fpn_1x_coco.py` | Y |
|
| VFNet | `configs/vfnet/vfnet_r50_fpn_1x_coco.py` | Y |
|
||||||
|
|
||||||
|
|
||||||
Notes:
|
Notes:
|
||||||
|
|
||||||
- Custom operations from OpenVINO use the domain `org.openvinotoolkit`.
|
- Custom operations from OpenVINO use the domain `org.openvinotoolkit`.
|
||||||
- For faster work in OpenVINO in the Faster-RCNN, Mask-RCNN, Cascade-RCNN, Cascade-Mask-RCNN models
|
- For faster work in OpenVINO in the Faster-RCNN, Mask-RCNN, Cascade-RCNN, Cascade-Mask-RCNN models
|
||||||
the RoiAlign operation is replaced with the [ExperimentalDetectronROIFeatureExtractor](https://docs.openvinotoolkit.org/latest/openvino_docs_ops_detection_ExperimentalDetectronROIFeatureExtractor_6.html) operation in the ONNX graph.
|
the RoiAlign operation is replaced with the [ExperimentalDetectronROIFeatureExtractor](https://docs.openvinotoolkit.org/latest/openvino_docs_ops_detection_ExperimentalDetectronROIFeatureExtractor_6.html) operation in the ONNX graph.
|
||||||
|
|
|
@ -25,9 +25,8 @@ Please install TensorRT 8 follow [install-guide](https://docs.nvidia.com/deeplea
|
||||||
Some custom ops are created to support models in OpenMMLab, and the custom ops can be built as follow:
|
Some custom ops are created to support models in OpenMMLab, and the custom ops can be built as follow:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
cd ${MMDEPLOY_DIR}
|
cd ${MMDEPLOY_DIR} # To MMDeploy root directory
|
||||||
mkdir build
|
mkdir -p build && cd build
|
||||||
cd build
|
|
||||||
cmake -DMMDEPLOY_TARGET_BACKENDS=trt ..
|
cmake -DMMDEPLOY_TARGET_BACKENDS=trt ..
|
||||||
make -j$(nproc)
|
make -j$(nproc)
|
||||||
```
|
```
|
||||||
|
@ -73,10 +72,6 @@ test_pipeline = [
|
||||||
data = dict(
|
data = dict(
|
||||||
samples_per_gpu=2,
|
samples_per_gpu=2,
|
||||||
workers_per_gpu=2,
|
workers_per_gpu=2,
|
||||||
train=dict(
|
|
||||||
type=dataset_type,
|
|
||||||
ann_file=data_root + 'train_annotations.json',
|
|
||||||
pipeline=train_pipeline),
|
|
||||||
val=dict(
|
val=dict(
|
||||||
type=dataset_type,
|
type=dataset_type,
|
||||||
ann_file=data_root + 'val_annotations.json',
|
ann_file=data_root + 'val_annotations.json',
|
||||||
|
|
Loading…
Reference in New Issue