From f76ac64cfe645aa039f1ee702a9c4e64b2d85d5d Mon Sep 17 00:00:00 2001 From: RunningLeon Date: Thu, 23 Dec 2021 13:23:39 +0800 Subject: [PATCH] update backend docs (#319) --- docs/en/backends/ncnn.md | 18 +++++++++++------- docs/en/backends/onnxruntime.md | 9 ++++----- docs/en/backends/openvino.md | 2 +- docs/en/backends/tensorrt.md | 9 ++------- 4 files changed, 18 insertions(+), 20 deletions(-) diff --git a/docs/en/backends/ncnn.md b/docs/en/backends/ncnn.md index 2ae258a88..d752904dd 100644 --- a/docs/en/backends/ncnn.md +++ b/docs/en/backends/ncnn.md @@ -5,10 +5,14 @@ #### Install ncnn - Download VulkanTools for the compilation of ncnn. + ```bash wget https://sdk.lunarg.com/sdk/download/1.2.176.1/linux/vulkansdk-linux-x86_64-1.2.176.1.tar.gz?Human=true -O vulkansdk-linux-x86_64-1.2.176.1.tar.gz tar -xf vulkansdk-linux-x86_64-1.2.176.1.tar.gz export VULKAN_SDK=$(pwd)/1.2.176.1/x86_64 + export LD_LIBRARY_PATH=$VULKAN_SDK/lib:$LD_LIBRARY_PATH + ``` + - Check your gcc version. You should ensure your gcc satisfies `gcc >= 6`. @@ -27,17 +31,18 @@ You should ensure your gcc satisfies `gcc >= 6`. - Make install ncnn library ```bash cd ncnn + export NCNN_DIR=$(pwd) git submodule update --init - mkdir build - cd build + mkdir -p build && cd build cmake -DNCNN_VULKAN=ON -DNCNN_SYSTEM_GLSLANG=ON -DNCNN_BUILD_EXAMPLES=ON -DNCNN_PYTHON=ON -DNCNN_BUILD_TOOLS=ON -DNCNN_BUILD_BENCHMARK=ON -DNCNN_BUILD_TESTS=ON .. make install ``` - Install pyncnn module ```bash - cd ncnn/python - pip install . + cd ${NCNN_DIR} # To NCNN root directory + cd python + pip install -e . ``` #### Build custom ops @@ -46,8 +51,7 @@ Some custom ops are created to support models in OpenMMLab, the custom ops can b ```bash cd ${MMDEPLOY_DIR} -mkdir build -cd build +mkdir -p build && cd build cmake -DMMDEPLOY_TARGET_BACKENDS=ncnn .. make -j$(nproc) ``` @@ -55,7 +59,7 @@ make -j$(nproc) If you haven't installed NCNN in the default path, please add `-Dncnn_DIR` flag in cmake. ```bash - cmake -DMMDEPLOY_TARGET_BACKENDS=ncnn -Dncnn_DIR={path/of/ncnn}/build/install/lib/cmake/ncnn .. + cmake -DMMDEPLOY_TARGET_BACKENDS=ncnn -Dncnn_DIR=${NCNN_DIR}/build/install/lib/cmake/ncnn .. make -j$(nproc) ``` diff --git a/docs/en/backends/onnxruntime.md b/docs/en/backends/onnxruntime.md index c878281f2..a47569d80 100644 --- a/docs/en/backends/onnxruntime.md +++ b/docs/en/backends/onnxruntime.md @@ -44,10 +44,9 @@ Note: ```bash cd ${MMDEPLOY_DIR} # To MMDeploy root directory -mkdir build -cd build +mkdir -p build && cd build cmake -DMMDEPLOY_TARGET_BACKENDS=ort -DONNXRUNTIME_DIR=${ONNXRUNTIME_DIR} .. -make -j10 +make -j$(nproc) ``` ### How to convert a model @@ -73,8 +72,8 @@ make -j10 Take custom operator `roi_align` for example. -1. Create a `roi_align` directory in ONNX Runtime source directory `backend_ops/onnxruntime/` -2. Add header and source file into `roi_align` directory `backend_ops/onnxruntime/roi_align/` +1. Create a `roi_align` directory in ONNX Runtime source directory `${MMDEPLOY_DIR}/csrc/backend_ops/onnxruntime/` +2. Add header and source file into `roi_align` directory `${MMDEPLOY_DIR}/csrc/backend_ops/onnxruntime/roi_align/` 3. Add unit test into `tests/test_ops/test_ops.py` Check [here](../../../tests/test_ops/test_ops.py) for examples. diff --git a/docs/en/backends/openvino.md b/docs/en/backends/openvino.md index f8ac75f28..2c8d5c2d2 100644 --- a/docs/en/backends/openvino.md +++ b/docs/en/backends/openvino.md @@ -63,8 +63,8 @@ The table below lists the models that are guaranteed to be exportable to OpenVIN | Faster R-CNN + DCN | `configs/dcn/faster_rcnn_r50_fpn_dconv_c3-c5_1x_coco.py` | Y | | VFNet | `configs/vfnet/vfnet_r50_fpn_1x_coco.py` | Y | - Notes: + - Custom operations from OpenVINO use the domain `org.openvinotoolkit`. - For faster work in OpenVINO in the Faster-RCNN, Mask-RCNN, Cascade-RCNN, Cascade-Mask-RCNN models the RoiAlign operation is replaced with the [ExperimentalDetectronROIFeatureExtractor](https://docs.openvinotoolkit.org/latest/openvino_docs_ops_detection_ExperimentalDetectronROIFeatureExtractor_6.html) operation in the ONNX graph. diff --git a/docs/en/backends/tensorrt.md b/docs/en/backends/tensorrt.md index 2bce59e4e..fbe3c754f 100644 --- a/docs/en/backends/tensorrt.md +++ b/docs/en/backends/tensorrt.md @@ -25,9 +25,8 @@ Please install TensorRT 8 follow [install-guide](https://docs.nvidia.com/deeplea Some custom ops are created to support models in OpenMMLab, and the custom ops can be built as follow: ```bash -cd ${MMDEPLOY_DIR} -mkdir build -cd build +cd ${MMDEPLOY_DIR} # To MMDeploy root directory +mkdir -p build && cd build cmake -DMMDEPLOY_TARGET_BACKENDS=trt .. make -j$(nproc) ``` @@ -73,10 +72,6 @@ test_pipeline = [ data = dict( samples_per_gpu=2, workers_per_gpu=2, - train=dict( - type=dataset_type, - ann_file=data_root + 'train_annotations.json', - pipeline=train_pipeline), val=dict( type=dataset_type, ann_file=data_root + 'val_annotations.json',