# TensorRT Plugins for custom operators in MMCV (Experimental)
<!-- TOC -->
- [TensorRT Plugins for custom operators in MMCV (Experimental)](#tensorrt-plugins-for-custom-operators-in-mmcv-experimental)
- [Introduction](#introduction)
- [List of TensorRT plugins supported in MMCV](#list-of-tensorrt-plugins-supported-in-mmcv)
- [How to build TensorRT plugins in MMCV](#how-to-build-tensorrt-plugins-in-mmcv)
- [Prerequisite](#prerequisite)
- [Build on Linux](#build-on-linux)
- [Create TensorRT engine and run inference in python](#create-tensorrt-engine-and-run-inference-in-python)
- [How to add a TensorRT plugin for custom op in MMCV](#how-to-add-a-tensorrt-plugin-for-custom-op-in-mmcv)
- [Main procedures](#main-procedures)
- [Reminders](#reminders)
- [Known Issues](#known-issues)
- [References](#references)
<!-- TOC -->
## Introduction
**NVIDIA TensorRT** is a software development kit(SDK) for high-performance inference of deep learning models. It includes a deep learning inference optimizer and runtime that delivers low latency and high-throughput for deep learning inference applications. Please check its [developer's website](https://developer.nvidia.com/tensorrt) for more information.
To ease the deployment of trained models with custom operators from `mmcv.ops` using TensorRT, a series of TensorRT plugins are included in MMCV.
For more detailed infomation of installing TensorRT using tar, please refer to [Nvidia' website](https://docs.nvidia.com/deeplearning/tensorrt/archives/tensorrt-721/install-guide/index.html#installing-tar).
### Build on Linux
```bash
cd mmcv # to MMCV root directory
MMCV_WITH_OPS=1 MMCV_WITH_TRT=1 pip install -e .
```
## Create TensorRT engine and run inference in python