**Note**: ${MODEL_STORE} needs to be an absolute path to a folder.
## 2. Build `mmcls-serve` docker image
```shell
docker build -t mmcls-serve:latest docker/serve/
```
## 3. Run `mmcls-serve`
Check the official docs for [running TorchServe with docker](https://github.com/pytorch/serve/blob/master/docker/README.md#running-torchserve-in-a-production-docker-environment).
In order to run in GPU, you need to install [nvidia-docker](https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/install-guide.html). You can omit the `--gpus` argument in order to run in GPU.