update docs

gh-pages
KaiyangZhou 2019-08-23 22:37:55 +01:00
parent cf226b63e6
commit e1610fee6a
33 changed files with 2746 additions and 2163 deletions

View File

@ -185,65 +185,65 @@
<div class="section" id="conferences">
<h2>Conferences<a class="headerlink" href="#conferences" title="Permalink to this headline"></a></h2>
<ul class="simple">
<li><p><strong><a class="reference external" href="#cvpr-2019">CVPR 2019</a></strong></p></li>
<li><p><strong><a class="reference external" href="#aaai-2019">AAAI 2019</a></strong></p></li>
<li><p><strong><a class="reference external" href="#neurips-2018">NeurIPS 2018</a></strong></p></li>
<li><p><strong><a class="reference external" href="#eccv-2018">ECCV 2018</a></strong></p></li>
<li><p><strong><a class="reference external" href="#cvpr-2018">CVPR 2018</a></strong></p></li>
<li><p><strong><a class="reference external" href="#arxiv">ArXiv</a></strong></p></li>
<li><strong><a class="reference external" href="#cvpr-2019">CVPR 2019</a></strong></li>
<li><strong><a class="reference external" href="#aaai-2019">AAAI 2019</a></strong></li>
<li><strong><a class="reference external" href="#neurips-2018">NeurIPS 2018</a></strong></li>
<li><strong><a class="reference external" href="#eccv-2018">ECCV 2018</a></strong></li>
<li><strong><a class="reference external" href="#cvpr-2018">CVPR 2018</a></strong></li>
<li><strong><a class="reference external" href="#arxiv">ArXiv</a></strong></li>
</ul>
<div class="section" id="cvpr-2019">
<h3>CVPR 2019<a class="headerlink" href="#cvpr-2019" title="Permalink to this headline"></a></h3>
<ul class="simple">
<li><p>Joint Discriminative and Generative Learning for Person Re-identification. [<a class="reference external" href="https://arxiv.org/abs/1904.07223">paper</a>][<a class="reference external" href="https://github.com/NVlabs/DG-Net">code</a>]</p></li>
<li><p>Invariance Matters: Exemplar Memory for Domain Adaptive Person Re-identification. [<a class="reference external" href="https://arxiv.org/abs/1904.01990">paper</a>][<a class="reference external" href="https://github.com/zhunzhong07/ECN">code</a>]</p></li>
<li><p>Dissecting Person Re-identification from the Viewpoint of Viewpoint. [<a class="reference external" href="https://arxiv.org/abs/1812.02162">paper</a>][<a class="reference external" href="https://github.com/sxzrt/Dissecting-Person-Re-ID-from-the-Viewpoint-of-Viewpoint">code</a>]</p></li>
<li><p>Unsupervised Person Re-identification by Soft Multilabel Learning. [<a class="reference external" href="https://arxiv.org/abs/1903.06325">paper</a>][<a class="reference external" href="https://github.com/KovenYu/MAR">code</a>]</p></li>
<li><p>Patch-based Discriminative Feature Learning for Unsupervised Person Re-identification. [<a class="reference external" href="https://kovenyu.com/publication/2019-cvpr-pedal/">paper</a>][<a class="reference external" href="https://github.com/QizeYang/PAUL">code</a>]</p></li>
<li>Joint Discriminative and Generative Learning for Person Re-identification. [<a class="reference external" href="https://arxiv.org/abs/1904.07223">paper</a>][<a class="reference external" href="https://github.com/NVlabs/DG-Net">code</a>]</li>
<li>Invariance Matters: Exemplar Memory for Domain Adaptive Person Re-identification. [<a class="reference external" href="https://arxiv.org/abs/1904.01990">paper</a>][<a class="reference external" href="https://github.com/zhunzhong07/ECN">code</a>]</li>
<li>Dissecting Person Re-identification from the Viewpoint of Viewpoint. [<a class="reference external" href="https://arxiv.org/abs/1812.02162">paper</a>][<a class="reference external" href="https://github.com/sxzrt/Dissecting-Person-Re-ID-from-the-Viewpoint-of-Viewpoint">code</a>]</li>
<li>Unsupervised Person Re-identification by Soft Multilabel Learning. [<a class="reference external" href="https://arxiv.org/abs/1903.06325">paper</a>][<a class="reference external" href="https://github.com/KovenYu/MAR">code</a>]</li>
<li>Patch-based Discriminative Feature Learning for Unsupervised Person Re-identification. [<a class="reference external" href="https://kovenyu.com/publication/2019-cvpr-pedal/">paper</a>][<a class="reference external" href="https://github.com/QizeYang/PAUL">code</a>]</li>
</ul>
</div>
<div class="section" id="aaai-2019">
<h3>AAAI 2019<a class="headerlink" href="#aaai-2019" title="Permalink to this headline"></a></h3>
<ul class="simple">
<li><p>Spatial and Temporal Mutual Promotion for Video-based Person Re-identification. [<a class="reference external" href="https://arxiv.org/abs/1812.10305">paper</a>][<a class="reference external" href="https://github.com/yolomax/person-reid-lib">code</a>]</p></li>
<li><p>Spatial-Temporal Person Re-identification. [<a class="reference external" href="https://arxiv.org/abs/1812.03282">paper</a>][<a class="reference external" href="https://github.com/Wanggcong/Spatial-Temporal-Re-identification">code</a>]</p></li>
<li><p>Horizontal Pyramid Matching for Person Re-identification. [<a class="reference external" href="https://arxiv.org/abs/1804.05275">paper</a>][<a class="reference external" href="https://github.com/OasisYang/HPM">code</a>]</p></li>
<li><p>Backbone Can Not be Trained at Once: Rolling Back to Pre-trained Network for Person Re-identification. [<a class="reference external" href="https://arxiv.org/abs/1901.06140">paper</a>][<a class="reference external" href="https://github.com/youngminPIL/rollback">code</a>]</p></li>
<li><p>A Bottom-Up Clustering Approach to Unsupervised Person Re-identification. [<a class="reference external" href="https://vana77.github.io/vana77.github.io/images/AAAI19.pdf">paper</a>][<a class="reference external" href="https://github.com/vana77/Bottom-up-Clustering-Person-Re-identification">code</a>]</p></li>
<li>Spatial and Temporal Mutual Promotion for Video-based Person Re-identification. [<a class="reference external" href="https://arxiv.org/abs/1812.10305">paper</a>][<a class="reference external" href="https://github.com/yolomax/person-reid-lib">code</a>]</li>
<li>Spatial-Temporal Person Re-identification. [<a class="reference external" href="https://arxiv.org/abs/1812.03282">paper</a>][<a class="reference external" href="https://github.com/Wanggcong/Spatial-Temporal-Re-identification">code</a>]</li>
<li>Horizontal Pyramid Matching for Person Re-identification. [<a class="reference external" href="https://arxiv.org/abs/1804.05275">paper</a>][<a class="reference external" href="https://github.com/OasisYang/HPM">code</a>]</li>
<li>Backbone Can Not be Trained at Once: Rolling Back to Pre-trained Network for Person Re-identification. [<a class="reference external" href="https://arxiv.org/abs/1901.06140">paper</a>][<a class="reference external" href="https://github.com/youngminPIL/rollback">code</a>]</li>
<li>A Bottom-Up Clustering Approach to Unsupervised Person Re-identification. [<a class="reference external" href="https://vana77.github.io/vana77.github.io/images/AAAI19.pdf">paper</a>][<a class="reference external" href="https://github.com/vana77/Bottom-up-Clustering-Person-Re-identification">code</a>]</li>
</ul>
</div>
<div class="section" id="neurips-2018">
<h3>NeurIPS 2018<a class="headerlink" href="#neurips-2018" title="Permalink to this headline"></a></h3>
<ul class="simple">
<li><p>FD-GAN: Pose-guided Feature Distilling GAN for Robust Person Re-identification. [<a class="reference external" href="https://arxiv.org/abs/1810.02936">paper</a>][<a class="reference external" href="https://github.com/yxgeee/FD-GAN">code</a>]</p></li>
<li>FD-GAN: Pose-guided Feature Distilling GAN for Robust Person Re-identification. [<a class="reference external" href="https://arxiv.org/abs/1810.02936">paper</a>][<a class="reference external" href="https://github.com/yxgeee/FD-GAN">code</a>]</li>
</ul>
</div>
<div class="section" id="eccv-2018">
<h3>ECCV 2018<a class="headerlink" href="#eccv-2018" title="Permalink to this headline"></a></h3>
<ul class="simple">
<li><p>Generalizing A Person Retrieval Model Hetero- and Homogeneously. [<a class="reference external" href="http://openaccess.thecvf.com/content_ECCV_2018/papers/Zhun_Zhong_Generalizing_A_Person_ECCV_2018_paper.pdf">paper</a>][<a class="reference external" href="https://github.com/zhunzhong07/HHL">code</a>]</p></li>
<li><p>Pose-Normalized Image Generation for Person Re-identification. [<a class="reference external" href="https://arxiv.org/abs/1712.02225">paper</a>][<a class="reference external" href="https://github.com/naiq/PN_GAN">code</a>]</p></li>
<li>Generalizing A Person Retrieval Model Hetero- and Homogeneously. [<a class="reference external" href="http://openaccess.thecvf.com/content_ECCV_2018/papers/Zhun_Zhong_Generalizing_A_Person_ECCV_2018_paper.pdf">paper</a>][<a class="reference external" href="https://github.com/zhunzhong07/HHL">code</a>]</li>
<li>Pose-Normalized Image Generation for Person Re-identification. [<a class="reference external" href="https://arxiv.org/abs/1712.02225">paper</a>][<a class="reference external" href="https://github.com/naiq/PN_GAN">code</a>]</li>
</ul>
</div>
<div class="section" id="cvpr-2018">
<h3>CVPR 2018<a class="headerlink" href="#cvpr-2018" title="Permalink to this headline"></a></h3>
<ul class="simple">
<li><p>Camera Style Adaptation for Person Re-Identification. [<a class="reference external" href="https://arxiv.org/abs/1711.10295">paper</a>][<a class="reference external" href="https://github.com/zhunzhong07/CamStyle">code</a>]</p></li>
<li><p>Deep Group-Shuffling Random Walk for Person Re-Identification. [<a class="reference external" href="https://arxiv.org/abs/1807.11178">paper</a>][<a class="reference external" href="https://github.com/YantaoShen/kpm_rw_person_reid">code</a>]</p></li>
<li><p>End-to-End Deep Kronecker-Product Matching for Person Re-identification. [<a class="reference external" href="https://arxiv.org/abs/1807.11182">paper</a>][<a class="reference external" href="https://github.com/YantaoShen/kpm_rw_person_reid">code</a>]</p></li>
<li><p>Features for Multi-Target Multi-Camera Tracking and Re-Identification. [<a class="reference external" href="https://arxiv.org/abs/1803.10859">paper</a>][<a class="reference external" href="https://github.com/ergysr/DeepCC">code</a>]</p></li>
<li><p>Group Consistent Similarity Learning via Deep CRF for Person Re-Identification. [<a class="reference external" href="http://openaccess.thecvf.com/content_cvpr_2018/papers/Chen_Group_Consistent_Similarity_CVPR_2018_paper.pdf">paper</a>][<a class="reference external" href="https://github.com/dapengchen123/crf_affinity">code</a>]</p></li>
<li><p>Harmonious Attention Network for Person Re-Identification. [<a class="reference external" href="https://arxiv.org/abs/1802.08122">paper</a>][<a class="reference external" href="https://github.com/KaiyangZhou/deep-person-reid">code</a>]</p></li>
<li><p>Human Semantic Parsing for Person Re-Identification. [<a class="reference external" href="https://arxiv.org/abs/1804.00216">paper</a>][<a class="reference external" href="https://github.com/emrahbasaran/SPReID">code</a>]</p></li>
<li><p>Multi-Level Factorisation Net for Person Re-Identification. [<a class="reference external" href="https://arxiv.org/abs/1803.09132">paper</a>][<a class="reference external" href="https://github.com/KaiyangZhou/deep-person-reid">code</a>]</p></li>
<li><p>Resource Aware Person Re-identification across Multiple Resolutions. [<a class="reference external" href="https://arxiv.org/abs/1805.08805">paper</a>][<a class="reference external" href="https://github.com/mileyan/DARENet">code</a>]</p></li>
<li><p>Exploit the Unknown Gradually: One-Shot Video-Based Person Re-Identification by Stepwise Learning. [<a class="reference external" href="https://yu-wu.net/pdf/CVPR2018_Exploit-Unknown-Gradually.pdf">paper</a>][<a class="reference external" href="https://github.com/Yu-Wu/Exploit-Unknown-Gradually">code</a>]</p></li>
<li>Camera Style Adaptation for Person Re-Identification. [<a class="reference external" href="https://arxiv.org/abs/1711.10295">paper</a>][<a class="reference external" href="https://github.com/zhunzhong07/CamStyle">code</a>]</li>
<li>Deep Group-Shuffling Random Walk for Person Re-Identification. [<a class="reference external" href="https://arxiv.org/abs/1807.11178">paper</a>][<a class="reference external" href="https://github.com/YantaoShen/kpm_rw_person_reid">code</a>]</li>
<li>End-to-End Deep Kronecker-Product Matching for Person Re-identification. [<a class="reference external" href="https://arxiv.org/abs/1807.11182">paper</a>][<a class="reference external" href="https://github.com/YantaoShen/kpm_rw_person_reid">code</a>]</li>
<li>Features for Multi-Target Multi-Camera Tracking and Re-Identification. [<a class="reference external" href="https://arxiv.org/abs/1803.10859">paper</a>][<a class="reference external" href="https://github.com/ergysr/DeepCC">code</a>]</li>
<li>Group Consistent Similarity Learning via Deep CRF for Person Re-Identification. [<a class="reference external" href="http://openaccess.thecvf.com/content_cvpr_2018/papers/Chen_Group_Consistent_Similarity_CVPR_2018_paper.pdf">paper</a>][<a class="reference external" href="https://github.com/dapengchen123/crf_affinity">code</a>]</li>
<li>Harmonious Attention Network for Person Re-Identification. [<a class="reference external" href="https://arxiv.org/abs/1802.08122">paper</a>][<a class="reference external" href="https://github.com/KaiyangZhou/deep-person-reid">code</a>]</li>
<li>Human Semantic Parsing for Person Re-Identification. [<a class="reference external" href="https://arxiv.org/abs/1804.00216">paper</a>][<a class="reference external" href="https://github.com/emrahbasaran/SPReID">code</a>]</li>
<li>Multi-Level Factorisation Net for Person Re-Identification. [<a class="reference external" href="https://arxiv.org/abs/1803.09132">paper</a>][<a class="reference external" href="https://github.com/KaiyangZhou/deep-person-reid">code</a>]</li>
<li>Resource Aware Person Re-identification across Multiple Resolutions. [<a class="reference external" href="https://arxiv.org/abs/1805.08805">paper</a>][<a class="reference external" href="https://github.com/mileyan/DARENet">code</a>]</li>
<li>Exploit the Unknown Gradually: One-Shot Video-Based Person Re-Identification by Stepwise Learning. [<a class="reference external" href="https://yu-wu.net/pdf/CVPR2018_Exploit-Unknown-Gradually.pdf">paper</a>][<a class="reference external" href="https://github.com/Yu-Wu/Exploit-Unknown-Gradually">code</a>]</li>
</ul>
</div>
<div class="section" id="arxiv">
<h3>ArXiv<a class="headerlink" href="#arxiv" title="Permalink to this headline"></a></h3>
<ul class="simple">
<li><p>Revisiting Temporal Modeling for Video-based Person ReID. [<a class="reference external" href="https://arxiv.org/abs/1805.02104">paper</a>][<a class="reference external" href="https://github.com/jiyanggao/Video-Person-ReID">code</a>]</p></li>
<li>Revisiting Temporal Modeling for Video-based Person ReID. [<a class="reference external" href="https://arxiv.org/abs/1805.02104">paper</a>][<a class="reference external" href="https://github.com/jiyanggao/Video-Person-ReID">code</a>]</li>
</ul>
</div>
</div>

View File

@ -182,12 +182,12 @@
<h1>Model Zoo<a class="headerlink" href="#model-zoo" title="Permalink to this headline"></a></h1>
<p>In general,</p>
<ul class="simple">
<li><p>results are presented in the format of <em>&lt;Rank-1 (mAP)&gt;</em>, unless specified otherwise.</p></li>
<li><p>when computing model size and FLOPs, only layers that are used at test time are considered (see <code class="docutils literal notranslate"><span class="pre">torchreid.utils.compute_model_complexity</span></code>).</p></li>
<li><p>asterisk (*) means the model is trained from scratch.</p></li>
<li><p><code class="docutils literal notranslate"><span class="pre">combineall=True</span></code> means all images in the dataset are used for model training.</p></li>
<li><p>for the cuhk03 dataset, we use the 767/700 split by <a class="reference external" href="https://arxiv.org/abs/1701.08398">Zhong et al. CVPR17</a>.</p></li>
<li><p><a class="reference external" href="https://arxiv.org/abs/1512.00567">label smoothing regularizer</a> is used in the softmax loss.</p></li>
<li>results are presented in the format of <em>&lt;Rank-1 (mAP)&gt;</em>, unless specified otherwise.</li>
<li>when computing model size and FLOPs, only layers that are used at test time are considered (see <code class="docutils literal notranslate"><span class="pre">torchreid.utils.compute_model_complexity</span></code>).</li>
<li>asterisk (*) means the model is trained from scratch.</li>
<li><code class="docutils literal notranslate"><span class="pre">combineall=True</span></code> means all images in the dataset are used for model training.</li>
<li>for the cuhk03 dataset, we use the 767/700 split by <a class="reference external" href="https://arxiv.org/abs/1701.08398">Zhong et al. CVPR17</a>.</li>
<li><a class="reference external" href="https://arxiv.org/abs/1512.00567">label smoothing regularizer</a> is used in the softmax loss.</li>
</ul>
<div class="section" id="imagenet-pretrained-models">
<h2>ImageNet pretrained models<a class="headerlink" href="#imagenet-pretrained-models" title="Permalink to this headline"></a></h2>

View File

@ -227,7 +227,7 @@
<span class="sd"> prefix: string</span>
<span class="sd"> matched: bool</span>
<span class="sd"> &quot;&quot;&quot;</span>
<span class="k">if</span> <span class="nb">isinstance</span><span class="p">(</span><span class="n">src</span><span class="p">,</span> <span class="nb">tuple</span><span class="p">)</span> <span class="ow">or</span> <span class="nb">isinstance</span><span class="p">(</span><span class="n">src</span><span class="p">,</span> <span class="nb">list</span><span class="p">):</span>
<span class="k">if</span> <span class="nb">isinstance</span><span class="p">(</span><span class="n">src</span><span class="p">,</span> <span class="p">(</span><span class="nb">tuple</span><span class="p">,</span> <span class="nb">list</span><span class="p">)):</span>
<span class="k">if</span> <span class="n">prefix</span> <span class="o">==</span> <span class="s1">&#39;gallery&#39;</span><span class="p">:</span>
<span class="n">suffix</span> <span class="o">=</span> <span class="s1">&#39;TRUE&#39;</span> <span class="k">if</span> <span class="n">matched</span> <span class="k">else</span> <span class="s1">&#39;FALSE&#39;</span>
<span class="n">dst</span> <span class="o">=</span> <span class="n">osp</span><span class="o">.</span><span class="n">join</span><span class="p">(</span><span class="n">dst</span><span class="p">,</span> <span class="n">prefix</span> <span class="o">+</span> <span class="s1">&#39;_top&#39;</span> <span class="o">+</span> <span class="nb">str</span><span class="p">(</span><span class="n">rank</span><span class="p">)</span><span class="o">.</span><span class="n">zfill</span><span class="p">(</span><span class="mi">3</span><span class="p">))</span> <span class="o">+</span> <span class="s1">&#39;_&#39;</span> <span class="o">+</span> <span class="n">suffix</span>

View File

@ -65,7 +65,7 @@ We provide a tool in ``torchreid.utils.model_complexity.py`` to automatically co
# count flops for all layers including ReLU and BatchNorm
utils.compute_model_complexity(model, (1, 3, 256, 128), verbose=True, only_conv_linear=False)
It is worth noting that (1) this function only provides an estimate of the theoretical time complexity rather than the actual running time which depends on implementations and hardware, and (2) the FLOPs is only counted for layers that are used at test time. This means that redundant layers such as person ID classification layer will be ignored as it is discarded when doing feature extraction. Note that the inference graph depends on how you construct the computations in ``forward()``.
Note that (1) this function only provides an estimate of the theoretical time complexity rather than the actual running time which depends on implementations and hardware; (2) the FLOPs is only counted for layers that are used at test time. This means that redundant layers such as person ID classification layer will be ignored. The inference graph depends on how you define the computations in ``forward()``.
Combine multiple datasets
@ -117,7 +117,7 @@ This can be easily done by setting ``combineall=True`` when instantiating a data
combineall=True # it's me, here
)
More specifically, with ``combineall=False``, you would get
More specifically, with ``combineall=False``, you will get
.. code-block:: none
@ -130,7 +130,7 @@ More specifically, with ``combineall=False``, you would get
gallery | 751 | 15913 | 6
---------------------------------------
with ``combineall=True``, you would get
with ``combineall=True``, you will get
.. code-block:: none
@ -168,9 +168,9 @@ Please refer to :ref:`torchreid_optim` for more details.
Do two-stepped transfer learning
-------------------------------------
To prevent the pretrained layers to be damaged by harmful gradients back-propagated from randomly initialized layers, one can adopt the *two-stepped transfer learning strategy* presented in `Deep Transfer Learning for Person Re-identification <https://arxiv.org/abs/1611.05244>`_. The basic idea is to pretrain the randomly initialized layers for few epochs while keeping the base layers frozen before training all layers end-to-end.
To prevent the pretrained layers from being damaged by harmful gradients back-propagated from randomly initialized layers, one can adopt the *two-stepped transfer learning strategy* presented in `Deep Transfer Learning for Person Re-identification <https://arxiv.org/abs/1611.05244>`_. The basic idea is to pretrain the randomly initialized layers for few epochs while keeping the base layers frozen before training all layers end-to-end.
This has been implemented in ``Engine.train()`` (see :ref:`torchreid_engine`). The arguments to enable this feature are ``fixbase_epoch`` and ``open_layers``. Intuitively, ``fixbase_epoch`` denotes the number of epochs to keep the base layers frozen; ``open_layers`` means which layers are open for training.
This has been implemented in ``Engine.train()`` (see :ref:`torchreid_engine`). The arguments related to this feature are ``fixbase_epoch`` and ``open_layers``. Intuitively, ``fixbase_epoch`` denotes the number of epochs to keep the base layers frozen; ``open_layers`` means which layer is open for training.
For example, say you want to pretrain the classification layer named "classifier" in ResNet50 for 5 epochs before training all layers, you can do
@ -198,18 +198,18 @@ You can load a trained model using :code:`torchreid.utils.load_pretrained_weight
Visualize learning curves with tensorboard
--------------------------------------------
The ``SummaryWriter()`` for tensorboard will be automatically initialized in ``engine.run()`` when you are training your model. Therefore, you do not need to do extra jobs. After the training is done, the ``*tf.events*`` file will be saved in ``save_dir``. Then, you just call ``tensorboard --logdir=your_save_dir`` in your terminal and visit ``http://localhost:6006/`` in your web browser. See `pytorch tensorboard <https://pytorch.org/docs/stable/tensorboard.html>`_ for further information.
The ``SummaryWriter()`` for tensorboard will be automatically initialized in ``engine.run()`` when you are training your model. Therefore, you do not need to do extra jobs. After the training is done, the ``*tf.events*`` file will be saved in ``save_dir``. Then, you just call ``tensorboard --logdir=your_save_dir`` in your terminal and visit ``http://localhost:6006/`` in a web browser. See `pytorch tensorboard <https://pytorch.org/docs/stable/tensorboard.html>`_ for further information.
Visualize ranked results
-------------------------
Ranked images can be visualized by setting ``visrank`` to True in ``engine.run()``. ``visrank_topk`` determines the top-k images to be visualized (Default is ``visrank_topk=10``). Note that ``visrank`` can only be used in test mode, i.e. ``test_only=True`` in ``engine.run()``. The images will be saved under ``save_dir/visrank_DATASETNAME`` where each image sketches the ranked list given a query. An example is shown below. Red and green denote incorrect and correct matches respectively.
Ranked images can be visualized by setting ``visrank`` to true in ``engine.run()``. ``visrank_topk`` determines the top-k images to be visualized (Default is ``visrank_topk=10``). Note that ``visrank`` can only be used in test mode, i.e. ``test_only=True`` in ``engine.run()``. The images will be saved under ``save_dir/visrank_DATASETNAME`` where each image contains the top-k ranked list given a query. An example is shown below. Red and green denote incorrect and correct matches respectively.
.. image:: figures/ranked_results.jpg
:width: 800px
:align: center
An example command line using ``scripts/main.py`` is
Example command for ``scripts/main.py`` is
.. code-block:: shell
@ -235,7 +235,7 @@ To understand where the CNN focuses on to extract features for ReID, you can vis
:align: center
An example command line using ``scripts/main.py`` is
Example command for ``scripts/main.py`` is
.. code-block:: shell
@ -331,4 +331,4 @@ Use your own dataset
Design your own Engine
------------------------
A new Engine should be designed if you have your own loss function. The base Engine class ``torchreid.engine.Engine`` has implemented some generic methods which you want to inherit to avoid re-writing. Please refer to the source code for more details. You are suggested to see how ``ImageSoftmaxEngine`` and ``ImageTripletEngine`` are constructed (also ``VideoSoftmaxEngine`` and ``VideoTripletEngine``). All you need to implement might be just a ``train()`` function.
A new Engine should be designed if you have your own loss function. The base Engine class ``torchreid.engine.Engine`` has implemented some generic methods which you can inherit to avoid re-writing. Please refer to the source code for more details. You are suggested to see how ``ImageSoftmaxEngine`` and ``ImageTripletEngine`` are constructed (also ``VideoSoftmaxEngine`` and ``VideoTripletEngine``). All you need to implement might be just a ``train()`` function.

Binary file not shown.

After

Width:  |  Height:  |  Size: 673 B

View File

@ -231,16 +231,6 @@ a.headerlink {
visibility: hidden;
}
a.brackets:before,
span.brackets > a:before{
content: "[";
}
a.brackets:after,
span.brackets > a:after {
content: "]";
}
h1:hover > a.headerlink,
h2:hover > a.headerlink,
h3:hover > a.headerlink,
@ -289,12 +279,6 @@ img.align-center, .figure.align-center, object.align-center {
margin-right: auto;
}
img.align-default, .figure.align-default {
display: block;
margin-left: auto;
margin-right: auto;
}
.align-left {
text-align: left;
}
@ -303,10 +287,6 @@ img.align-default, .figure.align-default {
text-align: center;
}
.align-default {
text-align: center;
}
.align-right {
text-align: right;
}
@ -378,11 +358,6 @@ table.align-center {
margin-right: auto;
}
table.align-default {
margin-left: auto;
margin-right: auto;
}
table caption span.caption-number {
font-style: italic;
}
@ -416,16 +391,6 @@ table.citation td {
border-bottom: none;
}
th > p:first-child,
td > p:first-child {
margin-top: 0px;
}
th > p:last-child,
td > p:last-child {
margin-bottom: 0px;
}
/* -- figures --------------------------------------------------------------- */
div.figure {
@ -495,58 +460,11 @@ ol.upperroman {
list-style: upper-roman;
}
li > p:first-child {
margin-top: 0px;
}
li > p:last-child {
margin-bottom: 0px;
}
dl.footnote > dt,
dl.citation > dt {
float: left;
}
dl.footnote > dd,
dl.citation > dd {
margin-bottom: 0em;
}
dl.footnote > dd:after,
dl.citation > dd:after {
content: "";
clear: both;
}
dl.field-list {
display: grid;
grid-template-columns: fit-content(30%) auto;
}
dl.field-list > dt {
font-weight: bold;
word-break: break-word;
padding-left: 0.5em;
padding-right: 5px;
}
dl.field-list > dt:after {
content: ":";
}
dl.field-list > dd {
padding-left: 0.5em;
margin-top: 0em;
margin-left: 0em;
margin-bottom: 0em;
}
dl {
margin-bottom: 15px;
}
dd > p:first-child {
dd p {
margin-top: 0px;
}
@ -619,12 +537,6 @@ dl.glossary dt {
font-style: oblique;
}
.classifier:before {
font-style: normal;
margin: 0.5em;
content: ":";
}
abbr, acronym {
border-bottom: dotted 1px;
cursor: help;

Binary file not shown.

After

Width:  |  Height:  |  Size: 756 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 829 B

BIN
_static/comment.png 100644

Binary file not shown.

After

Width:  |  Height:  |  Size: 641 B

View File

@ -87,13 +87,14 @@ jQuery.fn.highlightText = function(text, className) {
node.nextSibling));
node.nodeValue = val.substr(0, pos);
if (isInSVG) {
var bbox = span.getBBox();
var rect = document.createElementNS("http://www.w3.org/2000/svg", "rect");
var bbox = node.parentElement.getBBox();
rect.x.baseVal.value = bbox.x;
rect.x.baseVal.value = bbox.x;
rect.y.baseVal.value = bbox.y;
rect.width.baseVal.value = bbox.width;
rect.height.baseVal.value = bbox.height;
rect.setAttribute('class', className);
var parentOfText = node.parentNode.parentNode;
addItems.push({
"parent": node.parentNode,
"target": rect});

View File

@ -6,5 +6,5 @@ var DOCUMENTATION_OPTIONS = {
FILE_SUFFIX: '.html',
HAS_SOURCE: true,
SOURCELINK_SUFFIX: '.txt',
NAVIGATION_WITH_KEYS: false
NAVIGATION_WITH_KEYS: false,
};

Binary file not shown.

After

Width:  |  Height:  |  Size: 222 B

BIN
_static/down.png 100644

Binary file not shown.

After

Width:  |  Height:  |  Size: 202 B

File diff suppressed because it is too large Load Diff

6
_static/jquery.js vendored

File diff suppressed because one or more lines are too long

View File

@ -36,10 +36,8 @@ if (!Scorer) {
// query found in title
title: 15,
partialTitle: 7,
// query found in terms
term: 5,
partialTerm: 2
term: 5
};
}
@ -58,14 +56,6 @@ var Search = {
_queued_query : null,
_pulse_status : -1,
htmlToText : function(htmlString) {
var htmlElement = document.createElement('span');
htmlElement.innerHTML = htmlString;
$(htmlElement).find('.headerlink').remove();
docContent = $(htmlElement).find('[role=main]')[0];
return docContent.textContent || docContent.innerText;
},
init : function() {
var params = $.getQueryParameters();
if (params.q) {
@ -130,7 +120,7 @@ var Search = {
this.out = $('#search-results');
this.title = $('<h2>' + _('Searching') + '</h2>').appendTo(this.out);
this.dots = $('<span></span>').appendTo(this.title);
this.status = $('<p class="search-summary">&nbsp;</p>').appendTo(this.out);
this.status = $('<p style="display: none"></p>').appendTo(this.out);
this.output = $('<ul class="search"/>').appendTo(this.out);
$('#search-progress').text(_('Preparing search...'));
@ -269,7 +259,11 @@ var Search = {
displayNextItem();
});
} else if (DOCUMENTATION_OPTIONS.HAS_SOURCE) {
$.ajax({url: DOCUMENTATION_OPTIONS.URL_ROOT + item[0] + DOCUMENTATION_OPTIONS.FILE_SUFFIX,
var suffix = DOCUMENTATION_OPTIONS.SOURCELINK_SUFFIX;
if (suffix === undefined) {
suffix = '.txt';
}
$.ajax({url: DOCUMENTATION_OPTIONS.URL_ROOT + '_sources/' + item[5] + (item[5].slice(-suffix.length) === suffix ? '' : suffix),
dataType: "text",
complete: function(jqxhr, textstatus) {
var data = jqxhr.responseText;
@ -319,13 +313,12 @@ var Search = {
for (var prefix in objects) {
for (var name in objects[prefix]) {
var fullname = (prefix ? prefix + '.' : '') + name;
var fullnameLower = fullname.toLowerCase()
if (fullnameLower.indexOf(object) > -1) {
if (fullname.toLowerCase().indexOf(object) > -1) {
var score = 0;
var parts = fullnameLower.split('.');
var parts = fullname.split('.');
// check for different match types: exact matches of full name or
// "last name" (i.e. last dotted part)
if (fullnameLower == object || parts[parts.length - 1] == object) {
if (fullname == object || parts[parts.length - 1] == object) {
score += Scorer.objNameMatch;
// matches in last name
} else if (parts[parts.length - 1].indexOf(object) > -1) {
@ -392,19 +385,6 @@ var Search = {
{files: terms[word], score: Scorer.term},
{files: titleterms[word], score: Scorer.title}
];
// add support for partial matches
if (word.length > 2) {
for (var w in terms) {
if (w.match(word) && !terms[word]) {
_o.push({files: terms[w], score: Scorer.partialTerm})
}
}
for (var w in titleterms) {
if (w.match(word) && !titleterms[word]) {
_o.push({files: titleterms[w], score: Scorer.partialTitle})
}
}
}
// no match but word was a required one
if ($u.every(_o, function(o){return o.files === undefined;})) {
@ -444,12 +424,8 @@ var Search = {
var valid = true;
// check if all requirements are matched
var filteredTermCount = // as search terms with length < 3 are discarded: ignore
searchterms.filter(function(term){return term.length > 2}).length
if (
fileMap[file].length != searchterms.length &&
fileMap[file].length != filteredTermCount
) continue;
if (fileMap[file].length != searchterms.length)
continue;
// ensure that none of the excluded terms is in the search result
for (i = 0; i < excluded.length; i++) {
@ -480,8 +456,7 @@ var Search = {
* words. the first one is used to find the occurrence, the
* latter for highlighting it.
*/
makeSearchSummary : function(htmlText, keywords, hlwords) {
var text = Search.htmlToText(htmlText);
makeSearchSummary : function(text, keywords, hlwords) {
var textLower = text.toLowerCase();
var start = 0;
$.each(keywords, function() {

Binary file not shown.

After

Width:  |  Height:  |  Size: 214 B

BIN
_static/up.png 100644

Binary file not shown.

After

Width:  |  Height:  |  Size: 203 B

View File

@ -0,0 +1,808 @@
/*
* websupport.js
* ~~~~~~~~~~~~~
*
* sphinx.websupport utilities for all documentation.
*
* :copyright: Copyright 2007-2019 by the Sphinx team, see AUTHORS.
* :license: BSD, see LICENSE for details.
*
*/
(function($) {
$.fn.autogrow = function() {
return this.each(function() {
var textarea = this;
$.fn.autogrow.resize(textarea);
$(textarea)
.focus(function() {
textarea.interval = setInterval(function() {
$.fn.autogrow.resize(textarea);
}, 500);
})
.blur(function() {
clearInterval(textarea.interval);
});
});
};
$.fn.autogrow.resize = function(textarea) {
var lineHeight = parseInt($(textarea).css('line-height'), 10);
var lines = textarea.value.split('\n');
var columns = textarea.cols;
var lineCount = 0;
$.each(lines, function() {
lineCount += Math.ceil(this.length / columns) || 1;
});
var height = lineHeight * (lineCount + 1);
$(textarea).css('height', height);
};
})(jQuery);
(function($) {
var comp, by;
function init() {
initEvents();
initComparator();
}
function initEvents() {
$(document).on("click", 'a.comment-close', function(event) {
event.preventDefault();
hide($(this).attr('id').substring(2));
});
$(document).on("click", 'a.vote', function(event) {
event.preventDefault();
handleVote($(this));
});
$(document).on("click", 'a.reply', function(event) {
event.preventDefault();
openReply($(this).attr('id').substring(2));
});
$(document).on("click", 'a.close-reply', function(event) {
event.preventDefault();
closeReply($(this).attr('id').substring(2));
});
$(document).on("click", 'a.sort-option', function(event) {
event.preventDefault();
handleReSort($(this));
});
$(document).on("click", 'a.show-proposal', function(event) {
event.preventDefault();
showProposal($(this).attr('id').substring(2));
});
$(document).on("click", 'a.hide-proposal', function(event) {
event.preventDefault();
hideProposal($(this).attr('id').substring(2));
});
$(document).on("click", 'a.show-propose-change', function(event) {
event.preventDefault();
showProposeChange($(this).attr('id').substring(2));
});
$(document).on("click", 'a.hide-propose-change', function(event) {
event.preventDefault();
hideProposeChange($(this).attr('id').substring(2));
});
$(document).on("click", 'a.accept-comment', function(event) {
event.preventDefault();
acceptComment($(this).attr('id').substring(2));
});
$(document).on("click", 'a.delete-comment', function(event) {
event.preventDefault();
deleteComment($(this).attr('id').substring(2));
});
$(document).on("click", 'a.comment-markup', function(event) {
event.preventDefault();
toggleCommentMarkupBox($(this).attr('id').substring(2));
});
}
/**
* Set comp, which is a comparator function used for sorting and
* inserting comments into the list.
*/
function setComparator() {
// If the first three letters are "asc", sort in ascending order
// and remove the prefix.
if (by.substring(0,3) == 'asc') {
var i = by.substring(3);
comp = function(a, b) { return a[i] - b[i]; };
} else {
// Otherwise sort in descending order.
comp = function(a, b) { return b[by] - a[by]; };
}
// Reset link styles and format the selected sort option.
$('a.sel').attr('href', '#').removeClass('sel');
$('a.by' + by).removeAttr('href').addClass('sel');
}
/**
* Create a comp function. If the user has preferences stored in
* the sortBy cookie, use those, otherwise use the default.
*/
function initComparator() {
by = 'rating'; // Default to sort by rating.
// If the sortBy cookie is set, use that instead.
if (document.cookie.length > 0) {
var start = document.cookie.indexOf('sortBy=');
if (start != -1) {
start = start + 7;
var end = document.cookie.indexOf(";", start);
if (end == -1) {
end = document.cookie.length;
by = unescape(document.cookie.substring(start, end));
}
}
}
setComparator();
}
/**
* Show a comment div.
*/
function show(id) {
$('#ao' + id).hide();
$('#ah' + id).show();
var context = $.extend({id: id}, opts);
var popup = $(renderTemplate(popupTemplate, context)).hide();
popup.find('textarea[name="proposal"]').hide();
popup.find('a.by' + by).addClass('sel');
var form = popup.find('#cf' + id);
form.submit(function(event) {
event.preventDefault();
addComment(form);
});
$('#s' + id).after(popup);
popup.slideDown('fast', function() {
getComments(id);
});
}
/**
* Hide a comment div.
*/
function hide(id) {
$('#ah' + id).hide();
$('#ao' + id).show();
var div = $('#sc' + id);
div.slideUp('fast', function() {
div.remove();
});
}
/**
* Perform an ajax request to get comments for a node
* and insert the comments into the comments tree.
*/
function getComments(id) {
$.ajax({
type: 'GET',
url: opts.getCommentsURL,
data: {node: id},
success: function(data, textStatus, request) {
var ul = $('#cl' + id);
var speed = 100;
$('#cf' + id)
.find('textarea[name="proposal"]')
.data('source', data.source);
if (data.comments.length === 0) {
ul.html('<li>No comments yet.</li>');
ul.data('empty', true);
} else {
// If there are comments, sort them and put them in the list.
var comments = sortComments(data.comments);
speed = data.comments.length * 100;
appendComments(comments, ul);
ul.data('empty', false);
}
$('#cn' + id).slideUp(speed + 200);
ul.slideDown(speed);
},
error: function(request, textStatus, error) {
showError('Oops, there was a problem retrieving the comments.');
},
dataType: 'json'
});
}
/**
* Add a comment via ajax and insert the comment into the comment tree.
*/
function addComment(form) {
var node_id = form.find('input[name="node"]').val();
var parent_id = form.find('input[name="parent"]').val();
var text = form.find('textarea[name="comment"]').val();
var proposal = form.find('textarea[name="proposal"]').val();
if (text == '') {
showError('Please enter a comment.');
return;
}
// Disable the form that is being submitted.
form.find('textarea,input').attr('disabled', 'disabled');
// Send the comment to the server.
$.ajax({
type: "POST",
url: opts.addCommentURL,
dataType: 'json',
data: {
node: node_id,
parent: parent_id,
text: text,
proposal: proposal
},
success: function(data, textStatus, error) {
// Reset the form.
if (node_id) {
hideProposeChange(node_id);
}
form.find('textarea')
.val('')
.add(form.find('input'))
.removeAttr('disabled');
var ul = $('#cl' + (node_id || parent_id));
if (ul.data('empty')) {
$(ul).empty();
ul.data('empty', false);
}
insertComment(data.comment);
var ao = $('#ao' + node_id);
ao.find('img').attr({'src': opts.commentBrightImage});
if (node_id) {
// if this was a "root" comment, remove the commenting box
// (the user can get it back by reopening the comment popup)
$('#ca' + node_id).slideUp();
}
},
error: function(request, textStatus, error) {
form.find('textarea,input').removeAttr('disabled');
showError('Oops, there was a problem adding the comment.');
}
});
}
/**
* Recursively append comments to the main comment list and children
* lists, creating the comment tree.
*/
function appendComments(comments, ul) {
$.each(comments, function() {
var div = createCommentDiv(this);
ul.append($(document.createElement('li')).html(div));
appendComments(this.children, div.find('ul.comment-children'));
// To avoid stagnating data, don't store the comments children in data.
this.children = null;
div.data('comment', this);
});
}
/**
* After adding a new comment, it must be inserted in the correct
* location in the comment tree.
*/
function insertComment(comment) {
var div = createCommentDiv(comment);
// To avoid stagnating data, don't store the comments children in data.
comment.children = null;
div.data('comment', comment);
var ul = $('#cl' + (comment.node || comment.parent));
var siblings = getChildren(ul);
var li = $(document.createElement('li'));
li.hide();
// Determine where in the parents children list to insert this comment.
for(var i=0; i < siblings.length; i++) {
if (comp(comment, siblings[i]) <= 0) {
$('#cd' + siblings[i].id)
.parent()
.before(li.html(div));
li.slideDown('fast');
return;
}
}
// If we get here, this comment rates lower than all the others,
// or it is the only comment in the list.
ul.append(li.html(div));
li.slideDown('fast');
}
function acceptComment(id) {
$.ajax({
type: 'POST',
url: opts.acceptCommentURL,
data: {id: id},
success: function(data, textStatus, request) {
$('#cm' + id).fadeOut('fast');
$('#cd' + id).removeClass('moderate');
},
error: function(request, textStatus, error) {
showError('Oops, there was a problem accepting the comment.');
}
});
}
function deleteComment(id) {
$.ajax({
type: 'POST',
url: opts.deleteCommentURL,
data: {id: id},
success: function(data, textStatus, request) {
var div = $('#cd' + id);
if (data == 'delete') {
// Moderator mode: remove the comment and all children immediately
div.slideUp('fast', function() {
div.remove();
});
return;
}
// User mode: only mark the comment as deleted
div
.find('span.user-id:first')
.text('[deleted]').end()
.find('div.comment-text:first')
.text('[deleted]').end()
.find('#cm' + id + ', #dc' + id + ', #ac' + id + ', #rc' + id +
', #sp' + id + ', #hp' + id + ', #cr' + id + ', #rl' + id)
.remove();
var comment = div.data('comment');
comment.username = '[deleted]';
comment.text = '[deleted]';
div.data('comment', comment);
},
error: function(request, textStatus, error) {
showError('Oops, there was a problem deleting the comment.');
}
});
}
function showProposal(id) {
$('#sp' + id).hide();
$('#hp' + id).show();
$('#pr' + id).slideDown('fast');
}
function hideProposal(id) {
$('#hp' + id).hide();
$('#sp' + id).show();
$('#pr' + id).slideUp('fast');
}
function showProposeChange(id) {
$('#pc' + id).hide();
$('#hc' + id).show();
var textarea = $('#pt' + id);
textarea.val(textarea.data('source'));
$.fn.autogrow.resize(textarea[0]);
textarea.slideDown('fast');
}
function hideProposeChange(id) {
$('#hc' + id).hide();
$('#pc' + id).show();
var textarea = $('#pt' + id);
textarea.val('').removeAttr('disabled');
textarea.slideUp('fast');
}
function toggleCommentMarkupBox(id) {
$('#mb' + id).toggle();
}
/** Handle when the user clicks on a sort by link. */
function handleReSort(link) {
var classes = link.attr('class').split(/\s+/);
for (var i=0; i<classes.length; i++) {
if (classes[i] != 'sort-option') {
by = classes[i].substring(2);
}
}
setComparator();
// Save/update the sortBy cookie.
var expiration = new Date();
expiration.setDate(expiration.getDate() + 365);
document.cookie= 'sortBy=' + escape(by) +
';expires=' + expiration.toUTCString();
$('ul.comment-ul').each(function(index, ul) {
var comments = getChildren($(ul), true);
comments = sortComments(comments);
appendComments(comments, $(ul).empty());
});
}
/**
* Function to process a vote when a user clicks an arrow.
*/
function handleVote(link) {
if (!opts.voting) {
showError("You'll need to login to vote.");
return;
}
var id = link.attr('id');
if (!id) {
// Didn't click on one of the voting arrows.
return;
}
// If it is an unvote, the new vote value is 0,
// Otherwise it's 1 for an upvote, or -1 for a downvote.
var value = 0;
if (id.charAt(1) != 'u') {
value = id.charAt(0) == 'u' ? 1 : -1;
}
// The data to be sent to the server.
var d = {
comment_id: id.substring(2),
value: value
};
// Swap the vote and unvote links.
link.hide();
$('#' + id.charAt(0) + (id.charAt(1) == 'u' ? 'v' : 'u') + d.comment_id)
.show();
// The div the comment is displayed in.
var div = $('div#cd' + d.comment_id);
var data = div.data('comment');
// If this is not an unvote, and the other vote arrow has
// already been pressed, unpress it.
if ((d.value !== 0) && (data.vote === d.value * -1)) {
$('#' + (d.value == 1 ? 'd' : 'u') + 'u' + d.comment_id).hide();
$('#' + (d.value == 1 ? 'd' : 'u') + 'v' + d.comment_id).show();
}
// Update the comments rating in the local data.
data.rating += (data.vote === 0) ? d.value : (d.value - data.vote);
data.vote = d.value;
div.data('comment', data);
// Change the rating text.
div.find('.rating:first')
.text(data.rating + ' point' + (data.rating == 1 ? '' : 's'));
// Send the vote information to the server.
$.ajax({
type: "POST",
url: opts.processVoteURL,
data: d,
error: function(request, textStatus, error) {
showError('Oops, there was a problem casting that vote.');
}
});
}
/**
* Open a reply form used to reply to an existing comment.
*/
function openReply(id) {
// Swap out the reply link for the hide link
$('#rl' + id).hide();
$('#cr' + id).show();
// Add the reply li to the children ul.
var div = $(renderTemplate(replyTemplate, {id: id})).hide();
$('#cl' + id)
.prepend(div)
// Setup the submit handler for the reply form.
.find('#rf' + id)
.submit(function(event) {
event.preventDefault();
addComment($('#rf' + id));
closeReply(id);
})
.find('input[type=button]')
.click(function() {
closeReply(id);
});
div.slideDown('fast', function() {
$('#rf' + id).find('textarea').focus();
});
}
/**
* Close the reply form opened with openReply.
*/
function closeReply(id) {
// Remove the reply div from the DOM.
$('#rd' + id).slideUp('fast', function() {
$(this).remove();
});
// Swap out the hide link for the reply link
$('#cr' + id).hide();
$('#rl' + id).show();
}
/**
* Recursively sort a tree of comments using the comp comparator.
*/
function sortComments(comments) {
comments.sort(comp);
$.each(comments, function() {
this.children = sortComments(this.children);
});
return comments;
}
/**
* Get the children comments from a ul. If recursive is true,
* recursively include childrens' children.
*/
function getChildren(ul, recursive) {
var children = [];
ul.children().children("[id^='cd']")
.each(function() {
var comment = $(this).data('comment');
if (recursive)
comment.children = getChildren($(this).find('#cl' + comment.id), true);
children.push(comment);
});
return children;
}
/** Create a div to display a comment in. */
function createCommentDiv(comment) {
if (!comment.displayed && !opts.moderator) {
return $('<div class="moderate">Thank you! Your comment will show up '
+ 'once it is has been approved by a moderator.</div>');
}
// Prettify the comment rating.
comment.pretty_rating = comment.rating + ' point' +
(comment.rating == 1 ? '' : 's');
// Make a class (for displaying not yet moderated comments differently)
comment.css_class = comment.displayed ? '' : ' moderate';
// Create a div for this comment.
var context = $.extend({}, opts, comment);
var div = $(renderTemplate(commentTemplate, context));
// If the user has voted on this comment, highlight the correct arrow.
if (comment.vote) {
var direction = (comment.vote == 1) ? 'u' : 'd';
div.find('#' + direction + 'v' + comment.id).hide();
div.find('#' + direction + 'u' + comment.id).show();
}
if (opts.moderator || comment.text != '[deleted]') {
div.find('a.reply').show();
if (comment.proposal_diff)
div.find('#sp' + comment.id).show();
if (opts.moderator && !comment.displayed)
div.find('#cm' + comment.id).show();
if (opts.moderator || (opts.username == comment.username))
div.find('#dc' + comment.id).show();
}
return div;
}
/**
* A simple template renderer. Placeholders such as <%id%> are replaced
* by context['id'] with items being escaped. Placeholders such as <#id#>
* are not escaped.
*/
function renderTemplate(template, context) {
var esc = $(document.createElement('div'));
function handle(ph, escape) {
var cur = context;
$.each(ph.split('.'), function() {
cur = cur[this];
});
return escape ? esc.text(cur || "").html() : cur;
}
return template.replace(/<([%#])([\w\.]*)\1>/g, function() {
return handle(arguments[2], arguments[1] == '%' ? true : false);
});
}
/** Flash an error message briefly. */
function showError(message) {
$(document.createElement('div')).attr({'class': 'popup-error'})
.append($(document.createElement('div'))
.attr({'class': 'error-message'}).text(message))
.appendTo('body')
.fadeIn("slow")
.delay(2000)
.fadeOut("slow");
}
/** Add a link the user uses to open the comments popup. */
$.fn.comment = function() {
return this.each(function() {
var id = $(this).attr('id').substring(1);
var count = COMMENT_METADATA[id];
var title = count + ' comment' + (count == 1 ? '' : 's');
var image = count > 0 ? opts.commentBrightImage : opts.commentImage;
var addcls = count == 0 ? ' nocomment' : '';
$(this)
.append(
$(document.createElement('a')).attr({
href: '#',
'class': 'sphinx-comment-open' + addcls,
id: 'ao' + id
})
.append($(document.createElement('img')).attr({
src: image,
alt: 'comment',
title: title
}))
.click(function(event) {
event.preventDefault();
show($(this).attr('id').substring(2));
})
)
.append(
$(document.createElement('a')).attr({
href: '#',
'class': 'sphinx-comment-close hidden',
id: 'ah' + id
})
.append($(document.createElement('img')).attr({
src: opts.closeCommentImage,
alt: 'close',
title: 'close'
}))
.click(function(event) {
event.preventDefault();
hide($(this).attr('id').substring(2));
})
);
});
};
var opts = {
processVoteURL: '/_process_vote',
addCommentURL: '/_add_comment',
getCommentsURL: '/_get_comments',
acceptCommentURL: '/_accept_comment',
deleteCommentURL: '/_delete_comment',
commentImage: '/static/_static/comment.png',
closeCommentImage: '/static/_static/comment-close.png',
loadingImage: '/static/_static/ajax-loader.gif',
commentBrightImage: '/static/_static/comment-bright.png',
upArrow: '/static/_static/up.png',
downArrow: '/static/_static/down.png',
upArrowPressed: '/static/_static/up-pressed.png',
downArrowPressed: '/static/_static/down-pressed.png',
voting: false,
moderator: false
};
if (typeof COMMENT_OPTIONS != "undefined") {
opts = jQuery.extend(opts, COMMENT_OPTIONS);
}
var popupTemplate = '\
<div class="sphinx-comments" id="sc<%id%>">\
<p class="sort-options">\
Sort by:\
<a href="#" class="sort-option byrating">best rated</a>\
<a href="#" class="sort-option byascage">newest</a>\
<a href="#" class="sort-option byage">oldest</a>\
</p>\
<div class="comment-header">Comments</div>\
<div class="comment-loading" id="cn<%id%>">\
loading comments... <img src="<%loadingImage%>" alt="" /></div>\
<ul id="cl<%id%>" class="comment-ul"></ul>\
<div id="ca<%id%>">\
<p class="add-a-comment">Add a comment\
(<a href="#" class="comment-markup" id="ab<%id%>">markup</a>):</p>\
<div class="comment-markup-box" id="mb<%id%>">\
reStructured text markup: <i>*emph*</i>, <b>**strong**</b>, \
<code>``code``</code>, \
code blocks: <code>::</code> and an indented block after blank line</div>\
<form method="post" id="cf<%id%>" class="comment-form" action="">\
<textarea name="comment" cols="80"></textarea>\
<p class="propose-button">\
<a href="#" id="pc<%id%>" class="show-propose-change">\
Propose a change &#9657;\
</a>\
<a href="#" id="hc<%id%>" class="hide-propose-change">\
Propose a change &#9663;\
</a>\
</p>\
<textarea name="proposal" id="pt<%id%>" cols="80"\
spellcheck="false"></textarea>\
<input type="submit" value="Add comment" />\
<input type="hidden" name="node" value="<%id%>" />\
<input type="hidden" name="parent" value="" />\
</form>\
</div>\
</div>';
var commentTemplate = '\
<div id="cd<%id%>" class="sphinx-comment<%css_class%>">\
<div class="vote">\
<div class="arrow">\
<a href="#" id="uv<%id%>" class="vote" title="vote up">\
<img src="<%upArrow%>" />\
</a>\
<a href="#" id="uu<%id%>" class="un vote" title="vote up">\
<img src="<%upArrowPressed%>" />\
</a>\
</div>\
<div class="arrow">\
<a href="#" id="dv<%id%>" class="vote" title="vote down">\
<img src="<%downArrow%>" id="da<%id%>" />\
</a>\
<a href="#" id="du<%id%>" class="un vote" title="vote down">\
<img src="<%downArrowPressed%>" />\
</a>\
</div>\
</div>\
<div class="comment-content">\
<p class="tagline comment">\
<span class="user-id"><%username%></span>\
<span class="rating"><%pretty_rating%></span>\
<span class="delta"><%time.delta%></span>\
</p>\
<div class="comment-text comment"><#text#></div>\
<p class="comment-opts comment">\
<a href="#" class="reply hidden" id="rl<%id%>">reply &#9657;</a>\
<a href="#" class="close-reply" id="cr<%id%>">reply &#9663;</a>\
<a href="#" id="sp<%id%>" class="show-proposal">proposal &#9657;</a>\
<a href="#" id="hp<%id%>" class="hide-proposal">proposal &#9663;</a>\
<a href="#" id="dc<%id%>" class="delete-comment hidden">delete</a>\
<span id="cm<%id%>" class="moderation hidden">\
<a href="#" id="ac<%id%>" class="accept-comment">accept</a>\
</span>\
</p>\
<pre class="proposal" id="pr<%id%>">\
<#proposal_diff#>\
</pre>\
<ul class="comment-children" id="cl<%id%>"></ul>\
</div>\
<div class="clearleft"></div>\
</div>\
</div>';
var replyTemplate = '\
<li>\
<div class="reply-div" id="rd<%id%>">\
<form id="rf<%id%>">\
<textarea name="comment" cols="80"></textarea>\
<input type="submit" value="Add reply" />\
<input type="button" value="Cancel" />\
<input type="hidden" name="parent" value="<%id%>" />\
<input type="hidden" name="node" value="" />\
</form>\
</div>\
</li>';
$(document).ready(function() {
init();
});
})(jQuery);
$(document).ready(function() {
// add comment anchors for all paragraphs that are commentable
$('.sphinx-has-comment').comment();
// highlight search words in search results
$("div.context").each(function() {
var params = $.getQueryParameters();
var terms = (params.q) ? params.q[0].split(/\s+/) : [];
var result = $(this);
$.each(terms, function() {
result.highlightText(this.toLowerCase(), 'highlighted');
});
});
// directly open comment window if requested
var anchor = document.location.hash;
if (anchor.substring(0, 9) == '#comment-') {
$('#ao' + anchor.substring(9)).click();
document.location.hash = '#s' + anchor.substring(9);
}
});

View File

@ -196,39 +196,37 @@
<p>Suppose you wanna store the reid data in a directory called “path/to/reid-data/”, you need to specify the <code class="docutils literal notranslate"><span class="pre">root</span></code> as <em>root=path/to/reid-data/</em> when initializing <code class="docutils literal notranslate"><span class="pre">DataManager</span></code>. Below we use <code class="docutils literal notranslate"><span class="pre">$REID</span></code> to denote “path/to/reid-data”.</p>
<p>Please refer to <a class="reference internal" href="pkg/data.html#torchreid-data"><span class="std std-ref">torchreid.data</span></a> for details regarding the arguments.</p>
<div class="admonition note">
<p class="admonition-title">Note</p>
<p>Dataset with a <span class="math notranslate nohighlight">\(\dagger\)</span> superscript means that the process is automated, so you can directly call the dataset in <code class="docutils literal notranslate"><span class="pre">DataManager</span></code> (which automatically downloads the dataset and organizes the data structure). However, we also provide a way below to help the manual setup in case the automation fails.</p>
<p class="first admonition-title">Note</p>
<p class="last">Dataset with a <span class="math notranslate nohighlight">\(\dagger\)</span> superscript means that the process is automated, so you can directly call the dataset in <code class="docutils literal notranslate"><span class="pre">DataManager</span></code> (which automatically downloads the dataset and organizes the data structure). However, we also provide a way below to help the manual setup in case the automation fails.</p>
</div>
<div class="admonition note">
<p class="admonition-title">Note</p>
<p>The keys to use specific datasets are enclosed in the parantheses beside the datasets.</p>
<p class="first admonition-title">Note</p>
<p class="last">The keys to use specific datasets are enclosed in the parantheses beside the datasets.</p>
</div>
<div class="admonition note">
<p class="admonition-title">Note</p>
<p>You are suggested to use the provided names for dataset folders such as “market1501” for Market1501 and “dukemtmcreid” for DukeMTMC-reID when doing the manual setup, otherwise you need to modify the source code accordingly (i.e. the <code class="docutils literal notranslate"><span class="pre">dataset_dir</span></code> attribute).</p>
<p class="first admonition-title">Note</p>
<p class="last">You are suggested to use the provided names for dataset folders such as “market1501” for Market1501 and “dukemtmcreid” for DukeMTMC-reID when doing the manual setup, otherwise you need to modify the source code accordingly (i.e. the <code class="docutils literal notranslate"><span class="pre">dataset_dir</span></code> attribute).</p>
</div>
<div class="contents local topic" id="contents">
<ul class="simple">
<li><p><a class="reference internal" href="#image-datasets" id="id1">Image Datasets</a></p>
<ul>
<li><p><a class="reference internal" href="#market1501-dagger-market1501" id="id2">Market1501 <span class="math notranslate nohighlight">\(^\dagger\)</span> (<code class="docutils literal notranslate"><span class="pre">market1501</span></code>)</a></p></li>
<li><p><a class="reference internal" href="#cuhk03-cuhk03" id="id3">CUHK03 (<code class="docutils literal notranslate"><span class="pre">cuhk03</span></code>)</a></p></li>
<li><p><a class="reference internal" href="#dukemtmc-reid-dagger-dukemtmcreid" id="id4">DukeMTMC-reID <span class="math notranslate nohighlight">\(^\dagger\)</span> (<code class="docutils literal notranslate"><span class="pre">dukemtmcreid</span></code>)</a></p></li>
<li><p><a class="reference internal" href="#msmt17-msmt17" id="id5">MSMT17 (<code class="docutils literal notranslate"><span class="pre">msmt17</span></code>)</a></p></li>
<li><p><a class="reference internal" href="#viper-dagger-viper" id="id6">VIPeR <span class="math notranslate nohighlight">\(^\dagger\)</span> (<code class="docutils literal notranslate"><span class="pre">viper</span></code>)</a></p></li>
<li><p><a class="reference internal" href="#grid-dagger-grid" id="id7">GRID <span class="math notranslate nohighlight">\(^\dagger\)</span> (<code class="docutils literal notranslate"><span class="pre">grid</span></code>)</a></p></li>
<li><p><a class="reference internal" href="#cuhk01-cuhk01" id="id8">CUHK01 (<code class="docutils literal notranslate"><span class="pre">cuhk01</span></code>)</a></p></li>
<li><p><a class="reference internal" href="#sensereid-sensereid" id="id9">SenseReID (<code class="docutils literal notranslate"><span class="pre">sensereid</span></code>)</a></p></li>
<li><p><a class="reference internal" href="#qmul-ilids-dagger-ilids" id="id10">QMUL-iLIDS <span class="math notranslate nohighlight">\(^\dagger\)</span> (<code class="docutils literal notranslate"><span class="pre">ilids</span></code>)</a></p></li>
<li><p><a class="reference internal" href="#prid-prid" id="id11">PRID (<code class="docutils literal notranslate"><span class="pre">prid</span></code>)</a></p></li>
<li><a class="reference internal" href="#image-datasets" id="id1">Image Datasets</a><ul>
<li><a class="reference internal" href="#market1501-dagger-market1501" id="id2">Market1501 <span class="math notranslate nohighlight">\(^\dagger\)</span> (<code class="docutils literal notranslate"><span class="pre">market1501</span></code>)</a></li>
<li><a class="reference internal" href="#cuhk03-cuhk03" id="id3">CUHK03 (<code class="docutils literal notranslate"><span class="pre">cuhk03</span></code>)</a></li>
<li><a class="reference internal" href="#dukemtmc-reid-dagger-dukemtmcreid" id="id4">DukeMTMC-reID <span class="math notranslate nohighlight">\(^\dagger\)</span> (<code class="docutils literal notranslate"><span class="pre">dukemtmcreid</span></code>)</a></li>
<li><a class="reference internal" href="#msmt17-msmt17" id="id5">MSMT17 (<code class="docutils literal notranslate"><span class="pre">msmt17</span></code>)</a></li>
<li><a class="reference internal" href="#viper-dagger-viper" id="id6">VIPeR <span class="math notranslate nohighlight">\(^\dagger\)</span> (<code class="docutils literal notranslate"><span class="pre">viper</span></code>)</a></li>
<li><a class="reference internal" href="#grid-dagger-grid" id="id7">GRID <span class="math notranslate nohighlight">\(^\dagger\)</span> (<code class="docutils literal notranslate"><span class="pre">grid</span></code>)</a></li>
<li><a class="reference internal" href="#cuhk01-cuhk01" id="id8">CUHK01 (<code class="docutils literal notranslate"><span class="pre">cuhk01</span></code>)</a></li>
<li><a class="reference internal" href="#sensereid-sensereid" id="id9">SenseReID (<code class="docutils literal notranslate"><span class="pre">sensereid</span></code>)</a></li>
<li><a class="reference internal" href="#qmul-ilids-dagger-ilids" id="id10">QMUL-iLIDS <span class="math notranslate nohighlight">\(^\dagger\)</span> (<code class="docutils literal notranslate"><span class="pre">ilids</span></code>)</a></li>
<li><a class="reference internal" href="#prid-prid" id="id11">PRID (<code class="docutils literal notranslate"><span class="pre">prid</span></code>)</a></li>
</ul>
</li>
<li><p><a class="reference internal" href="#video-datasets" id="id12">Video Datasets</a></p>
<ul>
<li><p><a class="reference internal" href="#mars-mars" id="id13">MARS (<code class="docutils literal notranslate"><span class="pre">mars</span></code>)</a></p></li>
<li><p><a class="reference internal" href="#ilids-vid-dagger-ilidsvid" id="id14">iLIDS-VID <span class="math notranslate nohighlight">\(^\dagger\)</span> (<code class="docutils literal notranslate"><span class="pre">ilidsvid</span></code>)</a></p></li>
<li><p><a class="reference internal" href="#prid2011-prid2011" id="id15">PRID2011 (<code class="docutils literal notranslate"><span class="pre">prid2011</span></code>)</a></p></li>
<li><p><a class="reference internal" href="#dukemtmc-videoreid-dagger-dukemtmcvidreid" id="id16">DukeMTMC-VideoReID <span class="math notranslate nohighlight">\(^\dagger\)</span> (<code class="docutils literal notranslate"><span class="pre">dukemtmcvidreid</span></code>)</a></p></li>
<li><a class="reference internal" href="#video-datasets" id="id12">Video Datasets</a><ul>
<li><a class="reference internal" href="#mars-mars" id="id13">MARS (<code class="docutils literal notranslate"><span class="pre">mars</span></code>)</a></li>
<li><a class="reference internal" href="#ilids-vid-dagger-ilidsvid" id="id14">iLIDS-VID <span class="math notranslate nohighlight">\(^\dagger\)</span> (<code class="docutils literal notranslate"><span class="pre">ilidsvid</span></code>)</a></li>
<li><a class="reference internal" href="#prid2011-prid2011" id="id15">PRID2011 (<code class="docutils literal notranslate"><span class="pre">prid2011</span></code>)</a></li>
<li><a class="reference internal" href="#dukemtmc-videoreid-dagger-dukemtmcvidreid" id="id16">DukeMTMC-VideoReID <span class="math notranslate nohighlight">\(^\dagger\)</span> (<code class="docutils literal notranslate"><span class="pre">dukemtmcvidreid</span></code>)</a></li>
</ul>
</li>
</ul>
@ -238,9 +236,9 @@
<div class="section" id="market1501-dagger-market1501">
<h3><a class="toc-backref" href="#id2">Market1501 <span class="math notranslate nohighlight">\(^\dagger\)</span> (<code class="docutils literal notranslate"><span class="pre">market1501</span></code>)</a><a class="headerlink" href="#market1501-dagger-market1501" title="Permalink to this headline"></a></h3>
<ul class="simple">
<li><p>Create a directory named “market1501” under <code class="docutils literal notranslate"><span class="pre">$REID</span></code>.</p></li>
<li><p>Download the dataset to “market1501” from <a class="reference external" href="http://www.liangzheng.org/Project/project_reid.html">http://www.liangzheng.org/Project/project_reid.html</a> and extract the files.</p></li>
<li><p>The data structure should look like</p></li>
<li>Create a directory named “market1501” under <code class="docutils literal notranslate"><span class="pre">$REID</span></code>.</li>
<li>Download the dataset to “market1501” from <a class="reference external" href="http://www.liangzheng.org/Project/project_reid.html">http://www.liangzheng.org/Project/project_reid.html</a> and extract the files.</li>
<li>The data structure should look like</li>
</ul>
<div class="highlight-none notranslate"><div class="highlight"><pre><span></span>market1501/
Market-1501-v15.09.15/
@ -250,16 +248,16 @@
</pre></div>
</div>
<ul class="simple">
<li><p>To use the extra 500K distractors (i.e. Market1501 + 500K), go to the <strong>Market-1501+500k Dataset</strong> section at <a class="reference external" href="http://www.liangzheng.org/Project/project_reid.html">http://www.liangzheng.org/Project/project_reid.html</a>, download the zip file “distractors_500k.zip” and extract it under “market1501/Market-1501-v15.09.15”. The argument to use these 500K distrctors is <code class="docutils literal notranslate"><span class="pre">market1501_500k</span></code> in <code class="docutils literal notranslate"><span class="pre">ImageDataManager</span></code>.</p></li>
<li>To use the extra 500K distractors (i.e. Market1501 + 500K), go to the <strong>Market-1501+500k Dataset</strong> section at <a class="reference external" href="http://www.liangzheng.org/Project/project_reid.html">http://www.liangzheng.org/Project/project_reid.html</a>, download the zip file “distractors_500k.zip” and extract it under “market1501/Market-1501-v15.09.15”. The argument to use these 500K distrctors is <code class="docutils literal notranslate"><span class="pre">market1501_500k</span></code> in <code class="docutils literal notranslate"><span class="pre">ImageDataManager</span></code>.</li>
</ul>
</div>
<div class="section" id="cuhk03-cuhk03">
<h3><a class="toc-backref" href="#id3">CUHK03 (<code class="docutils literal notranslate"><span class="pre">cuhk03</span></code>)</a><a class="headerlink" href="#cuhk03-cuhk03" title="Permalink to this headline"></a></h3>
<ul class="simple">
<li><p>Create a folder named “cuhk03” under <code class="docutils literal notranslate"><span class="pre">$REID</span></code>.</p></li>
<li><p>Download the dataset to “cuhk03/” from <a class="reference external" href="http://www.ee.cuhk.edu.hk/~xgwang/CUHK_identification.html">http://www.ee.cuhk.edu.hk/~xgwang/CUHK_identification.html</a> and extract “cuhk03_release.zip”, resulting in “cuhk03/cuhk03_release/”.</p></li>
<li><p>Download the new split (767/700) from <a class="reference external" href="https://github.com/zhunzhong07/person-re-ranking/tree/master/evaluation/data/CUHK03">person-re-ranking</a>. What you need are “cuhk03_new_protocol_config_detected.mat” and “cuhk03_new_protocol_config_labeled.mat”. Put these two mat files under “cuhk03/”.</p></li>
<li><p>The data structure should look like</p></li>
<li>Create a folder named “cuhk03” under <code class="docutils literal notranslate"><span class="pre">$REID</span></code>.</li>
<li>Download the dataset to “cuhk03/” from <a class="reference external" href="http://www.ee.cuhk.edu.hk/~xgwang/CUHK_identification.html">http://www.ee.cuhk.edu.hk/~xgwang/CUHK_identification.html</a> and extract “cuhk03_release.zip”, resulting in “cuhk03/cuhk03_release/”.</li>
<li>Download the new split (767/700) from <a class="reference external" href="https://github.com/zhunzhong07/person-re-ranking/tree/master/evaluation/data/CUHK03">person-re-ranking</a>. What you need are “cuhk03_new_protocol_config_detected.mat” and “cuhk03_new_protocol_config_labeled.mat”. Put these two mat files under “cuhk03/”.</li>
<li>The data structure should look like</li>
</ul>
<div class="highlight-none notranslate"><div class="highlight"><pre><span></span>cuhk03/
cuhk03_release/
@ -268,19 +266,19 @@
</pre></div>
</div>
<ul class="simple">
<li><p>In the default mode, we load data using the new split (767/700). If you wanna use the original (20) splits (1367/100), please set <code class="docutils literal notranslate"><span class="pre">cuhk03_classic_split</span></code> to True in <code class="docutils literal notranslate"><span class="pre">ImageDataManager</span></code>. As the CMC is computed differently from Market1501 for the 1367/100 split (see <a class="reference external" href="http://www.ee.cuhk.edu.hk/~xgwang/CUHK_identification.html">here</a>), you need to enable <code class="docutils literal notranslate"><span class="pre">use_metric_cuhk03</span></code> in <code class="docutils literal notranslate"><span class="pre">ImageDataManager</span></code> to activate the <em>single-gallery-shot</em> metric for fair comparison with some methods that adopt the old splits (<em>do not need to report mAP</em>). In addition, we support both <em>labeled</em> and <em>detected</em> modes. The default mode loads <em>detected</em> images. Enable <code class="docutils literal notranslate"><span class="pre">cuhk03_labeled</span></code> in <code class="docutils literal notranslate"><span class="pre">ImageDataManager</span></code> if you wanna train and test on <em>labeled</em> images.</p></li>
<li>In the default mode, we load data using the new split (767/700). If you wanna use the original (20) splits (1367/100), please set <code class="docutils literal notranslate"><span class="pre">cuhk03_classic_split</span></code> to True in <code class="docutils literal notranslate"><span class="pre">ImageDataManager</span></code>. As the CMC is computed differently from Market1501 for the 1367/100 split (see <a class="reference external" href="http://www.ee.cuhk.edu.hk/~xgwang/CUHK_identification.html">here</a>), you need to enable <code class="docutils literal notranslate"><span class="pre">use_metric_cuhk03</span></code> in <code class="docutils literal notranslate"><span class="pre">ImageDataManager</span></code> to activate the <em>single-gallery-shot</em> metric for fair comparison with some methods that adopt the old splits (<em>do not need to report mAP</em>). In addition, we support both <em>labeled</em> and <em>detected</em> modes. The default mode loads <em>detected</em> images. Enable <code class="docutils literal notranslate"><span class="pre">cuhk03_labeled</span></code> in <code class="docutils literal notranslate"><span class="pre">ImageDataManager</span></code> if you wanna train and test on <em>labeled</em> images.</li>
</ul>
<div class="admonition note">
<p class="admonition-title">Note</p>
<p>The code will extract images in “cuhk-03.mat” and save them under “cuhk03/images_detected” and “cuhk03/images_labeled”. Also, four json files will be automatically generated, i.e. “splits_classic_detected.json”, “splits_classic_labeled.json”, “splits_new_detected.json” and “splits_new_labeled.json”. If the parent path of <code class="docutils literal notranslate"><span class="pre">$REID</span></code> is changed, these json files should be manually deleted. The code can automatically generate new json files to match the new path.</p>
<p class="first admonition-title">Note</p>
<p class="last">The code will extract images in “cuhk-03.mat” and save them under “cuhk03/images_detected” and “cuhk03/images_labeled”. Also, four json files will be automatically generated, i.e. “splits_classic_detected.json”, “splits_classic_labeled.json”, “splits_new_detected.json” and “splits_new_labeled.json”. If the parent path of <code class="docutils literal notranslate"><span class="pre">$REID</span></code> is changed, these json files should be manually deleted. The code can automatically generate new json files to match the new path.</p>
</div>
</div>
<div class="section" id="dukemtmc-reid-dagger-dukemtmcreid">
<h3><a class="toc-backref" href="#id4">DukeMTMC-reID <span class="math notranslate nohighlight">\(^\dagger\)</span> (<code class="docutils literal notranslate"><span class="pre">dukemtmcreid</span></code>)</a><a class="headerlink" href="#dukemtmc-reid-dagger-dukemtmcreid" title="Permalink to this headline"></a></h3>
<ul class="simple">
<li><p>Create a directory called “dukemtmc-reid” under <code class="docutils literal notranslate"><span class="pre">$REID</span></code>.</p></li>
<li><p>Download “DukeMTMC-reID” from <a class="reference external" href="http://vision.cs.duke.edu/DukeMTMC/">http://vision.cs.duke.edu/DukeMTMC/</a> and extract it under “dukemtmc-reid”.</p></li>
<li><p>The data structure should look like</p></li>
<li>Create a directory called “dukemtmc-reid” under <code class="docutils literal notranslate"><span class="pre">$REID</span></code>.</li>
<li>Download “DukeMTMC-reID” from <a class="reference external" href="http://vision.cs.duke.edu/DukeMTMC/">http://vision.cs.duke.edu/DukeMTMC/</a> and extract it under “dukemtmc-reid”.</li>
<li>The data structure should look like</li>
</ul>
<div class="highlight-none notranslate"><div class="highlight"><pre><span></span>dukemtmc-reid/
DukeMTMC-reID/
@ -294,9 +292,9 @@
<div class="section" id="msmt17-msmt17">
<h3><a class="toc-backref" href="#id5">MSMT17 (<code class="docutils literal notranslate"><span class="pre">msmt17</span></code>)</a><a class="headerlink" href="#msmt17-msmt17" title="Permalink to this headline"></a></h3>
<ul class="simple">
<li><p>Create a directory called “msmt17” under <code class="docutils literal notranslate"><span class="pre">$REID</span></code>.</p></li>
<li><p>Download the dataset from <a class="reference external" href="http://www.pkuvmc.com/publications/msmt17.html">http://www.pkuvmc.com/publications/msmt17.html</a> to “msmt17” and extract the files.</p></li>
<li><p>The data structure should look like</p></li>
<li>Create a directory called “msmt17” under <code class="docutils literal notranslate"><span class="pre">$REID</span></code>.</li>
<li>Download the dataset from <a class="reference external" href="http://www.pkuvmc.com/publications/msmt17.html">http://www.pkuvmc.com/publications/msmt17.html</a> to “msmt17” and extract the files.</li>
<li>The data structure should look like</li>
</ul>
<div class="highlight-none notranslate"><div class="highlight"><pre><span></span>msmt17/
MSMT17_V1/ # or MSMT17_V2
@ -312,8 +310,8 @@
<div class="section" id="viper-dagger-viper">
<h3><a class="toc-backref" href="#id6">VIPeR <span class="math notranslate nohighlight">\(^\dagger\)</span> (<code class="docutils literal notranslate"><span class="pre">viper</span></code>)</a><a class="headerlink" href="#viper-dagger-viper" title="Permalink to this headline"></a></h3>
<ul class="simple">
<li><p>The download link is <a class="reference external" href="http://users.soe.ucsc.edu/~manduchi/VIPeR.v1.0.zip">http://users.soe.ucsc.edu/~manduchi/VIPeR.v1.0.zip</a>.</p></li>
<li><p>Organize the dataset in a folder named “viper” as follows</p></li>
<li>The download link is <a class="reference external" href="http://users.soe.ucsc.edu/~manduchi/VIPeR.v1.0.zip">http://users.soe.ucsc.edu/~manduchi/VIPeR.v1.0.zip</a>.</li>
<li>Organize the dataset in a folder named “viper” as follows</li>
</ul>
<div class="highlight-none notranslate"><div class="highlight"><pre><span></span>viper/
VIPeR/
@ -325,8 +323,8 @@
<div class="section" id="grid-dagger-grid">
<h3><a class="toc-backref" href="#id7">GRID <span class="math notranslate nohighlight">\(^\dagger\)</span> (<code class="docutils literal notranslate"><span class="pre">grid</span></code>)</a><a class="headerlink" href="#grid-dagger-grid" title="Permalink to this headline"></a></h3>
<ul class="simple">
<li><p>The download link is <a class="reference external" href="http://personal.ie.cuhk.edu.hk/~ccloy/files/datasets/underground_reid.zip">http://personal.ie.cuhk.edu.hk/~ccloy/files/datasets/underground_reid.zip</a>.</p></li>
<li><p>Organize the dataset in a folder named “grid” as follows</p></li>
<li>The download link is <a class="reference external" href="http://personal.ie.cuhk.edu.hk/~ccloy/files/datasets/underground_reid.zip">http://personal.ie.cuhk.edu.hk/~ccloy/files/datasets/underground_reid.zip</a>.</li>
<li>Organize the dataset in a folder named “grid” as follows</li>
</ul>
<div class="highlight-none notranslate"><div class="highlight"><pre><span></span>grid/
underground_reid/
@ -339,10 +337,10 @@
<div class="section" id="cuhk01-cuhk01">
<h3><a class="toc-backref" href="#id8">CUHK01 (<code class="docutils literal notranslate"><span class="pre">cuhk01</span></code>)</a><a class="headerlink" href="#cuhk01-cuhk01" title="Permalink to this headline"></a></h3>
<ul class="simple">
<li><p>Create a folder named “cuhk01” under <code class="docutils literal notranslate"><span class="pre">$REID</span></code>.</p></li>
<li><p>Download “CUHK01.zip” from <a class="reference external" href="http://www.ee.cuhk.edu.hk/~xgwang/CUHK_identification.html">http://www.ee.cuhk.edu.hk/~xgwang/CUHK_identification.html</a> and place it under “cuhk01/”.</p></li>
<li><p>The code can automatically extract the files, or you can do it yourself.</p></li>
<li><p>The data structure should look like</p></li>
<li>Create a folder named “cuhk01” under <code class="docutils literal notranslate"><span class="pre">$REID</span></code>.</li>
<li>Download “CUHK01.zip” from <a class="reference external" href="http://www.ee.cuhk.edu.hk/~xgwang/CUHK_identification.html">http://www.ee.cuhk.edu.hk/~xgwang/CUHK_identification.html</a> and place it under “cuhk01/”.</li>
<li>The code can automatically extract the files, or you can do it yourself.</li>
<li>The data structure should look like</li>
</ul>
<div class="highlight-none notranslate"><div class="highlight"><pre><span></span>cuhk01/
campus/
@ -352,9 +350,9 @@
<div class="section" id="sensereid-sensereid">
<h3><a class="toc-backref" href="#id9">SenseReID (<code class="docutils literal notranslate"><span class="pre">sensereid</span></code>)</a><a class="headerlink" href="#sensereid-sensereid" title="Permalink to this headline"></a></h3>
<ul class="simple">
<li><p>Create “sensereid” under <code class="docutils literal notranslate"><span class="pre">$REID</span></code>.</p></li>
<li><p>Download the dataset from this <a class="reference external" href="https://drive.google.com/file/d/0B56OfSrVI8hubVJLTzkwV2VaOWM/view">link</a> and extract it to “sensereid”.</p></li>
<li><p>Organize the data to be like</p></li>
<li>Create “sensereid” under <code class="docutils literal notranslate"><span class="pre">$REID</span></code>.</li>
<li>Download the dataset from this <a class="reference external" href="https://drive.google.com/file/d/0B56OfSrVI8hubVJLTzkwV2VaOWM/view">link</a> and extract it to “sensereid”.</li>
<li>Organize the data to be like</li>
</ul>
<div class="highlight-none notranslate"><div class="highlight"><pre><span></span>sensereid/
SenseReID/
@ -366,8 +364,8 @@
<div class="section" id="qmul-ilids-dagger-ilids">
<h3><a class="toc-backref" href="#id10">QMUL-iLIDS <span class="math notranslate nohighlight">\(^\dagger\)</span> (<code class="docutils literal notranslate"><span class="pre">ilids</span></code>)</a><a class="headerlink" href="#qmul-ilids-dagger-ilids" title="Permalink to this headline"></a></h3>
<ul class="simple">
<li><p>Create a folder named “ilids” under <code class="docutils literal notranslate"><span class="pre">$REID</span></code>.</p></li>
<li><p>Download the dataset from <a class="reference external" href="http://www.eecs.qmul.ac.uk/~jason/data/i-LIDS_Pedestrian.tgz">http://www.eecs.qmul.ac.uk/~jason/data/i-LIDS_Pedestrian.tgz</a> and organize it to look like</p></li>
<li>Create a folder named “ilids” under <code class="docutils literal notranslate"><span class="pre">$REID</span></code>.</li>
<li>Download the dataset from <a class="reference external" href="http://www.eecs.qmul.ac.uk/~jason/data/i-LIDS_Pedestrian.tgz">http://www.eecs.qmul.ac.uk/~jason/data/i-LIDS_Pedestrian.tgz</a> and organize it to look like</li>
</ul>
<div class="highlight-none notranslate"><div class="highlight"><pre><span></span>ilids/
i-LIDS_Pedestrian/
@ -378,9 +376,9 @@
<div class="section" id="prid-prid">
<h3><a class="toc-backref" href="#id11">PRID (<code class="docutils literal notranslate"><span class="pre">prid</span></code>)</a><a class="headerlink" href="#prid-prid" title="Permalink to this headline"></a></h3>
<ul class="simple">
<li><p>Create a directory named “prid2011” under <code class="docutils literal notranslate"><span class="pre">$REID</span></code>.</p></li>
<li><p>Download the dataset from <a class="reference external" href="https://www.tugraz.at/institute/icg/research/team-bischof/lrs/downloads/PRID11/">https://www.tugraz.at/institute/icg/research/team-bischof/lrs/downloads/PRID11/</a> and extract it under “prid2011”.</p></li>
<li><p>The data structure should end up with</p></li>
<li>Create a directory named “prid2011” under <code class="docutils literal notranslate"><span class="pre">$REID</span></code>.</li>
<li>Download the dataset from <a class="reference external" href="https://www.tugraz.at/institute/icg/research/team-bischof/lrs/downloads/PRID11/">https://www.tugraz.at/institute/icg/research/team-bischof/lrs/downloads/PRID11/</a> and extract it under “prid2011”.</li>
<li>The data structure should end up with</li>
</ul>
<div class="highlight-none notranslate"><div class="highlight"><pre><span></span>prid2011/
prid_2011/
@ -395,11 +393,11 @@
<div class="section" id="mars-mars">
<h3><a class="toc-backref" href="#id13">MARS (<code class="docutils literal notranslate"><span class="pre">mars</span></code>)</a><a class="headerlink" href="#mars-mars" title="Permalink to this headline"></a></h3>
<ul class="simple">
<li><p>Create “mars/” under <code class="docutils literal notranslate"><span class="pre">$REID</span></code>.</p></li>
<li><p>Download the dataset from <a class="reference external" href="http://www.liangzheng.com.cn/Project/project_mars.html">http://www.liangzheng.com.cn/Project/project_mars.html</a> and place it in “mars/”.</p></li>
<li><p>Extract “bbox_train.zip” and “bbox_test.zip”.</p></li>
<li><p>Download the split metadata from <a class="reference external" href="https://github.com/liangzheng06/MARS-evaluation/tree/master/info">https://github.com/liangzheng06/MARS-evaluation/tree/master/info</a> and put “info/” in “mars/”.</p></li>
<li><p>The data structure should end up with</p></li>
<li>Create “mars/” under <code class="docutils literal notranslate"><span class="pre">$REID</span></code>.</li>
<li>Download the dataset from <a class="reference external" href="http://www.liangzheng.com.cn/Project/project_mars.html">http://www.liangzheng.com.cn/Project/project_mars.html</a> and place it in “mars/”.</li>
<li>Extract “bbox_train.zip” and “bbox_test.zip”.</li>
<li>Download the split metadata from <a class="reference external" href="https://github.com/liangzheng06/MARS-evaluation/tree/master/info">https://github.com/liangzheng06/MARS-evaluation/tree/master/info</a> and put “info/” in “mars/”.</li>
<li>The data structure should end up with</li>
</ul>
<div class="highlight-none notranslate"><div class="highlight"><pre><span></span>mars/
bbox_test/
@ -411,9 +409,9 @@
<div class="section" id="ilids-vid-dagger-ilidsvid">
<h3><a class="toc-backref" href="#id14">iLIDS-VID <span class="math notranslate nohighlight">\(^\dagger\)</span> (<code class="docutils literal notranslate"><span class="pre">ilidsvid</span></code>)</a><a class="headerlink" href="#ilids-vid-dagger-ilidsvid" title="Permalink to this headline"></a></h3>
<ul class="simple">
<li><p>Create “ilids-vid” under <code class="docutils literal notranslate"><span class="pre">$REID</span></code>.</p></li>
<li><p>Download the dataset from <a class="reference external" href="http://www.eecs.qmul.ac.uk/~xiatian/downloads_qmul_iLIDS-VID_ReID_dataset.html">http://www.eecs.qmul.ac.uk/~xiatian/downloads_qmul_iLIDS-VID_ReID_dataset.html</a> to “ilids-vid”.</p></li>
<li><p>Organize the data structure to match</p></li>
<li>Create “ilids-vid” under <code class="docutils literal notranslate"><span class="pre">$REID</span></code>.</li>
<li>Download the dataset from <a class="reference external" href="http://www.eecs.qmul.ac.uk/~xiatian/downloads_qmul_iLIDS-VID_ReID_dataset.html">http://www.eecs.qmul.ac.uk/~xiatian/downloads_qmul_iLIDS-VID_ReID_dataset.html</a> to “ilids-vid”.</li>
<li>Organize the data structure to match</li>
</ul>
<div class="highlight-none notranslate"><div class="highlight"><pre><span></span>ilids-vid/
i-LIDS-VID/
@ -424,10 +422,10 @@
<div class="section" id="prid2011-prid2011">
<h3><a class="toc-backref" href="#id15">PRID2011 (<code class="docutils literal notranslate"><span class="pre">prid2011</span></code>)</a><a class="headerlink" href="#prid2011-prid2011" title="Permalink to this headline"></a></h3>
<ul class="simple">
<li><p>Create a directory named “prid2011” under <code class="docutils literal notranslate"><span class="pre">$REID</span></code>.</p></li>
<li><p>Download the dataset from <a class="reference external" href="https://www.tugraz.at/institute/icg/research/team-bischof/lrs/downloads/PRID11/">https://www.tugraz.at/institute/icg/research/team-bischof/lrs/downloads/PRID11/</a> and extract it under “prid2011”.</p></li>
<li><p>Download the split created by <em>iLIDS-VID</em> from <cite>here &lt;http://www.eecs.qmul.ac.uk/~kz303/deep-person-reid/datasets/prid2011/splits_prid2011.json&gt;</cite> and put it under “prid2011/”. Following the standard protocol, only 178 persons whose sequences are more than a threshold are used.</p></li>
<li><p>The data structure should end up with</p></li>
<li>Create a directory named “prid2011” under <code class="docutils literal notranslate"><span class="pre">$REID</span></code>.</li>
<li>Download the dataset from <a class="reference external" href="https://www.tugraz.at/institute/icg/research/team-bischof/lrs/downloads/PRID11/">https://www.tugraz.at/institute/icg/research/team-bischof/lrs/downloads/PRID11/</a> and extract it under “prid2011”.</li>
<li>Download the split created by <em>iLIDS-VID</em> from <cite>here &lt;http://www.eecs.qmul.ac.uk/~kz303/deep-person-reid/datasets/prid2011/splits_prid2011.json&gt;</cite> and put it under “prid2011/”. Following the standard protocol, only 178 persons whose sequences are more than a threshold are used.</li>
<li>The data structure should end up with</li>
</ul>
<div class="highlight-none notranslate"><div class="highlight"><pre><span></span>prid2011/
splits_prid2011.json
@ -440,9 +438,9 @@
<div class="section" id="dukemtmc-videoreid-dagger-dukemtmcvidreid">
<h3><a class="toc-backref" href="#id16">DukeMTMC-VideoReID <span class="math notranslate nohighlight">\(^\dagger\)</span> (<code class="docutils literal notranslate"><span class="pre">dukemtmcvidreid</span></code>)</a><a class="headerlink" href="#dukemtmc-videoreid-dagger-dukemtmcvidreid" title="Permalink to this headline"></a></h3>
<ul class="simple">
<li><p>Create “dukemtmc-vidreid” under <code class="docutils literal notranslate"><span class="pre">$REID</span></code>.</p></li>
<li><p>Download “DukeMTMC-VideoReID” from <a class="reference external" href="http://vision.cs.duke.edu/DukeMTMC/">http://vision.cs.duke.edu/DukeMTMC/</a> and unzip the file to “dukemtmc-vidreid/”.</p></li>
<li><p>The data structure should look like</p></li>
<li>Create “dukemtmc-vidreid” under <code class="docutils literal notranslate"><span class="pre">$REID</span></code>.</li>
<li>Download “DukeMTMC-VideoReID” from <a class="reference external" href="http://vision.cs.duke.edu/DukeMTMC/">http://vision.cs.duke.edu/DukeMTMC/</a> and unzip the file to “dukemtmc-vidreid/”.</li>
<li>The data structure should look like</li>
</ul>
<div class="highlight-none notranslate"><div class="highlight"><pre><span></span>dukemtmc-vidreid/
DukeMTMC-VideoReID/

View File

@ -177,23 +177,23 @@
<div class="section" id="image-reid">
<h2>Image ReID<a class="headerlink" href="#image-reid" title="Permalink to this headline"></a></h2>
<ul class="simple">
<li><p><strong>Market1501</strong>, <strong>DukeMTMC-reID</strong>, <strong>CUHK03 (767/700 split)</strong> and <strong>MSMT17</strong> have fixed split so keeping <code class="docutils literal notranslate"><span class="pre">split_id=0</span></code> is fine.</p></li>
<li><p><strong>CUHK03 (classic split)</strong> has 20 fixed splits, so do <code class="docutils literal notranslate"><span class="pre">split_id=0~19</span></code>.</p></li>
<li><p><strong>VIPeR</strong> contains 632 identities each with 2 images under two camera views. Evaluation should be done for 10 random splits. Each split randomly divides 632 identities to 316 train ids (632 images) and the other 316 test ids (632 images). Note that, in each random split, there are two sub-splits, one using camera-A as query and camera-B as gallery while the other one using camera-B as query and camera-A as gallery. Thus, there are totally 20 splits generated with <code class="docutils literal notranslate"><span class="pre">split_id</span></code> starting from 0 to 19. Models can be trained on <code class="docutils literal notranslate"><span class="pre">split_id=[0,</span> <span class="pre">2,</span> <span class="pre">4,</span> <span class="pre">6,</span> <span class="pre">8,</span> <span class="pre">10,</span> <span class="pre">12,</span> <span class="pre">14,</span> <span class="pre">16,</span> <span class="pre">18]</span></code> (because <code class="docutils literal notranslate"><span class="pre">split_id=0</span></code> and <code class="docutils literal notranslate"><span class="pre">split_id=1</span></code> share the same train set, and so on and so forth.). At test time, models trained on <code class="docutils literal notranslate"><span class="pre">split_id=0</span></code> can be directly evaluated on <code class="docutils literal notranslate"><span class="pre">split_id=1</span></code>, models trained on <code class="docutils literal notranslate"><span class="pre">split_id=2</span></code> can be directly evaluated on <code class="docutils literal notranslate"><span class="pre">split_id=3</span></code>, and so on and so forth.</p></li>
<li><p><strong>CUHK01</strong> is similar to VIPeR in the split generation.</p></li>
<li><p><strong>GRID</strong> , <strong>iLIDS</strong> and <strong>PRID</strong> have 10 random splits, so evaluation should be done by varying <code class="docutils literal notranslate"><span class="pre">split_id</span></code> from 0 to 9.</p></li>
<li><p><strong>SenseReID</strong> has no training images and is used for evaluation only.</p></li>
<li><strong>Market1501</strong>, <strong>DukeMTMC-reID</strong>, <strong>CUHK03 (767/700 split)</strong> and <strong>MSMT17</strong> have fixed split so keeping <code class="docutils literal notranslate"><span class="pre">split_id=0</span></code> is fine.</li>
<li><strong>CUHK03 (classic split)</strong> has 20 fixed splits, so do <code class="docutils literal notranslate"><span class="pre">split_id=0~19</span></code>.</li>
<li><strong>VIPeR</strong> contains 632 identities each with 2 images under two camera views. Evaluation should be done for 10 random splits. Each split randomly divides 632 identities to 316 train ids (632 images) and the other 316 test ids (632 images). Note that, in each random split, there are two sub-splits, one using camera-A as query and camera-B as gallery while the other one using camera-B as query and camera-A as gallery. Thus, there are totally 20 splits generated with <code class="docutils literal notranslate"><span class="pre">split_id</span></code> starting from 0 to 19. Models can be trained on <code class="docutils literal notranslate"><span class="pre">split_id=[0,</span> <span class="pre">2,</span> <span class="pre">4,</span> <span class="pre">6,</span> <span class="pre">8,</span> <span class="pre">10,</span> <span class="pre">12,</span> <span class="pre">14,</span> <span class="pre">16,</span> <span class="pre">18]</span></code> (because <code class="docutils literal notranslate"><span class="pre">split_id=0</span></code> and <code class="docutils literal notranslate"><span class="pre">split_id=1</span></code> share the same train set, and so on and so forth.). At test time, models trained on <code class="docutils literal notranslate"><span class="pre">split_id=0</span></code> can be directly evaluated on <code class="docutils literal notranslate"><span class="pre">split_id=1</span></code>, models trained on <code class="docutils literal notranslate"><span class="pre">split_id=2</span></code> can be directly evaluated on <code class="docutils literal notranslate"><span class="pre">split_id=3</span></code>, and so on and so forth.</li>
<li><strong>CUHK01</strong> is similar to VIPeR in the split generation.</li>
<li><strong>GRID</strong> , <strong>iLIDS</strong> and <strong>PRID</strong> have 10 random splits, so evaluation should be done by varying <code class="docutils literal notranslate"><span class="pre">split_id</span></code> from 0 to 9.</li>
<li><strong>SenseReID</strong> has no training images and is used for evaluation only.</li>
</ul>
<div class="admonition note">
<p class="admonition-title">Note</p>
<p>The <code class="docutils literal notranslate"><span class="pre">split_id</span></code> argument is defined in <code class="docutils literal notranslate"><span class="pre">ImageDataManager</span></code> and <code class="docutils literal notranslate"><span class="pre">VideoDataManager</span></code>. Please refer to <a class="reference internal" href="pkg/data.html#torchreid-data"><span class="std std-ref">torchreid.data</span></a>.</p>
<p class="first admonition-title">Note</p>
<p class="last">The <code class="docutils literal notranslate"><span class="pre">split_id</span></code> argument is defined in <code class="docutils literal notranslate"><span class="pre">ImageDataManager</span></code> and <code class="docutils literal notranslate"><span class="pre">VideoDataManager</span></code>. Please refer to <a class="reference internal" href="pkg/data.html#torchreid-data"><span class="std std-ref">torchreid.data</span></a>.</p>
</div>
</div>
<div class="section" id="video-reid">
<h2>Video ReID<a class="headerlink" href="#video-reid" title="Permalink to this headline"></a></h2>
<ul class="simple">
<li><p><strong>MARS</strong> and <strong>DukeMTMC-VideoReID</strong> have fixed single split so using <code class="docutils literal notranslate"><span class="pre">split_id=0</span></code> is ok.</p></li>
<li><p><strong>iLIDS-VID</strong> and <strong>PRID2011</strong> have 10 predefined splits so evaluation should be done by varying <code class="docutils literal notranslate"><span class="pre">split_id</span></code> from 0 to 9.</p></li>
<li><strong>MARS</strong> and <strong>DukeMTMC-VideoReID</strong> have fixed single split so using <code class="docutils literal notranslate"><span class="pre">split_id=0</span></code> is ok.</li>
<li><strong>iLIDS-VID</strong> and <strong>PRID2011</strong> have 10 predefined splits so evaluation should be done by varying <code class="docutils literal notranslate"><span class="pre">split_id</span></code> from 0 to 9.</li>
</ul>
</div>
</div>

View File

@ -398,9 +398,9 @@
</li>
</ul></td>
<td style="width: 33%; vertical-align: top;"><ul>
<li><a href="pkg/data.html#torchreid.data.datamanager.DataManager.num_train_cams">num_train_cams() (torchreid.data.datamanager.DataManager property)</a>
<li><a href="pkg/data.html#torchreid.data.datamanager.DataManager.num_train_cams">num_train_cams (torchreid.data.datamanager.DataManager attribute)</a>
</li>
<li><a href="pkg/data.html#torchreid.data.datamanager.DataManager.num_train_pids">num_train_pids() (torchreid.data.datamanager.DataManager property)</a>
<li><a href="pkg/data.html#torchreid.data.datamanager.DataManager.num_train_pids">num_train_pids (torchreid.data.datamanager.DataManager attribute)</a>
</li>
</ul></td>
</tr></table>

View File

@ -172,18 +172,18 @@
<p>Torchreid is a library built on <a class="reference external" href="https://pytorch.org/">PyTorch</a> for deep-learning person re-identification.</p>
<p>It features:</p>
<ul class="simple">
<li><p>multi-GPU training</p></li>
<li><p>support both image- and video-reid</p></li>
<li><p>end-to-end training and evaluation</p></li>
<li><p>incredibly easy preparation of reid datasets</p></li>
<li><p>multi-dataset training</p></li>
<li><p>cross-dataset evaluation</p></li>
<li><p>standard protocol used by most research papers</p></li>
<li><p>highly extensible (easy to add models, datasets, training methods, etc.)</p></li>
<li><p>implementations of state-of-the-art deep reid models</p></li>
<li><p>access to pretrained reid models</p></li>
<li><p>advanced training techniques</p></li>
<li><p>visualization tools (tensorboard, ranks, etc.)</p></li>
<li>multi-GPU training</li>
<li>support both image- and video-reid</li>
<li>end-to-end training and evaluation</li>
<li>incredibly easy preparation of reid datasets</li>
<li>multi-dataset training</li>
<li>cross-dataset evaluation</li>
<li>standard protocol used by most research papers</li>
<li>highly extensible (easy to add models, datasets, training methods, etc.)</li>
<li>implementations of state-of-the-art deep reid models</li>
<li>access to pretrained reid models</li>
<li>advanced training techniques</li>
<li>visualization tools (tensorboard, ranks, etc.)</li>
</ul>
<p>Documentation: <a class="reference external" href="https://kaiyangzhou.github.io/deep-person-reid/">https://kaiyangzhou.github.io/deep-person-reid/</a>.</p>
<p>Code: <a class="reference external" href="https://github.com/KaiyangZhou/deep-person-reid">https://github.com/KaiyangZhou/deep-person-reid</a>.</p>
@ -192,13 +192,13 @@
<h2>Installation<a class="headerlink" href="#installation" title="Permalink to this headline"></a></h2>
<p>We recommend using <a class="reference external" href="https://www.anaconda.com/distribution/">conda</a> to manage the packages.</p>
<ol class="arabic simple">
<li><p>Clone <code class="docutils literal notranslate"><span class="pre">deep-person-reid</span></code> to your preferred directory.</p></li>
<li>Clone <code class="docutils literal notranslate"><span class="pre">deep-person-reid</span></code> to your preferred directory.</li>
</ol>
<div class="highlight-bash notranslate"><div class="highlight"><pre><span></span>$ git clone https://github.com/KaiyangZhou/deep-person-reid.git
</pre></div>
</div>
<ol class="arabic simple" start="2">
<li><p>Create a conda environment (the default name is <code class="docutils literal notranslate"><span class="pre">torchreid</span></code>).</p></li>
<li>Create a conda environment (the default name is <code class="docutils literal notranslate"><span class="pre">torchreid</span></code>).</li>
</ol>
<div class="highlight-bash notranslate"><div class="highlight"><pre><span></span>$ <span class="nb">cd</span> deep-person-reid/
$ conda env create -f environment.yml
@ -207,19 +207,19 @@ $ conda activate torchreid
</div>
<p>Do check whether <code class="docutils literal notranslate"><span class="pre">which</span> <span class="pre">python</span></code> and <code class="docutils literal notranslate"><span class="pre">which</span> <span class="pre">pip</span></code> point to the right path.</p>
<ol class="arabic simple" start="3">
<li><p>Install tensorboard.</p></li>
<li>Install tensorboard.</li>
</ol>
<div class="highlight-bash notranslate"><div class="highlight"><pre><span></span>$ pip install tb-nightly
</pre></div>
</div>
<ol class="arabic simple" start="4">
<li><p>Install PyTorch and torchvision (select the proper cuda version to suit your machine).</p></li>
<li>Install PyTorch and torchvision (select the proper cuda version to suit your machine).</li>
</ol>
<div class="highlight-bash notranslate"><div class="highlight"><pre><span></span>$ conda install pytorch torchvision <span class="nv">cudatoolkit</span><span class="o">=</span><span class="m">9</span>.0 -c pytorch
</pre></div>
</div>
<ol class="arabic simple" start="5">
<li><p>Install <code class="docutils literal notranslate"><span class="pre">torchreid</span></code>.</p></li>
<li>Install <code class="docutils literal notranslate"><span class="pre">torchreid</span></code>.</li>
</ol>
<div class="highlight-bash notranslate"><div class="highlight"><pre><span></span>$ python setup.py develop
</pre></div>
@ -228,13 +228,13 @@ $ conda activate torchreid
<div class="section" id="get-started-30-seconds-to-torchreid">
<h2>Get started: 30 seconds to Torchreid<a class="headerlink" href="#get-started-30-seconds-to-torchreid" title="Permalink to this headline"></a></h2>
<ol class="arabic simple">
<li><p>Import <code class="docutils literal notranslate"><span class="pre">torchreid</span></code></p></li>
<li>Import <code class="docutils literal notranslate"><span class="pre">torchreid</span></code></li>
</ol>
<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="kn">import</span> <span class="nn">torchreid</span>
</pre></div>
</div>
<ol class="arabic simple" start="2">
<li><p>Load data manager</p></li>
<li>Load data manager</li>
</ol>
<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">datamanager</span> <span class="o">=</span> <span class="n">torchreid</span><span class="o">.</span><span class="n">data</span><span class="o">.</span><span class="n">ImageDataManager</span><span class="p">(</span>
<span class="n">root</span><span class="o">=</span><span class="s1">&#39;reid-data&#39;</span><span class="p">,</span>
@ -270,7 +270,7 @@ $ conda activate torchreid
</pre></div>
</div>
<ol class="arabic simple" start="4">
<li><p>Build engine</p></li>
<li>Build engine</li>
</ol>
<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">engine</span> <span class="o">=</span> <span class="n">torchreid</span><span class="o">.</span><span class="n">engine</span><span class="o">.</span><span class="n">ImageSoftmaxEngine</span><span class="p">(</span>
<span class="n">datamanager</span><span class="p">,</span>
@ -282,7 +282,7 @@ $ conda activate torchreid
</pre></div>
</div>
<ol class="arabic simple" start="5">
<li><p>Run training and test</p></li>
<li>Run training and test</li>
</ol>
<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">engine</span><span class="o">.</span><span class="n">run</span><span class="p">(</span>
<span class="n">save_dir</span><span class="o">=</span><span class="s1">&#39;log/resnet50&#39;</span><span class="p">,</span>
@ -321,25 +321,25 @@ $ conda activate torchreid
<div class="section" id="image-reid-datasets">
<h3>Image-reid datasets<a class="headerlink" href="#image-reid-datasets" title="Permalink to this headline"></a></h3>
<ul class="simple">
<li><p><a class="reference external" href="https://www.cv-foundation.org/openaccess/content_iccv_2015/papers/Zheng_Scalable_Person_Re-Identification_ICCV_2015_paper.pdf">Market1501</a></p></li>
<li><p><a class="reference external" href="https://www.cv-foundation.org/openaccess/content_cvpr_2014/papers/Li_DeepReID_Deep_Filter_2014_CVPR_paper.pdf">CUHK03</a></p></li>
<li><p><a class="reference external" href="https://arxiv.org/abs/1701.07717">DukeMTMC-reID</a></p></li>
<li><p><a class="reference external" href="https://arxiv.org/abs/1711.08565">MSMT17</a></p></li>
<li><p><a class="reference external" href="http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.331.7285&amp;rep=rep1&amp;type=pdf">VIPeR</a></p></li>
<li><p><a class="reference external" href="http://www.eecs.qmul.ac.uk/~txiang/publications/LoyXiangGong_cvpr_2009.pdf">GRID</a></p></li>
<li><p><a class="reference external" href="http://www.ee.cuhk.edu.hk/~xgwang/papers/liZWaccv12.pdf">CUHK01</a></p></li>
<li><p><a class="reference external" href="http://openaccess.thecvf.com/content_cvpr_2017/papers/Zhao_Spindle_Net_Person_CVPR_2017_paper.pdf">SenseReID</a></p></li>
<li><p><a class="reference external" href="http://www.eecs.qmul.ac.uk/~sgg/papers/ZhengGongXiang_BMVC09.pdf">QMUL-iLIDS</a></p></li>
<li><p><a class="reference external" href="https://pdfs.semanticscholar.org/4c1b/f0592be3e535faf256c95e27982db9b3d3d3.pdf">PRID</a></p></li>
<li><a class="reference external" href="https://www.cv-foundation.org/openaccess/content_iccv_2015/papers/Zheng_Scalable_Person_Re-Identification_ICCV_2015_paper.pdf">Market1501</a></li>
<li><a class="reference external" href="https://www.cv-foundation.org/openaccess/content_cvpr_2014/papers/Li_DeepReID_Deep_Filter_2014_CVPR_paper.pdf">CUHK03</a></li>
<li><a class="reference external" href="https://arxiv.org/abs/1701.07717">DukeMTMC-reID</a></li>
<li><a class="reference external" href="https://arxiv.org/abs/1711.08565">MSMT17</a></li>
<li><a class="reference external" href="http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.331.7285&amp;rep=rep1&amp;type=pdf">VIPeR</a></li>
<li><a class="reference external" href="http://www.eecs.qmul.ac.uk/~txiang/publications/LoyXiangGong_cvpr_2009.pdf">GRID</a></li>
<li><a class="reference external" href="http://www.ee.cuhk.edu.hk/~xgwang/papers/liZWaccv12.pdf">CUHK01</a></li>
<li><a class="reference external" href="http://openaccess.thecvf.com/content_cvpr_2017/papers/Zhao_Spindle_Net_Person_CVPR_2017_paper.pdf">SenseReID</a></li>
<li><a class="reference external" href="http://www.eecs.qmul.ac.uk/~sgg/papers/ZhengGongXiang_BMVC09.pdf">QMUL-iLIDS</a></li>
<li><a class="reference external" href="https://pdfs.semanticscholar.org/4c1b/f0592be3e535faf256c95e27982db9b3d3d3.pdf">PRID</a></li>
</ul>
</div>
<div class="section" id="video-reid-datasets">
<h3>Video-reid datasets<a class="headerlink" href="#video-reid-datasets" title="Permalink to this headline"></a></h3>
<ul class="simple">
<li><p><a class="reference external" href="http://www.liangzheng.org/1320.pdf">MARS</a></p></li>
<li><p><a class="reference external" href="https://www.eecs.qmul.ac.uk/~sgg/papers/WangEtAl_ECCV14.pdf">iLIDS-VID</a></p></li>
<li><p><a class="reference external" href="https://pdfs.semanticscholar.org/4c1b/f0592be3e535faf256c95e27982db9b3d3d3.pdf">PRID2011</a></p></li>
<li><p><a class="reference external" href="http://openaccess.thecvf.com/content_cvpr_2018/papers/Wu_Exploit_the_Unknown_CVPR_2018_paper.pdf">DukeMTMC-VideoReID</a></p></li>
<li><a class="reference external" href="http://www.liangzheng.org/1320.pdf">MARS</a></li>
<li><a class="reference external" href="https://www.eecs.qmul.ac.uk/~sgg/papers/WangEtAl_ECCV14.pdf">iLIDS-VID</a></li>
<li><a class="reference external" href="https://pdfs.semanticscholar.org/4c1b/f0592be3e535faf256c95e27982db9b3d3d3.pdf">PRID2011</a></li>
<li><a class="reference external" href="http://openaccess.thecvf.com/content_cvpr_2018/papers/Wu_Exploit_the_Unknown_CVPR_2018_paper.pdf">DukeMTMC-VideoReID</a></li>
</ul>
</div>
</div>
@ -348,42 +348,42 @@ $ conda activate torchreid
<div class="section" id="imagenet-classification-models">
<h3>ImageNet classification models<a class="headerlink" href="#imagenet-classification-models" title="Permalink to this headline"></a></h3>
<ul class="simple">
<li><p><a class="reference external" href="https://arxiv.org/abs/1512.03385">ResNet</a></p></li>
<li><p><a class="reference external" href="https://arxiv.org/abs/1611.05431">ResNeXt</a></p></li>
<li><p><a class="reference external" href="https://arxiv.org/abs/1709.01507">SENet</a></p></li>
<li><p><a class="reference external" href="https://arxiv.org/abs/1608.06993">DenseNet</a></p></li>
<li><p><a class="reference external" href="https://arxiv.org/abs/1602.07261">Inception-ResNet-V2</a></p></li>
<li><p><a class="reference external" href="https://arxiv.org/abs/1602.07261">Inception-V4</a></p></li>
<li><p><a class="reference external" href="https://arxiv.org/abs/1610.02357">Xception</a></p></li>
<li><a class="reference external" href="https://arxiv.org/abs/1512.03385">ResNet</a></li>
<li><a class="reference external" href="https://arxiv.org/abs/1611.05431">ResNeXt</a></li>
<li><a class="reference external" href="https://arxiv.org/abs/1709.01507">SENet</a></li>
<li><a class="reference external" href="https://arxiv.org/abs/1608.06993">DenseNet</a></li>
<li><a class="reference external" href="https://arxiv.org/abs/1602.07261">Inception-ResNet-V2</a></li>
<li><a class="reference external" href="https://arxiv.org/abs/1602.07261">Inception-V4</a></li>
<li><a class="reference external" href="https://arxiv.org/abs/1610.02357">Xception</a></li>
</ul>
</div>
<div class="section" id="lightweight-models">
<h3>Lightweight models<a class="headerlink" href="#lightweight-models" title="Permalink to this headline"></a></h3>
<ul class="simple">
<li><p><a class="reference external" href="https://arxiv.org/abs/1707.07012">NASNet</a></p></li>
<li><p><a class="reference external" href="https://arxiv.org/abs/1801.04381">MobileNetV2</a></p></li>
<li><p><a class="reference external" href="https://arxiv.org/abs/1707.01083">ShuffleNet</a></p></li>
<li><p><a class="reference external" href="https://arxiv.org/abs/1807.11164">ShuffleNetV2</a></p></li>
<li><p><a class="reference external" href="https://arxiv.org/abs/1602.07360">SqueezeNet</a></p></li>
<li><a class="reference external" href="https://arxiv.org/abs/1707.07012">NASNet</a></li>
<li><a class="reference external" href="https://arxiv.org/abs/1801.04381">MobileNetV2</a></li>
<li><a class="reference external" href="https://arxiv.org/abs/1707.01083">ShuffleNet</a></li>
<li><a class="reference external" href="https://arxiv.org/abs/1807.11164">ShuffleNetV2</a></li>
<li><a class="reference external" href="https://arxiv.org/abs/1602.07360">SqueezeNet</a></li>
</ul>
</div>
<div class="section" id="reid-specific-models">
<h3>ReID-specific models<a class="headerlink" href="#reid-specific-models" title="Permalink to this headline"></a></h3>
<ul class="simple">
<li><p><a class="reference external" href="https://arxiv.org/abs/1709.05165">MuDeep</a></p></li>
<li><p><a class="reference external" href="https://arxiv.org/abs/1711.08106">ResNet-mid</a></p></li>
<li><p><a class="reference external" href="https://arxiv.org/abs/1802.08122">HACNN</a></p></li>
<li><p><a class="reference external" href="https://arxiv.org/abs/1711.09349">PCB</a></p></li>
<li><p><a class="reference external" href="https://arxiv.org/abs/1803.09132">MLFN</a></p></li>
<li><p><a class="reference external" href="https://arxiv.org/abs/1905.00953">OSNet</a></p></li>
<li><a class="reference external" href="https://arxiv.org/abs/1709.05165">MuDeep</a></li>
<li><a class="reference external" href="https://arxiv.org/abs/1711.08106">ResNet-mid</a></li>
<li><a class="reference external" href="https://arxiv.org/abs/1802.08122">HACNN</a></li>
<li><a class="reference external" href="https://arxiv.org/abs/1711.09349">PCB</a></li>
<li><a class="reference external" href="https://arxiv.org/abs/1803.09132">MLFN</a></li>
<li><a class="reference external" href="https://arxiv.org/abs/1905.00953">OSNet</a></li>
</ul>
</div>
</div>
<div class="section" id="losses">
<h2>Losses<a class="headerlink" href="#losses" title="Permalink to this headline"></a></h2>
<ul class="simple">
<li><p><a class="reference external" href="https://www.cv-foundation.org/openaccess/content_cvpr_2016/papers/Szegedy_Rethinking_the_Inception_CVPR_2016_paper.pdf">Softmax (cross entropy loss with label smoothing)</a></p></li>
<li><p><a class="reference external" href="https://arxiv.org/abs/1703.07737">Triplet (hard example mining triplet loss)</a></p></li>
<li><a class="reference external" href="https://www.cv-foundation.org/openaccess/content_cvpr_2016/papers/Szegedy_Rethinking_the_Inception_CVPR_2016_paper.pdf">Softmax (cross entropy loss with label smoothing)</a></li>
<li><a class="reference external" href="https://arxiv.org/abs/1703.07737">Triplet (hard example mining triplet loss)</a></li>
</ul>
</div>
<div class="section" id="citation">
@ -408,8 +408,8 @@ $ conda activate torchreid
<div class="section" id="indices-and-tables">
<h1>Indices and tables<a class="headerlink" href="#indices-and-tables" title="Permalink to this headline"></a></h1>
<ul class="simple">
<li><p><a class="reference internal" href="genindex.html"><span class="std std-ref">Index</span></a></p></li>
<li><p><a class="reference internal" href="py-modindex.html"><span class="std std-ref">Module Index</span></a></p></li>
<li><a class="reference internal" href="genindex.html"><span class="std std-ref">Index</span></a></li>
<li><a class="reference internal" href="py-modindex.html"><span class="std std-ref">Module Index</span></a></li>
</ul>
</div>

Binary file not shown.

File diff suppressed because it is too large Load Diff

View File

@ -179,70 +179,78 @@
<h2>Base Engine<a class="headerlink" href="#base-engine" title="Permalink to this headline"></a></h2>
<dl class="class">
<dt id="torchreid.engine.engine.Engine">
<em class="property">class </em><code class="sig-prename descclassname">torchreid.engine.engine.</code><code class="sig-name descname">Engine</code><span class="sig-paren">(</span><em class="sig-param">datamanager</em>, <em class="sig-param">model</em>, <em class="sig-param">optimizer=None</em>, <em class="sig-param">scheduler=None</em>, <em class="sig-param">use_cpu=False</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/engine/engine.html#Engine"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.engine.engine.Engine" title="Permalink to this definition"></a></dt>
<em class="property">class </em><code class="descclassname">torchreid.engine.engine.</code><code class="descname">Engine</code><span class="sig-paren">(</span><em>datamanager</em>, <em>model</em>, <em>optimizer=None</em>, <em>scheduler=None</em>, <em>use_cpu=False</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/engine/engine.html#Engine"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.engine.engine.Engine" title="Permalink to this definition"></a></dt>
<dd><p>A generic base Engine class for both image- and video-reid.</p>
<dl class="field-list simple">
<dt class="field-odd">Parameters</dt>
<dd class="field-odd"><ul class="simple">
<li><p><strong>datamanager</strong> (<a class="reference internal" href="data.html#torchreid.data.datamanager.DataManager" title="torchreid.data.datamanager.DataManager"><em>DataManager</em></a>) an instance of <code class="docutils literal notranslate"><span class="pre">torchreid.data.ImageDataManager</span></code>
or <code class="docutils literal notranslate"><span class="pre">torchreid.data.VideoDataManager</span></code>.</p></li>
<li><p><strong>model</strong> (<em>nn.Module</em>) model instance.</p></li>
<li><p><strong>optimizer</strong> (<em>Optimizer</em>) an Optimizer.</p></li>
<li><p><strong>scheduler</strong> (<em>LRScheduler</em><em>, </em><em>optional</em>) if None, no learning rate decay will be performed.</p></li>
<li><p><strong>use_cpu</strong> (<em>bool</em><em>, </em><em>optional</em>) use cpu. Default is False.</p></li>
<table class="docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
<tbody valign="top">
<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple">
<li><strong>datamanager</strong> (<a class="reference internal" href="data.html#torchreid.data.datamanager.DataManager" title="torchreid.data.datamanager.DataManager"><em>DataManager</em></a>) an instance of <code class="docutils literal notranslate"><span class="pre">torchreid.data.ImageDataManager</span></code>
or <code class="docutils literal notranslate"><span class="pre">torchreid.data.VideoDataManager</span></code>.</li>
<li><strong>model</strong> (<em>nn.Module</em>) model instance.</li>
<li><strong>optimizer</strong> (<em>Optimizer</em>) an Optimizer.</li>
<li><strong>scheduler</strong> (<em>LRScheduler</em><em>, </em><em>optional</em>) if None, no learning rate decay will be performed.</li>
<li><strong>use_cpu</strong> (<em>bool</em><em>, </em><em>optional</em>) use cpu. Default is False.</li>
</ul>
</dd>
</dl>
</td>
</tr>
</tbody>
</table>
<dl class="method">
<dt id="torchreid.engine.engine.Engine.run">
<code class="sig-name descname">run</code><span class="sig-paren">(</span><em class="sig-param">save_dir='log', max_epoch=0, start_epoch=0, fixbase_epoch=0, open_layers=None, start_eval=0, eval_freq=-1, test_only=False, print_freq=10, dist_metric='euclidean', normalize_feature=False, visrank=False, visrank_topk=10, use_metric_cuhk03=False, ranks=[1, 5, 10, 20], rerank=False, visactmap=False</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/engine/engine.html#Engine.run"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.engine.engine.Engine.run" title="Permalink to this definition"></a></dt>
<code class="descname">run</code><span class="sig-paren">(</span><em>save_dir='log', max_epoch=0, start_epoch=0, fixbase_epoch=0, open_layers=None, start_eval=0, eval_freq=-1, test_only=False, print_freq=10, dist_metric='euclidean', normalize_feature=False, visrank=False, visrank_topk=10, use_metric_cuhk03=False, ranks=[1, 5, 10, 20], rerank=False, visactmap=False</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/engine/engine.html#Engine.run"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.engine.engine.Engine.run" title="Permalink to this definition"></a></dt>
<dd><p>A unified pipeline for training and evaluating a model.</p>
<dl class="field-list simple">
<dt class="field-odd">Parameters</dt>
<dd class="field-odd"><ul class="simple">
<li><p><strong>save_dir</strong> (<em>str</em>) directory to save model.</p></li>
<li><p><strong>max_epoch</strong> (<em>int</em>) maximum epoch.</p></li>
<li><p><strong>start_epoch</strong> (<em>int</em><em>, </em><em>optional</em>) starting epoch. Default is 0.</p></li>
<li><p><strong>fixbase_epoch</strong> (<em>int</em><em>, </em><em>optional</em>) number of epochs to train <code class="docutils literal notranslate"><span class="pre">open_layers</span></code> (new layers)
<table class="docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
<tbody valign="top">
<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple">
<li><strong>save_dir</strong> (<em>str</em>) directory to save model.</li>
<li><strong>max_epoch</strong> (<em>int</em>) maximum epoch.</li>
<li><strong>start_epoch</strong> (<em>int</em><em>, </em><em>optional</em>) starting epoch. Default is 0.</li>
<li><strong>fixbase_epoch</strong> (<em>int</em><em>, </em><em>optional</em>) number of epochs to train <code class="docutils literal notranslate"><span class="pre">open_layers</span></code> (new layers)
while keeping base layers frozen. Default is 0. <code class="docutils literal notranslate"><span class="pre">fixbase_epoch</span></code> is counted
in <code class="docutils literal notranslate"><span class="pre">max_epoch</span></code>.</p></li>
<li><p><strong>open_layers</strong> (<em>str</em><em> or </em><em>list</em><em>, </em><em>optional</em>) layers (attribute names) open for training.</p></li>
<li><p><strong>start_eval</strong> (<em>int</em><em>, </em><em>optional</em>) from which epoch to start evaluation. Default is 0.</p></li>
<li><p><strong>eval_freq</strong> (<em>int</em><em>, </em><em>optional</em>) evaluation frequency. Default is -1 (meaning evaluation
is only performed at the end of training).</p></li>
<li><p><strong>test_only</strong> (<em>bool</em><em>, </em><em>optional</em>) if True, only runs evaluation on test datasets.
Default is False.</p></li>
<li><p><strong>print_freq</strong> (<em>int</em><em>, </em><em>optional</em>) print_frequency. Default is 10.</p></li>
<li><p><strong>dist_metric</strong> (<em>str</em><em>, </em><em>optional</em>) distance metric used to compute distance matrix
between query and gallery. Default is “euclidean”.</p></li>
<li><p><strong>normalize_feature</strong> (<em>bool</em><em>, </em><em>optional</em>) performs L2 normalization on feature vectors before
computing feature distance. Default is False.</p></li>
<li><p><strong>visrank</strong> (<em>bool</em><em>, </em><em>optional</em>) visualizes ranked results. Default is False. It is recommended to
in <code class="docutils literal notranslate"><span class="pre">max_epoch</span></code>.</li>
<li><strong>open_layers</strong> (<em>str</em><em> or </em><em>list</em><em>, </em><em>optional</em>) layers (attribute names) open for training.</li>
<li><strong>start_eval</strong> (<em>int</em><em>, </em><em>optional</em>) from which epoch to start evaluation. Default is 0.</li>
<li><strong>eval_freq</strong> (<em>int</em><em>, </em><em>optional</em>) evaluation frequency. Default is -1 (meaning evaluation
is only performed at the end of training).</li>
<li><strong>test_only</strong> (<em>bool</em><em>, </em><em>optional</em>) if True, only runs evaluation on test datasets.
Default is False.</li>
<li><strong>print_freq</strong> (<em>int</em><em>, </em><em>optional</em>) print_frequency. Default is 10.</li>
<li><strong>dist_metric</strong> (<em>str</em><em>, </em><em>optional</em>) distance metric used to compute distance matrix
between query and gallery. Default is “euclidean”.</li>
<li><strong>normalize_feature</strong> (<em>bool</em><em>, </em><em>optional</em>) performs L2 normalization on feature vectors before
computing feature distance. Default is False.</li>
<li><strong>visrank</strong> (<em>bool</em><em>, </em><em>optional</em>) visualizes ranked results. Default is False. It is recommended to
enable <code class="docutils literal notranslate"><span class="pre">visrank</span></code> when <code class="docutils literal notranslate"><span class="pre">test_only</span></code> is True. The ranked images will be saved to
“save_dir/visrank_dataset”, e.g. “save_dir/visrank_market1501”.</p></li>
<li><p><strong>visrank_topk</strong> (<em>int</em><em>, </em><em>optional</em>) top-k ranked images to be visualized. Default is 10.</p></li>
<li><p><strong>use_metric_cuhk03</strong> (<em>bool</em><em>, </em><em>optional</em>) use single-gallery-shot setting for cuhk03.
Default is False. This should be enabled when using cuhk03 classic split.</p></li>
<li><p><strong>ranks</strong> (<em>list</em><em>, </em><em>optional</em>) cmc ranks to be computed. Default is [1, 5, 10, 20].</p></li>
<li><p><strong>rerank</strong> (<em>bool</em><em>, </em><em>optional</em>) uses person re-ranking (by Zhong et al. CVPR17).
Default is False. This is only enabled when test_only=True.</p></li>
<li><p><strong>visactmap</strong> (<em>bool</em><em>, </em><em>optional</em>) visualizes activation maps. Default is False.</p></li>
“save_dir/visrank_dataset”, e.g. “save_dir/visrank_market1501”.</li>
<li><strong>visrank_topk</strong> (<em>int</em><em>, </em><em>optional</em>) top-k ranked images to be visualized. Default is 10.</li>
<li><strong>use_metric_cuhk03</strong> (<em>bool</em><em>, </em><em>optional</em>) use single-gallery-shot setting for cuhk03.
Default is False. This should be enabled when using cuhk03 classic split.</li>
<li><strong>ranks</strong> (<em>list</em><em>, </em><em>optional</em>) cmc ranks to be computed. Default is [1, 5, 10, 20].</li>
<li><strong>rerank</strong> (<em>bool</em><em>, </em><em>optional</em>) uses person re-ranking (by Zhong et al. CVPR17).
Default is False. This is only enabled when test_only=True.</li>
<li><strong>visactmap</strong> (<em>bool</em><em>, </em><em>optional</em>) visualizes activation maps. Default is False.</li>
</ul>
</dd>
</dl>
</td>
</tr>
</tbody>
</table>
</dd></dl>
<dl class="method">
<dt id="torchreid.engine.engine.Engine.test">
<code class="sig-name descname">test</code><span class="sig-paren">(</span><em class="sig-param">epoch, testloader, dist_metric='euclidean', normalize_feature=False, visrank=False, visrank_topk=10, save_dir='', use_metric_cuhk03=False, ranks=[1, 5, 10, 20], rerank=False</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/engine/engine.html#Engine.test"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.engine.engine.Engine.test" title="Permalink to this definition"></a></dt>
<code class="descname">test</code><span class="sig-paren">(</span><em>epoch, testloader, dist_metric='euclidean', normalize_feature=False, visrank=False, visrank_topk=10, save_dir='', use_metric_cuhk03=False, ranks=[1, 5, 10, 20], rerank=False</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/engine/engine.html#Engine.test"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.engine.engine.Engine.test" title="Permalink to this definition"></a></dt>
<dd><p>Tests model on target datasets.</p>
<div class="admonition note">
<p class="admonition-title">Note</p>
<p>This function has been called in <code class="docutils literal notranslate"><span class="pre">run()</span></code>.</p>
<p class="first admonition-title">Note</p>
<p class="last">This function has been called in <code class="docutils literal notranslate"><span class="pre">run()</span></code>.</p>
</div>
<div class="admonition note">
<p class="admonition-title">Note</p>
<p>The test pipeline implemented in this function suits both image- and
<p class="first admonition-title">Note</p>
<p class="last">The test pipeline implemented in this function suits both image- and
video-reid. In general, a subclass of Engine only needs to re-implement
<code class="docutils literal notranslate"><span class="pre">_extract_features()</span></code> and <code class="docutils literal notranslate"><span class="pre">_parse_data_for_eval()</span></code> (most of the time),
but not a must. Please refer to the source code for more details.</p>
@ -251,7 +259,7 @@ but not a must. Please refer to the source code for more details.</p>
<dl class="method">
<dt id="torchreid.engine.engine.Engine.train">
<code class="sig-name descname">train</code><span class="sig-paren">(</span><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/engine/engine.html#Engine.train"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.engine.engine.Engine.train" title="Permalink to this definition"></a></dt>
<code class="descname">train</code><span class="sig-paren">(</span><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/engine/engine.html#Engine.train"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.engine.engine.Engine.train" title="Permalink to this definition"></a></dt>
<dd><p>Performs training on source datasets for one epoch.</p>
<p>This will be called every epoch in <code class="docutils literal notranslate"><span class="pre">run()</span></code>, e.g.</p>
<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">for</span> <span class="n">epoch</span> <span class="ow">in</span> <span class="nb">range</span><span class="p">(</span><span class="n">start_epoch</span><span class="p">,</span> <span class="n">max_epoch</span><span class="p">):</span>
@ -259,21 +267,22 @@ but not a must. Please refer to the source code for more details.</p>
</pre></div>
</div>
<div class="admonition note">
<p class="admonition-title">Note</p>
<p>This must be implemented in subclasses.</p>
<p class="first admonition-title">Note</p>
<p class="last">This must be implemented in subclasses.</p>
</div>
</dd></dl>
<dl class="method">
<dt id="torchreid.engine.engine.Engine.visactmap">
<code class="sig-name descname">visactmap</code><span class="sig-paren">(</span><em class="sig-param">testloader</em>, <em class="sig-param">save_dir</em>, <em class="sig-param">width</em>, <em class="sig-param">height</em>, <em class="sig-param">print_freq</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/engine/engine.html#Engine.visactmap"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.engine.engine.Engine.visactmap" title="Permalink to this definition"></a></dt>
<code class="descname">visactmap</code><span class="sig-paren">(</span><em>testloader</em>, <em>save_dir</em>, <em>width</em>, <em>height</em>, <em>print_freq</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/engine/engine.html#Engine.visactmap"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.engine.engine.Engine.visactmap" title="Permalink to this definition"></a></dt>
<dd><p>Visualizes CNN activation maps to see where the CNN focuses on to extract features.</p>
<p>This function takes as input the query images of target datasets</p>
<dl class="simple">
<dt>Reference:</dt><dd><ul class="simple">
<li><p>Zagoruyko and Komodakis. Paying more attention to attention: Improving the
performance of convolutional neural networks via attention transfer. ICLR, 2017</p></li>
<li><p>Zhou et al. Omni-Scale Feature Learning for Person Re-Identification. ICCV, 2019.</p></li>
<dl class="docutils">
<dt>Reference:</dt>
<dd><ul class="first last simple">
<li>Zagoruyko and Komodakis. Paying more attention to attention: Improving the
performance of convolutional neural networks via attention transfer. ICLR, 2017</li>
<li>Zhou et al. Omni-Scale Feature Learning for Person Re-Identification. ICCV, 2019.</li>
</ul>
</dd>
</dl>
@ -286,21 +295,25 @@ performance of convolutional neural networks via attention transfer. ICLR, 2017<
<h2>Image Engines<a class="headerlink" href="#image-engines" title="Permalink to this headline"></a></h2>
<dl class="class">
<dt id="torchreid.engine.image.softmax.ImageSoftmaxEngine">
<em class="property">class </em><code class="sig-prename descclassname">torchreid.engine.image.softmax.</code><code class="sig-name descname">ImageSoftmaxEngine</code><span class="sig-paren">(</span><em class="sig-param">datamanager</em>, <em class="sig-param">model</em>, <em class="sig-param">optimizer</em>, <em class="sig-param">scheduler=None</em>, <em class="sig-param">use_cpu=False</em>, <em class="sig-param">label_smooth=True</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/engine/image/softmax.html#ImageSoftmaxEngine"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.engine.image.softmax.ImageSoftmaxEngine" title="Permalink to this definition"></a></dt>
<em class="property">class </em><code class="descclassname">torchreid.engine.image.softmax.</code><code class="descname">ImageSoftmaxEngine</code><span class="sig-paren">(</span><em>datamanager</em>, <em>model</em>, <em>optimizer</em>, <em>scheduler=None</em>, <em>use_cpu=False</em>, <em>label_smooth=True</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/engine/image/softmax.html#ImageSoftmaxEngine"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.engine.image.softmax.ImageSoftmaxEngine" title="Permalink to this definition"></a></dt>
<dd><p>Softmax-loss engine for image-reid.</p>
<dl class="field-list simple">
<dt class="field-odd">Parameters</dt>
<dd class="field-odd"><ul class="simple">
<li><p><strong>datamanager</strong> (<a class="reference internal" href="data.html#torchreid.data.datamanager.DataManager" title="torchreid.data.datamanager.DataManager"><em>DataManager</em></a>) an instance of <code class="docutils literal notranslate"><span class="pre">torchreid.data.ImageDataManager</span></code>
or <code class="docutils literal notranslate"><span class="pre">torchreid.data.VideoDataManager</span></code>.</p></li>
<li><p><strong>model</strong> (<em>nn.Module</em>) model instance.</p></li>
<li><p><strong>optimizer</strong> (<em>Optimizer</em>) an Optimizer.</p></li>
<li><p><strong>scheduler</strong> (<em>LRScheduler</em><em>, </em><em>optional</em>) if None, no learning rate decay will be performed.</p></li>
<li><p><strong>use_cpu</strong> (<em>bool</em><em>, </em><em>optional</em>) use cpu. Default is False.</p></li>
<li><p><strong>label_smooth</strong> (<em>bool</em><em>, </em><em>optional</em>) use label smoothing regularizer. Default is True.</p></li>
<table class="docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
<tbody valign="top">
<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple">
<li><strong>datamanager</strong> (<a class="reference internal" href="data.html#torchreid.data.datamanager.DataManager" title="torchreid.data.datamanager.DataManager"><em>DataManager</em></a>) an instance of <code class="docutils literal notranslate"><span class="pre">torchreid.data.ImageDataManager</span></code>
or <code class="docutils literal notranslate"><span class="pre">torchreid.data.VideoDataManager</span></code>.</li>
<li><strong>model</strong> (<em>nn.Module</em>) model instance.</li>
<li><strong>optimizer</strong> (<em>Optimizer</em>) an Optimizer.</li>
<li><strong>scheduler</strong> (<em>LRScheduler</em><em>, </em><em>optional</em>) if None, no learning rate decay will be performed.</li>
<li><strong>use_cpu</strong> (<em>bool</em><em>, </em><em>optional</em>) use cpu. Default is False.</li>
<li><strong>label_smooth</strong> (<em>bool</em><em>, </em><em>optional</em>) use label smoothing regularizer. Default is True.</li>
</ul>
</dd>
</dl>
</td>
</tr>
</tbody>
</table>
<p>Examples:</p>
<div class="highlight-default notranslate"><div class="highlight"><pre><span></span><span class="kn">import</span> <span class="nn">torch</span>
<span class="kn">import</span> <span class="nn">torchreid</span>
@ -338,7 +351,7 @@ or <code class="docutils literal notranslate"><span class="pre">torchreid.data.V
</div>
<dl class="method">
<dt id="torchreid.engine.image.softmax.ImageSoftmaxEngine.train">
<code class="sig-name descname">train</code><span class="sig-paren">(</span><em class="sig-param">epoch</em>, <em class="sig-param">max_epoch</em>, <em class="sig-param">trainloader</em>, <em class="sig-param">fixbase_epoch=0</em>, <em class="sig-param">open_layers=None</em>, <em class="sig-param">print_freq=10</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/engine/image/softmax.html#ImageSoftmaxEngine.train"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.engine.image.softmax.ImageSoftmaxEngine.train" title="Permalink to this definition"></a></dt>
<code class="descname">train</code><span class="sig-paren">(</span><em>epoch</em>, <em>max_epoch</em>, <em>trainloader</em>, <em>fixbase_epoch=0</em>, <em>open_layers=None</em>, <em>print_freq=10</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/engine/image/softmax.html#ImageSoftmaxEngine.train"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.engine.image.softmax.ImageSoftmaxEngine.train" title="Permalink to this definition"></a></dt>
<dd><p>Performs training on source datasets for one epoch.</p>
<p>This will be called every epoch in <code class="docutils literal notranslate"><span class="pre">run()</span></code>, e.g.</p>
<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">for</span> <span class="n">epoch</span> <span class="ow">in</span> <span class="nb">range</span><span class="p">(</span><span class="n">start_epoch</span><span class="p">,</span> <span class="n">max_epoch</span><span class="p">):</span>
@ -346,8 +359,8 @@ or <code class="docutils literal notranslate"><span class="pre">torchreid.data.V
</pre></div>
</div>
<div class="admonition note">
<p class="admonition-title">Note</p>
<p>This must be implemented in subclasses.</p>
<p class="first admonition-title">Note</p>
<p class="last">This must be implemented in subclasses.</p>
</div>
</dd></dl>
@ -355,24 +368,28 @@ or <code class="docutils literal notranslate"><span class="pre">torchreid.data.V
<dl class="class">
<dt id="torchreid.engine.image.triplet.ImageTripletEngine">
<em class="property">class </em><code class="sig-prename descclassname">torchreid.engine.image.triplet.</code><code class="sig-name descname">ImageTripletEngine</code><span class="sig-paren">(</span><em class="sig-param">datamanager</em>, <em class="sig-param">model</em>, <em class="sig-param">optimizer</em>, <em class="sig-param">margin=0.3</em>, <em class="sig-param">weight_t=1</em>, <em class="sig-param">weight_x=1</em>, <em class="sig-param">scheduler=None</em>, <em class="sig-param">use_cpu=False</em>, <em class="sig-param">label_smooth=True</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/engine/image/triplet.html#ImageTripletEngine"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.engine.image.triplet.ImageTripletEngine" title="Permalink to this definition"></a></dt>
<em class="property">class </em><code class="descclassname">torchreid.engine.image.triplet.</code><code class="descname">ImageTripletEngine</code><span class="sig-paren">(</span><em>datamanager</em>, <em>model</em>, <em>optimizer</em>, <em>margin=0.3</em>, <em>weight_t=1</em>, <em>weight_x=1</em>, <em>scheduler=None</em>, <em>use_cpu=False</em>, <em>label_smooth=True</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/engine/image/triplet.html#ImageTripletEngine"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.engine.image.triplet.ImageTripletEngine" title="Permalink to this definition"></a></dt>
<dd><p>Triplet-loss engine for image-reid.</p>
<dl class="field-list simple">
<dt class="field-odd">Parameters</dt>
<dd class="field-odd"><ul class="simple">
<li><p><strong>datamanager</strong> (<a class="reference internal" href="data.html#torchreid.data.datamanager.DataManager" title="torchreid.data.datamanager.DataManager"><em>DataManager</em></a>) an instance of <code class="docutils literal notranslate"><span class="pre">torchreid.data.ImageDataManager</span></code>
or <code class="docutils literal notranslate"><span class="pre">torchreid.data.VideoDataManager</span></code>.</p></li>
<li><p><strong>model</strong> (<em>nn.Module</em>) model instance.</p></li>
<li><p><strong>optimizer</strong> (<em>Optimizer</em>) an Optimizer.</p></li>
<li><p><strong>margin</strong> (<em>float</em><em>, </em><em>optional</em>) margin for triplet loss. Default is 0.3.</p></li>
<li><p><strong>weight_t</strong> (<em>float</em><em>, </em><em>optional</em>) weight for triplet loss. Default is 1.</p></li>
<li><p><strong>weight_x</strong> (<em>float</em><em>, </em><em>optional</em>) weight for softmax loss. Default is 1.</p></li>
<li><p><strong>scheduler</strong> (<em>LRScheduler</em><em>, </em><em>optional</em>) if None, no learning rate decay will be performed.</p></li>
<li><p><strong>use_cpu</strong> (<em>bool</em><em>, </em><em>optional</em>) use cpu. Default is False.</p></li>
<li><p><strong>label_smooth</strong> (<em>bool</em><em>, </em><em>optional</em>) use label smoothing regularizer. Default is True.</p></li>
<table class="docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
<tbody valign="top">
<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple">
<li><strong>datamanager</strong> (<a class="reference internal" href="data.html#torchreid.data.datamanager.DataManager" title="torchreid.data.datamanager.DataManager"><em>DataManager</em></a>) an instance of <code class="docutils literal notranslate"><span class="pre">torchreid.data.ImageDataManager</span></code>
or <code class="docutils literal notranslate"><span class="pre">torchreid.data.VideoDataManager</span></code>.</li>
<li><strong>model</strong> (<em>nn.Module</em>) model instance.</li>
<li><strong>optimizer</strong> (<em>Optimizer</em>) an Optimizer.</li>
<li><strong>margin</strong> (<em>float</em><em>, </em><em>optional</em>) margin for triplet loss. Default is 0.3.</li>
<li><strong>weight_t</strong> (<em>float</em><em>, </em><em>optional</em>) weight for triplet loss. Default is 1.</li>
<li><strong>weight_x</strong> (<em>float</em><em>, </em><em>optional</em>) weight for softmax loss. Default is 1.</li>
<li><strong>scheduler</strong> (<em>LRScheduler</em><em>, </em><em>optional</em>) if None, no learning rate decay will be performed.</li>
<li><strong>use_cpu</strong> (<em>bool</em><em>, </em><em>optional</em>) use cpu. Default is False.</li>
<li><strong>label_smooth</strong> (<em>bool</em><em>, </em><em>optional</em>) use label smoothing regularizer. Default is True.</li>
</ul>
</dd>
</dl>
</td>
</tr>
</tbody>
</table>
<p>Examples:</p>
<div class="highlight-default notranslate"><div class="highlight"><pre><span></span><span class="kn">import</span> <span class="nn">torch</span>
<span class="kn">import</span> <span class="nn">torchreid</span>
@ -413,7 +430,7 @@ or <code class="docutils literal notranslate"><span class="pre">torchreid.data.V
</div>
<dl class="method">
<dt id="torchreid.engine.image.triplet.ImageTripletEngine.train">
<code class="sig-name descname">train</code><span class="sig-paren">(</span><em class="sig-param">epoch</em>, <em class="sig-param">max_epoch</em>, <em class="sig-param">trainloader</em>, <em class="sig-param">fixbase_epoch=0</em>, <em class="sig-param">open_layers=None</em>, <em class="sig-param">print_freq=10</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/engine/image/triplet.html#ImageTripletEngine.train"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.engine.image.triplet.ImageTripletEngine.train" title="Permalink to this definition"></a></dt>
<code class="descname">train</code><span class="sig-paren">(</span><em>epoch</em>, <em>max_epoch</em>, <em>trainloader</em>, <em>fixbase_epoch=0</em>, <em>open_layers=None</em>, <em>print_freq=10</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/engine/image/triplet.html#ImageTripletEngine.train"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.engine.image.triplet.ImageTripletEngine.train" title="Permalink to this definition"></a></dt>
<dd><p>Performs training on source datasets for one epoch.</p>
<p>This will be called every epoch in <code class="docutils literal notranslate"><span class="pre">run()</span></code>, e.g.</p>
<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">for</span> <span class="n">epoch</span> <span class="ow">in</span> <span class="nb">range</span><span class="p">(</span><span class="n">start_epoch</span><span class="p">,</span> <span class="n">max_epoch</span><span class="p">):</span>
@ -421,8 +438,8 @@ or <code class="docutils literal notranslate"><span class="pre">torchreid.data.V
</pre></div>
</div>
<div class="admonition note">
<p class="admonition-title">Note</p>
<p>This must be implemented in subclasses.</p>
<p class="first admonition-title">Note</p>
<p class="last">This must be implemented in subclasses.</p>
</div>
</dd></dl>
@ -433,23 +450,27 @@ or <code class="docutils literal notranslate"><span class="pre">torchreid.data.V
<h2>Video Engines<a class="headerlink" href="#video-engines" title="Permalink to this headline"></a></h2>
<dl class="class">
<dt id="torchreid.engine.video.softmax.VideoSoftmaxEngine">
<em class="property">class </em><code class="sig-prename descclassname">torchreid.engine.video.softmax.</code><code class="sig-name descname">VideoSoftmaxEngine</code><span class="sig-paren">(</span><em class="sig-param">datamanager</em>, <em class="sig-param">model</em>, <em class="sig-param">optimizer</em>, <em class="sig-param">scheduler=None</em>, <em class="sig-param">use_cpu=False</em>, <em class="sig-param">label_smooth=True</em>, <em class="sig-param">pooling_method='avg'</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/engine/video/softmax.html#VideoSoftmaxEngine"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.engine.video.softmax.VideoSoftmaxEngine" title="Permalink to this definition"></a></dt>
<em class="property">class </em><code class="descclassname">torchreid.engine.video.softmax.</code><code class="descname">VideoSoftmaxEngine</code><span class="sig-paren">(</span><em>datamanager</em>, <em>model</em>, <em>optimizer</em>, <em>scheduler=None</em>, <em>use_cpu=False</em>, <em>label_smooth=True</em>, <em>pooling_method='avg'</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/engine/video/softmax.html#VideoSoftmaxEngine"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.engine.video.softmax.VideoSoftmaxEngine" title="Permalink to this definition"></a></dt>
<dd><p>Softmax-loss engine for video-reid.</p>
<dl class="field-list simple">
<dt class="field-odd">Parameters</dt>
<dd class="field-odd"><ul class="simple">
<li><p><strong>datamanager</strong> (<a class="reference internal" href="data.html#torchreid.data.datamanager.DataManager" title="torchreid.data.datamanager.DataManager"><em>DataManager</em></a>) an instance of <code class="docutils literal notranslate"><span class="pre">torchreid.data.ImageDataManager</span></code>
or <code class="docutils literal notranslate"><span class="pre">torchreid.data.VideoDataManager</span></code>.</p></li>
<li><p><strong>model</strong> (<em>nn.Module</em>) model instance.</p></li>
<li><p><strong>optimizer</strong> (<em>Optimizer</em>) an Optimizer.</p></li>
<li><p><strong>scheduler</strong> (<em>LRScheduler</em><em>, </em><em>optional</em>) if None, no learning rate decay will be performed.</p></li>
<li><p><strong>use_cpu</strong> (<em>bool</em><em>, </em><em>optional</em>) use cpu. Default is False.</p></li>
<li><p><strong>label_smooth</strong> (<em>bool</em><em>, </em><em>optional</em>) use label smoothing regularizer. Default is True.</p></li>
<li><p><strong>pooling_method</strong> (<em>str</em><em>, </em><em>optional</em>) how to pool features for a tracklet.
Default is “avg” (average). Choices are [“avg”, “max”].</p></li>
<table class="docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
<tbody valign="top">
<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple">
<li><strong>datamanager</strong> (<a class="reference internal" href="data.html#torchreid.data.datamanager.DataManager" title="torchreid.data.datamanager.DataManager"><em>DataManager</em></a>) an instance of <code class="docutils literal notranslate"><span class="pre">torchreid.data.ImageDataManager</span></code>
or <code class="docutils literal notranslate"><span class="pre">torchreid.data.VideoDataManager</span></code>.</li>
<li><strong>model</strong> (<em>nn.Module</em>) model instance.</li>
<li><strong>optimizer</strong> (<em>Optimizer</em>) an Optimizer.</li>
<li><strong>scheduler</strong> (<em>LRScheduler</em><em>, </em><em>optional</em>) if None, no learning rate decay will be performed.</li>
<li><strong>use_cpu</strong> (<em>bool</em><em>, </em><em>optional</em>) use cpu. Default is False.</li>
<li><strong>label_smooth</strong> (<em>bool</em><em>, </em><em>optional</em>) use label smoothing regularizer. Default is True.</li>
<li><strong>pooling_method</strong> (<em>str</em><em>, </em><em>optional</em>) how to pool features for a tracklet.
Default is “avg” (average). Choices are [“avg”, “max”].</li>
</ul>
</dd>
</dl>
</td>
</tr>
</tbody>
</table>
<p>Examples:</p>
<div class="highlight-default notranslate"><div class="highlight"><pre><span></span><span class="kn">import</span> <span class="nn">torch</span>
<span class="kn">import</span> <span class="nn">torchreid</span>
@ -492,26 +513,30 @@ Default is “avg” (average). Choices are [“avg”, “max”].</p></li>
<dl class="class">
<dt id="torchreid.engine.video.triplet.VideoTripletEngine">
<em class="property">class </em><code class="sig-prename descclassname">torchreid.engine.video.triplet.</code><code class="sig-name descname">VideoTripletEngine</code><span class="sig-paren">(</span><em class="sig-param">datamanager</em>, <em class="sig-param">model</em>, <em class="sig-param">optimizer</em>, <em class="sig-param">margin=0.3</em>, <em class="sig-param">weight_t=1</em>, <em class="sig-param">weight_x=1</em>, <em class="sig-param">scheduler=None</em>, <em class="sig-param">use_cpu=False</em>, <em class="sig-param">label_smooth=True</em>, <em class="sig-param">pooling_method='avg'</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/engine/video/triplet.html#VideoTripletEngine"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.engine.video.triplet.VideoTripletEngine" title="Permalink to this definition"></a></dt>
<em class="property">class </em><code class="descclassname">torchreid.engine.video.triplet.</code><code class="descname">VideoTripletEngine</code><span class="sig-paren">(</span><em>datamanager</em>, <em>model</em>, <em>optimizer</em>, <em>margin=0.3</em>, <em>weight_t=1</em>, <em>weight_x=1</em>, <em>scheduler=None</em>, <em>use_cpu=False</em>, <em>label_smooth=True</em>, <em>pooling_method='avg'</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/engine/video/triplet.html#VideoTripletEngine"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.engine.video.triplet.VideoTripletEngine" title="Permalink to this definition"></a></dt>
<dd><p>Triplet-loss engine for video-reid.</p>
<dl class="field-list simple">
<dt class="field-odd">Parameters</dt>
<dd class="field-odd"><ul class="simple">
<li><p><strong>datamanager</strong> (<a class="reference internal" href="data.html#torchreid.data.datamanager.DataManager" title="torchreid.data.datamanager.DataManager"><em>DataManager</em></a>) an instance of <code class="docutils literal notranslate"><span class="pre">torchreid.data.ImageDataManager</span></code>
or <code class="docutils literal notranslate"><span class="pre">torchreid.data.VideoDataManager</span></code>.</p></li>
<li><p><strong>model</strong> (<em>nn.Module</em>) model instance.</p></li>
<li><p><strong>optimizer</strong> (<em>Optimizer</em>) an Optimizer.</p></li>
<li><p><strong>margin</strong> (<em>float</em><em>, </em><em>optional</em>) margin for triplet loss. Default is 0.3.</p></li>
<li><p><strong>weight_t</strong> (<em>float</em><em>, </em><em>optional</em>) weight for triplet loss. Default is 1.</p></li>
<li><p><strong>weight_x</strong> (<em>float</em><em>, </em><em>optional</em>) weight for softmax loss. Default is 1.</p></li>
<li><p><strong>scheduler</strong> (<em>LRScheduler</em><em>, </em><em>optional</em>) if None, no learning rate decay will be performed.</p></li>
<li><p><strong>use_cpu</strong> (<em>bool</em><em>, </em><em>optional</em>) use cpu. Default is False.</p></li>
<li><p><strong>label_smooth</strong> (<em>bool</em><em>, </em><em>optional</em>) use label smoothing regularizer. Default is True.</p></li>
<li><p><strong>pooling_method</strong> (<em>str</em><em>, </em><em>optional</em>) how to pool features for a tracklet.
Default is “avg” (average). Choices are [“avg”, “max”].</p></li>
<table class="docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
<tbody valign="top">
<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple">
<li><strong>datamanager</strong> (<a class="reference internal" href="data.html#torchreid.data.datamanager.DataManager" title="torchreid.data.datamanager.DataManager"><em>DataManager</em></a>) an instance of <code class="docutils literal notranslate"><span class="pre">torchreid.data.ImageDataManager</span></code>
or <code class="docutils literal notranslate"><span class="pre">torchreid.data.VideoDataManager</span></code>.</li>
<li><strong>model</strong> (<em>nn.Module</em>) model instance.</li>
<li><strong>optimizer</strong> (<em>Optimizer</em>) an Optimizer.</li>
<li><strong>margin</strong> (<em>float</em><em>, </em><em>optional</em>) margin for triplet loss. Default is 0.3.</li>
<li><strong>weight_t</strong> (<em>float</em><em>, </em><em>optional</em>) weight for triplet loss. Default is 1.</li>
<li><strong>weight_x</strong> (<em>float</em><em>, </em><em>optional</em>) weight for softmax loss. Default is 1.</li>
<li><strong>scheduler</strong> (<em>LRScheduler</em><em>, </em><em>optional</em>) if None, no learning rate decay will be performed.</li>
<li><strong>use_cpu</strong> (<em>bool</em><em>, </em><em>optional</em>) use cpu. Default is False.</li>
<li><strong>label_smooth</strong> (<em>bool</em><em>, </em><em>optional</em>) use label smoothing regularizer. Default is True.</li>
<li><strong>pooling_method</strong> (<em>str</em><em>, </em><em>optional</em>) how to pool features for a tracklet.
Default is “avg” (average). Choices are [“avg”, “max”].</li>
</ul>
</dd>
</dl>
</td>
</tr>
</tbody>
</table>
<p>Examples:</p>
<div class="highlight-default notranslate"><div class="highlight"><pre><span></span><span class="kn">import</span> <span class="nn">torch</span>
<span class="kn">import</span> <span class="nn">torchreid</span>

View File

@ -178,11 +178,11 @@
<span id="softmax"></span><h2>Softmax<a class="headerlink" href="#module-torchreid.losses.cross_entropy_loss" title="Permalink to this headline"></a></h2>
<dl class="class">
<dt id="torchreid.losses.cross_entropy_loss.CrossEntropyLoss">
<em class="property">class </em><code class="sig-prename descclassname">torchreid.losses.cross_entropy_loss.</code><code class="sig-name descname">CrossEntropyLoss</code><span class="sig-paren">(</span><em class="sig-param">num_classes</em>, <em class="sig-param">epsilon=0.1</em>, <em class="sig-param">use_gpu=True</em>, <em class="sig-param">label_smooth=True</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/losses/cross_entropy_loss.html#CrossEntropyLoss"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.losses.cross_entropy_loss.CrossEntropyLoss" title="Permalink to this definition"></a></dt>
<em class="property">class </em><code class="descclassname">torchreid.losses.cross_entropy_loss.</code><code class="descname">CrossEntropyLoss</code><span class="sig-paren">(</span><em>num_classes</em>, <em>epsilon=0.1</em>, <em>use_gpu=True</em>, <em>label_smooth=True</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/losses/cross_entropy_loss.html#CrossEntropyLoss"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.losses.cross_entropy_loss.CrossEntropyLoss" title="Permalink to this definition"></a></dt>
<dd><p>Cross entropy loss with label smoothing regularizer.</p>
<dl class="simple">
<dt>Reference:</dt><dd><p>Szegedy et al. Rethinking the Inception Architecture for Computer Vision. CVPR 2016.</p>
</dd>
<dl class="docutils">
<dt>Reference:</dt>
<dd>Szegedy et al. Rethinking the Inception Architecture for Computer Vision. CVPR 2016.</dd>
</dl>
<p>With label smoothing, the label <span class="math notranslate nohighlight">\(y\)</span> for a class is computed by</p>
<div class="math notranslate nohighlight">
@ -191,29 +191,37 @@
\end{equation}\]</div>
<p>where <span class="math notranslate nohighlight">\(K\)</span> denotes the number of classes and <span class="math notranslate nohighlight">\(\epsilon\)</span> is a weight. When
<span class="math notranslate nohighlight">\(\epsilon = 0\)</span>, the loss function reduces to the normal cross entropy.</p>
<dl class="field-list simple">
<dt class="field-odd">Parameters</dt>
<dd class="field-odd"><ul class="simple">
<li><p><strong>num_classes</strong> (<em>int</em>) number of classes.</p></li>
<li><p><strong>epsilon</strong> (<em>float</em><em>, </em><em>optional</em>) weight. Default is 0.1.</p></li>
<li><p><strong>use_gpu</strong> (<em>bool</em><em>, </em><em>optional</em>) whether to use gpu devices. Default is True.</p></li>
<li><p><strong>label_smooth</strong> (<em>bool</em><em>, </em><em>optional</em>) whether to apply label smoothing. Default is True.</p></li>
<table class="docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
<tbody valign="top">
<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple">
<li><strong>num_classes</strong> (<em>int</em>) number of classes.</li>
<li><strong>epsilon</strong> (<em>float</em><em>, </em><em>optional</em>) weight. Default is 0.1.</li>
<li><strong>use_gpu</strong> (<em>bool</em><em>, </em><em>optional</em>) whether to use gpu devices. Default is True.</li>
<li><strong>label_smooth</strong> (<em>bool</em><em>, </em><em>optional</em>) whether to apply label smoothing. Default is True.</li>
</ul>
</dd>
</dl>
</td>
</tr>
</tbody>
</table>
<dl class="method">
<dt id="torchreid.losses.cross_entropy_loss.CrossEntropyLoss.forward">
<code class="sig-name descname">forward</code><span class="sig-paren">(</span><em class="sig-param">inputs</em>, <em class="sig-param">targets</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/losses/cross_entropy_loss.html#CrossEntropyLoss.forward"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.losses.cross_entropy_loss.CrossEntropyLoss.forward" title="Permalink to this definition"></a></dt>
<dd><dl class="field-list simple">
<dt class="field-odd">Parameters</dt>
<dd class="field-odd"><ul class="simple">
<li><p><strong>inputs</strong> (<em>torch.Tensor</em>) prediction matrix (before softmax) with
shape (batch_size, num_classes).</p></li>
<li><p><strong>targets</strong> (<em>torch.LongTensor</em>) ground truth labels with shape (batch_size).
Each position contains the label index.</p></li>
<code class="descname">forward</code><span class="sig-paren">(</span><em>inputs</em>, <em>targets</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/losses/cross_entropy_loss.html#CrossEntropyLoss.forward"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.losses.cross_entropy_loss.CrossEntropyLoss.forward" title="Permalink to this definition"></a></dt>
<dd><table class="docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
<tbody valign="top">
<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple">
<li><strong>inputs</strong> (<em>torch.Tensor</em>) prediction matrix (before softmax) with
shape (batch_size, num_classes).</li>
<li><strong>targets</strong> (<em>torch.LongTensor</em>) ground truth labels with shape (batch_size).
Each position contains the label index.</li>
</ul>
</dd>
</dl>
</td>
</tr>
</tbody>
</table>
</dd></dl>
</dd></dl>
@ -223,29 +231,36 @@ Each position contains the label index.</p></li>
<span id="triplet"></span><h2>Triplet<a class="headerlink" href="#module-torchreid.losses.hard_mine_triplet_loss" title="Permalink to this headline"></a></h2>
<dl class="class">
<dt id="torchreid.losses.hard_mine_triplet_loss.TripletLoss">
<em class="property">class </em><code class="sig-prename descclassname">torchreid.losses.hard_mine_triplet_loss.</code><code class="sig-name descname">TripletLoss</code><span class="sig-paren">(</span><em class="sig-param">margin=0.3</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/losses/hard_mine_triplet_loss.html#TripletLoss"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.losses.hard_mine_triplet_loss.TripletLoss" title="Permalink to this definition"></a></dt>
<em class="property">class </em><code class="descclassname">torchreid.losses.hard_mine_triplet_loss.</code><code class="descname">TripletLoss</code><span class="sig-paren">(</span><em>margin=0.3</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/losses/hard_mine_triplet_loss.html#TripletLoss"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.losses.hard_mine_triplet_loss.TripletLoss" title="Permalink to this definition"></a></dt>
<dd><p>Triplet loss with hard positive/negative mining.</p>
<dl class="simple">
<dt>Reference:</dt><dd><p>Hermans et al. In Defense of the Triplet Loss for Person Re-Identification. arXiv:1703.07737.</p>
</dd>
<dl class="docutils">
<dt>Reference:</dt>
<dd>Hermans et al. In Defense of the Triplet Loss for Person Re-Identification. arXiv:1703.07737.</dd>
</dl>
<p>Imported from <a class="reference external" href="https://github.com/Cysu/open-reid/blob/master/reid/loss/triplet.py">https://github.com/Cysu/open-reid/blob/master/reid/loss/triplet.py</a>.</p>
<dl class="field-list simple">
<dt class="field-odd">Parameters</dt>
<dd class="field-odd"><p><strong>margin</strong> (<em>float</em><em>, </em><em>optional</em>) margin for triplet. Default is 0.3.</p>
</dd>
</dl>
<table class="docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
<tbody valign="top">
<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><strong>margin</strong> (<em>float</em><em>, </em><em>optional</em>) margin for triplet. Default is 0.3.</td>
</tr>
</tbody>
</table>
<dl class="method">
<dt id="torchreid.losses.hard_mine_triplet_loss.TripletLoss.forward">
<code class="sig-name descname">forward</code><span class="sig-paren">(</span><em class="sig-param">inputs</em>, <em class="sig-param">targets</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/losses/hard_mine_triplet_loss.html#TripletLoss.forward"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.losses.hard_mine_triplet_loss.TripletLoss.forward" title="Permalink to this definition"></a></dt>
<dd><dl class="field-list simple">
<dt class="field-odd">Parameters</dt>
<dd class="field-odd"><ul class="simple">
<li><p><strong>inputs</strong> (<em>torch.Tensor</em>) feature matrix with shape (batch_size, feat_dim).</p></li>
<li><p><strong>targets</strong> (<em>torch.LongTensor</em>) ground truth labels with shape (num_classes).</p></li>
<code class="descname">forward</code><span class="sig-paren">(</span><em>inputs</em>, <em>targets</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/losses/hard_mine_triplet_loss.html#TripletLoss.forward"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.losses.hard_mine_triplet_loss.TripletLoss.forward" title="Permalink to this definition"></a></dt>
<dd><table class="docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
<tbody valign="top">
<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple">
<li><strong>inputs</strong> (<em>torch.Tensor</em>) feature matrix with shape (batch_size, feat_dim).</li>
<li><strong>targets</strong> (<em>torch.LongTensor</em>) ground truth labels with shape (num_classes).</li>
</ul>
</dd>
</dl>
</td>
</tr>
</tbody>
</table>
</dd></dl>
</dd></dl>

View File

@ -179,26 +179,31 @@
<span id="distance"></span><h2>Distance<a class="headerlink" href="#module-torchreid.metrics.distance" title="Permalink to this headline"></a></h2>
<dl class="function">
<dt id="torchreid.metrics.distance.compute_distance_matrix">
<code class="sig-prename descclassname">torchreid.metrics.distance.</code><code class="sig-name descname">compute_distance_matrix</code><span class="sig-paren">(</span><em class="sig-param">input1</em>, <em class="sig-param">input2</em>, <em class="sig-param">metric='euclidean'</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/metrics/distance.html#compute_distance_matrix"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.metrics.distance.compute_distance_matrix" title="Permalink to this definition"></a></dt>
<code class="descclassname">torchreid.metrics.distance.</code><code class="descname">compute_distance_matrix</code><span class="sig-paren">(</span><em>input1</em>, <em>input2</em>, <em>metric='euclidean'</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/metrics/distance.html#compute_distance_matrix"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.metrics.distance.compute_distance_matrix" title="Permalink to this definition"></a></dt>
<dd><p>A wrapper function for computing distance matrix.</p>
<dl class="field-list simple">
<dt class="field-odd">Parameters</dt>
<dd class="field-odd"><ul class="simple">
<li><p><strong>input1</strong> (<em>torch.Tensor</em>) 2-D feature matrix.</p></li>
<li><p><strong>input2</strong> (<em>torch.Tensor</em>) 2-D feature matrix.</p></li>
<li><p><strong>metric</strong> (<em>str</em><em>, </em><em>optional</em>) “euclidean” or “cosine”.
Default is “euclidean”.</p></li>
<table class="docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
<tbody valign="top">
<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first simple">
<li><strong>input1</strong> (<em>torch.Tensor</em>) 2-D feature matrix.</li>
<li><strong>input2</strong> (<em>torch.Tensor</em>) 2-D feature matrix.</li>
<li><strong>metric</strong> (<em>str</em><em>, </em><em>optional</em>) “euclidean” or “cosine”.
Default is “euclidean”.</li>
</ul>
</dd>
<dt class="field-even">Returns</dt>
<dd class="field-even"><p>distance matrix.</p>
</dd>
<dt class="field-odd">Return type</dt>
<dd class="field-odd"><p>torch.Tensor</p>
</dd>
</dl>
<dl>
<dt>Examples::</dt><dd><div class="highlight-default notranslate"><div class="highlight"><pre><span></span><span class="gp">&gt;&gt;&gt; </span><span class="kn">from</span> <span class="nn">torchreid</span> <span class="k">import</span> <span class="n">metrics</span>
</td>
</tr>
<tr class="field-even field"><th class="field-name">Returns:</th><td class="field-body"><p class="first">distance matrix.</p>
</td>
</tr>
<tr class="field-odd field"><th class="field-name">Return type:</th><td class="field-body"><p class="first last">torch.Tensor</p>
</td>
</tr>
</tbody>
</table>
<dl class="docutils">
<dt>Examples::</dt>
<dd><div class="first last highlight-default notranslate"><div class="highlight"><pre><span></span><span class="gp">&gt;&gt;&gt; </span><span class="kn">from</span> <span class="nn">torchreid</span> <span class="k">import</span> <span class="n">metrics</span>
<span class="gp">&gt;&gt;&gt; </span><span class="n">input1</span> <span class="o">=</span> <span class="n">torch</span><span class="o">.</span><span class="n">rand</span><span class="p">(</span><span class="mi">10</span><span class="p">,</span> <span class="mi">2048</span><span class="p">)</span>
<span class="gp">&gt;&gt;&gt; </span><span class="n">input2</span> <span class="o">=</span> <span class="n">torch</span><span class="o">.</span><span class="n">rand</span><span class="p">(</span><span class="mi">100</span><span class="p">,</span> <span class="mi">2048</span><span class="p">)</span>
<span class="gp">&gt;&gt;&gt; </span><span class="n">distmat</span> <span class="o">=</span> <span class="n">metrics</span><span class="o">.</span><span class="n">compute_distance_matrix</span><span class="p">(</span><span class="n">input1</span><span class="p">,</span> <span class="n">input2</span><span class="p">)</span>
@ -211,42 +216,50 @@ Default is “euclidean”.</p></li>
<dl class="function">
<dt id="torchreid.metrics.distance.cosine_distance">
<code class="sig-prename descclassname">torchreid.metrics.distance.</code><code class="sig-name descname">cosine_distance</code><span class="sig-paren">(</span><em class="sig-param">input1</em>, <em class="sig-param">input2</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/metrics/distance.html#cosine_distance"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.metrics.distance.cosine_distance" title="Permalink to this definition"></a></dt>
<code class="descclassname">torchreid.metrics.distance.</code><code class="descname">cosine_distance</code><span class="sig-paren">(</span><em>input1</em>, <em>input2</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/metrics/distance.html#cosine_distance"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.metrics.distance.cosine_distance" title="Permalink to this definition"></a></dt>
<dd><p>Computes cosine distance.</p>
<dl class="field-list simple">
<dt class="field-odd">Parameters</dt>
<dd class="field-odd"><ul class="simple">
<li><p><strong>input1</strong> (<em>torch.Tensor</em>) 2-D feature matrix.</p></li>
<li><p><strong>input2</strong> (<em>torch.Tensor</em>) 2-D feature matrix.</p></li>
<table class="docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
<tbody valign="top">
<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first simple">
<li><strong>input1</strong> (<em>torch.Tensor</em>) 2-D feature matrix.</li>
<li><strong>input2</strong> (<em>torch.Tensor</em>) 2-D feature matrix.</li>
</ul>
</dd>
<dt class="field-even">Returns</dt>
<dd class="field-even"><p>distance matrix.</p>
</dd>
<dt class="field-odd">Return type</dt>
<dd class="field-odd"><p>torch.Tensor</p>
</dd>
</dl>
</td>
</tr>
<tr class="field-even field"><th class="field-name">Returns:</th><td class="field-body"><p class="first">distance matrix.</p>
</td>
</tr>
<tr class="field-odd field"><th class="field-name">Return type:</th><td class="field-body"><p class="first last">torch.Tensor</p>
</td>
</tr>
</tbody>
</table>
</dd></dl>
<dl class="function">
<dt id="torchreid.metrics.distance.euclidean_squared_distance">
<code class="sig-prename descclassname">torchreid.metrics.distance.</code><code class="sig-name descname">euclidean_squared_distance</code><span class="sig-paren">(</span><em class="sig-param">input1</em>, <em class="sig-param">input2</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/metrics/distance.html#euclidean_squared_distance"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.metrics.distance.euclidean_squared_distance" title="Permalink to this definition"></a></dt>
<code class="descclassname">torchreid.metrics.distance.</code><code class="descname">euclidean_squared_distance</code><span class="sig-paren">(</span><em>input1</em>, <em>input2</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/metrics/distance.html#euclidean_squared_distance"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.metrics.distance.euclidean_squared_distance" title="Permalink to this definition"></a></dt>
<dd><p>Computes euclidean squared distance.</p>
<dl class="field-list simple">
<dt class="field-odd">Parameters</dt>
<dd class="field-odd"><ul class="simple">
<li><p><strong>input1</strong> (<em>torch.Tensor</em>) 2-D feature matrix.</p></li>
<li><p><strong>input2</strong> (<em>torch.Tensor</em>) 2-D feature matrix.</p></li>
<table class="docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
<tbody valign="top">
<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first simple">
<li><strong>input1</strong> (<em>torch.Tensor</em>) 2-D feature matrix.</li>
<li><strong>input2</strong> (<em>torch.Tensor</em>) 2-D feature matrix.</li>
</ul>
</dd>
<dt class="field-even">Returns</dt>
<dd class="field-even"><p>distance matrix.</p>
</dd>
<dt class="field-odd">Return type</dt>
<dd class="field-odd"><p>torch.Tensor</p>
</dd>
</dl>
</td>
</tr>
<tr class="field-even field"><th class="field-name">Returns:</th><td class="field-body"><p class="first">distance matrix.</p>
</td>
</tr>
<tr class="field-odd field"><th class="field-name">Return type:</th><td class="field-body"><p class="first last">torch.Tensor</p>
</td>
</tr>
</tbody>
</table>
</dd></dl>
</div>
@ -254,27 +267,32 @@ Default is “euclidean”.</p></li>
<span id="accuracy"></span><h2>Accuracy<a class="headerlink" href="#module-torchreid.metrics.accuracy" title="Permalink to this headline"></a></h2>
<dl class="function">
<dt id="torchreid.metrics.accuracy.accuracy">
<code class="sig-prename descclassname">torchreid.metrics.accuracy.</code><code class="sig-name descname">accuracy</code><span class="sig-paren">(</span><em class="sig-param">output</em>, <em class="sig-param">target</em>, <em class="sig-param">topk=(1</em>, <em class="sig-param">)</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/metrics/accuracy.html#accuracy"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.metrics.accuracy.accuracy" title="Permalink to this definition"></a></dt>
<code class="descclassname">torchreid.metrics.accuracy.</code><code class="descname">accuracy</code><span class="sig-paren">(</span><em>output</em>, <em>target</em>, <em>topk=(1</em>, <em>)</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/metrics/accuracy.html#accuracy"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.metrics.accuracy.accuracy" title="Permalink to this definition"></a></dt>
<dd><p>Computes the accuracy over the k top predictions for
the specified values of k.</p>
<dl class="field-list simple">
<dt class="field-odd">Parameters</dt>
<dd class="field-odd"><ul class="simple">
<li><p><strong>output</strong> (<em>torch.Tensor</em>) prediction matrix with shape (batch_size, num_classes).</p></li>
<li><p><strong>target</strong> (<em>torch.LongTensor</em>) ground truth labels with shape (batch_size).</p></li>
<li><p><strong>topk</strong> (<em>tuple</em><em>, </em><em>optional</em>) accuracy at top-k will be computed. For example,
topk=(1, 5) means accuracy at top-1 and top-5 will be computed.</p></li>
<table class="docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
<tbody valign="top">
<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first simple">
<li><strong>output</strong> (<em>torch.Tensor</em>) prediction matrix with shape (batch_size, num_classes).</li>
<li><strong>target</strong> (<em>torch.LongTensor</em>) ground truth labels with shape (batch_size).</li>
<li><strong>topk</strong> (<em>tuple</em><em>, </em><em>optional</em>) accuracy at top-k will be computed. For example,
topk=(1, 5) means accuracy at top-1 and top-5 will be computed.</li>
</ul>
</dd>
<dt class="field-even">Returns</dt>
<dd class="field-even"><p>accuracy at top-k.</p>
</dd>
<dt class="field-odd">Return type</dt>
<dd class="field-odd"><p>list</p>
</dd>
</dl>
<dl>
<dt>Examples::</dt><dd><div class="highlight-default notranslate"><div class="highlight"><pre><span></span><span class="gp">&gt;&gt;&gt; </span><span class="kn">from</span> <span class="nn">torchreid</span> <span class="k">import</span> <span class="n">metrics</span>
</td>
</tr>
<tr class="field-even field"><th class="field-name">Returns:</th><td class="field-body"><p class="first">accuracy at top-k.</p>
</td>
</tr>
<tr class="field-odd field"><th class="field-name">Return type:</th><td class="field-body"><p class="first last">list</p>
</td>
</tr>
</tbody>
</table>
<dl class="docutils">
<dt>Examples::</dt>
<dd><div class="first last highlight-default notranslate"><div class="highlight"><pre><span></span><span class="gp">&gt;&gt;&gt; </span><span class="kn">from</span> <span class="nn">torchreid</span> <span class="k">import</span> <span class="n">metrics</span>
<span class="gp">&gt;&gt;&gt; </span><span class="n">metrics</span><span class="o">.</span><span class="n">accuracy</span><span class="p">(</span><span class="n">output</span><span class="p">,</span> <span class="n">target</span><span class="p">)</span>
</pre></div>
</div>
@ -287,29 +305,33 @@ topk=(1, 5) means accuracy at top-1 and top-5 will be computed.</p></li>
<span id="rank"></span><h2>Rank<a class="headerlink" href="#module-torchreid.metrics.rank" title="Permalink to this headline"></a></h2>
<dl class="function">
<dt id="torchreid.metrics.rank.evaluate_rank">
<code class="sig-prename descclassname">torchreid.metrics.rank.</code><code class="sig-name descname">evaluate_rank</code><span class="sig-paren">(</span><em class="sig-param">distmat</em>, <em class="sig-param">q_pids</em>, <em class="sig-param">g_pids</em>, <em class="sig-param">q_camids</em>, <em class="sig-param">g_camids</em>, <em class="sig-param">max_rank=50</em>, <em class="sig-param">use_metric_cuhk03=False</em>, <em class="sig-param">use_cython=True</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/metrics/rank.html#evaluate_rank"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.metrics.rank.evaluate_rank" title="Permalink to this definition"></a></dt>
<code class="descclassname">torchreid.metrics.rank.</code><code class="descname">evaluate_rank</code><span class="sig-paren">(</span><em>distmat</em>, <em>q_pids</em>, <em>g_pids</em>, <em>q_camids</em>, <em>g_camids</em>, <em>max_rank=50</em>, <em>use_metric_cuhk03=False</em>, <em>use_cython=True</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/metrics/rank.html#evaluate_rank"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.metrics.rank.evaluate_rank" title="Permalink to this definition"></a></dt>
<dd><p>Evaluates CMC rank.</p>
<dl class="field-list simple">
<dt class="field-odd">Parameters</dt>
<dd class="field-odd"><ul class="simple">
<li><p><strong>distmat</strong> (<em>numpy.ndarray</em>) distance matrix of shape (num_query, num_gallery).</p></li>
<li><p><strong>q_pids</strong> (<em>numpy.ndarray</em>) 1-D array containing person identities
of each query instance.</p></li>
<li><p><strong>g_pids</strong> (<em>numpy.ndarray</em>) 1-D array containing person identities
of each gallery instance.</p></li>
<li><p><strong>q_camids</strong> (<em>numpy.ndarray</em>) 1-D array containing camera views under
which each query instance is captured.</p></li>
<li><p><strong>g_camids</strong> (<em>numpy.ndarray</em>) 1-D array containing camera views under
which each gallery instance is captured.</p></li>
<li><p><strong>max_rank</strong> (<em>int</em><em>, </em><em>optional</em>) maximum CMC rank to be computed. Default is 50.</p></li>
<li><p><strong>use_metric_cuhk03</strong> (<em>bool</em><em>, </em><em>optional</em>) use single-gallery-shot setting for cuhk03.
Default is False. This should be enabled when using cuhk03 classic split.</p></li>
<li><p><strong>use_cython</strong> (<em>bool</em><em>, </em><em>optional</em>) use cython code for evaluation. Default is True.
<table class="docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
<tbody valign="top">
<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple">
<li><strong>distmat</strong> (<em>numpy.ndarray</em>) distance matrix of shape (num_query, num_gallery).</li>
<li><strong>q_pids</strong> (<em>numpy.ndarray</em>) 1-D array containing person identities
of each query instance.</li>
<li><strong>g_pids</strong> (<em>numpy.ndarray</em>) 1-D array containing person identities
of each gallery instance.</li>
<li><strong>q_camids</strong> (<em>numpy.ndarray</em>) 1-D array containing camera views under
which each query instance is captured.</li>
<li><strong>g_camids</strong> (<em>numpy.ndarray</em>) 1-D array containing camera views under
which each gallery instance is captured.</li>
<li><strong>max_rank</strong> (<em>int</em><em>, </em><em>optional</em>) maximum CMC rank to be computed. Default is 50.</li>
<li><strong>use_metric_cuhk03</strong> (<em>bool</em><em>, </em><em>optional</em>) use single-gallery-shot setting for cuhk03.
Default is False. This should be enabled when using cuhk03 classic split.</li>
<li><strong>use_cython</strong> (<em>bool</em><em>, </em><em>optional</em>) use cython code for evaluation. Default is True.
This is highly recommended as the cython code can speed up the cmc computation
by more than 10x. This requires Cython to be installed.</p></li>
by more than 10x. This requires Cython to be installed.</li>
</ul>
</dd>
</dl>
</td>
</tr>
</tbody>
</table>
</dd></dl>
</div>

View File

@ -180,26 +180,31 @@
<span id="interface"></span><h2>Interface<a class="headerlink" href="#module-torchreid.models.__init__" title="Permalink to this headline"></a></h2>
<dl class="function">
<dt id="torchreid.models.__init__.build_model">
<code class="sig-prename descclassname">torchreid.models.__init__.</code><code class="sig-name descname">build_model</code><span class="sig-paren">(</span><em class="sig-param">name</em>, <em class="sig-param">num_classes</em>, <em class="sig-param">loss='softmax'</em>, <em class="sig-param">pretrained=True</em>, <em class="sig-param">use_gpu=True</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/__init__.html#build_model"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.__init__.build_model" title="Permalink to this definition"></a></dt>
<code class="descclassname">torchreid.models.__init__.</code><code class="descname">build_model</code><span class="sig-paren">(</span><em>name</em>, <em>num_classes</em>, <em>loss='softmax'</em>, <em>pretrained=True</em>, <em>use_gpu=True</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/__init__.html#build_model"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.__init__.build_model" title="Permalink to this definition"></a></dt>
<dd><p>A function wrapper for building a model.</p>
<dl class="field-list simple">
<dt class="field-odd">Parameters</dt>
<dd class="field-odd"><ul class="simple">
<li><p><strong>name</strong> (<em>str</em>) model name.</p></li>
<li><p><strong>num_classes</strong> (<em>int</em>) number of training identities.</p></li>
<li><p><strong>loss</strong> (<em>str</em><em>, </em><em>optional</em>) loss function to optimize the model. Currently
supports “softmax” and “triplet”. Default is “softmax”.</p></li>
<li><p><strong>pretrained</strong> (<em>bool</em><em>, </em><em>optional</em>) whether to load ImageNet-pretrained weights.
Default is True.</p></li>
<li><p><strong>use_gpu</strong> (<em>bool</em><em>, </em><em>optional</em>) whether to use gpu. Default is True.</p></li>
<table class="docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
<tbody valign="top">
<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first simple">
<li><strong>name</strong> (<em>str</em>) model name.</li>
<li><strong>num_classes</strong> (<em>int</em>) number of training identities.</li>
<li><strong>loss</strong> (<em>str</em><em>, </em><em>optional</em>) loss function to optimize the model. Currently
supports “softmax” and “triplet”. Default is “softmax”.</li>
<li><strong>pretrained</strong> (<em>bool</em><em>, </em><em>optional</em>) whether to load ImageNet-pretrained weights.
Default is True.</li>
<li><strong>use_gpu</strong> (<em>bool</em><em>, </em><em>optional</em>) whether to use gpu. Default is True.</li>
</ul>
</dd>
<dt class="field-even">Returns</dt>
<dd class="field-even"><p>nn.Module</p>
</dd>
</dl>
<dl>
<dt>Examples::</dt><dd><div class="highlight-default notranslate"><div class="highlight"><pre><span></span><span class="gp">&gt;&gt;&gt; </span><span class="kn">from</span> <span class="nn">torchreid</span> <span class="k">import</span> <span class="n">models</span>
</td>
</tr>
<tr class="field-even field"><th class="field-name">Returns:</th><td class="field-body"><p class="first last">nn.Module</p>
</td>
</tr>
</tbody>
</table>
<dl class="docutils">
<dt>Examples::</dt>
<dd><div class="first last highlight-default notranslate"><div class="highlight"><pre><span></span><span class="gp">&gt;&gt;&gt; </span><span class="kn">from</span> <span class="nn">torchreid</span> <span class="k">import</span> <span class="n">models</span>
<span class="gp">&gt;&gt;&gt; </span><span class="n">model</span> <span class="o">=</span> <span class="n">models</span><span class="o">.</span><span class="n">build_model</span><span class="p">(</span><span class="s1">&#39;resnet50&#39;</span><span class="p">,</span> <span class="mi">751</span><span class="p">,</span> <span class="n">loss</span><span class="o">=</span><span class="s1">&#39;softmax&#39;</span><span class="p">)</span>
</pre></div>
</div>
@ -209,10 +214,11 @@ Default is True.</p></li>
<dl class="function">
<dt id="torchreid.models.__init__.show_avai_models">
<code class="sig-prename descclassname">torchreid.models.__init__.</code><code class="sig-name descname">show_avai_models</code><span class="sig-paren">(</span><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/__init__.html#show_avai_models"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.__init__.show_avai_models" title="Permalink to this definition"></a></dt>
<code class="descclassname">torchreid.models.__init__.</code><code class="descname">show_avai_models</code><span class="sig-paren">(</span><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/__init__.html#show_avai_models"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.__init__.show_avai_models" title="Permalink to this definition"></a></dt>
<dd><p>Displays available models.</p>
<dl>
<dt>Examples::</dt><dd><div class="highlight-default notranslate"><div class="highlight"><pre><span></span><span class="gp">&gt;&gt;&gt; </span><span class="kn">from</span> <span class="nn">torchreid</span> <span class="k">import</span> <span class="n">models</span>
<dl class="docutils">
<dt>Examples::</dt>
<dd><div class="first last highlight-default notranslate"><div class="highlight"><pre><span></span><span class="gp">&gt;&gt;&gt; </span><span class="kn">from</span> <span class="nn">torchreid</span> <span class="k">import</span> <span class="n">models</span>
<span class="gp">&gt;&gt;&gt; </span><span class="n">models</span><span class="o">.</span><span class="n">show_avai_models</span><span class="p">()</span>
</pre></div>
</div>
@ -225,23 +231,25 @@ Default is True.</p></li>
<h2>ImageNet Classification Models<a class="headerlink" href="#imagenet-classification-models" title="Permalink to this headline"></a></h2>
<dl class="class">
<dt id="torchreid.models.resnet.ResNet">
<em class="property">class </em><code class="sig-prename descclassname">torchreid.models.resnet.</code><code class="sig-name descname">ResNet</code><span class="sig-paren">(</span><em class="sig-param">num_classes</em>, <em class="sig-param">loss</em>, <em class="sig-param">block</em>, <em class="sig-param">layers</em>, <em class="sig-param">zero_init_residual=False</em>, <em class="sig-param">groups=1</em>, <em class="sig-param">width_per_group=64</em>, <em class="sig-param">replace_stride_with_dilation=None</em>, <em class="sig-param">norm_layer=None</em>, <em class="sig-param">last_stride=2</em>, <em class="sig-param">fc_dims=None</em>, <em class="sig-param">dropout_p=None</em>, <em class="sig-param">**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/resnet.html#ResNet"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.resnet.ResNet" title="Permalink to this definition"></a></dt>
<em class="property">class </em><code class="descclassname">torchreid.models.resnet.</code><code class="descname">ResNet</code><span class="sig-paren">(</span><em>num_classes</em>, <em>loss</em>, <em>block</em>, <em>layers</em>, <em>zero_init_residual=False</em>, <em>groups=1</em>, <em>width_per_group=64</em>, <em>replace_stride_with_dilation=None</em>, <em>norm_layer=None</em>, <em>last_stride=2</em>, <em>fc_dims=None</em>, <em>dropout_p=None</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/resnet.html#ResNet"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.resnet.ResNet" title="Permalink to this definition"></a></dt>
<dd><p>Residual network.</p>
<dl class="simple">
<dt>Reference:</dt><dd><ul class="simple">
<li><p>He et al. Deep Residual Learning for Image Recognition. CVPR 2016.</p></li>
<li><p>Xie et al. Aggregated Residual Transformations for Deep Neural Networks. CVPR 2017.</p></li>
<dl class="docutils">
<dt>Reference:</dt>
<dd><ul class="first last simple">
<li>He et al. Deep Residual Learning for Image Recognition. CVPR 2016.</li>
<li>Xie et al. Aggregated Residual Transformations for Deep Neural Networks. CVPR 2017.</li>
</ul>
</dd>
<dt>Public keys:</dt><dd><ul class="simple">
<li><p><code class="docutils literal notranslate"><span class="pre">resnet18</span></code>: ResNet18.</p></li>
<li><p><code class="docutils literal notranslate"><span class="pre">resnet34</span></code>: ResNet34.</p></li>
<li><p><code class="docutils literal notranslate"><span class="pre">resnet50</span></code>: ResNet50.</p></li>
<li><p><code class="docutils literal notranslate"><span class="pre">resnet101</span></code>: ResNet101.</p></li>
<li><p><code class="docutils literal notranslate"><span class="pre">resnet152</span></code>: ResNet152.</p></li>
<li><p><code class="docutils literal notranslate"><span class="pre">resnext50_32x4d</span></code>: ResNeXt50.</p></li>
<li><p><code class="docutils literal notranslate"><span class="pre">resnext101_32x8d</span></code>: ResNeXt101.</p></li>
<li><p><code class="docutils literal notranslate"><span class="pre">resnet50_fc512</span></code>: ResNet50 + FC.</p></li>
<dt>Public keys:</dt>
<dd><ul class="first last simple">
<li><code class="docutils literal notranslate"><span class="pre">resnet18</span></code>: ResNet18.</li>
<li><code class="docutils literal notranslate"><span class="pre">resnet34</span></code>: ResNet34.</li>
<li><code class="docutils literal notranslate"><span class="pre">resnet50</span></code>: ResNet50.</li>
<li><code class="docutils literal notranslate"><span class="pre">resnet101</span></code>: ResNet101.</li>
<li><code class="docutils literal notranslate"><span class="pre">resnet152</span></code>: ResNet152.</li>
<li><code class="docutils literal notranslate"><span class="pre">resnext50_32x4d</span></code>: ResNeXt50.</li>
<li><code class="docutils literal notranslate"><span class="pre">resnext101_32x8d</span></code>: ResNeXt101.</li>
<li><code class="docutils literal notranslate"><span class="pre">resnet50_fc512</span></code>: ResNet50 + FC.</li>
</ul>
</dd>
</dl>
@ -249,19 +257,20 @@ Default is True.</p></li>
<dl class="class">
<dt id="torchreid.models.senet.SENet">
<em class="property">class </em><code class="sig-prename descclassname">torchreid.models.senet.</code><code class="sig-name descname">SENet</code><span class="sig-paren">(</span><em class="sig-param">num_classes</em>, <em class="sig-param">loss</em>, <em class="sig-param">block</em>, <em class="sig-param">layers</em>, <em class="sig-param">groups</em>, <em class="sig-param">reduction</em>, <em class="sig-param">dropout_p=0.2</em>, <em class="sig-param">inplanes=128</em>, <em class="sig-param">input_3x3=True</em>, <em class="sig-param">downsample_kernel_size=3</em>, <em class="sig-param">downsample_padding=1</em>, <em class="sig-param">last_stride=2</em>, <em class="sig-param">fc_dims=None</em>, <em class="sig-param">**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/senet.html#SENet"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.senet.SENet" title="Permalink to this definition"></a></dt>
<em class="property">class </em><code class="descclassname">torchreid.models.senet.</code><code class="descname">SENet</code><span class="sig-paren">(</span><em>num_classes</em>, <em>loss</em>, <em>block</em>, <em>layers</em>, <em>groups</em>, <em>reduction</em>, <em>dropout_p=0.2</em>, <em>inplanes=128</em>, <em>input_3x3=True</em>, <em>downsample_kernel_size=3</em>, <em>downsample_padding=1</em>, <em>last_stride=2</em>, <em>fc_dims=None</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/senet.html#SENet"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.senet.SENet" title="Permalink to this definition"></a></dt>
<dd><p>Squeeze-and-excitation network.</p>
<dl class="simple">
<dt>Reference:</dt><dd><p>Hu et al. Squeeze-and-Excitation Networks. CVPR 2018.</p>
</dd>
<dt>Public keys:</dt><dd><ul class="simple">
<li><p><code class="docutils literal notranslate"><span class="pre">senet154</span></code>: SENet154.</p></li>
<li><p><code class="docutils literal notranslate"><span class="pre">se_resnet50</span></code>: ResNet50 + SE.</p></li>
<li><p><code class="docutils literal notranslate"><span class="pre">se_resnet101</span></code>: ResNet101 + SE.</p></li>
<li><p><code class="docutils literal notranslate"><span class="pre">se_resnet152</span></code>: ResNet152 + SE.</p></li>
<li><p><code class="docutils literal notranslate"><span class="pre">se_resnext50_32x4d</span></code>: ResNeXt50 (groups=32, width=4) + SE.</p></li>
<li><p><code class="docutils literal notranslate"><span class="pre">se_resnext101_32x4d</span></code>: ResNeXt101 (groups=32, width=4) + SE.</p></li>
<li><p><code class="docutils literal notranslate"><span class="pre">se_resnet50_fc512</span></code>: (ResNet50 + SE) + FC.</p></li>
<dl class="docutils">
<dt>Reference:</dt>
<dd>Hu et al. Squeeze-and-Excitation Networks. CVPR 2018.</dd>
<dt>Public keys:</dt>
<dd><ul class="first last simple">
<li><code class="docutils literal notranslate"><span class="pre">senet154</span></code>: SENet154.</li>
<li><code class="docutils literal notranslate"><span class="pre">se_resnet50</span></code>: ResNet50 + SE.</li>
<li><code class="docutils literal notranslate"><span class="pre">se_resnet101</span></code>: ResNet101 + SE.</li>
<li><code class="docutils literal notranslate"><span class="pre">se_resnet152</span></code>: ResNet152 + SE.</li>
<li><code class="docutils literal notranslate"><span class="pre">se_resnext50_32x4d</span></code>: ResNeXt50 (groups=32, width=4) + SE.</li>
<li><code class="docutils literal notranslate"><span class="pre">se_resnext101_32x4d</span></code>: ResNeXt101 (groups=32, width=4) + SE.</li>
<li><code class="docutils literal notranslate"><span class="pre">se_resnet50_fc512</span></code>: (ResNet50 + SE) + FC.</li>
</ul>
</dd>
</dl>
@ -269,17 +278,18 @@ Default is True.</p></li>
<dl class="class">
<dt id="torchreid.models.densenet.DenseNet">
<em class="property">class </em><code class="sig-prename descclassname">torchreid.models.densenet.</code><code class="sig-name descname">DenseNet</code><span class="sig-paren">(</span><em class="sig-param">num_classes</em>, <em class="sig-param">loss</em>, <em class="sig-param">growth_rate=32</em>, <em class="sig-param">block_config=(6</em>, <em class="sig-param">12</em>, <em class="sig-param">24</em>, <em class="sig-param">16)</em>, <em class="sig-param">num_init_features=64</em>, <em class="sig-param">bn_size=4</em>, <em class="sig-param">drop_rate=0</em>, <em class="sig-param">fc_dims=None</em>, <em class="sig-param">dropout_p=None</em>, <em class="sig-param">**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/densenet.html#DenseNet"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.densenet.DenseNet" title="Permalink to this definition"></a></dt>
<em class="property">class </em><code class="descclassname">torchreid.models.densenet.</code><code class="descname">DenseNet</code><span class="sig-paren">(</span><em>num_classes</em>, <em>loss</em>, <em>growth_rate=32</em>, <em>block_config=(6</em>, <em>12</em>, <em>24</em>, <em>16)</em>, <em>num_init_features=64</em>, <em>bn_size=4</em>, <em>drop_rate=0</em>, <em>fc_dims=None</em>, <em>dropout_p=None</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/densenet.html#DenseNet"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.densenet.DenseNet" title="Permalink to this definition"></a></dt>
<dd><p>Densely connected network.</p>
<dl class="simple">
<dt>Reference:</dt><dd><p>Huang et al. Densely Connected Convolutional Networks. CVPR 2017.</p>
</dd>
<dt>Public keys:</dt><dd><ul class="simple">
<li><p><code class="docutils literal notranslate"><span class="pre">densenet121</span></code>: DenseNet121.</p></li>
<li><p><code class="docutils literal notranslate"><span class="pre">densenet169</span></code>: DenseNet169.</p></li>
<li><p><code class="docutils literal notranslate"><span class="pre">densenet201</span></code>: DenseNet201.</p></li>
<li><p><code class="docutils literal notranslate"><span class="pre">densenet161</span></code>: DenseNet161.</p></li>
<li><p><code class="docutils literal notranslate"><span class="pre">densenet121_fc512</span></code>: DenseNet121 + FC.</p></li>
<dl class="docutils">
<dt>Reference:</dt>
<dd>Huang et al. Densely Connected Convolutional Networks. CVPR 2017.</dd>
<dt>Public keys:</dt>
<dd><ul class="first last simple">
<li><code class="docutils literal notranslate"><span class="pre">densenet121</span></code>: DenseNet121.</li>
<li><code class="docutils literal notranslate"><span class="pre">densenet169</span></code>: DenseNet169.</li>
<li><code class="docutils literal notranslate"><span class="pre">densenet201</span></code>: DenseNet201.</li>
<li><code class="docutils literal notranslate"><span class="pre">densenet161</span></code>: DenseNet161.</li>
<li><code class="docutils literal notranslate"><span class="pre">densenet121_fc512</span></code>: DenseNet121 + FC.</li>
</ul>
</dd>
</dl>
@ -287,14 +297,15 @@ Default is True.</p></li>
<dl class="class">
<dt id="torchreid.models.inceptionresnetv2.InceptionResNetV2">
<em class="property">class </em><code class="sig-prename descclassname">torchreid.models.inceptionresnetv2.</code><code class="sig-name descname">InceptionResNetV2</code><span class="sig-paren">(</span><em class="sig-param">num_classes</em>, <em class="sig-param">loss='softmax'</em>, <em class="sig-param">**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/inceptionresnetv2.html#InceptionResNetV2"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.inceptionresnetv2.InceptionResNetV2" title="Permalink to this definition"></a></dt>
<em class="property">class </em><code class="descclassname">torchreid.models.inceptionresnetv2.</code><code class="descname">InceptionResNetV2</code><span class="sig-paren">(</span><em>num_classes</em>, <em>loss='softmax'</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/inceptionresnetv2.html#InceptionResNetV2"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.inceptionresnetv2.InceptionResNetV2" title="Permalink to this definition"></a></dt>
<dd><p>Inception-ResNet-V2.</p>
<dl class="simple">
<dt>Reference:</dt><dd><p>Szegedy et al. Inception-v4, Inception-ResNet and the Impact of Residual
Connections on Learning. AAAI 2017.</p>
</dd>
<dt>Public keys:</dt><dd><ul class="simple">
<li><p><code class="docutils literal notranslate"><span class="pre">inceptionresnetv2</span></code>: Inception-ResNet-V2.</p></li>
<dl class="docutils">
<dt>Reference:</dt>
<dd>Szegedy et al. Inception-v4, Inception-ResNet and the Impact of Residual
Connections on Learning. AAAI 2017.</dd>
<dt>Public keys:</dt>
<dd><ul class="first last simple">
<li><code class="docutils literal notranslate"><span class="pre">inceptionresnetv2</span></code>: Inception-ResNet-V2.</li>
</ul>
</dd>
</dl>
@ -302,14 +313,15 @@ Connections on Learning. AAAI 2017.</p>
<dl class="class">
<dt id="torchreid.models.inceptionv4.InceptionV4">
<em class="property">class </em><code class="sig-prename descclassname">torchreid.models.inceptionv4.</code><code class="sig-name descname">InceptionV4</code><span class="sig-paren">(</span><em class="sig-param">num_classes</em>, <em class="sig-param">loss</em>, <em class="sig-param">**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/inceptionv4.html#InceptionV4"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.inceptionv4.InceptionV4" title="Permalink to this definition"></a></dt>
<em class="property">class </em><code class="descclassname">torchreid.models.inceptionv4.</code><code class="descname">InceptionV4</code><span class="sig-paren">(</span><em>num_classes</em>, <em>loss</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/inceptionv4.html#InceptionV4"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.inceptionv4.InceptionV4" title="Permalink to this definition"></a></dt>
<dd><p>Inception-v4.</p>
<dl class="simple">
<dt>Reference:</dt><dd><p>Szegedy et al. Inception-v4, Inception-ResNet and the Impact of Residual
Connections on Learning. AAAI 2017.</p>
</dd>
<dt>Public keys:</dt><dd><ul class="simple">
<li><p><code class="docutils literal notranslate"><span class="pre">inceptionv4</span></code>: InceptionV4.</p></li>
<dl class="docutils">
<dt>Reference:</dt>
<dd>Szegedy et al. Inception-v4, Inception-ResNet and the Impact of Residual
Connections on Learning. AAAI 2017.</dd>
<dt>Public keys:</dt>
<dd><ul class="first last simple">
<li><code class="docutils literal notranslate"><span class="pre">inceptionv4</span></code>: InceptionV4.</li>
</ul>
</dd>
</dl>
@ -317,14 +329,15 @@ Connections on Learning. AAAI 2017.</p>
<dl class="class">
<dt id="torchreid.models.xception.Xception">
<em class="property">class </em><code class="sig-prename descclassname">torchreid.models.xception.</code><code class="sig-name descname">Xception</code><span class="sig-paren">(</span><em class="sig-param">num_classes</em>, <em class="sig-param">loss</em>, <em class="sig-param">fc_dims=None</em>, <em class="sig-param">dropout_p=None</em>, <em class="sig-param">**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/xception.html#Xception"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.xception.Xception" title="Permalink to this definition"></a></dt>
<em class="property">class </em><code class="descclassname">torchreid.models.xception.</code><code class="descname">Xception</code><span class="sig-paren">(</span><em>num_classes</em>, <em>loss</em>, <em>fc_dims=None</em>, <em>dropout_p=None</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/xception.html#Xception"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.xception.Xception" title="Permalink to this definition"></a></dt>
<dd><p>Xception.</p>
<dl class="simple">
<dt>Reference:</dt><dd><p>Chollet. Xception: Deep Learning with Depthwise
Separable Convolutions. CVPR 2017.</p>
</dd>
<dt>Public keys:</dt><dd><ul class="simple">
<li><p><code class="docutils literal notranslate"><span class="pre">xception</span></code>: Xception.</p></li>
<dl class="docutils">
<dt>Reference:</dt>
<dd>Chollet. Xception: Deep Learning with Depthwise
Separable Convolutions. CVPR 2017.</dd>
<dt>Public keys:</dt>
<dd><ul class="first last simple">
<li><code class="docutils literal notranslate"><span class="pre">xception</span></code>: Xception.</li>
</ul>
</dd>
</dl>
@ -335,14 +348,15 @@ Separable Convolutions. CVPR 2017.</p>
<h2>Lightweight Models<a class="headerlink" href="#lightweight-models" title="Permalink to this headline"></a></h2>
<dl class="class">
<dt id="torchreid.models.nasnet.NASNetAMobile">
<em class="property">class </em><code class="sig-prename descclassname">torchreid.models.nasnet.</code><code class="sig-name descname">NASNetAMobile</code><span class="sig-paren">(</span><em class="sig-param">num_classes</em>, <em class="sig-param">loss</em>, <em class="sig-param">stem_filters=32</em>, <em class="sig-param">penultimate_filters=1056</em>, <em class="sig-param">filters_multiplier=2</em>, <em class="sig-param">**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/nasnet.html#NASNetAMobile"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.nasnet.NASNetAMobile" title="Permalink to this definition"></a></dt>
<em class="property">class </em><code class="descclassname">torchreid.models.nasnet.</code><code class="descname">NASNetAMobile</code><span class="sig-paren">(</span><em>num_classes</em>, <em>loss</em>, <em>stem_filters=32</em>, <em>penultimate_filters=1056</em>, <em>filters_multiplier=2</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/nasnet.html#NASNetAMobile"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.nasnet.NASNetAMobile" title="Permalink to this definition"></a></dt>
<dd><p>Neural Architecture Search (NAS).</p>
<dl class="simple">
<dt>Reference:</dt><dd><p>Zoph et al. Learning Transferable Architectures
for Scalable Image Recognition. CVPR 2018.</p>
</dd>
<dt>Public keys:</dt><dd><ul class="simple">
<li><p><code class="docutils literal notranslate"><span class="pre">nasnetamobile</span></code>: NASNet-A Mobile.</p></li>
<dl class="docutils">
<dt>Reference:</dt>
<dd>Zoph et al. Learning Transferable Architectures
for Scalable Image Recognition. CVPR 2018.</dd>
<dt>Public keys:</dt>
<dd><ul class="first last simple">
<li><code class="docutils literal notranslate"><span class="pre">nasnetamobile</span></code>: NASNet-A Mobile.</li>
</ul>
</dd>
</dl>
@ -350,15 +364,16 @@ for Scalable Image Recognition. CVPR 2018.</p>
<dl class="class">
<dt id="torchreid.models.mobilenetv2.MobileNetV2">
<em class="property">class </em><code class="sig-prename descclassname">torchreid.models.mobilenetv2.</code><code class="sig-name descname">MobileNetV2</code><span class="sig-paren">(</span><em class="sig-param">num_classes</em>, <em class="sig-param">width_mult=1</em>, <em class="sig-param">loss='softmax'</em>, <em class="sig-param">fc_dims=None</em>, <em class="sig-param">dropout_p=None</em>, <em class="sig-param">**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/mobilenetv2.html#MobileNetV2"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.mobilenetv2.MobileNetV2" title="Permalink to this definition"></a></dt>
<em class="property">class </em><code class="descclassname">torchreid.models.mobilenetv2.</code><code class="descname">MobileNetV2</code><span class="sig-paren">(</span><em>num_classes</em>, <em>width_mult=1</em>, <em>loss='softmax'</em>, <em>fc_dims=None</em>, <em>dropout_p=None</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/mobilenetv2.html#MobileNetV2"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.mobilenetv2.MobileNetV2" title="Permalink to this definition"></a></dt>
<dd><p>MobileNetV2.</p>
<dl class="simple">
<dt>Reference:</dt><dd><p>Sandler et al. MobileNetV2: Inverted Residuals and
Linear Bottlenecks. CVPR 2018.</p>
</dd>
<dt>Public keys:</dt><dd><ul class="simple">
<li><p><code class="docutils literal notranslate"><span class="pre">mobilenetv2_x1_0</span></code>: MobileNetV2 x1.0.</p></li>
<li><p><code class="docutils literal notranslate"><span class="pre">mobilenetv2_x1_4</span></code>: MobileNetV2 x1.4.</p></li>
<dl class="docutils">
<dt>Reference:</dt>
<dd>Sandler et al. MobileNetV2: Inverted Residuals and
Linear Bottlenecks. CVPR 2018.</dd>
<dt>Public keys:</dt>
<dd><ul class="first last simple">
<li><code class="docutils literal notranslate"><span class="pre">mobilenetv2_x1_0</span></code>: MobileNetV2 x1.0.</li>
<li><code class="docutils literal notranslate"><span class="pre">mobilenetv2_x1_4</span></code>: MobileNetV2 x1.4.</li>
</ul>
</dd>
</dl>
@ -366,14 +381,15 @@ Linear Bottlenecks. CVPR 2018.</p>
<dl class="class">
<dt id="torchreid.models.shufflenet.ShuffleNet">
<em class="property">class </em><code class="sig-prename descclassname">torchreid.models.shufflenet.</code><code class="sig-name descname">ShuffleNet</code><span class="sig-paren">(</span><em class="sig-param">num_classes</em>, <em class="sig-param">loss='softmax'</em>, <em class="sig-param">num_groups=3</em>, <em class="sig-param">**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/shufflenet.html#ShuffleNet"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.shufflenet.ShuffleNet" title="Permalink to this definition"></a></dt>
<em class="property">class </em><code class="descclassname">torchreid.models.shufflenet.</code><code class="descname">ShuffleNet</code><span class="sig-paren">(</span><em>num_classes</em>, <em>loss='softmax'</em>, <em>num_groups=3</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/shufflenet.html#ShuffleNet"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.shufflenet.ShuffleNet" title="Permalink to this definition"></a></dt>
<dd><p>ShuffleNet.</p>
<dl class="simple">
<dt>Reference:</dt><dd><p>Zhang et al. ShuffleNet: An Extremely Efficient Convolutional Neural
Network for Mobile Devices. CVPR 2018.</p>
</dd>
<dt>Public keys:</dt><dd><ul class="simple">
<li><p><code class="docutils literal notranslate"><span class="pre">shufflenet</span></code>: ShuffleNet (groups=3).</p></li>
<dl class="docutils">
<dt>Reference:</dt>
<dd>Zhang et al. ShuffleNet: An Extremely Efficient Convolutional Neural
Network for Mobile Devices. CVPR 2018.</dd>
<dt>Public keys:</dt>
<dd><ul class="first last simple">
<li><code class="docutils literal notranslate"><span class="pre">shufflenet</span></code>: ShuffleNet (groups=3).</li>
</ul>
</dd>
</dl>
@ -381,16 +397,17 @@ Network for Mobile Devices. CVPR 2018.</p>
<dl class="class">
<dt id="torchreid.models.squeezenet.SqueezeNet">
<em class="property">class </em><code class="sig-prename descclassname">torchreid.models.squeezenet.</code><code class="sig-name descname">SqueezeNet</code><span class="sig-paren">(</span><em class="sig-param">num_classes</em>, <em class="sig-param">loss</em>, <em class="sig-param">version=1.0</em>, <em class="sig-param">fc_dims=None</em>, <em class="sig-param">dropout_p=None</em>, <em class="sig-param">**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/squeezenet.html#SqueezeNet"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.squeezenet.SqueezeNet" title="Permalink to this definition"></a></dt>
<em class="property">class </em><code class="descclassname">torchreid.models.squeezenet.</code><code class="descname">SqueezeNet</code><span class="sig-paren">(</span><em>num_classes</em>, <em>loss</em>, <em>version=1.0</em>, <em>fc_dims=None</em>, <em>dropout_p=None</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/squeezenet.html#SqueezeNet"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.squeezenet.SqueezeNet" title="Permalink to this definition"></a></dt>
<dd><p>SqueezeNet.</p>
<dl class="simple">
<dt>Reference:</dt><dd><p>Iandola et al. SqueezeNet: AlexNet-level accuracy with 50x fewer parameters
and&lt; 0.5 MB model size. arXiv:1602.07360.</p>
</dd>
<dt>Public keys:</dt><dd><ul class="simple">
<li><p><code class="docutils literal notranslate"><span class="pre">squeezenet1_0</span></code>: SqueezeNet (version=1.0).</p></li>
<li><p><code class="docutils literal notranslate"><span class="pre">squeezenet1_1</span></code>: SqueezeNet (version=1.1).</p></li>
<li><p><code class="docutils literal notranslate"><span class="pre">squeezenet1_0_fc512</span></code>: SqueezeNet (version=1.0) + FC.</p></li>
<dl class="docutils">
<dt>Reference:</dt>
<dd>Iandola et al. SqueezeNet: AlexNet-level accuracy with 50x fewer parameters
and&lt; 0.5 MB model size. arXiv:1602.07360.</dd>
<dt>Public keys:</dt>
<dd><ul class="first last simple">
<li><code class="docutils literal notranslate"><span class="pre">squeezenet1_0</span></code>: SqueezeNet (version=1.0).</li>
<li><code class="docutils literal notranslate"><span class="pre">squeezenet1_1</span></code>: SqueezeNet (version=1.1).</li>
<li><code class="docutils literal notranslate"><span class="pre">squeezenet1_0_fc512</span></code>: SqueezeNet (version=1.0) + FC.</li>
</ul>
</dd>
</dl>
@ -398,16 +415,17 @@ and&lt; 0.5 MB model size. arXiv:1602.07360.</p>
<dl class="class">
<dt id="torchreid.models.shufflenetv2.ShuffleNetV2">
<em class="property">class </em><code class="sig-prename descclassname">torchreid.models.shufflenetv2.</code><code class="sig-name descname">ShuffleNetV2</code><span class="sig-paren">(</span><em class="sig-param">num_classes</em>, <em class="sig-param">loss</em>, <em class="sig-param">stages_repeats</em>, <em class="sig-param">stages_out_channels</em>, <em class="sig-param">**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/shufflenetv2.html#ShuffleNetV2"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.shufflenetv2.ShuffleNetV2" title="Permalink to this definition"></a></dt>
<em class="property">class </em><code class="descclassname">torchreid.models.shufflenetv2.</code><code class="descname">ShuffleNetV2</code><span class="sig-paren">(</span><em>num_classes</em>, <em>loss</em>, <em>stages_repeats</em>, <em>stages_out_channels</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/shufflenetv2.html#ShuffleNetV2"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.shufflenetv2.ShuffleNetV2" title="Permalink to this definition"></a></dt>
<dd><p>ShuffleNetV2.</p>
<dl class="simple">
<dt>Reference:</dt><dd><p>Ma et al. ShuffleNet V2: Practical Guidelines for Efficient CNN Architecture Design. ECCV 2018.</p>
</dd>
<dt>Public keys:</dt><dd><ul class="simple">
<li><p><code class="docutils literal notranslate"><span class="pre">shufflenet_v2_x0_5</span></code>: ShuffleNetV2 x0.5.</p></li>
<li><p><code class="docutils literal notranslate"><span class="pre">shufflenet_v2_x1_0</span></code>: ShuffleNetV2 x1.0.</p></li>
<li><p><code class="docutils literal notranslate"><span class="pre">shufflenet_v2_x1_5</span></code>: ShuffleNetV2 x1.5.</p></li>
<li><p><code class="docutils literal notranslate"><span class="pre">shufflenet_v2_x2_0</span></code>: ShuffleNetV2 x2.0.</p></li>
<dl class="docutils">
<dt>Reference:</dt>
<dd>Ma et al. ShuffleNet V2: Practical Guidelines for Efficient CNN Architecture Design. ECCV 2018.</dd>
<dt>Public keys:</dt>
<dd><ul class="first last simple">
<li><code class="docutils literal notranslate"><span class="pre">shufflenet_v2_x0_5</span></code>: ShuffleNetV2 x0.5.</li>
<li><code class="docutils literal notranslate"><span class="pre">shufflenet_v2_x1_0</span></code>: ShuffleNetV2 x1.0.</li>
<li><code class="docutils literal notranslate"><span class="pre">shufflenet_v2_x1_5</span></code>: ShuffleNetV2 x1.5.</li>
<li><code class="docutils literal notranslate"><span class="pre">shufflenet_v2_x2_0</span></code>: ShuffleNetV2 x2.0.</li>
</ul>
</dd>
</dl>
@ -418,14 +436,15 @@ and&lt; 0.5 MB model size. arXiv:1602.07360.</p>
<h2>ReID-specific Models<a class="headerlink" href="#reid-specific-models" title="Permalink to this headline"></a></h2>
<dl class="class">
<dt id="torchreid.models.mudeep.MuDeep">
<em class="property">class </em><code class="sig-prename descclassname">torchreid.models.mudeep.</code><code class="sig-name descname">MuDeep</code><span class="sig-paren">(</span><em class="sig-param">num_classes</em>, <em class="sig-param">loss='softmax'</em>, <em class="sig-param">**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/mudeep.html#MuDeep"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.mudeep.MuDeep" title="Permalink to this definition"></a></dt>
<em class="property">class </em><code class="descclassname">torchreid.models.mudeep.</code><code class="descname">MuDeep</code><span class="sig-paren">(</span><em>num_classes</em>, <em>loss='softmax'</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/mudeep.html#MuDeep"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.mudeep.MuDeep" title="Permalink to this definition"></a></dt>
<dd><p>Multiscale deep neural network.</p>
<dl class="simple">
<dt>Reference:</dt><dd><p>Qian et al. Multi-scale Deep Learning Architectures
for Person Re-identification. ICCV 2017.</p>
</dd>
<dt>Public keys:</dt><dd><ul class="simple">
<li><p><code class="docutils literal notranslate"><span class="pre">mudeep</span></code>: Multiscale deep neural network.</p></li>
<dl class="docutils">
<dt>Reference:</dt>
<dd>Qian et al. Multi-scale Deep Learning Architectures
for Person Re-identification. ICCV 2017.</dd>
<dt>Public keys:</dt>
<dd><ul class="first last simple">
<li><code class="docutils literal notranslate"><span class="pre">mudeep</span></code>: Multiscale deep neural network.</li>
</ul>
</dd>
</dl>
@ -433,14 +452,15 @@ for Person Re-identification. ICCV 2017.</p>
<dl class="class">
<dt id="torchreid.models.resnetmid.ResNetMid">
<em class="property">class </em><code class="sig-prename descclassname">torchreid.models.resnetmid.</code><code class="sig-name descname">ResNetMid</code><span class="sig-paren">(</span><em class="sig-param">num_classes</em>, <em class="sig-param">loss</em>, <em class="sig-param">block</em>, <em class="sig-param">layers</em>, <em class="sig-param">last_stride=2</em>, <em class="sig-param">fc_dims=None</em>, <em class="sig-param">**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/resnetmid.html#ResNetMid"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.resnetmid.ResNetMid" title="Permalink to this definition"></a></dt>
<em class="property">class </em><code class="descclassname">torchreid.models.resnetmid.</code><code class="descname">ResNetMid</code><span class="sig-paren">(</span><em>num_classes</em>, <em>loss</em>, <em>block</em>, <em>layers</em>, <em>last_stride=2</em>, <em>fc_dims=None</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/resnetmid.html#ResNetMid"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.resnetmid.ResNetMid" title="Permalink to this definition"></a></dt>
<dd><p>Residual network + mid-level features.</p>
<dl class="simple">
<dt>Reference:</dt><dd><p>Yu et al. The Devil is in the Middle: Exploiting Mid-level Representations for
Cross-Domain Instance Matching. arXiv:1711.08106.</p>
</dd>
<dt>Public keys:</dt><dd><ul class="simple">
<li><p><code class="docutils literal notranslate"><span class="pre">resnet50mid</span></code>: ResNet50 + mid-level feature fusion.</p></li>
<dl class="docutils">
<dt>Reference:</dt>
<dd>Yu et al. The Devil is in the Middle: Exploiting Mid-level Representations for
Cross-Domain Instance Matching. arXiv:1711.08106.</dd>
<dt>Public keys:</dt>
<dd><ul class="first last simple">
<li><code class="docutils literal notranslate"><span class="pre">resnet50mid</span></code>: ResNet50 + mid-level feature fusion.</li>
</ul>
</dd>
</dl>
@ -448,13 +468,14 @@ Cross-Domain Instance Matching. arXiv:1711.08106.</p>
<dl class="class">
<dt id="torchreid.models.hacnn.HACNN">
<em class="property">class </em><code class="sig-prename descclassname">torchreid.models.hacnn.</code><code class="sig-name descname">HACNN</code><span class="sig-paren">(</span><em class="sig-param">num_classes, loss='softmax', nchannels=[128, 256, 384], feat_dim=512, learn_region=True, use_gpu=True, **kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/hacnn.html#HACNN"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.hacnn.HACNN" title="Permalink to this definition"></a></dt>
<em class="property">class </em><code class="descclassname">torchreid.models.hacnn.</code><code class="descname">HACNN</code><span class="sig-paren">(</span><em>num_classes, loss='softmax', nchannels=[128, 256, 384], feat_dim=512, learn_region=True, use_gpu=True, **kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/hacnn.html#HACNN"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.hacnn.HACNN" title="Permalink to this definition"></a></dt>
<dd><p>Harmonious Attention Convolutional Neural Network.</p>
<dl class="simple">
<dt>Reference:</dt><dd><p>Li et al. Harmonious Attention Network for Person Re-identification. CVPR 2018.</p>
</dd>
<dt>Public keys:</dt><dd><ul class="simple">
<li><p><code class="docutils literal notranslate"><span class="pre">hacnn</span></code>: HACNN.</p></li>
<dl class="docutils">
<dt>Reference:</dt>
<dd>Li et al. Harmonious Attention Network for Person Re-identification. CVPR 2018.</dd>
<dt>Public keys:</dt>
<dd><ul class="first last simple">
<li><code class="docutils literal notranslate"><span class="pre">hacnn</span></code>: HACNN.</li>
</ul>
</dd>
</dl>
@ -462,15 +483,16 @@ Cross-Domain Instance Matching. arXiv:1711.08106.</p>
<dl class="class">
<dt id="torchreid.models.pcb.PCB">
<em class="property">class </em><code class="sig-prename descclassname">torchreid.models.pcb.</code><code class="sig-name descname">PCB</code><span class="sig-paren">(</span><em class="sig-param">num_classes</em>, <em class="sig-param">loss</em>, <em class="sig-param">block</em>, <em class="sig-param">layers</em>, <em class="sig-param">parts=6</em>, <em class="sig-param">reduced_dim=256</em>, <em class="sig-param">nonlinear='relu'</em>, <em class="sig-param">**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/pcb.html#PCB"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.pcb.PCB" title="Permalink to this definition"></a></dt>
<em class="property">class </em><code class="descclassname">torchreid.models.pcb.</code><code class="descname">PCB</code><span class="sig-paren">(</span><em>num_classes</em>, <em>loss</em>, <em>block</em>, <em>layers</em>, <em>parts=6</em>, <em>reduced_dim=256</em>, <em>nonlinear='relu'</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/pcb.html#PCB"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.pcb.PCB" title="Permalink to this definition"></a></dt>
<dd><p>Part-based Convolutional Baseline.</p>
<dl class="simple">
<dt>Reference:</dt><dd><p>Sun et al. Beyond Part Models: Person Retrieval with Refined
Part Pooling (and A Strong Convolutional Baseline). ECCV 2018.</p>
</dd>
<dt>Public keys:</dt><dd><ul class="simple">
<li><p><code class="docutils literal notranslate"><span class="pre">pcb_p4</span></code>: PCB with 4-part strips.</p></li>
<li><p><code class="docutils literal notranslate"><span class="pre">pcb_p6</span></code>: PCB with 6-part strips.</p></li>
<dl class="docutils">
<dt>Reference:</dt>
<dd>Sun et al. Beyond Part Models: Person Retrieval with Refined
Part Pooling (and A Strong Convolutional Baseline). ECCV 2018.</dd>
<dt>Public keys:</dt>
<dd><ul class="first last simple">
<li><code class="docutils literal notranslate"><span class="pre">pcb_p4</span></code>: PCB with 4-part strips.</li>
<li><code class="docutils literal notranslate"><span class="pre">pcb_p6</span></code>: PCB with 6-part strips.</li>
</ul>
</dd>
</dl>
@ -478,14 +500,15 @@ Part Pooling (and A Strong Convolutional Baseline). ECCV 2018.</p>
<dl class="class">
<dt id="torchreid.models.mlfn.MLFN">
<em class="property">class </em><code class="sig-prename descclassname">torchreid.models.mlfn.</code><code class="sig-name descname">MLFN</code><span class="sig-paren">(</span><em class="sig-param">num_classes, loss='softmax', groups=32, channels=[64, 256, 512, 1024, 2048], embed_dim=1024, **kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/mlfn.html#MLFN"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.mlfn.MLFN" title="Permalink to this definition"></a></dt>
<em class="property">class </em><code class="descclassname">torchreid.models.mlfn.</code><code class="descname">MLFN</code><span class="sig-paren">(</span><em>num_classes, loss='softmax', groups=32, channels=[64, 256, 512, 1024, 2048], embed_dim=1024, **kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/mlfn.html#MLFN"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.mlfn.MLFN" title="Permalink to this definition"></a></dt>
<dd><p>Multi-Level Factorisation Net.</p>
<dl class="simple">
<dt>Reference:</dt><dd><p>Chang et al. Multi-Level Factorisation Net for
Person Re-Identification. CVPR 2018.</p>
</dd>
<dt>Public keys:</dt><dd><ul class="simple">
<li><p><code class="docutils literal notranslate"><span class="pre">mlfn</span></code>: MLFN (Multi-Level Factorisation Net).</p></li>
<dl class="docutils">
<dt>Reference:</dt>
<dd>Chang et al. Multi-Level Factorisation Net for
Person Re-Identification. CVPR 2018.</dd>
<dt>Public keys:</dt>
<dd><ul class="first last simple">
<li><code class="docutils literal notranslate"><span class="pre">mlfn</span></code>: MLFN (Multi-Level Factorisation Net).</li>
</ul>
</dd>
</dl>
@ -493,11 +516,12 @@ Person Re-Identification. CVPR 2018.</p>
<dl class="class">
<dt id="torchreid.models.osnet.OSNet">
<em class="property">class </em><code class="sig-prename descclassname">torchreid.models.osnet.</code><code class="sig-name descname">OSNet</code><span class="sig-paren">(</span><em class="sig-param">num_classes</em>, <em class="sig-param">blocks</em>, <em class="sig-param">layers</em>, <em class="sig-param">channels</em>, <em class="sig-param">feature_dim=512</em>, <em class="sig-param">loss='softmax'</em>, <em class="sig-param">IN=False</em>, <em class="sig-param">**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/osnet.html#OSNet"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.osnet.OSNet" title="Permalink to this definition"></a></dt>
<em class="property">class </em><code class="descclassname">torchreid.models.osnet.</code><code class="descname">OSNet</code><span class="sig-paren">(</span><em>num_classes</em>, <em>blocks</em>, <em>layers</em>, <em>channels</em>, <em>feature_dim=512</em>, <em>loss='softmax'</em>, <em>IN=False</em>, <em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/models/osnet.html#OSNet"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.models.osnet.OSNet" title="Permalink to this definition"></a></dt>
<dd><p>Omni-Scale Network.</p>
<dl class="simple">
<dt>Reference:</dt><dd><ul class="simple">
<li><p>Zhou et al. Omni-Scale Feature Learning for Person Re-Identification. ICCV, 2019.</p></li>
<dl class="docutils">
<dt>Reference:</dt>
<dd><ul class="first last simple">
<li>Zhou et al. Omni-Scale Feature Learning for Person Re-Identification. ICCV, 2019.</li>
</ul>
</dd>
</dl>

View File

@ -178,33 +178,38 @@
<span id="optimizer"></span><h2>Optimizer<a class="headerlink" href="#module-torchreid.optim.optimizer" title="Permalink to this headline"></a></h2>
<dl class="function">
<dt id="torchreid.optim.optimizer.build_optimizer">
<code class="sig-prename descclassname">torchreid.optim.optimizer.</code><code class="sig-name descname">build_optimizer</code><span class="sig-paren">(</span><em class="sig-param">model</em>, <em class="sig-param">optim='adam'</em>, <em class="sig-param">lr=0.0003</em>, <em class="sig-param">weight_decay=0.0005</em>, <em class="sig-param">momentum=0.9</em>, <em class="sig-param">sgd_dampening=0</em>, <em class="sig-param">sgd_nesterov=False</em>, <em class="sig-param">rmsprop_alpha=0.99</em>, <em class="sig-param">adam_beta1=0.9</em>, <em class="sig-param">adam_beta2=0.99</em>, <em class="sig-param">staged_lr=False</em>, <em class="sig-param">new_layers=''</em>, <em class="sig-param">base_lr_mult=0.1</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/optim/optimizer.html#build_optimizer"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.optim.optimizer.build_optimizer" title="Permalink to this definition"></a></dt>
<code class="descclassname">torchreid.optim.optimizer.</code><code class="descname">build_optimizer</code><span class="sig-paren">(</span><em>model</em>, <em>optim='adam'</em>, <em>lr=0.0003</em>, <em>weight_decay=0.0005</em>, <em>momentum=0.9</em>, <em>sgd_dampening=0</em>, <em>sgd_nesterov=False</em>, <em>rmsprop_alpha=0.99</em>, <em>adam_beta1=0.9</em>, <em>adam_beta2=0.99</em>, <em>staged_lr=False</em>, <em>new_layers=''</em>, <em>base_lr_mult=0.1</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/optim/optimizer.html#build_optimizer"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.optim.optimizer.build_optimizer" title="Permalink to this definition"></a></dt>
<dd><p>A function wrapper for building an optimizer.</p>
<dl class="field-list simple">
<dt class="field-odd">Parameters</dt>
<dd class="field-odd"><ul class="simple">
<li><p><strong>model</strong> (<em>nn.Module</em>) model.</p></li>
<li><p><strong>optim</strong> (<em>str</em><em>, </em><em>optional</em>) optimizer. Default is “adam”.</p></li>
<li><p><strong>lr</strong> (<em>float</em><em>, </em><em>optional</em>) learning rate. Default is 0.0003.</p></li>
<li><p><strong>weight_decay</strong> (<em>float</em><em>, </em><em>optional</em>) weight decay (L2 penalty). Default is 5e-04.</p></li>
<li><p><strong>momentum</strong> (<em>float</em><em>, </em><em>optional</em>) momentum factor in sgd. Default is 0.9,</p></li>
<li><p><strong>sgd_dampening</strong> (<em>float</em><em>, </em><em>optional</em>) dampening for momentum. Default is 0.</p></li>
<li><p><strong>sgd_nesterov</strong> (<em>bool</em><em>, </em><em>optional</em>) enables Nesterov momentum. Default is False.</p></li>
<li><p><strong>rmsprop_alpha</strong> (<em>float</em><em>, </em><em>optional</em>) smoothing constant for rmsprop. Default is 0.99.</p></li>
<li><p><strong>adam_beta1</strong> (<em>float</em><em>, </em><em>optional</em>) beta-1 value in adam. Default is 0.9.</p></li>
<li><p><strong>adam_beta2</strong> (<em>float</em><em>, </em><em>optional</em>) beta-2 value in adam. Default is 0.99,</p></li>
<li><p><strong>staged_lr</strong> (<em>bool</em><em>, </em><em>optional</em>) uses different learning rates for base and new layers. Base
<table class="docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
<tbody valign="top">
<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple">
<li><strong>model</strong> (<em>nn.Module</em>) model.</li>
<li><strong>optim</strong> (<em>str</em><em>, </em><em>optional</em>) optimizer. Default is “adam”.</li>
<li><strong>lr</strong> (<em>float</em><em>, </em><em>optional</em>) learning rate. Default is 0.0003.</li>
<li><strong>weight_decay</strong> (<em>float</em><em>, </em><em>optional</em>) weight decay (L2 penalty). Default is 5e-04.</li>
<li><strong>momentum</strong> (<em>float</em><em>, </em><em>optional</em>) momentum factor in sgd. Default is 0.9,</li>
<li><strong>sgd_dampening</strong> (<em>float</em><em>, </em><em>optional</em>) dampening for momentum. Default is 0.</li>
<li><strong>sgd_nesterov</strong> (<em>bool</em><em>, </em><em>optional</em>) enables Nesterov momentum. Default is False.</li>
<li><strong>rmsprop_alpha</strong> (<em>float</em><em>, </em><em>optional</em>) smoothing constant for rmsprop. Default is 0.99.</li>
<li><strong>adam_beta1</strong> (<em>float</em><em>, </em><em>optional</em>) beta-1 value in adam. Default is 0.9.</li>
<li><strong>adam_beta2</strong> (<em>float</em><em>, </em><em>optional</em>) beta-2 value in adam. Default is 0.99,</li>
<li><strong>staged_lr</strong> (<em>bool</em><em>, </em><em>optional</em>) uses different learning rates for base and new layers. Base
layers are pretrained layers while new layers are randomly initialized, e.g. the
identity classification layer. Enabling <code class="docutils literal notranslate"><span class="pre">staged_lr</span></code> can allow the base layers to
be trained with a smaller learning rate determined by <code class="docutils literal notranslate"><span class="pre">base_lr_mult</span></code>, while the new
layers will take the <code class="docutils literal notranslate"><span class="pre">lr</span></code>. Default is False.</p></li>
<li><p><strong>new_layers</strong> (<em>str</em><em> or </em><em>list</em>) attribute names in <code class="docutils literal notranslate"><span class="pre">model</span></code>. Default is empty.</p></li>
<li><p><strong>base_lr_mult</strong> (<em>float</em><em>, </em><em>optional</em>) learning rate multiplier for base layers. Default is 0.1.</p></li>
layers will take the <code class="docutils literal notranslate"><span class="pre">lr</span></code>. Default is False.</li>
<li><strong>new_layers</strong> (<em>str</em><em> or </em><em>list</em>) attribute names in <code class="docutils literal notranslate"><span class="pre">model</span></code>. Default is empty.</li>
<li><strong>base_lr_mult</strong> (<em>float</em><em>, </em><em>optional</em>) learning rate multiplier for base layers. Default is 0.1.</li>
</ul>
</dd>
</dl>
<dl>
<dt>Examples::</dt><dd><div class="highlight-default notranslate"><div class="highlight"><pre><span></span><span class="gp">&gt;&gt;&gt; </span><span class="c1"># A normal optimizer can be built by</span>
</td>
</tr>
</tbody>
</table>
<dl class="docutils">
<dt>Examples::</dt>
<dd><div class="first last highlight-default notranslate"><div class="highlight"><pre><span></span><span class="gp">&gt;&gt;&gt; </span><span class="c1"># A normal optimizer can be built by</span>
<span class="gp">&gt;&gt;&gt; </span><span class="n">optimizer</span> <span class="o">=</span> <span class="n">torchreid</span><span class="o">.</span><span class="n">optim</span><span class="o">.</span><span class="n">build_optimizer</span><span class="p">(</span><span class="n">model</span><span class="p">,</span> <span class="n">optim</span><span class="o">=</span><span class="s1">&#39;sgd&#39;</span><span class="p">,</span> <span class="n">lr</span><span class="o">=</span><span class="mf">0.01</span><span class="p">)</span>
<span class="gp">&gt;&gt;&gt; </span><span class="c1"># If you want to use a smaller learning rate for pretrained layers</span>
<span class="gp">&gt;&gt;&gt; </span><span class="c1"># and the attribute name for the randomly initialized layer is &#39;classifier&#39;,</span>
@ -232,23 +237,28 @@ layers will take the <code class="docutils literal notranslate"><span class="pre
<span id="lr-scheduler"></span><h2>LR Scheduler<a class="headerlink" href="#module-torchreid.optim.lr_scheduler" title="Permalink to this headline"></a></h2>
<dl class="function">
<dt id="torchreid.optim.lr_scheduler.build_lr_scheduler">
<code class="sig-prename descclassname">torchreid.optim.lr_scheduler.</code><code class="sig-name descname">build_lr_scheduler</code><span class="sig-paren">(</span><em class="sig-param">optimizer</em>, <em class="sig-param">lr_scheduler='single_step'</em>, <em class="sig-param">stepsize=1</em>, <em class="sig-param">gamma=0.1</em>, <em class="sig-param">max_epoch=1</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/optim/lr_scheduler.html#build_lr_scheduler"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.optim.lr_scheduler.build_lr_scheduler" title="Permalink to this definition"></a></dt>
<code class="descclassname">torchreid.optim.lr_scheduler.</code><code class="descname">build_lr_scheduler</code><span class="sig-paren">(</span><em>optimizer</em>, <em>lr_scheduler='single_step'</em>, <em>stepsize=1</em>, <em>gamma=0.1</em>, <em>max_epoch=1</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/optim/lr_scheduler.html#build_lr_scheduler"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.optim.lr_scheduler.build_lr_scheduler" title="Permalink to this definition"></a></dt>
<dd><p>A function wrapper for building a learning rate scheduler.</p>
<dl class="field-list simple">
<dt class="field-odd">Parameters</dt>
<dd class="field-odd"><ul class="simple">
<li><p><strong>optimizer</strong> (<em>Optimizer</em>) an Optimizer.</p></li>
<li><p><strong>lr_scheduler</strong> (<em>str</em><em>, </em><em>optional</em>) learning rate scheduler method. Default is single_step.</p></li>
<li><p><strong>stepsize</strong> (<em>int</em><em> or </em><em>list</em><em>, </em><em>optional</em>) step size to decay learning rate. When <code class="docutils literal notranslate"><span class="pre">lr_scheduler</span></code>
<table class="docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
<tbody valign="top">
<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple">
<li><strong>optimizer</strong> (<em>Optimizer</em>) an Optimizer.</li>
<li><strong>lr_scheduler</strong> (<em>str</em><em>, </em><em>optional</em>) learning rate scheduler method. Default is single_step.</li>
<li><strong>stepsize</strong> (<em>int</em><em> or </em><em>list</em><em>, </em><em>optional</em>) step size to decay learning rate. When <code class="docutils literal notranslate"><span class="pre">lr_scheduler</span></code>
is “single_step”, <code class="docutils literal notranslate"><span class="pre">stepsize</span></code> should be an integer. When <code class="docutils literal notranslate"><span class="pre">lr_scheduler</span></code> is
“multi_step”, <code class="docutils literal notranslate"><span class="pre">stepsize</span></code> is a list. Default is 1.</p></li>
<li><p><strong>gamma</strong> (<em>float</em><em>, </em><em>optional</em>) decay rate. Default is 0.1.</p></li>
<li><p><strong>max_epoch</strong> (<em>int</em><em>, </em><em>optional</em>) maximum epoch (for cosine annealing). Default is 1.</p></li>
“multi_step”, <code class="docutils literal notranslate"><span class="pre">stepsize</span></code> is a list. Default is 1.</li>
<li><strong>gamma</strong> (<em>float</em><em>, </em><em>optional</em>) decay rate. Default is 0.1.</li>
<li><strong>max_epoch</strong> (<em>int</em><em>, </em><em>optional</em>) maximum epoch (for cosine annealing). Default is 1.</li>
</ul>
</dd>
</dl>
<dl>
<dt>Examples::</dt><dd><div class="highlight-default notranslate"><div class="highlight"><pre><span></span><span class="gp">&gt;&gt;&gt; </span><span class="c1"># Decay learning rate by every 20 epochs.</span>
</td>
</tr>
</tbody>
</table>
<dl class="docutils">
<dt>Examples::</dt>
<dd><div class="first last highlight-default notranslate"><div class="highlight"><pre><span></span><span class="gp">&gt;&gt;&gt; </span><span class="c1"># Decay learning rate by every 20 epochs.</span>
<span class="gp">&gt;&gt;&gt; </span><span class="n">scheduler</span> <span class="o">=</span> <span class="n">torchreid</span><span class="o">.</span><span class="n">optim</span><span class="o">.</span><span class="n">build_lr_scheduler</span><span class="p">(</span>
<span class="gp">&gt;&gt;&gt; </span> <span class="n">optimizer</span><span class="p">,</span> <span class="n">lr_scheduler</span><span class="o">=</span><span class="s1">&#39;single_step&#39;</span><span class="p">,</span> <span class="n">stepsize</span><span class="o">=</span><span class="mi">20</span>
<span class="gp">&gt;&gt;&gt; </span><span class="p">)</span>

View File

@ -181,10 +181,11 @@
<span id="average-meter"></span><h2>Average Meter<a class="headerlink" href="#module-torchreid.utils.avgmeter" title="Permalink to this headline"></a></h2>
<dl class="class">
<dt id="torchreid.utils.avgmeter.AverageMeter">
<em class="property">class </em><code class="sig-prename descclassname">torchreid.utils.avgmeter.</code><code class="sig-name descname">AverageMeter</code><a class="reference internal" href="../_modules/torchreid/utils/avgmeter.html#AverageMeter"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.avgmeter.AverageMeter" title="Permalink to this definition"></a></dt>
<em class="property">class </em><code class="descclassname">torchreid.utils.avgmeter.</code><code class="descname">AverageMeter</code><a class="reference internal" href="../_modules/torchreid/utils/avgmeter.html#AverageMeter"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.avgmeter.AverageMeter" title="Permalink to this definition"></a></dt>
<dd><p>Computes and stores the average and current value.</p>
<dl>
<dt>Examples::</dt><dd><div class="highlight-default notranslate"><div class="highlight"><pre><span></span><span class="gp">&gt;&gt;&gt; </span><span class="c1"># Initialize a meter to record loss</span>
<dl class="docutils">
<dt>Examples::</dt>
<dd><div class="first last highlight-default notranslate"><div class="highlight"><pre><span></span><span class="gp">&gt;&gt;&gt; </span><span class="c1"># Initialize a meter to record loss</span>
<span class="gp">&gt;&gt;&gt; </span><span class="n">losses</span> <span class="o">=</span> <span class="n">AverageMeter</span><span class="p">()</span>
<span class="gp">&gt;&gt;&gt; </span><span class="c1"># Update meter after every minibatch update</span>
<span class="gp">&gt;&gt;&gt; </span><span class="n">losses</span><span class="o">.</span><span class="n">update</span><span class="p">(</span><span class="n">loss_value</span><span class="p">,</span> <span class="n">batch_size</span><span class="p">)</span>
@ -199,16 +200,20 @@
<span id="loggers"></span><h2>Loggers<a class="headerlink" href="#module-torchreid.utils.loggers" title="Permalink to this headline"></a></h2>
<dl class="class">
<dt id="torchreid.utils.loggers.Logger">
<em class="property">class </em><code class="sig-prename descclassname">torchreid.utils.loggers.</code><code class="sig-name descname">Logger</code><span class="sig-paren">(</span><em class="sig-param">fpath=None</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/loggers.html#Logger"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.loggers.Logger" title="Permalink to this definition"></a></dt>
<em class="property">class </em><code class="descclassname">torchreid.utils.loggers.</code><code class="descname">Logger</code><span class="sig-paren">(</span><em>fpath=None</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/loggers.html#Logger"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.loggers.Logger" title="Permalink to this definition"></a></dt>
<dd><p>Writes console output to external text file.</p>
<p>Imported from <a class="reference external" href="https://github.com/Cysu/open-reid/blob/master/reid/utils/logging.py">https://github.com/Cysu/open-reid/blob/master/reid/utils/logging.py</a></p>
<dl class="field-list simple">
<dt class="field-odd">Parameters</dt>
<dd class="field-odd"><p><strong>fpath</strong> (<em>str</em>) directory to save logging file.</p>
</dd>
</dl>
<dl>
<dt>Examples::</dt><dd><div class="highlight-default notranslate"><div class="highlight"><pre><span></span><span class="gp">&gt;&gt;&gt; </span><span class="kn">import</span> <span class="nn">sys</span>
<table class="docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
<tbody valign="top">
<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><strong>fpath</strong> (<em>str</em>) directory to save logging file.</td>
</tr>
</tbody>
</table>
<dl class="docutils">
<dt>Examples::</dt>
<dd><div class="first last highlight-default notranslate"><div class="highlight"><pre><span></span><span class="gp">&gt;&gt;&gt; </span><span class="kn">import</span> <span class="nn">sys</span>
<span class="gp">&gt;&gt;&gt; </span><span class="kn">import</span> <span class="nn">os</span>
<span class="gp">&gt;&gt;&gt; </span><span class="kn">import</span> <span class="nn">os.path</span> <span class="k">as</span> <span class="nn">osp</span>
<span class="gp">&gt;&gt;&gt; </span><span class="kn">from</span> <span class="nn">torchreid.utils</span> <span class="k">import</span> <span class="n">Logger</span>
@ -223,20 +228,25 @@
<dl class="class">
<dt id="torchreid.utils.loggers.RankLogger">
<em class="property">class </em><code class="sig-prename descclassname">torchreid.utils.loggers.</code><code class="sig-name descname">RankLogger</code><span class="sig-paren">(</span><em class="sig-param">sources</em>, <em class="sig-param">targets</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/loggers.html#RankLogger"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.loggers.RankLogger" title="Permalink to this definition"></a></dt>
<em class="property">class </em><code class="descclassname">torchreid.utils.loggers.</code><code class="descname">RankLogger</code><span class="sig-paren">(</span><em>sources</em>, <em>targets</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/loggers.html#RankLogger"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.loggers.RankLogger" title="Permalink to this definition"></a></dt>
<dd><p>Records the rank1 matching accuracy obtained for each
test dataset at specified evaluation steps and provides a function
to show the summarized results, which are convenient for analysis.</p>
<dl class="field-list simple">
<dt class="field-odd">Parameters</dt>
<dd class="field-odd"><ul class="simple">
<li><p><strong>sources</strong> (<em>str</em><em> or </em><em>list</em>) source dataset name(s).</p></li>
<li><p><strong>targets</strong> (<em>str</em><em> or </em><em>list</em>) target dataset name(s).</p></li>
<table class="docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
<tbody valign="top">
<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple">
<li><strong>sources</strong> (<em>str</em><em> or </em><em>list</em>) source dataset name(s).</li>
<li><strong>targets</strong> (<em>str</em><em> or </em><em>list</em>) target dataset name(s).</li>
</ul>
</dd>
</dl>
<dl>
<dt>Examples::</dt><dd><div class="highlight-default notranslate"><div class="highlight"><pre><span></span><span class="gp">&gt;&gt;&gt; </span><span class="kn">from</span> <span class="nn">torchreid.utils</span> <span class="k">import</span> <span class="n">RankLogger</span>
</td>
</tr>
</tbody>
</table>
<dl class="docutils">
<dt>Examples::</dt>
<dd><div class="first last highlight-default notranslate"><div class="highlight"><pre><span></span><span class="gp">&gt;&gt;&gt; </span><span class="kn">from</span> <span class="nn">torchreid.utils</span> <span class="k">import</span> <span class="n">RankLogger</span>
<span class="gp">&gt;&gt;&gt; </span><span class="n">s</span> <span class="o">=</span> <span class="s1">&#39;market1501&#39;</span>
<span class="gp">&gt;&gt;&gt; </span><span class="n">t</span> <span class="o">=</span> <span class="s1">&#39;market1501&#39;</span>
<span class="gp">&gt;&gt;&gt; </span><span class="n">ranklogger</span> <span class="o">=</span> <span class="n">RankLogger</span><span class="p">(</span><span class="n">s</span><span class="p">,</span> <span class="n">t</span><span class="p">)</span>
@ -276,23 +286,27 @@ to show the summarized results, which are convenient for analysis.</p>
</dl>
<dl class="method">
<dt id="torchreid.utils.loggers.RankLogger.show_summary">
<code class="sig-name descname">show_summary</code><span class="sig-paren">(</span><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/loggers.html#RankLogger.show_summary"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.loggers.RankLogger.show_summary" title="Permalink to this definition"></a></dt>
<code class="descname">show_summary</code><span class="sig-paren">(</span><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/loggers.html#RankLogger.show_summary"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.loggers.RankLogger.show_summary" title="Permalink to this definition"></a></dt>
<dd><p>Shows saved results.</p>
</dd></dl>
<dl class="method">
<dt id="torchreid.utils.loggers.RankLogger.write">
<code class="sig-name descname">write</code><span class="sig-paren">(</span><em class="sig-param">name</em>, <em class="sig-param">epoch</em>, <em class="sig-param">rank1</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/loggers.html#RankLogger.write"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.loggers.RankLogger.write" title="Permalink to this definition"></a></dt>
<code class="descname">write</code><span class="sig-paren">(</span><em>name</em>, <em>epoch</em>, <em>rank1</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/loggers.html#RankLogger.write"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.loggers.RankLogger.write" title="Permalink to this definition"></a></dt>
<dd><p>Writes result.</p>
<dl class="field-list simple">
<dt class="field-odd">Parameters</dt>
<dd class="field-odd"><ul class="simple">
<li><p><strong>name</strong> (<em>str</em>) dataset name.</p></li>
<li><p><strong>epoch</strong> (<em>int</em>) current epoch.</p></li>
<li><p><strong>rank1</strong> (<em>float</em>) rank1 result.</p></li>
<table class="docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
<tbody valign="top">
<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple">
<li><strong>name</strong> (<em>str</em>) dataset name.</li>
<li><strong>epoch</strong> (<em>int</em>) current epoch.</li>
<li><strong>rank1</strong> (<em>float</em>) rank1 result.</li>
</ul>
</dd>
</dl>
</td>
</tr>
</tbody>
</table>
</dd></dl>
</dd></dl>
@ -302,67 +316,75 @@ to show the summarized results, which are convenient for analysis.</p>
<span id="generic-tools"></span><h2>Generic Tools<a class="headerlink" href="#module-torchreid.utils.tools" title="Permalink to this headline"></a></h2>
<dl class="function">
<dt id="torchreid.utils.tools.mkdir_if_missing">
<code class="sig-prename descclassname">torchreid.utils.tools.</code><code class="sig-name descname">mkdir_if_missing</code><span class="sig-paren">(</span><em class="sig-param">dirname</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/tools.html#mkdir_if_missing"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.tools.mkdir_if_missing" title="Permalink to this definition"></a></dt>
<code class="descclassname">torchreid.utils.tools.</code><code class="descname">mkdir_if_missing</code><span class="sig-paren">(</span><em>dirname</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/tools.html#mkdir_if_missing"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.tools.mkdir_if_missing" title="Permalink to this definition"></a></dt>
<dd><p>Creates dirname if it is missing.</p>
</dd></dl>
<dl class="function">
<dt id="torchreid.utils.tools.check_isfile">
<code class="sig-prename descclassname">torchreid.utils.tools.</code><code class="sig-name descname">check_isfile</code><span class="sig-paren">(</span><em class="sig-param">fpath</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/tools.html#check_isfile"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.tools.check_isfile" title="Permalink to this definition"></a></dt>
<code class="descclassname">torchreid.utils.tools.</code><code class="descname">check_isfile</code><span class="sig-paren">(</span><em>fpath</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/tools.html#check_isfile"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.tools.check_isfile" title="Permalink to this definition"></a></dt>
<dd><p>Checks if the given path is a file.</p>
<dl class="field-list simple">
<dt class="field-odd">Parameters</dt>
<dd class="field-odd"><p><strong>fpath</strong> (<em>str</em>) file path.</p>
</dd>
<dt class="field-even">Returns</dt>
<dd class="field-even"><p>bool</p>
</dd>
</dl>
<table class="docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
<tbody valign="top">
<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><strong>fpath</strong> (<em>str</em>) file path.</td>
</tr>
<tr class="field-even field"><th class="field-name">Returns:</th><td class="field-body">bool</td>
</tr>
</tbody>
</table>
</dd></dl>
<dl class="function">
<dt id="torchreid.utils.tools.read_json">
<code class="sig-prename descclassname">torchreid.utils.tools.</code><code class="sig-name descname">read_json</code><span class="sig-paren">(</span><em class="sig-param">fpath</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/tools.html#read_json"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.tools.read_json" title="Permalink to this definition"></a></dt>
<code class="descclassname">torchreid.utils.tools.</code><code class="descname">read_json</code><span class="sig-paren">(</span><em>fpath</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/tools.html#read_json"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.tools.read_json" title="Permalink to this definition"></a></dt>
<dd><p>Reads json file from a path.</p>
</dd></dl>
<dl class="function">
<dt id="torchreid.utils.tools.write_json">
<code class="sig-prename descclassname">torchreid.utils.tools.</code><code class="sig-name descname">write_json</code><span class="sig-paren">(</span><em class="sig-param">obj</em>, <em class="sig-param">fpath</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/tools.html#write_json"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.tools.write_json" title="Permalink to this definition"></a></dt>
<code class="descclassname">torchreid.utils.tools.</code><code class="descname">write_json</code><span class="sig-paren">(</span><em>obj</em>, <em>fpath</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/tools.html#write_json"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.tools.write_json" title="Permalink to this definition"></a></dt>
<dd><p>Writes to a json file.</p>
</dd></dl>
<dl class="function">
<dt id="torchreid.utils.tools.download_url">
<code class="sig-prename descclassname">torchreid.utils.tools.</code><code class="sig-name descname">download_url</code><span class="sig-paren">(</span><em class="sig-param">url</em>, <em class="sig-param">dst</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/tools.html#download_url"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.tools.download_url" title="Permalink to this definition"></a></dt>
<code class="descclassname">torchreid.utils.tools.</code><code class="descname">download_url</code><span class="sig-paren">(</span><em>url</em>, <em>dst</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/tools.html#download_url"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.tools.download_url" title="Permalink to this definition"></a></dt>
<dd><p>Downloads file from a url to a destination.</p>
<dl class="field-list simple">
<dt class="field-odd">Parameters</dt>
<dd class="field-odd"><ul class="simple">
<li><p><strong>url</strong> (<em>str</em>) url to download file.</p></li>
<li><p><strong>dst</strong> (<em>str</em>) destination path.</p></li>
<table class="docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
<tbody valign="top">
<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple">
<li><strong>url</strong> (<em>str</em>) url to download file.</li>
<li><strong>dst</strong> (<em>str</em>) destination path.</li>
</ul>
</dd>
</dl>
</td>
</tr>
</tbody>
</table>
</dd></dl>
<dl class="function">
<dt id="torchreid.utils.tools.read_image">
<code class="sig-prename descclassname">torchreid.utils.tools.</code><code class="sig-name descname">read_image</code><span class="sig-paren">(</span><em class="sig-param">path</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/tools.html#read_image"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.tools.read_image" title="Permalink to this definition"></a></dt>
<code class="descclassname">torchreid.utils.tools.</code><code class="descname">read_image</code><span class="sig-paren">(</span><em>path</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/tools.html#read_image"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.tools.read_image" title="Permalink to this definition"></a></dt>
<dd><p>Reads image from path using <code class="docutils literal notranslate"><span class="pre">PIL.Image</span></code>.</p>
<dl class="field-list simple">
<dt class="field-odd">Parameters</dt>
<dd class="field-odd"><p><strong>path</strong> (<em>str</em>) path to an image.</p>
</dd>
<dt class="field-even">Returns</dt>
<dd class="field-even"><p>PIL image</p>
</dd>
</dl>
<table class="docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
<tbody valign="top">
<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><strong>path</strong> (<em>str</em>) path to an image.</td>
</tr>
<tr class="field-even field"><th class="field-name">Returns:</th><td class="field-body">PIL image</td>
</tr>
</tbody>
</table>
</dd></dl>
<dl class="function">
<dt id="torchreid.utils.tools.collect_env_info">
<code class="sig-prename descclassname">torchreid.utils.tools.</code><code class="sig-name descname">collect_env_info</code><span class="sig-paren">(</span><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/tools.html#collect_env_info"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.tools.collect_env_info" title="Permalink to this definition"></a></dt>
<code class="descclassname">torchreid.utils.tools.</code><code class="descname">collect_env_info</code><span class="sig-paren">(</span><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/tools.html#collect_env_info"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.tools.collect_env_info" title="Permalink to this definition"></a></dt>
<dd><p>Returns env info as a string.</p>
<p>Code source: github.com/facebookresearch/maskrcnn-benchmark</p>
</dd></dl>
@ -372,26 +394,30 @@ to show the summarized results, which are convenient for analysis.</p>
<span id="reid-tools"></span><h2>ReID Tools<a class="headerlink" href="#module-torchreid.utils.reidtools" title="Permalink to this headline"></a></h2>
<dl class="function">
<dt id="torchreid.utils.reidtools.visualize_ranked_results">
<code class="sig-prename descclassname">torchreid.utils.reidtools.</code><code class="sig-name descname">visualize_ranked_results</code><span class="sig-paren">(</span><em class="sig-param">distmat</em>, <em class="sig-param">dataset</em>, <em class="sig-param">data_type</em>, <em class="sig-param">width=128</em>, <em class="sig-param">height=256</em>, <em class="sig-param">save_dir=''</em>, <em class="sig-param">topk=10</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/reidtools.html#visualize_ranked_results"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.reidtools.visualize_ranked_results" title="Permalink to this definition"></a></dt>
<code class="descclassname">torchreid.utils.reidtools.</code><code class="descname">visualize_ranked_results</code><span class="sig-paren">(</span><em>distmat</em>, <em>dataset</em>, <em>data_type</em>, <em>width=128</em>, <em>height=256</em>, <em>save_dir=''</em>, <em>topk=10</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/reidtools.html#visualize_ranked_results"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.reidtools.visualize_ranked_results" title="Permalink to this definition"></a></dt>
<dd><p>Visualizes ranked results.</p>
<p>Supports both image-reid and video-reid.</p>
<p>For image-reid, ranks will be plotted in a single figure. For video-reid, ranks will be
saved in folders each containing a tracklet.</p>
<dl class="field-list simple">
<dt class="field-odd">Parameters</dt>
<dd class="field-odd"><ul class="simple">
<li><p><strong>distmat</strong> (<em>numpy.ndarray</em>) distance matrix of shape (num_query, num_gallery).</p></li>
<li><p><strong>dataset</strong> (<em>tuple</em>) a 2-tuple containing (query, gallery), each of which contains
tuples of (img_path(s), pid, camid).</p></li>
<li><p><strong>data_type</strong> (<em>str</em>) “image” or “video”.</p></li>
<li><p><strong>width</strong> (<em>int</em><em>, </em><em>optional</em>) resized image width. Default is 128.</p></li>
<li><p><strong>height</strong> (<em>int</em><em>, </em><em>optional</em>) resized image height. Default is 256.</p></li>
<li><p><strong>save_dir</strong> (<em>str</em>) directory to save output images.</p></li>
<li><p><strong>topk</strong> (<em>int</em><em>, </em><em>optional</em>) denoting top-k images in the rank list to be visualized.
Default is 10.</p></li>
<table class="docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
<tbody valign="top">
<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple">
<li><strong>distmat</strong> (<em>numpy.ndarray</em>) distance matrix of shape (num_query, num_gallery).</li>
<li><strong>dataset</strong> (<em>tuple</em>) a 2-tuple containing (query, gallery), each of which contains
tuples of (img_path(s), pid, camid).</li>
<li><strong>data_type</strong> (<em>str</em>) “image” or “video”.</li>
<li><strong>width</strong> (<em>int</em><em>, </em><em>optional</em>) resized image width. Default is 128.</li>
<li><strong>height</strong> (<em>int</em><em>, </em><em>optional</em>) resized image height. Default is 256.</li>
<li><strong>save_dir</strong> (<em>str</em>) directory to save output images.</li>
<li><strong>topk</strong> (<em>int</em><em>, </em><em>optional</em>) denoting top-k images in the rank list to be visualized.
Default is 10.</li>
</ul>
</dd>
</dl>
</td>
</tr>
</tbody>
</table>
</dd></dl>
</div>
@ -399,22 +425,27 @@ Default is 10.</p></li>
<span id="torch-tools"></span><h2>Torch Tools<a class="headerlink" href="#module-torchreid.utils.torchtools" title="Permalink to this headline"></a></h2>
<dl class="function">
<dt id="torchreid.utils.torchtools.save_checkpoint">
<code class="sig-prename descclassname">torchreid.utils.torchtools.</code><code class="sig-name descname">save_checkpoint</code><span class="sig-paren">(</span><em class="sig-param">state</em>, <em class="sig-param">save_dir</em>, <em class="sig-param">is_best=False</em>, <em class="sig-param">remove_module_from_keys=False</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/torchtools.html#save_checkpoint"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.torchtools.save_checkpoint" title="Permalink to this definition"></a></dt>
<code class="descclassname">torchreid.utils.torchtools.</code><code class="descname">save_checkpoint</code><span class="sig-paren">(</span><em>state</em>, <em>save_dir</em>, <em>is_best=False</em>, <em>remove_module_from_keys=False</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/torchtools.html#save_checkpoint"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.torchtools.save_checkpoint" title="Permalink to this definition"></a></dt>
<dd><p>Saves checkpoint.</p>
<dl class="field-list simple">
<dt class="field-odd">Parameters</dt>
<dd class="field-odd"><ul class="simple">
<li><p><strong>state</strong> (<em>dict</em>) dictionary.</p></li>
<li><p><strong>save_dir</strong> (<em>str</em>) directory to save checkpoint.</p></li>
<li><p><strong>is_best</strong> (<em>bool</em><em>, </em><em>optional</em>) if True, this checkpoint will be copied and named
<code class="docutils literal notranslate"><span class="pre">model-best.pth.tar</span></code>. Default is False.</p></li>
<li><p><strong>remove_module_from_keys</strong> (<em>bool</em><em>, </em><em>optional</em>) whether to remove “module.”
from layer names. Default is False.</p></li>
<table class="docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
<tbody valign="top">
<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple">
<li><strong>state</strong> (<em>dict</em>) dictionary.</li>
<li><strong>save_dir</strong> (<em>str</em>) directory to save checkpoint.</li>
<li><strong>is_best</strong> (<em>bool</em><em>, </em><em>optional</em>) if True, this checkpoint will be copied and named
<code class="docutils literal notranslate"><span class="pre">model-best.pth.tar</span></code>. Default is False.</li>
<li><strong>remove_module_from_keys</strong> (<em>bool</em><em>, </em><em>optional</em>) whether to remove “module.”
from layer names. Default is False.</li>
</ul>
</dd>
</dl>
<dl>
<dt>Examples::</dt><dd><div class="highlight-default notranslate"><div class="highlight"><pre><span></span><span class="gp">&gt;&gt;&gt; </span><span class="n">state</span> <span class="o">=</span> <span class="p">{</span>
</td>
</tr>
</tbody>
</table>
<dl class="docutils">
<dt>Examples::</dt>
<dd><div class="first last highlight-default notranslate"><div class="highlight"><pre><span></span><span class="gp">&gt;&gt;&gt; </span><span class="n">state</span> <span class="o">=</span> <span class="p">{</span>
<span class="gp">&gt;&gt;&gt; </span> <span class="s1">&#39;state_dict&#39;</span><span class="p">:</span> <span class="n">model</span><span class="o">.</span><span class="n">state_dict</span><span class="p">(),</span>
<span class="gp">&gt;&gt;&gt; </span> <span class="s1">&#39;epoch&#39;</span><span class="p">:</span> <span class="mi">10</span><span class="p">,</span>
<span class="gp">&gt;&gt;&gt; </span> <span class="s1">&#39;rank1&#39;</span><span class="p">:</span> <span class="mf">0.5</span><span class="p">,</span>
@ -429,20 +460,23 @@ from layer names. Default is False.</p></li>
<dl class="function">
<dt id="torchreid.utils.torchtools.load_checkpoint">
<code class="sig-prename descclassname">torchreid.utils.torchtools.</code><code class="sig-name descname">load_checkpoint</code><span class="sig-paren">(</span><em class="sig-param">fpath</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/torchtools.html#load_checkpoint"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.torchtools.load_checkpoint" title="Permalink to this definition"></a></dt>
<code class="descclassname">torchreid.utils.torchtools.</code><code class="descname">load_checkpoint</code><span class="sig-paren">(</span><em>fpath</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/torchtools.html#load_checkpoint"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.torchtools.load_checkpoint" title="Permalink to this definition"></a></dt>
<dd><p>Loads checkpoint.</p>
<p><code class="docutils literal notranslate"><span class="pre">UnicodeDecodeError</span></code> can be well handled, which means
python2-saved files can be read from python3.</p>
<dl class="field-list simple">
<dt class="field-odd">Parameters</dt>
<dd class="field-odd"><p><strong>fpath</strong> (<em>str</em>) path to checkpoint.</p>
</dd>
<dt class="field-even">Returns</dt>
<dd class="field-even"><p>dict</p>
</dd>
</dl>
<dl>
<dt>Examples::</dt><dd><div class="highlight-default notranslate"><div class="highlight"><pre><span></span><span class="gp">&gt;&gt;&gt; </span><span class="kn">from</span> <span class="nn">torchreid.utils</span> <span class="k">import</span> <span class="n">load_checkpoint</span>
<table class="docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
<tbody valign="top">
<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><strong>fpath</strong> (<em>str</em>) path to checkpoint.</td>
</tr>
<tr class="field-even field"><th class="field-name">Returns:</th><td class="field-body">dict</td>
</tr>
</tbody>
</table>
<dl class="docutils">
<dt>Examples::</dt>
<dd><div class="first last highlight-default notranslate"><div class="highlight"><pre><span></span><span class="gp">&gt;&gt;&gt; </span><span class="kn">from</span> <span class="nn">torchreid.utils</span> <span class="k">import</span> <span class="n">load_checkpoint</span>
<span class="gp">&gt;&gt;&gt; </span><span class="n">fpath</span> <span class="o">=</span> <span class="s1">&#39;log/my_model/model.pth.tar-10&#39;</span>
<span class="gp">&gt;&gt;&gt; </span><span class="n">checkpoint</span> <span class="o">=</span> <span class="n">load_checkpoint</span><span class="p">(</span><span class="n">fpath</span><span class="p">)</span>
</pre></div>
@ -453,27 +487,32 @@ python2-saved files can be read from python3.</p>
<dl class="function">
<dt id="torchreid.utils.torchtools.resume_from_checkpoint">
<code class="sig-prename descclassname">torchreid.utils.torchtools.</code><code class="sig-name descname">resume_from_checkpoint</code><span class="sig-paren">(</span><em class="sig-param">fpath</em>, <em class="sig-param">model</em>, <em class="sig-param">optimizer=None</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/torchtools.html#resume_from_checkpoint"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.torchtools.resume_from_checkpoint" title="Permalink to this definition"></a></dt>
<code class="descclassname">torchreid.utils.torchtools.</code><code class="descname">resume_from_checkpoint</code><span class="sig-paren">(</span><em>fpath</em>, <em>model</em>, <em>optimizer=None</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/torchtools.html#resume_from_checkpoint"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.torchtools.resume_from_checkpoint" title="Permalink to this definition"></a></dt>
<dd><p>Resumes training from a checkpoint.</p>
<p>This will load (1) model weights and (2) <code class="docutils literal notranslate"><span class="pre">state_dict</span></code>
of optimizer if <code class="docutils literal notranslate"><span class="pre">optimizer</span></code> is not None.</p>
<dl class="field-list simple">
<dt class="field-odd">Parameters</dt>
<dd class="field-odd"><ul class="simple">
<li><p><strong>fpath</strong> (<em>str</em>) path to checkpoint.</p></li>
<li><p><strong>model</strong> (<em>nn.Module</em>) model.</p></li>
<li><p><strong>optimizer</strong> (<em>Optimizer</em><em>, </em><em>optional</em>) an Optimizer.</p></li>
<table class="docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
<tbody valign="top">
<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first simple">
<li><strong>fpath</strong> (<em>str</em>) path to checkpoint.</li>
<li><strong>model</strong> (<em>nn.Module</em>) model.</li>
<li><strong>optimizer</strong> (<em>Optimizer</em><em>, </em><em>optional</em>) an Optimizer.</li>
</ul>
</dd>
<dt class="field-even">Returns</dt>
<dd class="field-even"><p>start_epoch.</p>
</dd>
<dt class="field-odd">Return type</dt>
<dd class="field-odd"><p>int</p>
</dd>
</dl>
<dl>
<dt>Examples::</dt><dd><div class="highlight-default notranslate"><div class="highlight"><pre><span></span><span class="gp">&gt;&gt;&gt; </span><span class="kn">from</span> <span class="nn">torchreid.utils</span> <span class="k">import</span> <span class="n">resume_from_checkpoint</span>
</td>
</tr>
<tr class="field-even field"><th class="field-name">Returns:</th><td class="field-body"><p class="first">start_epoch.</p>
</td>
</tr>
<tr class="field-odd field"><th class="field-name">Return type:</th><td class="field-body"><p class="first last">int</p>
</td>
</tr>
</tbody>
</table>
<dl class="docutils">
<dt>Examples::</dt>
<dd><div class="first last highlight-default notranslate"><div class="highlight"><pre><span></span><span class="gp">&gt;&gt;&gt; </span><span class="kn">from</span> <span class="nn">torchreid.utils</span> <span class="k">import</span> <span class="n">resume_from_checkpoint</span>
<span class="gp">&gt;&gt;&gt; </span><span class="n">fpath</span> <span class="o">=</span> <span class="s1">&#39;log/my_model/model.pth.tar-10&#39;</span>
<span class="gp">&gt;&gt;&gt; </span><span class="n">start_epoch</span> <span class="o">=</span> <span class="n">resume_from_checkpoint</span><span class="p">(</span><span class="n">fpath</span><span class="p">,</span> <span class="n">model</span><span class="p">,</span> <span class="n">optimizer</span><span class="p">)</span>
</pre></div>
@ -484,10 +523,11 @@ of optimizer if <code class="docutils literal notranslate"><span class="pre">opt
<dl class="function">
<dt id="torchreid.utils.torchtools.open_all_layers">
<code class="sig-prename descclassname">torchreid.utils.torchtools.</code><code class="sig-name descname">open_all_layers</code><span class="sig-paren">(</span><em class="sig-param">model</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/torchtools.html#open_all_layers"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.torchtools.open_all_layers" title="Permalink to this definition"></a></dt>
<code class="descclassname">torchreid.utils.torchtools.</code><code class="descname">open_all_layers</code><span class="sig-paren">(</span><em>model</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/torchtools.html#open_all_layers"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.torchtools.open_all_layers" title="Permalink to this definition"></a></dt>
<dd><p>Opens all layers in model for training.</p>
<dl>
<dt>Examples::</dt><dd><div class="highlight-default notranslate"><div class="highlight"><pre><span></span><span class="gp">&gt;&gt;&gt; </span><span class="kn">from</span> <span class="nn">torchreid.utils</span> <span class="k">import</span> <span class="n">open_all_layers</span>
<dl class="docutils">
<dt>Examples::</dt>
<dd><div class="first last highlight-default notranslate"><div class="highlight"><pre><span></span><span class="gp">&gt;&gt;&gt; </span><span class="kn">from</span> <span class="nn">torchreid.utils</span> <span class="k">import</span> <span class="n">open_all_layers</span>
<span class="gp">&gt;&gt;&gt; </span><span class="n">open_all_layers</span><span class="p">(</span><span class="n">model</span><span class="p">)</span>
</pre></div>
</div>
@ -497,19 +537,24 @@ of optimizer if <code class="docutils literal notranslate"><span class="pre">opt
<dl class="function">
<dt id="torchreid.utils.torchtools.open_specified_layers">
<code class="sig-prename descclassname">torchreid.utils.torchtools.</code><code class="sig-name descname">open_specified_layers</code><span class="sig-paren">(</span><em class="sig-param">model</em>, <em class="sig-param">open_layers</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/torchtools.html#open_specified_layers"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.torchtools.open_specified_layers" title="Permalink to this definition"></a></dt>
<code class="descclassname">torchreid.utils.torchtools.</code><code class="descname">open_specified_layers</code><span class="sig-paren">(</span><em>model</em>, <em>open_layers</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/torchtools.html#open_specified_layers"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.torchtools.open_specified_layers" title="Permalink to this definition"></a></dt>
<dd><p>Opens specified layers in model for training while keeping
other layers frozen.</p>
<dl class="field-list simple">
<dt class="field-odd">Parameters</dt>
<dd class="field-odd"><ul class="simple">
<li><p><strong>model</strong> (<em>nn.Module</em>) neural net model.</p></li>
<li><p><strong>open_layers</strong> (<em>str</em><em> or </em><em>list</em>) layers open for training.</p></li>
<table class="docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
<tbody valign="top">
<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple">
<li><strong>model</strong> (<em>nn.Module</em>) neural net model.</li>
<li><strong>open_layers</strong> (<em>str</em><em> or </em><em>list</em>) layers open for training.</li>
</ul>
</dd>
</dl>
<dl>
<dt>Examples::</dt><dd><div class="highlight-default notranslate"><div class="highlight"><pre><span></span><span class="gp">&gt;&gt;&gt; </span><span class="kn">from</span> <span class="nn">torchreid.utils</span> <span class="k">import</span> <span class="n">open_specified_layers</span>
</td>
</tr>
</tbody>
</table>
<dl class="docutils">
<dt>Examples::</dt>
<dd><div class="first last highlight-default notranslate"><div class="highlight"><pre><span></span><span class="gp">&gt;&gt;&gt; </span><span class="kn">from</span> <span class="nn">torchreid.utils</span> <span class="k">import</span> <span class="n">open_specified_layers</span>
<span class="gp">&gt;&gt;&gt; </span><span class="c1"># Only model.classifier will be updated.</span>
<span class="gp">&gt;&gt;&gt; </span><span class="n">open_layers</span> <span class="o">=</span> <span class="s1">&#39;classifier&#39;</span>
<span class="gp">&gt;&gt;&gt; </span><span class="n">open_specified_layers</span><span class="p">(</span><span class="n">model</span><span class="p">,</span> <span class="n">open_layers</span><span class="p">)</span>
@ -524,48 +569,58 @@ other layers frozen.</p>
<dl class="function">
<dt id="torchreid.utils.torchtools.count_num_param">
<code class="sig-prename descclassname">torchreid.utils.torchtools.</code><code class="sig-name descname">count_num_param</code><span class="sig-paren">(</span><em class="sig-param">model</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/torchtools.html#count_num_param"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.torchtools.count_num_param" title="Permalink to this definition"></a></dt>
<code class="descclassname">torchreid.utils.torchtools.</code><code class="descname">count_num_param</code><span class="sig-paren">(</span><em>model</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/torchtools.html#count_num_param"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.torchtools.count_num_param" title="Permalink to this definition"></a></dt>
<dd><p>Counts number of parameters in a model while ignoring <code class="docutils literal notranslate"><span class="pre">self.classifier</span></code>.</p>
<dl class="field-list simple">
<dt class="field-odd">Parameters</dt>
<dd class="field-odd"><p><strong>model</strong> (<em>nn.Module</em>) network model.</p>
</dd>
</dl>
<dl>
<dt>Examples::</dt><dd><div class="highlight-default notranslate"><div class="highlight"><pre><span></span><span class="gp">&gt;&gt;&gt; </span><span class="kn">from</span> <span class="nn">torchreid.utils</span> <span class="k">import</span> <span class="n">count_num_param</span>
<table class="docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
<tbody valign="top">
<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><strong>model</strong> (<em>nn.Module</em>) network model.</td>
</tr>
</tbody>
</table>
<dl class="docutils">
<dt>Examples::</dt>
<dd><div class="first last highlight-default notranslate"><div class="highlight"><pre><span></span><span class="gp">&gt;&gt;&gt; </span><span class="kn">from</span> <span class="nn">torchreid.utils</span> <span class="k">import</span> <span class="n">count_num_param</span>
<span class="gp">&gt;&gt;&gt; </span><span class="n">model_size</span> <span class="o">=</span> <span class="n">count_num_param</span><span class="p">(</span><span class="n">model</span><span class="p">)</span>
</pre></div>
</div>
</dd>
</dl>
<div class="admonition warning">
<p class="admonition-title">Warning</p>
<p>This method is deprecated in favor of
<p class="first admonition-title">Warning</p>
<p class="last">This method is deprecated in favor of
<code class="docutils literal notranslate"><span class="pre">torchreid.utils.compute_model_complexity</span></code>.</p>
</div>
</dd></dl>
<dl class="function">
<dt id="torchreid.utils.torchtools.load_pretrained_weights">
<code class="sig-prename descclassname">torchreid.utils.torchtools.</code><code class="sig-name descname">load_pretrained_weights</code><span class="sig-paren">(</span><em class="sig-param">model</em>, <em class="sig-param">weight_path</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/torchtools.html#load_pretrained_weights"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.torchtools.load_pretrained_weights" title="Permalink to this definition"></a></dt>
<code class="descclassname">torchreid.utils.torchtools.</code><code class="descname">load_pretrained_weights</code><span class="sig-paren">(</span><em>model</em>, <em>weight_path</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/torchtools.html#load_pretrained_weights"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.torchtools.load_pretrained_weights" title="Permalink to this definition"></a></dt>
<dd><p>Loads pretrianed weights to model.</p>
<dl class="simple">
<dt>Features::</dt><dd><ul class="simple">
<li><p>Incompatible layers (unmatched in name or size) will be ignored.</p></li>
<li><p>Can automatically deal with keys containing “module.”.</p></li>
<dl class="docutils">
<dt>Features::</dt>
<dd><ul class="first last simple">
<li>Incompatible layers (unmatched in name or size) will be ignored.</li>
<li>Can automatically deal with keys containing “module.”.</li>
</ul>
</dd>
</dl>
<dl class="field-list simple">
<dt class="field-odd">Parameters</dt>
<dd class="field-odd"><ul class="simple">
<li><p><strong>model</strong> (<em>nn.Module</em>) network model.</p></li>
<li><p><strong>weight_path</strong> (<em>str</em>) path to pretrained weights.</p></li>
<table class="docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
<tbody valign="top">
<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple">
<li><strong>model</strong> (<em>nn.Module</em>) network model.</li>
<li><strong>weight_path</strong> (<em>str</em>) path to pretrained weights.</li>
</ul>
</dd>
</dl>
<dl>
<dt>Examples::</dt><dd><div class="highlight-default notranslate"><div class="highlight"><pre><span></span><span class="gp">&gt;&gt;&gt; </span><span class="kn">from</span> <span class="nn">torchreid.utils</span> <span class="k">import</span> <span class="n">load_pretrained_weights</span>
</td>
</tr>
</tbody>
</table>
<dl class="docutils">
<dt>Examples::</dt>
<dd><div class="first last highlight-default notranslate"><div class="highlight"><pre><span></span><span class="gp">&gt;&gt;&gt; </span><span class="kn">from</span> <span class="nn">torchreid.utils</span> <span class="k">import</span> <span class="n">load_pretrained_weights</span>
<span class="gp">&gt;&gt;&gt; </span><span class="n">weight_path</span> <span class="o">=</span> <span class="s1">&#39;log/my_model/model-best.pth.tar&#39;</span>
<span class="gp">&gt;&gt;&gt; </span><span class="n">load_pretrained_weights</span><span class="p">(</span><span class="n">model</span><span class="p">,</span> <span class="n">weight_path</span><span class="p">)</span>
</pre></div>
@ -576,32 +631,37 @@ other layers frozen.</p>
<span class="target" id="module-torchreid.utils.model_complexity"></span><dl class="function">
<dt id="torchreid.utils.model_complexity.compute_model_complexity">
<code class="sig-prename descclassname">torchreid.utils.model_complexity.</code><code class="sig-name descname">compute_model_complexity</code><span class="sig-paren">(</span><em class="sig-param">model</em>, <em class="sig-param">input_size</em>, <em class="sig-param">verbose=False</em>, <em class="sig-param">only_conv_linear=True</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/model_complexity.html#compute_model_complexity"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.model_complexity.compute_model_complexity" title="Permalink to this definition"></a></dt>
<code class="descclassname">torchreid.utils.model_complexity.</code><code class="descname">compute_model_complexity</code><span class="sig-paren">(</span><em>model</em>, <em>input_size</em>, <em>verbose=False</em>, <em>only_conv_linear=True</em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/torchreid/utils/model_complexity.html#compute_model_complexity"><span class="viewcode-link">[source]</span></a><a class="headerlink" href="#torchreid.utils.model_complexity.compute_model_complexity" title="Permalink to this definition"></a></dt>
<dd><p>Returns number of parameters and FLOPs.</p>
<div class="admonition note">
<p class="admonition-title">Note</p>
<p>(1) this function only provides an estimate of the theoretical time complexity
<p class="first admonition-title">Note</p>
<p class="last">(1) this function only provides an estimate of the theoretical time complexity
rather than the actual running time which depends on implementations and hardware,
and (2) the FLOPs is only counted for layers that are used at test time. This means
that redundant layers such as person ID classification layer will be ignored as it
is discarded when doing feature extraction. Note that the inference graph depends on
how you construct the computations in <code class="docutils literal notranslate"><span class="pre">forward()</span></code>.</p>
</div>
<dl class="field-list simple">
<dt class="field-odd">Parameters</dt>
<dd class="field-odd"><ul class="simple">
<li><p><strong>model</strong> (<em>nn.Module</em>) network model.</p></li>
<li><p><strong>input_size</strong> (<em>tuple</em>) input size, e.g. (1, 3, 256, 128).</p></li>
<li><p><strong>verbose</strong> (<em>bool</em><em>, </em><em>optional</em>) shows detailed complexity of
each module. Default is False.</p></li>
<li><p><strong>only_conv_linear</strong> (<em>bool</em><em>, </em><em>optional</em>) only considers convolution
<table class="docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
<tbody valign="top">
<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple">
<li><strong>model</strong> (<em>nn.Module</em>) network model.</li>
<li><strong>input_size</strong> (<em>tuple</em>) input size, e.g. (1, 3, 256, 128).</li>
<li><strong>verbose</strong> (<em>bool</em><em>, </em><em>optional</em>) shows detailed complexity of
each module. Default is False.</li>
<li><strong>only_conv_linear</strong> (<em>bool</em><em>, </em><em>optional</em>) only considers convolution
and linear layers when counting flops. Default is True.
If set to False, flops of all layers will be counted.</p></li>
If set to False, flops of all layers will be counted.</li>
</ul>
</dd>
</dl>
<dl>
<dt>Examples::</dt><dd><div class="highlight-default notranslate"><div class="highlight"><pre><span></span><span class="gp">&gt;&gt;&gt; </span><span class="kn">from</span> <span class="nn">torchreid</span> <span class="k">import</span> <span class="n">models</span><span class="p">,</span> <span class="n">utils</span>
</td>
</tr>
</tbody>
</table>
<dl class="docutils">
<dt>Examples::</dt>
<dd><div class="first last highlight-default notranslate"><div class="highlight"><pre><span></span><span class="gp">&gt;&gt;&gt; </span><span class="kn">from</span> <span class="nn">torchreid</span> <span class="k">import</span> <span class="n">models</span><span class="p">,</span> <span class="n">utils</span>
<span class="gp">&gt;&gt;&gt; </span><span class="n">model</span> <span class="o">=</span> <span class="n">models</span><span class="o">.</span><span class="n">build_model</span><span class="p">(</span><span class="n">name</span><span class="o">=</span><span class="s1">&#39;resnet50&#39;</span><span class="p">,</span> <span class="n">num_classes</span><span class="o">=</span><span class="mi">1000</span><span class="p">)</span>
<span class="gp">&gt;&gt;&gt; </span><span class="n">num_params</span><span class="p">,</span> <span class="n">flops</span> <span class="o">=</span> <span class="n">utils</span><span class="o">.</span><span class="n">compute_model_complexity</span><span class="p">(</span><span class="n">model</span><span class="p">,</span> <span class="p">(</span><span class="mi">1</span><span class="p">,</span> <span class="mi">3</span><span class="p">,</span> <span class="mi">256</span><span class="p">,</span> <span class="mi">128</span><span class="p">),</span> <span class="n">verbose</span><span class="o">=</span><span class="kc">True</span><span class="p">)</span>
</pre></div>

File diff suppressed because one or more lines are too long

View File

@ -191,23 +191,23 @@
<h1>How-to<a class="headerlink" href="#how-to" title="Permalink to this headline"></a></h1>
<div class="contents local topic" id="contents">
<ul class="simple">
<li><p><a class="reference internal" href="#find-model-keys" id="id1">Find model keys</a></p></li>
<li><p><a class="reference internal" href="#show-available-models" id="id2">Show available models</a></p></li>
<li><p><a class="reference internal" href="#change-the-training-sampler" id="id3">Change the training sampler</a></p></li>
<li><p><a class="reference internal" href="#choose-an-optimizer-lr-scheduler" id="id4">Choose an optimizer/lr_scheduler</a></p></li>
<li><p><a class="reference internal" href="#resume-training" id="id5">Resume training</a></p></li>
<li><p><a class="reference internal" href="#compute-model-complexity" id="id6">Compute model complexity</a></p></li>
<li><p><a class="reference internal" href="#combine-multiple-datasets" id="id7">Combine multiple datasets</a></p></li>
<li><p><a class="reference internal" href="#do-cross-dataset-evaluation" id="id8">Do cross-dataset evaluation</a></p></li>
<li><p><a class="reference internal" href="#combine-train-query-and-gallery" id="id9">Combine train, query and gallery</a></p></li>
<li><p><a class="reference internal" href="#optimize-layers-with-different-learning-rates" id="id10">Optimize layers with different learning rates</a></p></li>
<li><p><a class="reference internal" href="#do-two-stepped-transfer-learning" id="id11">Do two-stepped transfer learning</a></p></li>
<li><p><a class="reference internal" href="#test-a-trained-model" id="id12">Test a trained model</a></p></li>
<li><p><a class="reference internal" href="#visualize-learning-curves-with-tensorboard" id="id13">Visualize learning curves with tensorboard</a></p></li>
<li><p><a class="reference internal" href="#visualize-ranked-results" id="id14">Visualize ranked results</a></p></li>
<li><p><a class="reference internal" href="#visualize-activation-maps" id="id15">Visualize activation maps</a></p></li>
<li><p><a class="reference internal" href="#use-your-own-dataset" id="id16">Use your own dataset</a></p></li>
<li><p><a class="reference internal" href="#design-your-own-engine" id="id17">Design your own Engine</a></p></li>
<li><a class="reference internal" href="#find-model-keys" id="id1">Find model keys</a></li>
<li><a class="reference internal" href="#show-available-models" id="id2">Show available models</a></li>
<li><a class="reference internal" href="#change-the-training-sampler" id="id3">Change the training sampler</a></li>
<li><a class="reference internal" href="#choose-an-optimizer-lr-scheduler" id="id4">Choose an optimizer/lr_scheduler</a></li>
<li><a class="reference internal" href="#resume-training" id="id5">Resume training</a></li>
<li><a class="reference internal" href="#compute-model-complexity" id="id6">Compute model complexity</a></li>
<li><a class="reference internal" href="#combine-multiple-datasets" id="id7">Combine multiple datasets</a></li>
<li><a class="reference internal" href="#do-cross-dataset-evaluation" id="id8">Do cross-dataset evaluation</a></li>
<li><a class="reference internal" href="#combine-train-query-and-gallery" id="id9">Combine train, query and gallery</a></li>
<li><a class="reference internal" href="#optimize-layers-with-different-learning-rates" id="id10">Optimize layers with different learning rates</a></li>
<li><a class="reference internal" href="#do-two-stepped-transfer-learning" id="id11">Do two-stepped transfer learning</a></li>
<li><a class="reference internal" href="#test-a-trained-model" id="id12">Test a trained model</a></li>
<li><a class="reference internal" href="#visualize-learning-curves-with-tensorboard" id="id13">Visualize learning curves with tensorboard</a></li>
<li><a class="reference internal" href="#visualize-ranked-results" id="id14">Visualize ranked results</a></li>
<li><a class="reference internal" href="#visualize-activation-maps" id="id15">Visualize activation maps</a></li>
<li><a class="reference internal" href="#use-your-own-dataset" id="id16">Use your own dataset</a></li>
<li><a class="reference internal" href="#design-your-own-engine" id="id17">Design your own Engine</a></li>
</ul>
</div>
<div class="section" id="find-model-keys">
@ -261,7 +261,7 @@
<span class="n">utils</span><span class="o">.</span><span class="n">compute_model_complexity</span><span class="p">(</span><span class="n">model</span><span class="p">,</span> <span class="p">(</span><span class="mi">1</span><span class="p">,</span> <span class="mi">3</span><span class="p">,</span> <span class="mi">256</span><span class="p">,</span> <span class="mi">128</span><span class="p">),</span> <span class="n">verbose</span><span class="o">=</span><span class="bp">True</span><span class="p">,</span> <span class="n">only_conv_linear</span><span class="o">=</span><span class="bp">False</span><span class="p">)</span>
</pre></div>
</div>
<p>It is worth noting that (1) this function only provides an estimate of the theoretical time complexity rather than the actual running time which depends on implementations and hardware, and (2) the FLOPs is only counted for layers that are used at test time. This means that redundant layers such as person ID classification layer will be ignored as it is discarded when doing feature extraction. Note that the inference graph depends on how you construct the computations in <code class="docutils literal notranslate"><span class="pre">forward()</span></code>.</p>
<p>Note that (1) this function only provides an estimate of the theoretical time complexity rather than the actual running time which depends on implementations and hardware; (2) the FLOPs is only counted for layers that are used at test time. This means that redundant layers such as person ID classification layer will be ignored. The inference graph depends on how you define the computations in <code class="docutils literal notranslate"><span class="pre">forward()</span></code>.</p>
</div>
<div class="section" id="combine-multiple-datasets">
<h2><a class="toc-backref" href="#id7">Combine multiple datasets</a><a class="headerlink" href="#combine-multiple-datasets" title="Permalink to this headline"></a></h2>
@ -305,7 +305,7 @@
<span class="p">)</span>
</pre></div>
</div>
<p>More specifically, with <code class="docutils literal notranslate"><span class="pre">combineall=False</span></code>, you would get</p>
<p>More specifically, with <code class="docutils literal notranslate"><span class="pre">combineall=False</span></code>, you will get</p>
<div class="highlight-none notranslate"><div class="highlight"><pre><span></span>=&gt; Loaded Market1501
----------------------------------------
subset | # ids | # images | # cameras
@ -316,7 +316,7 @@
---------------------------------------
</pre></div>
</div>
<p>with <code class="docutils literal notranslate"><span class="pre">combineall=True</span></code>, you would get</p>
<p>with <code class="docutils literal notranslate"><span class="pre">combineall=True</span></code>, you will get</p>
<div class="highlight-none notranslate"><div class="highlight"><pre><span></span>=&gt; Loaded Market1501
----------------------------------------
subset | # ids | # images | # cameras
@ -348,8 +348,8 @@
</div>
<div class="section" id="do-two-stepped-transfer-learning">
<h2><a class="toc-backref" href="#id11">Do two-stepped transfer learning</a><a class="headerlink" href="#do-two-stepped-transfer-learning" title="Permalink to this headline"></a></h2>
<p>To prevent the pretrained layers to be damaged by harmful gradients back-propagated from randomly initialized layers, one can adopt the <em>two-stepped transfer learning strategy</em> presented in <a class="reference external" href="https://arxiv.org/abs/1611.05244">Deep Transfer Learning for Person Re-identification</a>. The basic idea is to pretrain the randomly initialized layers for few epochs while keeping the base layers frozen before training all layers end-to-end.</p>
<p>This has been implemented in <code class="docutils literal notranslate"><span class="pre">Engine.train()</span></code> (see <a class="reference internal" href="pkg/engine.html#torchreid-engine"><span class="std std-ref">torchreid.engine</span></a>). The arguments to enable this feature are <code class="docutils literal notranslate"><span class="pre">fixbase_epoch</span></code> and <code class="docutils literal notranslate"><span class="pre">open_layers</span></code>. Intuitively, <code class="docutils literal notranslate"><span class="pre">fixbase_epoch</span></code> denotes the number of epochs to keep the base layers frozen; <code class="docutils literal notranslate"><span class="pre">open_layers</span></code> means which layers are open for training.</p>
<p>To prevent the pretrained layers from being damaged by harmful gradients back-propagated from randomly initialized layers, one can adopt the <em>two-stepped transfer learning strategy</em> presented in <a class="reference external" href="https://arxiv.org/abs/1611.05244">Deep Transfer Learning for Person Re-identification</a>. The basic idea is to pretrain the randomly initialized layers for few epochs while keeping the base layers frozen before training all layers end-to-end.</p>
<p>This has been implemented in <code class="docutils literal notranslate"><span class="pre">Engine.train()</span></code> (see <a class="reference internal" href="pkg/engine.html#torchreid-engine"><span class="std std-ref">torchreid.engine</span></a>). The arguments related to this feature are <code class="docutils literal notranslate"><span class="pre">fixbase_epoch</span></code> and <code class="docutils literal notranslate"><span class="pre">open_layers</span></code>. Intuitively, <code class="docutils literal notranslate"><span class="pre">fixbase_epoch</span></code> denotes the number of epochs to keep the base layers frozen; <code class="docutils literal notranslate"><span class="pre">open_layers</span></code> means which layer is open for training.</p>
<p>For example, say you want to pretrain the classification layer named “classifier” in ResNet50 for 5 epochs before training all layers, you can do</p>
<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">engine</span><span class="o">.</span><span class="n">run</span><span class="p">(</span>
<span class="n">save_dir</span><span class="o">=</span><span class="s1">&#39;log/resnet50&#39;</span><span class="p">,</span>
@ -372,13 +372,13 @@
</div>
<div class="section" id="visualize-learning-curves-with-tensorboard">
<h2><a class="toc-backref" href="#id13">Visualize learning curves with tensorboard</a><a class="headerlink" href="#visualize-learning-curves-with-tensorboard" title="Permalink to this headline"></a></h2>
<p>The <code class="docutils literal notranslate"><span class="pre">SummaryWriter()</span></code> for tensorboard will be automatically initialized in <code class="docutils literal notranslate"><span class="pre">engine.run()</span></code> when you are training your model. Therefore, you do not need to do extra jobs. After the training is done, the <code class="docutils literal notranslate"><span class="pre">*tf.events*</span></code> file will be saved in <code class="docutils literal notranslate"><span class="pre">save_dir</span></code>. Then, you just call <code class="docutils literal notranslate"><span class="pre">tensorboard</span> <span class="pre">--logdir=your_save_dir</span></code> in your terminal and visit <code class="docutils literal notranslate"><span class="pre">http://localhost:6006/</span></code> in your web browser. See <a class="reference external" href="https://pytorch.org/docs/stable/tensorboard.html">pytorch tensorboard</a> for further information.</p>
<p>The <code class="docutils literal notranslate"><span class="pre">SummaryWriter()</span></code> for tensorboard will be automatically initialized in <code class="docutils literal notranslate"><span class="pre">engine.run()</span></code> when you are training your model. Therefore, you do not need to do extra jobs. After the training is done, the <code class="docutils literal notranslate"><span class="pre">*tf.events*</span></code> file will be saved in <code class="docutils literal notranslate"><span class="pre">save_dir</span></code>. Then, you just call <code class="docutils literal notranslate"><span class="pre">tensorboard</span> <span class="pre">--logdir=your_save_dir</span></code> in your terminal and visit <code class="docutils literal notranslate"><span class="pre">http://localhost:6006/</span></code> in a web browser. See <a class="reference external" href="https://pytorch.org/docs/stable/tensorboard.html">pytorch tensorboard</a> for further information.</p>
</div>
<div class="section" id="visualize-ranked-results">
<h2><a class="toc-backref" href="#id14">Visualize ranked results</a><a class="headerlink" href="#visualize-ranked-results" title="Permalink to this headline"></a></h2>
<p>Ranked images can be visualized by setting <code class="docutils literal notranslate"><span class="pre">visrank</span></code> to True in <code class="docutils literal notranslate"><span class="pre">engine.run()</span></code>. <code class="docutils literal notranslate"><span class="pre">visrank_topk</span></code> determines the top-k images to be visualized (Default is <code class="docutils literal notranslate"><span class="pre">visrank_topk=10</span></code>). Note that <code class="docutils literal notranslate"><span class="pre">visrank</span></code> can only be used in test mode, i.e. <code class="docutils literal notranslate"><span class="pre">test_only=True</span></code> in <code class="docutils literal notranslate"><span class="pre">engine.run()</span></code>. The images will be saved under <code class="docutils literal notranslate"><span class="pre">save_dir/visrank_DATASETNAME</span></code> where each image sketches the ranked list given a query. An example is shown below. Red and green denote incorrect and correct matches respectively.</p>
<p>Ranked images can be visualized by setting <code class="docutils literal notranslate"><span class="pre">visrank</span></code> to true in <code class="docutils literal notranslate"><span class="pre">engine.run()</span></code>. <code class="docutils literal notranslate"><span class="pre">visrank_topk</span></code> determines the top-k images to be visualized (Default is <code class="docutils literal notranslate"><span class="pre">visrank_topk=10</span></code>). Note that <code class="docutils literal notranslate"><span class="pre">visrank</span></code> can only be used in test mode, i.e. <code class="docutils literal notranslate"><span class="pre">test_only=True</span></code> in <code class="docutils literal notranslate"><span class="pre">engine.run()</span></code>. The images will be saved under <code class="docutils literal notranslate"><span class="pre">save_dir/visrank_DATASETNAME</span></code> where each image contains the top-k ranked list given a query. An example is shown below. Red and green denote incorrect and correct matches respectively.</p>
<a class="reference internal image-reference" href="_images/ranked_results.jpg"><img alt="_images/ranked_results.jpg" class="align-center" src="_images/ranked_results.jpg" style="width: 800px;" /></a>
<p>An example command line using <code class="docutils literal notranslate"><span class="pre">scripts/main.py</span></code> is</p>
<p>Example command for <code class="docutils literal notranslate"><span class="pre">scripts/main.py</span></code> is</p>
<div class="highlight-shell notranslate"><div class="highlight"><pre><span></span>python scripts/main.py <span class="se">\</span>
--root <span class="nv">$DATA</span> <span class="se">\</span>
-s market1501 <span class="se">\</span>
@ -397,7 +397,7 @@
<h2><a class="toc-backref" href="#id15">Visualize activation maps</a><a class="headerlink" href="#visualize-activation-maps" title="Permalink to this headline"></a></h2>
<p>To understand where the CNN focuses on to extract features for ReID, you can visualize the activation maps as in <a class="reference external" href="https://arxiv.org/abs/1905.00953">OSNet</a>. This can be achieved by setting <code class="docutils literal notranslate"><span class="pre">visactmap=True</span></code> in <code class="docutils literal notranslate"><span class="pre">engine.run()</span></code> (<code class="docutils literal notranslate"><span class="pre">test_only</span></code> does not have to be True as <code class="docutils literal notranslate"><span class="pre">visactmap</span></code> is independent of <code class="docutils literal notranslate"><span class="pre">test_only</span></code>. See the code for details). Images will be saved in <code class="docutils literal notranslate"><span class="pre">save_dir/actmap_DATASETNAME</span></code>. An example is shown below (from left to right: image, activation map, overlapped image)</p>
<a class="reference internal image-reference" href="_images/actmap.jpg"><img alt="_images/actmap.jpg" class="align-center" src="_images/actmap.jpg" style="width: 300px;" /></a>
<p>An example command line using <code class="docutils literal notranslate"><span class="pre">scripts/main.py</span></code> is</p>
<p>Example command for <code class="docutils literal notranslate"><span class="pre">scripts/main.py</span></code> is</p>
<div class="highlight-shell notranslate"><div class="highlight"><pre><span></span>python scripts/main.py <span class="se">\</span>
--root <span class="nv">$DATA</span> <span class="se">\</span>
-s market1501 <span class="se">\</span>
@ -410,14 +410,14 @@
</pre></div>
</div>
<div class="admonition note">
<p class="admonition-title">Note</p>
<p>In order to visualize activation maps, the CNN needs to output the last convolutional feature maps at eval mode. See <code class="docutils literal notranslate"><span class="pre">torchreid/models/osnet.py</span></code> for example.</p>
<p class="first admonition-title">Note</p>
<p class="last">In order to visualize activation maps, the CNN needs to output the last convolutional feature maps at eval mode. See <code class="docutils literal notranslate"><span class="pre">torchreid/models/osnet.py</span></code> for example.</p>
</div>
</div>
<div class="section" id="use-your-own-dataset">
<h2><a class="toc-backref" href="#id16">Use your own dataset</a><a class="headerlink" href="#use-your-own-dataset" title="Permalink to this headline"></a></h2>
<ol class="arabic simple">
<li><p>Write your own dataset class. Below is a template for image dataset. However, it can also be applied to a video dataset class, for which you simply change <code class="docutils literal notranslate"><span class="pre">ImageDataset</span></code> to <code class="docutils literal notranslate"><span class="pre">VideoDataset</span></code>.</p></li>
<li>Write your own dataset class. Below is a template for image dataset. However, it can also be applied to a video dataset class, for which you simply change <code class="docutils literal notranslate"><span class="pre">ImageDataset</span></code> to <code class="docutils literal notranslate"><span class="pre">VideoDataset</span></code>.</li>
</ol>
<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="kn">from</span> <span class="nn">__future__</span> <span class="kn">import</span> <span class="n">absolute_import</span>
<span class="kn">from</span> <span class="nn">__future__</span> <span class="kn">import</span> <span class="n">print_function</span>
@ -458,14 +458,14 @@
</pre></div>
</div>
<ol class="arabic simple" start="2">
<li><p>Register your dataset.</p></li>
<li>Register your dataset.</li>
</ol>
<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="kn">import</span> <span class="nn">torchreid</span>
<span class="n">torchreid</span><span class="o">.</span><span class="n">data</span><span class="o">.</span><span class="n">register_image_dataset</span><span class="p">(</span><span class="s1">&#39;new_dataset&#39;</span><span class="p">,</span> <span class="n">NewDataset</span><span class="p">)</span>
</pre></div>
</div>
<ol class="arabic simple" start="3">
<li><p>Initialize a data manager with your dataset.</p></li>
<li>Initialize a data manager with your dataset.</li>
</ol>
<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="c1"># use your own dataset only</span>
<span class="n">datamanager</span> <span class="o">=</span> <span class="n">torchreid</span><span class="o">.</span><span class="n">data</span><span class="o">.</span><span class="n">ImageDataManager</span><span class="p">(</span>
@ -488,7 +488,7 @@
</div>
<div class="section" id="design-your-own-engine">
<h2><a class="toc-backref" href="#id17">Design your own Engine</a><a class="headerlink" href="#design-your-own-engine" title="Permalink to this headline"></a></h2>
<p>A new Engine should be designed if you have your own loss function. The base Engine class <code class="docutils literal notranslate"><span class="pre">torchreid.engine.Engine</span></code> has implemented some generic methods which you want to inherit to avoid re-writing. Please refer to the source code for more details. You are suggested to see how <code class="docutils literal notranslate"><span class="pre">ImageSoftmaxEngine</span></code> and <code class="docutils literal notranslate"><span class="pre">ImageTripletEngine</span></code> are constructed (also <code class="docutils literal notranslate"><span class="pre">VideoSoftmaxEngine</span></code> and <code class="docutils literal notranslate"><span class="pre">VideoTripletEngine</span></code>). All you need to implement might be just a <code class="docutils literal notranslate"><span class="pre">train()</span></code> function.</p>
<p>A new Engine should be designed if you have your own loss function. The base Engine class <code class="docutils literal notranslate"><span class="pre">torchreid.engine.Engine</span></code> has implemented some generic methods which you can inherit to avoid re-writing. Please refer to the source code for more details. You are suggested to see how <code class="docutils literal notranslate"><span class="pre">ImageSoftmaxEngine</span></code> and <code class="docutils literal notranslate"><span class="pre">ImageTripletEngine</span></code> are constructed (also <code class="docutils literal notranslate"><span class="pre">VideoSoftmaxEngine</span></code> and <code class="docutils literal notranslate"><span class="pre">VideoTripletEngine</span></code>). All you need to implement might be just a <code class="docutils literal notranslate"><span class="pre">train()</span></code> function.</p>
</div>
</div>