# BYOL > [Bootstrap your own latent: A new approach to self-supervised Learning](https://arxiv.org/abs/2006.07733) ## Abstract **B**ootstrap **Y**our **O**wn **L**atent (BYOL) is a new approach to self-supervised image representation learning. BYOL relies on two neural networks, referred to as online and target networks, that interact and learn from each other. From an augmented view of an image, we train the online network to predict the target network representation of the same image under a different augmented view. At the same time, we update the target network with a slow-moving average of the online network.
## How to use it? **Predict image** ```python from mmpretrain import inference_model predict = inference_model('resnet50_byol-pre_8xb512-linear-coslr-90e_in1k', 'demo/bird.JPEG') print(predict['pred_class']) print(predict['pred_score']) ``` **Use the model** ```python import torch from mmpretrain import get_model model = get_model('byol_resnet50_16xb256-coslr-200e_in1k', pretrained=True) inputs = torch.rand(1, 3, 224, 224) out = model(inputs) print(type(out)) # To extract features. feats = model.extract_feat(inputs) print(type(feats)) ``` **Train/Test Command** Prepare your dataset according to the [docs](https://mmclassification.readthedocs.io/en/1.x/user_guides/dataset_prepare.html#prepare-dataset). Train: ```shell python tools/train.py configs/byol/byol_resnet50_16xb256-coslr-200e_in1k.py ``` Test: ```shell python tools/test.py configs/byol/benchmarks/resnet50_8xb512-linear-coslr-90e_in1k.py https://download.openmmlab.com/mmselfsup/1.x/byol/byol_resnet50_16xb256-coslr-200e_in1k/resnet50_linear-8xb512-coslr-90e_in1k/resnet50_linear-8xb512-coslr-90e_in1k_20220825-7596c6f5.pth ``` ## Models and results ### Pretrained models | Model | Params (M) | Flops (G) | Config | Download | | :-------------------------------------- | :--------: | :-------: | :------------------------------------------------: | :------------------------------------------------------------------------------------------: | | `byol_resnet50_16xb256-coslr-200e_in1k` | 68.02 | 4.11 | [config](byol_resnet50_16xb256-coslr-200e_in1k.py) | [model](https://download.openmmlab.com/mmselfsup/1.x/byol/byol_resnet50_16xb256-coslr-200e_in1k/byol_resnet50_16xb256-coslr-200e_in1k_20220825-de817331.pth) \| [log](https://download.openmmlab.com/mmselfsup/1.x/byol/byol_resnet50_16xb256-coslr-200e_in1k/byol_resnet50_16xb256-coslr-200e_in1k_20220825-de817331.json) | ### Image Classification on ImageNet-1k | Model | Pretrain | Params (M) | Flops (G) | Top-1 (%) | Config | Download | | :---------------------------------------- | :------------------------------------------: | :--------: | :-------: | :-------: | :----------------------------------------: | :-------------------------------------------: | | `resnet50_byol-pre_8xb512-linear-coslr-90e_in1k` | [BYOL](https://download.openmmlab.com/mmselfsup/1.x/byol/byol_resnet50_16xb256-coslr-200e_in1k/byol_resnet50_16xb256-coslr-200e_in1k_20220825-de817331.pth) | 25.56 | 4.11 | 71.80 | [config](benchmarks/resnet50_8xb512-linear-coslr-90e_in1k.py) | [model](https://download.openmmlab.com/mmselfsup/1.x/byol/byol_resnet50_16xb256-coslr-200e_in1k/resnet50_linear-8xb512-coslr-90e_in1k/resnet50_linear-8xb512-coslr-90e_in1k_20220825-7596c6f5.pth) \| [log](https://download.openmmlab.com/mmselfsup/1.x/byol/byol_resnet50_16xb256-coslr-200e_in1k/resnet50_linear-8xb512-coslr-90e_in1k/resnet50_linear-8xb512-coslr-90e_in1k_20220825-7596c6f5.json) | ## Citation ```bibtex @inproceedings{grill2020bootstrap, title={Bootstrap your own latent: A new approach to self-supervised learning}, author={Grill, Jean-Bastien and Strub, Florian and Altch{\'e}, Florent and Tallec, Corentin and Richemond, Pierre H and Buchatskaya, Elena and Doersch, Carl and Pires, Bernardo Avila and Guo, Zhaohan Daniel and Azar, Mohammad Gheshlaghi and others}, booktitle={NeurIPS}, year={2020} } ```