update README

This commit is contained in:
max410011 2023-06-26 17:15:18 +00:00
parent a21296debb
commit e3de89e879

View File

@ -4,6 +4,75 @@ This reposity contains the PyTorch training code for the original DeiT models. C
Here, I have build an interface and add some naive methods for add sparsity into the ViT.
## Sparsity NAS Training scripts
- Normal command
- training
```
python -m torch.distributed.launch --master_port 29510 --nproc_per_node=2 --use_env main.py \
--data-path /dataset/imagenet \
--epochs 150 \
--pretrained \
--lr 5e-5 \
--min-lr 1e-6 \
--nas-mode \
--nas-config configs/deit_small_nxm_uniform24.yaml \
--nas-test-config 2 4 \
--output_dir nas_uniform_24_150epoch \
--wandb
```
- eval
```
python -m torch.distributed.launch --master_port 29510 --nproc_per_node=2 --use_env main.py \
--data-path /dataset/imagenet \
--nas-mode \
--nas-config configs/deit_small_nxm_uniform24.yaml \
--nas-weights nas_uniform_24_150epoch/best_checkpoint.pth \
--nas-test-config 2 4 \
--eval
```
- KD command
- training
```
python -m torch.distributed.launch --master_port 29510 --nproc_per_node=2 --use_env main.py \
--data-path /dataset/imagenet \
--epochs 150 \
--pretrained \\
--lr 5e-5 \
--min-lr 1e-6 \
--nas-mode \
--nas-config configs/deit_small_nxm_nas_1234.yaml \
--nas-test-config 2 4 \
--output_dir KD_nas_124+13_150epoch \
--teacher-model deit_small_patch16_224 \
--distillation-type soft \
--distillation-alpha 1.0 \
--wandb
```
- eval
```
python -m torch.distributed.launch --master_port 29510 --nproc_per_node=2 --use_env main.py \
--data-path /dataset/imagenet \
--nas-mode \
--nas-config configs/deit_small_nxm_uniform24.yaml \
--nas-weights KD_nas_124+13_150epoch/checkpoint.pth \
--nas-test-config 2 4 \
--eval
```
- Cifar-100 command
- training
```
python main.py \
--model deit_small_patch16_224 \
--batch-size 256 \
--finetune https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth \
--data-set CIFAR \
--data-path /dataset/cifar100 \
--opt sgd \
--weight-decay 1e-4 \
--lr 1e-2 \
--output_dir deit_s_224_cifar_100 \
--epochs 500
```
## Support Sparsity Searching Algorithm
Currently, we support the following sparsity strategy:
@ -23,7 +92,7 @@ We can provide a custom config that define the target sparsity of each layer.
Currently, we support two kind of sparsity including `nxm` and `unstructuted`.
User can create a `yaml` file the descibe the detail and pass into the main function by add the `--custom-config [path to config file]` argument when you call the `main.py`
## Example Usage
## Example Usage (Pruning method)
To run a DeiT-S with custom configuration and eval the accuracy before finetuning
```
python main.py \