Skip to content

irsLu/SD-MAE

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LTRP illustration

Pretrained models

arch params download
ViT-S/16 21M NCT-CRC' model NCT-CRC' logs PCam's model PCam's logs

Training

Please install PyTorch and download the NCT-CRC-HE and PatchCamelyon dataset. This codebase has been developed based on MAE. More information about requirements can be found at it.

Single-node training on PatchCamelyon dataset

Run SD-MAE with ViT-small classing model on a single node with 4 GPUs for 100 epochs with the following command. We provide training logs for this run to help reproducibility.

python -m torch.distributed.launch --nproc_per_node=4  SD-MAE/run_mae_pretraining.py  \
    --data_path yourpath/pCam \
    --batch_size 256 \
    --model ltrp_base_and_vs \
    --mask_ratio 0.6 \
    --epochs 100 \
    --dino_head_dim 4096 \
    --dino_bottleneck_dim 256 \
    --dino_hidden_dim 2048 \
    --warmup_epochs 5 \ 
    --lr 0.0006

Single-node finetuning on PatchCamelyon dataset

python -m torch.distributed.launch --nproc_per_node=4  SD-MAE/run_class_finetuning.py \
    --model vit_small_patch16_224 \
    --finetune yourpath/checkpoint-100.pth \
    --data_path  yourpath/pCam \
    --batch_size 256 \
    --opt adamw \
    --opt_betas 0.9 0.999 \
    --weight_decay 0.05 \
    --epochs 100 \
    --nb_classes 2
    --data_set 'pCam' \
    --lr 0.001

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published