π Running SuPr
This section provides detailed instructions on running SuPr experiments across different scenarios: base-to-novel transfer, cross-dataset/domain generalization, and few-shot learning.
π₯οΈ GPU and Memory Requirements
- All experiments are trained with a batch size of 4 on a single NVIDIA 4090 GPU, with the exception of ImageNet.
- ImageNet experiments require approximately 30 GB of GPU memory. For ImageNet, we recommend using a single NVIDIA A800.
- We provide two implementations for projection:
- SVD-based projection
- Least squares-based projection
Tip: Although mathematically equivalent, the least squares method is more GPU memory-efficient.
(1) π Base-to-Novel Experiments
Step-by-Step Instructions
Configuration
Modify the configuration file located at:configs/trainers/SuPr/vit_b16_ep10_batch4_4+4ctx.yamlUpdate Dataset Path
Change the dataset path in:scripts/supr/base2new.sh(for SuPr)scripts/supr_ens/base2new.sh(for SuPrEns)
(Modify line 4 to point to your local dataset directory.)
Training Commands
Run the following command to train SuPr (repeat for seeds 1, 2, and 3):# Set dataset (e.g., imagenet) # Available datasets: [caltech101, food101, dtd, ucf101, oxford_flowers, oxford_pets, fgvc_aircraft, stanford_cars, sun397, eurosat] # Train SuPr sh scripts/supr/base2new.sh imagenet # Train SuPr Ens sh scripts/supr_ens/base2new.sh imagenetOutput Directory
Results will be saved automatically at:Base results: output/base2new/${TRAINER}/${CFG}/train_base/${DATASET}/shots_${SHOTS}/seed${SEED} Novel results: output/base2new/${TRAINER}/${CFG}/test_new/${DATASET}/shots_${SHOTS}/seed${SEED}Result Aggregation
After finishing training for all seeds, run:# Aggregate base-to-novel results python parse_test_res.py -type base2new output/base2new/SuPr/vit_b16_ep10_batch4_4+4ctx/test_new/caltech101/shots_16
π₯ SuPr + PromptSRC
To run SuPr combined with PromptSRC:
Configuration
Use the configuration file:configs/trainers/SuPr/vit_b16_ep20_batch4_4+4ctx_promptsrc.yamlTraining Command
# Train SuPr+PromptSRC sh scripts/supr_src/base2new.sh imagenet
(2) π Cross-Dataset / Domain Generalization Experiments
Step-by-Step Instructions
Configuration
Edit the configuration file at:configs/trainers/SuPr/vit_b16_ep12_batch8_4+4ctx_cross_datasets.yamlUpdate Dataset Path
Change the dataset path in:scripts/supr/cross_dg.sh (line 4)Training Command
Run the following script:# This script will: # 1. Train SuPr on ImageNet (3 seeds) # 2. Evaluate on 10 cross-datasets # 3. Perform DG evaluation on ImageNetV2, ImageNet-Sketch, ImageNet-A, and ImageNet-R sh scripts/supr/cross_dg.shOutput Directory
Results will be saved at:output/cross_dg/${TRAINER}/${CFG}/${DATASET}/shots_${SHOTS}/seed${SEED}Result Aggregation
# Aggregate cross-dataset results python parse_test_res.py -type cross output/cross_dg/SuPr/vit_b16_ep12_batch8_4+4ctx_cross_datasets/caltech101/shots_16 # Aggregate domain generalization results python parse_test_res.py -type dg output/cross_dg/SuPr/vit_b16_ep12_batch8_4+4ctx_cross_datasets/imagenet/shots_16
(3) π― Few-Shot Learning Experiments
Step-by-Step Instructions
Configuration
Edit the configuration file at:configs/trainers/SuPr/vit_b16_ep25_batch8_4+4ctx_few_shot.yamlUpdate Dataset Path
Change the dataset path in:scripts/supr/few_shot.sh (line 4)Training Command
# dataset=imagenet # Other available datasets: [caltech101, food101, dtd, ucf101, oxford_flowers, oxford_pets, fgvc_aircraft, stanford_cars, sun397, eurosat] sh scripts/supr/few_shot.sh imagenetOutput Directory
Results will be saved at:output/fewshot/${TRAINER}/${CFG}/${DATASET}/shots_${SHOTS}/seed${SEED}Result Aggregation
# Aggregate few-shot results python parse_test_res.py -type fewshot output/fewshot/SuPr/vit_b16_ep25_batch8_4+4ctx_few_shot/imagenet/shots_4
Tip: Always run experiments across three random seeds to ensure reproducibility and statistically stable results.
Warning: Be sure to update dataset paths correctly before launching the scripts. Missing this may lead to training failures or empty outputs.