File size: 4,835 Bytes
8c6b5ee
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7d5ec28
8c6b5ee
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175

# ๐Ÿš€ Running SuPr

This section provides detailed instructions on running **SuPr** experiments across different scenarios: base-to-novel transfer, cross-dataset/domain generalization, and few-shot learning.

---

## ๐Ÿ–ฅ๏ธ GPU and Memory Requirements

- All experiments are trained with a **batch size of 4** on a **single NVIDIA 4090** GPU, with the exception of ImageNet. 
- **ImageNet** experiments require approximately **30 GB** of GPU memory. For ImageNet, we recommend using a **single NVIDIA A800**.
- We provide two implementations for projection:
  - **SVD**-based projection
  - **Least squares**-based projection  
  > **Tip:** Although mathematically equivalent, the least squares method is more GPU memory-efficient.

---

## (1) ๐Ÿ† Base-to-Novel Experiments

### Step-by-Step Instructions

1. **Configuration**  
   Modify the configuration file located at:  
   ```
   configs/trainers/SuPr/vit_b16_ep10_batch4_4+4ctx.yaml
   ```

2. **Update Dataset Path**  
   Change the dataset path in:
   - `scripts/supr/base2new.sh` (for SuPr)
   - `scripts/supr_ens/base2new.sh` (for SuPrEns)  
   
   (Modify **line 4** to point to your local dataset directory.)

3. **Training Commands**  
   Run the following command to train SuPr (repeat for seeds 1, 2, and 3):

   ```bash
   # Set dataset (e.g., imagenet)
   # Available datasets: [caltech101, food101, dtd, ucf101, oxford_flowers, oxford_pets, fgvc_aircraft, stanford_cars, sun397, eurosat]

   # Train SuPr
   sh scripts/supr/base2new.sh imagenet

   # Train SuPr Ens
   sh scripts/supr_ens/base2new.sh imagenet
   ```

4. **Output Directory**  
   Results will be saved automatically at:
   ```
   Base results: output/base2new/${TRAINER}/${CFG}/train_base/${DATASET}/shots_${SHOTS}/seed${SEED}
   Novel results: output/base2new/${TRAINER}/${CFG}/test_new/${DATASET}/shots_${SHOTS}/seed${SEED}
   ```

5. **Result Aggregation**  
   After finishing training for all seeds, run:

   ```bash
   # Aggregate base-to-novel results
   python parse_test_res.py -type base2new output/base2new/SuPr/vit_b16_ep10_batch4_4+4ctx/test_new/caltech101/shots_16
   ```

---

### ๐Ÿ”ฅ SuPr + PromptSRC

To run SuPr combined with PromptSRC:

1. **Configuration**  
   Use the configuration file:  
   ```
   configs/trainers/SuPr/vit_b16_ep20_batch4_4+4ctx_promptsrc.yaml
   ```

2. **Training Command**  
   ```bash
   # Train SuPr+PromptSRC
   sh scripts/supr_src/base2new.sh imagenet
   ```

---

## (2) ๐ŸŒ Cross-Dataset / Domain Generalization Experiments

### Step-by-Step Instructions

1. **Configuration**  
   Edit the configuration file at:  
   ```
   configs/trainers/SuPr/vit_b16_ep12_batch8_4+4ctx_cross_datasets.yaml
   ```

2. **Update Dataset Path**  
   Change the dataset path in:  
   ```
   scripts/supr/cross_dg.sh (line 4)
   ```

3. **Training Command**  
   Run the following script:

   ```bash
   # This script will:
   # 1. Train SuPr on ImageNet (3 seeds)
   # 2. Evaluate on 10 cross-datasets
   # 3. Perform DG evaluation on ImageNetV2, ImageNet-Sketch, ImageNet-A, and ImageNet-R

   sh scripts/supr/cross_dg.sh
   ```

4. **Output Directory**  
   Results will be saved at:
   ```
   output/cross_dg/${TRAINER}/${CFG}/${DATASET}/shots_${SHOTS}/seed${SEED}
   ```

5. **Result Aggregation**  

   ```bash
   # Aggregate cross-dataset results
   python parse_test_res.py -type cross output/cross_dg/SuPr/vit_b16_ep12_batch8_4+4ctx_cross_datasets/caltech101/shots_16

   # Aggregate domain generalization results
   python parse_test_res.py -type dg output/cross_dg/SuPr/vit_b16_ep12_batch8_4+4ctx_cross_datasets/imagenet/shots_16
   ```

---

## (3) ๐ŸŽฏ Few-Shot Learning Experiments

### Step-by-Step Instructions

1. **Configuration**  
   Edit the configuration file at:  
   ```
   configs/trainers/SuPr/vit_b16_ep25_batch8_4+4ctx_few_shot.yaml
   ```

2. **Update Dataset Path**  
   Change the dataset path in:  
   ```
   scripts/supr/few_shot.sh (line 4)
   ```

3. **Training Command**  
   ```bash
   # dataset=imagenet
   # Other available datasets: [caltech101, food101, dtd, ucf101, oxford_flowers, oxford_pets, fgvc_aircraft, stanford_cars, sun397, eurosat]

   sh scripts/supr/few_shot.sh imagenet
   ```

4. **Output Directory**  
   Results will be saved at:
   ```
   output/fewshot/${TRAINER}/${CFG}/${DATASET}/shots_${SHOTS}/seed${SEED}
   ```

5. **Result Aggregation**  

   ```bash
   # Aggregate few-shot results
   python parse_test_res.py -type fewshot output/fewshot/SuPr/vit_b16_ep25_batch8_4+4ctx_few_shot/imagenet/shots_4
   ```

---

> **Tip:** Always run experiments across **three random seeds** to ensure reproducibility and statistically stable results.
>  
> **Warning:** Be sure to update dataset paths correctly before launching the scripts. Missing this may lead to training failures or empty outputs.

---