ZeroGPU-LLM-Inference / quantize_to_awq_colab.ipynb
Alikestocode's picture
Fix QuantizationConfig: use config_groups with BaseQuantizationConfig
ecf6a69
raw
history blame
28.3 kB
Open in Colab
Rendering notebook...