Instructions to use deepghs/clip_onnx with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use deepghs/clip_onnx with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("zero-shot-classification", model="deepghs/clip_onnx")# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("deepghs/clip_onnx", dtype="auto") - Notebooks
- Google Colab
- Kaggle
ONNX exported version of CLIP models.
Models
4 models exported in total.
| Name | Image (Params/FLOPS) | Image Size | Image Width (Enc/Emb) | Text (Params/FLOPS) | Text Width (Enc/Emb) | Created At |
|---|---|---|---|---|---|---|
| openai/clip-vit-large-patch14-336 | 302.9M / 174.7G | 336 | 1024 / 768 | 85.1M / 1.2G | 768 / 768 | 2022-04-22 |
| openai/clip-vit-large-patch14 | 302.9M / 77.8G | 224 | 1024 / 768 | 85.1M / 1.2G | 768 / 768 | 2022-03-03 |
| openai/clip-vit-base-patch16 | 85.6M / 16.9G | 224 | 768 / 512 | 37.8M / 529.2M | 512 / 512 | 2022-03-03 |
| openai/clip-vit-base-patch32 | 87.4M / 4.4G | 224 | 768 / 512 | 37.8M / 529.2M | 512 / 512 | 2022-03-03 |
Model tree for deepghs/clip_onnx
Base model
openai/clip-vit-base-patch16