Mask Generation
Transformers
Safetensors
falcon_perception
text-generation
falcon
segmentation
vision-language
open-vocabulary
custom_code
Eval Results
Instructions to use tiiuae/Falcon-Perception with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use tiiuae/Falcon-Perception with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("mask-generation", model="tiiuae/Falcon-Perception", trust_remote_code=True)# Load model directly from transformers import AutoModelForCausalLM model = AutoModelForCausalLM.from_pretrained("tiiuae/Falcon-Perception", trust_remote_code=True, dtype="auto") - Notebooks
- Google Colab
- Kaggle
Anyone working on GGUF, ONNX support?
#2
by abe238 - opened
Anyone working on GGUF, ONNX support for added surface coverage?
yes we have a PR open for the ocr model https://github.com/ggml-org/llama.cpp/pull/21045
once merged, we will build on top of it for perception
@abe238 working prototype with ONNX is here: https://huggingface.co/spaces/shreyask/falcon-perception and the model is uploaded here: https://huggingface.co/onnx-community/falcon-perception-onnx-webgpu