Instructions to use universalner/uner_dan_ddt with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use universalner/uner_dan_ddt with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("token-classification", model="universalner/uner_dan_ddt")# Load model directly from transformers import AutoTokenizer, AutoModelForTokenClassification tokenizer = AutoTokenizer.from_pretrained("universalner/uner_dan_ddt") model = AutoModelForTokenClassification.from_pretrained("universalner/uner_dan_ddt") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- bde68c81aa70db241df6305a7e0d152a58e78a37a033312e61635e70cb3f0a4a
- Size of remote file:
- 2.24 GB
- SHA256:
- 0c58ceaaac19789e4b343067a0e34f087b2fd25603de3867ffc2b629b12b4af3
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.