Instructions to use blaze999/Medical-NER with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use blaze999/Medical-NER with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("token-classification", model="blaze999/Medical-NER")# Load model directly from transformers import AutoTokenizer, AutoModelForTokenClassification tokenizer = AutoTokenizer.from_pretrained("blaze999/Medical-NER") model = AutoModelForTokenClassification.from_pretrained("blaze999/Medical-NER") - Inference
- Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 760bdc91d8cf105df5928b8c25a26d1d8bda603830f485b7ae1850192c19f5d7
- Size of remote file:
- 4.66 kB
- SHA256:
- d82257d8194d3d25a29011a6bfec613c3acf7495068de099fcd2d8f178a32419
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.