Compact Biomedical Models
Collection
This collection contains the models from the "On the Effectiveness of Compact Biomedical Transformers" โข 9 items โข Updated โข 2
How to use nlpie/bio-tinybert with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("fill-mask", model="nlpie/bio-tinybert") # Load model directly
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("nlpie/bio-tinybert")
model = AutoModelForMaskedLM.from_pretrained("nlpie/bio-tinybert")BioTinyBERT is the result of training the TinyBERT model in a continual learning fashion for 200k training steps using a total batch size of 192 on the PubMed dataset.
We initialise our model with the pre-trained checkpoints of the TinyBERT model available on Huggingface.
This model uses 4 hidden layers with a hidden dimension size and an embedding size of 768 resulting in a total of 15M parameters.
If you use this model, please consider citing the following paper:
@article{rohanian2023effectiveness,
title={On the effectiveness of compact biomedical transformers},
author={Rohanian, Omid and Nouriborji, Mohammadmahdi and Kouchaki, Samaneh and Clifton, David A},
journal={Bioinformatics},
volume={39},
number={3},
pages={btad103},
year={2023},
publisher={Oxford University Press}
}
If this model helps your work, you can keep the project running with a one-off or monthly contribution:
https://github.com/sponsors/nlpie-research