MoLFormer-c3-1.1B
MoLFormer-c3-1.1B, as described in the Chemberta-3 paper [1] is pretrained on a combination of 100% ZINC20 (1B) and 100% Pubchem (100M)
Usage
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("DeepChem/MoLFormer-c3-1.1B")
model = AutoModelForMaskedLM.from_pretrained("DeepChem/MoLFormer-c3-1.1B")
Reference
- Singh R, Barsainyan AA, Irfan R, Amorin CJ, He S, Davis T, et al. ChemBERTa-3: An Open Source Training Framework for Chemical Foundation Models. ChemRxiv. 2025; doi:10.26434/chemrxiv-2025-4glrl-v2 This content is a preprint and has not been peer-reviewed.
- Downloads last month
- 393