Missing chat_template
#27
by
jamesbut
- opened
I am confused as to why there is no chat_template in the tokenizer_config.json given that part of the advertised intended usage is:
"Chatbots and Conversational AI: Power conversational interfaces for customer service, virtual assistants, or interactive applications."
I am trying to correctly tokenize prompts in chat format but I cannot use the tokenizer.apply_chat_template() function without chat_template set.
Am I missing something?
This is a base model, i think you are looking for instruction tuned model , here is the name : google/gemma-3-270m-it