🧠 Space LLM - Transformer-based Language Model

This is a custom transformer-based language model developed and fine-tuned by Aditya B Dhruva. It uses a custom architecture similar to GPT and is compatible with the Hugging Face transformers library for text generation tasks. Named Space, it is open source and can run even on your phones. It has a built in RAG systems that you guys can use :))

πŸš€ Usage (with Transformers pipeline)

To use this model via Hugging Face transformers: (May not work as expected occasionally. If it does not work then do download the jupyter notebook and plug it in)

from transformers import AutoTokenizer, AutoModelForCausalLM, pipeline

# Load model and tokenizer
model = AutoModelForCausalLM.from_pretrained("AdityaBDhruva/Space")

# Create a text-generation pipeline
generator = pipeline("text-generation", model=model, tokenizer=tokenizer)

# Generate text
prompt = "What is the future of AI?"
output = generator(prompt, max_new_tokens=150)
print(output[0]['generated_text'])
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support