build_your_circuit
This model is a fine-tuned version of t5-small on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.3753
- Bleu: 0.7523
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 20
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Bleu |
|---|---|---|---|---|
| 2.5261 | 0.2633 | 500 | 1.1977 | 0.4359 |
| 1.1455 | 0.5266 | 1000 | 0.7789 | 0.4657 |
| 0.8398 | 0.7899 | 1500 | 0.6254 | 0.4128 |
| 0.7001 | 1.0532 | 2000 | 0.5315 | 0.7362 |
| 0.5917 | 1.3165 | 2500 | 0.4880 | 0.7590 |
| 0.5684 | 1.5798 | 3000 | 0.4533 | 0.7101 |
| 0.5297 | 1.8431 | 3500 | 0.4382 | 0.7719 |
| 0.4913 | 2.1064 | 4000 | 0.4129 | 0.7696 |
| 0.4706 | 2.3697 | 4500 | 0.4068 | 0.7719 |
| 0.4529 | 2.6330 | 5000 | 0.3933 | 0.7580 |
| 0.4318 | 2.8963 | 5500 | 0.3859 | 0.7709 |
| 0.4242 | 3.1596 | 6000 | 0.3753 | 0.7523 |
Framework versions
- Transformers 4.52.4
- Pytorch 2.6.0+cu124
- Datasets 3.6.0
- Tokenizers 0.21.2
- Downloads last month
- 8
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for Humphery7/build_your_circuit
Base model
google-t5/t5-small