leondz/wnut_17
Updated • 4.53k • 19
How to use BaselMousi/distilbert_wnut_model with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("token-classification", model="BaselMousi/distilbert_wnut_model") # Load model directly
from transformers import AutoTokenizer, AutoModelForTokenClassification
tokenizer = AutoTokenizer.from_pretrained("BaselMousi/distilbert_wnut_model")
model = AutoModelForTokenClassification.from_pretrained("BaselMousi/distilbert_wnut_model")This model is a fine-tuned version of distilbert/distilbert-base-uncased on the wnut_17 dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|---|---|---|---|---|---|---|---|
| No log | 1.0 | 213 | 0.2801 | 0.5586 | 0.2428 | 0.3385 | 0.9384 |
| No log | 2.0 | 426 | 0.2573 | 0.5228 | 0.2975 | 0.3792 | 0.9425 |
| 0.1769 | 3.0 | 639 | 0.2859 | 0.5510 | 0.3253 | 0.4091 | 0.9450 |
| 0.1769 | 4.0 | 852 | 0.2965 | 0.5499 | 0.3522 | 0.4294 | 0.9462 |
| 0.0496 | 5.0 | 1065 | 0.2951 | 0.5123 | 0.3846 | 0.4394 | 0.9458 |
| 0.0496 | 6.0 | 1278 | 0.3052 | 0.5218 | 0.3874 | 0.4447 | 0.9463 |
Base model
distilbert/distilbert-base-uncased