eriktks/conll2003
Updated • 39k • 166
How to use theArif/bert-finetuned-ner-conll with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("token-classification", model="theArif/bert-finetuned-ner-conll") # Load model directly
from transformers import AutoTokenizer, AutoModelForTokenClassification
tokenizer = AutoTokenizer.from_pretrained("theArif/bert-finetuned-ner-conll")
model = AutoModelForTokenClassification.from_pretrained("theArif/bert-finetuned-ner-conll")This model is a fine-tuned version of bert-base-cased on the conll2003 dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|---|---|---|---|---|---|---|---|
| No log | 1.0 | 293 | 0.0773 | 0.8819 | 0.9138 | 0.8976 | 0.9775 |
| 0.1657 | 2.0 | 586 | 0.0598 | 0.9101 | 0.9374 | 0.9236 | 0.9835 |
| 0.1657 | 3.0 | 879 | 0.0593 | 0.9152 | 0.9403 | 0.9275 | 0.9845 |