stanfordnlp/sst2
Viewer โข Updated โข 70k โข 25.7k โข 161
How to use TehranNLP-org/bert-large-sst2 with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-classification", model="TehranNLP-org/bert-large-sst2") # Load model directly
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("TehranNLP-org/bert-large-sst2")
model = AutoModelForSequenceClassification.from_pretrained("TehranNLP-org/bert-large-sst2")This model is a fine-tuned version of bert-large-uncased on the SST2 dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|---|---|---|---|---|
| No log | 1.0 | 2105 | 0.2167 | 0.9232 |
| 0.2049 | 2.0 | 4210 | 0.2375 | 0.9278 |
| 0.123 | 3.0 | 6315 | 0.2636 | 0.9243 |
| 0.0839 | 4.0 | 8420 | 0.2865 | 0.9243 |
| 0.058 | 5.0 | 10525 | 0.3109 | 0.9255 |