Text Classification
Transformers
Safetensors
English
Vietnamese
esg_hierarchical
esg
classification
hierarchical
multi-task-learning
sustainability
custom_code
Instructions to use chungpt2123/esg-subfactor-classifier with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use chungpt2123/esg-subfactor-classifier with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="chungpt2123/esg-subfactor-classifier", trust_remote_code=True)# Load model directly from transformers import AutoModelForSequenceClassification model = AutoModelForSequenceClassification.from_pretrained("chungpt2123/esg-subfactor-classifier", trust_remote_code=True, dtype="auto") - Notebooks
- Google Colab
- Kaggle
Update README.md
Browse files
README.md
CHANGED
|
@@ -40,7 +40,7 @@ The model uses a hierarchical approach:
|
|
| 40 |
from transformers import AutoTokenizer, AutoModelForSequenceClassification
|
| 41 |
|
| 42 |
# Load model and tokenizer
|
| 43 |
-
model = AutoModelForSequenceClassification.from_pretrained("chungpt2123/
|
| 44 |
tokenizer = AutoTokenizer.from_pretrained("Alibaba-NLP/gte-multilingual-base")
|
| 45 |
|
| 46 |
# Example usage
|
|
@@ -69,10 +69,6 @@ The model achieves strong performance on ESG classification tasks with hierarchi
|
|
| 69 |
- Performance may vary on domain-specific or technical ESG content
|
| 70 |
- Best performance on texts similar to training data distribution
|
| 71 |
|
| 72 |
-
## Citation
|
| 73 |
-
|
| 74 |
-
If you use this model, please cite:
|
| 75 |
-
|
| 76 |
```bibtex
|
| 77 |
@misc{esg_hierarchical_model,
|
| 78 |
title={ESG Hierarchical Multi-Task Learning Model},
|
|
|
|
| 40 |
from transformers import AutoTokenizer, AutoModelForSequenceClassification
|
| 41 |
|
| 42 |
# Load model and tokenizer
|
| 43 |
+
model = AutoModelForSequenceClassification.from_pretrained("chungpt2123/esg-subfactor-classifier", trust_remote_code=True)
|
| 44 |
tokenizer = AutoTokenizer.from_pretrained("Alibaba-NLP/gte-multilingual-base")
|
| 45 |
|
| 46 |
# Example usage
|
|
|
|
| 69 |
- Performance may vary on domain-specific or technical ESG content
|
| 70 |
- Best performance on texts similar to training data distribution
|
| 71 |
|
|
|
|
|
|
|
|
|
|
|
|
|
| 72 |
```bibtex
|
| 73 |
@misc{esg_hierarchical_model,
|
| 74 |
title={ESG Hierarchical Multi-Task Learning Model},
|