Instructions to use microsoft/Magma-8B with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use microsoft/Magma-8B with Transformers:
# Load model directly from transformers import AutoModelForCausalLM model = AutoModelForCausalLM.from_pretrained("microsoft/Magma-8B", trust_remote_code=True, dtype="auto") - Notebooks
- Google Colab
- Kaggle
Use this model on kaggle not working
#16 opened 11 months ago
by
Nirav-Madhani
Magma Deployment
#15 opened about 1 year ago
by
chrishoertnagl
Edit 536 lines due to type mismatch error when evaluation
#14 opened about 1 year ago
by
shoveling42
Fine-Tuning on Custom UIs & Tasks
2
#13 opened about 1 year ago
by
chrishoertnagl