Instructions to use grammarly/coedit-xl with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use grammarly/coedit-xl with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("grammarly/coedit-xl") model = AutoModelForSeq2SeqLM.from_pretrained("grammarly/coedit-xl") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- e963a41dc5425ec81752d01e50e86d8a4f4b8bccff813babfbe02ece704cfeca
- Size of remote file:
- 11.4 GB
- SHA256:
- 8122177e6a4f18643211727faaeda2c6a55488a6e84c5b622a30a67feb23632d
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.