transformers
https://github.com/huggingface/transformers
Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported35 Subscribers
View all SubscribersAdd a CodeTriage badge to transformers
Help out
- Issues
- LLaMa 3 8B - offloaded_static cache - layer_device_map TypeError
- Not only main process will save checkpoints during training
- about code
- Fix deprecated torch.cuda.amp.autocast usage in Nemotron model
- Add support for Allegro
- GPTQ Quantization reduces number of parameters by a lot (Factor of more than 30) to the point that model is unusable
- ValueError: Architecture deepseek2 not supported
- Set `open` encoding to `utf-8` for Windows compatability
- Fix: siglip image processor rgb_convert is not being applied correctly.
- Confusion about the words returned by `word_ids()` in `deberta-v3-base`
- Docs
- Python not yet supported