transformers
https://github.com/huggingface/transformers
Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported35 Subscribers
View all SubscribersAdd a CodeTriage badge to transformers
Help out
- Issues
- Error when running Grounding DINO for batch inference.
- Mask Padded Tokens In Aspect Ratio Mask Preparation
- Improve EncoderDecoderModel docs
- MLU devices : Checks if mlu is available via an cndev-based check which won't trigger the drivers and leave mlu
- LLaMa 3 8B - offloaded_static cache - layer_device_map TypeError
- Not only main process will save checkpoints during training
- about code
- Fix deprecated torch.cuda.amp.autocast usage in Nemotron model
- Add support for Allegro
- GPTQ Quantization reduces number of parameters by a lot (Factor of more than 30) to the point that model is unusable
- Docs
- Python not yet supported