transformers
https://github.com/huggingface/transformers
Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported36 Subscribers
View all SubscribersAdd a CodeTriage badge to transformers
Help out
- Issues
- Passing nn.Parameter values within the model architecture as deep copies.
- Aryan_33290
- fix cpu bnb path
- about gradient accumulation
- Add functionality for deleting adapter layers in PEFT integration
- Add EXAONE
- The support of `Mllama` in AutoModel
- Added explicit error handling to a few modules
- Different LlamaRotaryEmbedding in old and new versions of transformers
- Inconsistent Hyperparameter Application in Distributed DeepSpeed + Optuna Setup for Hyperparameter Search
- Docs
- Python not yet supported