transformers
https://github.com/huggingface/transformers
Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported30 Subscribers
View all SubscribersAdd a CodeTriage badge to transformers
Help out
- Issues
- callback to implement how the predictions should be stored.
- [i18n-<languageCode>] Translating docs to <languageName>
- Adding warnings or errors when provided sequence length is bigger than config.max_position_embeddings
- Added error when sequence length is bigger than max_position_embeddings
- DINOv2 register support
- Is apply_chat_template support function call usage?
- _prepare_4d_causal_attention_mask mask inversion should work boolean masks
- fix: resolve bug with `use_mps_device` setting not taking effect
- Update benchmark.py-- Enhance Benchmarking with Multi-Commit Support …
- Output from model.Generate & model.forward not same when output attention/hidden_state is True
- Docs
- Python not yet supported