transformers
https://github.com/huggingface/transformers
Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported39 Subscribers
View all SubscribersAdd a CodeTriage badge to transformers
Help out
- Issues
- How can I disable legacy processing in llava-next
- Installation Error for transformers Package (🔥 maturin failed)
- T5 static cache
- Tokenizer does not split text according to newly added input tokens
- Support Constant Learning Rate with Cooldown
- [Feature Request] Add beam search text streaming visualization feature
- Calling `to()` is not supported for `4-bit` quantized models with the installed version of bitsandbytes. The current device is `cuda:0`. If you intended to move the model, please install bitsandbytes >= 0.43.2.
- Compatibility Issue with Python 3.13
- Allow static cache to be larger than sequence length / batch size for encoder-decoder models
- `tokenizer` should be replaced to `processing_class` in `Seq2SeqTrainer`?
- Docs
- Python not yet supported