transformers
https://github.com/huggingface/transformers
Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported39 Subscribers
View all SubscribersAdd a CodeTriage badge to transformers
Help out
- Issues
- why doc don't tell the default padding side is right and some configuration default behaviour is missing in lib doc.
- Update conftest.py
- Unknown quantization type, got fp8
- cover edge case where decoder_input_ids is provided but it is empty
- 8bits GPTQ quantization output
- "Is it possible for Hugging Face to implement a chat model for quick information retrieval similar to vLLM?"
- Support SDPA & Flash Attention 2 for LayoutLMv3
- LayerDrop broken in various Flax models (Whisper/BART/more...)
- Add SDPA support for LayoutLMv3 model
- ModernBert: reuse GemmaRotaryEmbedding via modular + Integration tests
- Docs
- Python not yet supported