transformers
https://github.com/huggingface/transformers
Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported35 Subscribers
View all SubscribersAdd a CodeTriage badge to transformers
Help out
- Issues
- Vision Encoder-Decoder fails with LLaMA decoder due to missing cross-attention implementation
- CUDA Out Of Memory when training a DETR Object detection model with compute_metrics
- add empty cache before load_best_model to prevent cuda OOM
- GLM/ChatGLM badly broken in HF
- Bug when using StaticCache in Qwen2.5 Inference
- Confusing error message
- Neftune computation is probably wrong with packed training
- Fix for TypeError in train_new_from_iterator() in tokenization_utils_fast.py
- Clear unused allocated GPU memory when available GPU memory is low.
- FlaxWhisperForConditionalGeneration Out Of Memory Error
- Docs
- Python not yet supported