transformers
https://github.com/huggingface/transformers
Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported31 Subscribers
View all SubscribersAdd a CodeTriage badge to transformers
Help out
- Issues
- ValueError: LlavaForConditionalGeneration does not support an attention implementation through torch.nn.functional.scaled_dot_product_attention yet. Please open an issue on GitHub to request support for this architecture: https://github.com/huggingface/transformers/issues/new
- [Flash Attention 2] Performance improvement
- [Trainer.train] learning rate logging inconsistency: learning rate for the future step is logged
- [Docs] Broken link in Kubernetes doc
- Add StyleTTS 2 to HF Transformers Pipeline
- OWL-VIT Vision Foundation Model deployment in the edge cases - Need SDPA support for OWL-ViT Model optimization for Edge Deployment
- Expose `gradient_as_bucket_view` as training argument for `DDP`
- Can i convert open-clip trained models (.pt) using code “src/transformers/models/clip/convert_clip_original_pytorch_to_hf.py” ?
- Add time progress bar to track the group_by_length computation for bigger datasets on Trainer
- Mixtral: Reduce and Increase Expert Models
- Docs
- Python not yet supported