transformers
https://github.com/huggingface/transformers
Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported30 Subscribers
View all SubscribersAdd a CodeTriage badge to transformers
Help out
- Issues
- SigLIP tokenizer not enforcing use_fast=True
- Include other tokenizers/image processors in Llava
- LLaVA `torch.compile` implementation
- Implement SuperGlue model
- Int4 transformers training
- Added HelpingAI model type in it
- Bart evaluation throws the following error at generate(): UnboundLocalError: 'model_kwargs['decoder_attention_mask']' is used before assignment
- About new ClearML Intergrations
- Move weight initialization for DeformableDetr
- Mixture of All Intelligence (MoAI)
- Docs
- Python not yet supported