transformers
https://github.com/huggingface/transformers
Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported39 Subscribers
View all SubscribersAdd a CodeTriage badge to transformers
Help out
- Issues
- Object detection evaluator
- Universal Speculative Decoding `CandidateGenerator`
- Unexpected output of _flash_attention_forward() for cross attention
- Make it possible to save and evaluate checkpoint on CTRL+C / `KeyboardInterrupt` with Hugging Face Trainer
- Only Fine-tune the embeddings of the added special tokens
- Stop requiring CacheConfig in GenerationConfig with StaticCache
- adjust beam search early stopping to any criteria as opposed to all
- Replace all torch.FloatTensor by torch.Tensor
- switch from `training_args.bin` `training_args.json`
- [doc] deepspeed universal checkpoint
- Docs
- Python not yet supported