transformers
https://github.com/huggingface/transformers
Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported30 Subscribers
View all SubscribersAdd a CodeTriage badge to transformers
Help out
- Issues
- Does GroundingDINO support batched inference?
- get error when running the chatglm3: 'GenerationConfig' object has no attribute '_eos_token_tensor'
- auto_find_batch_size for OOM during evaluation
- Uniformize kwargs for Layoutlm (2, 3, X) processors
- Uniformize kwargs for chameleon processor
- Enable speculative decoding with batch size >1
- `dataloader_prefetch_factor` is left unused for datasets of type `IterableDataset`
- [WIP] - Enable speculative decoding with batch size >1
- Add Matching Anything by Segmenting Anything (MASA) MOT tracking model
- add scaling_factor to GemmaRotaryEmbedding for fix error in GemmaLine…
- Docs
- Python not yet supported