transformers
https://github.com/huggingface/transformers
Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported31 Subscribers
View all SubscribersAdd a CodeTriage badge to transformers
Help out
- Issues
- Could we Add Linear Projection Layer in pre-trained model?
- [2023-12-04 11:52:08,378] [INFO] [autotuner.py:1110:run_after_tuning] No optimal DeepSpeed configuration found by autotuning.
- Add LayoutLMProcessor
- deepspeed autotuning intergration
- [WIP] Uniformize processors in text+image multimodal models.
- Add the CRATE (Coding RATE) backbone model
- TECO - Temporally Consistent Transformers for Video Generation
- Pipeline instantiation of model "facebook/nllb-200-distilled-600M" requires source and target language as mandatory
- Add flag for easily finetuning heads / linear probing to AutoModelforSequenceClassification
- Add support for llama.cpp
- Docs
- Python not yet supported