transformers
https://github.com/huggingface/transformers
Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported30 Subscribers
View all SubscribersAdd a CodeTriage badge to transformers
Help out
- Issues
- rework `test_multi_gpu_data_parallel_forward`
- Add ability to specify input device for ffmpeg_microphone()
- Whether the OutEffHop can support with Transfomers
- Checkpoint saving by different evaluation criterias
- Add basic eval table logging for WandbCallback
- feat: adding mplugdocowl
- Using accelerate launch FDSP cause weight saved after 2nd time onwards to be incomplete
- sdpa for bert casues nan when using bfloat16 with padding.
- Add option to only install AutoTokenizer for production environment
- Trainer should throw a warning if max_sequence_length < number of tokens in dataset sample record.
- Docs
- Python not yet supported