transformers
https://github.com/huggingface/transformers
Python
š¤ Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported35 Subscribers
View all SubscribersAdd a CodeTriage badge to transformers
Help out
- Issues
- Uniform kwargs for processors
- Initializes generators in trainer.py with the device specified in selā¦
- Transformer.Trainer fails in creating optimizer for optim adamw_torch_fused when launched with deepspeed.
- AddĀ cosine_with_min_lr_schedule_with_warmup_lr_rateĀ scheduler in Trainer
- Load fsdp+lora checkpoint error
- Add support for XTR
- Generate: XGLM can generate with inputs_embeds
- Fix some FastSpeechConformer2 failing tests
- NotImplementedError: ggml_type 3 not implemented
- Add NSP Labels Handling to DataCollatorForWholeWordMask for Simultaneous WWM and NSP Pre-training
- Docs
- Python not yet supported