transformers
https://github.com/huggingface/transformers
Python
š¤ Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported35 Subscribers
View all SubscribersAdd a CodeTriage badge to transformers
Help out
- Issues
- Add Microsoft CLAP model
- xpu device is not used running pipeline(device_map="auto")
- A bug that may cause device inconsistency
- RuntimeError: Expected a 'mps:0' generator device but found 'cpu'
- Set position_embeddings in BertEmbeddings for absolute position type only, to avoid unused parameters
- Added french version of preprocessing data and corrected English version
- Uniform kwargs for processors
- Initializes generators in trainer.py with the device specified in selā¦
- Transformer.Trainer fails in creating optimizer for optim adamw_torch_fused when launched with deepspeed.
- AddĀ cosine_with_min_lr_schedule_with_warmup_lr_rateĀ scheduler in Trainer
- Docs
- Python not yet supported