transformers
https://github.com/huggingface/transformers
Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported30 Subscribers
View all SubscribersAdd a CodeTriage badge to transformers
Help out
- Issues
- ReduceLROnPlateau version of get_constant_schedule_with_warmup
- Mistral in Flax: generation is slow, JIT fails
- ProcessorMixin doesn't properly instantiate image processors
- The results of run_mae.py pre-training were abnormal
- Transformers Agents Collab Notebook - OpenAI Run Mode Issues
- Add Mixture of Tokens model
- MPNet doesn't have an implemented LMHead subclass
- add_code_shell
- hyperparameter_serch() does not consider LoRA parameters like r to be finetuned.
- Please correct the following DeepSpeed config values that mismatch TrainingArguments values: scheduler.params.total_num_steps=0 vs hf num_training_steps (calculated)= 260
- Docs
- Python not yet supported