transformers
https://github.com/huggingface/transformers
Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported39 Subscribers
View all SubscribersAdd a CodeTriage badge to transformers
Help out
- Issues
- Trainer: TensorBoardCallback not working for "on_save" and "on_save_end" events
- Help Understanding Beam Search Scores in Hugging Face (LLaMA + LoRA)
- from_pretrained fails to save weights.py and layers.py into cache, therefore fails to find them in cache
- The argument "dim" is gone from LlamaRotaryEmbedding initializer. Intentional?
- running utills.fx.symbolic_trace on gp2 raised an error: torch.fx.proxy.TraceError: Proxy object cannot be iterated, which does not occur in the previous version
- Unsupported: hasattr SkipFunctionVariable when i compile the mixtral model with muti-gpus
- The Phi model does not have lm_head bias after upgraded to v4.48.0
- Still more model refactors!
- Uniformize LlavaNextVideoProcessor kwargs
- Docs
- Python not yet supported