transformers
https://github.com/huggingface/transformers
Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported35 Subscribers
View all SubscribersAdd a CodeTriage badge to transformers
Help out
- Issues
- Add logit scaling sdpa using `FlexAttention` for Gemma2
- How to use 【examples/pytorch/contrastive-image-text】 to inter inference
- Added resource class configuration option for `check_circleci_user` job
- Update modeling_seamless_m4t.py SeamlessM4TForSpeechToText needs text_decoder it should not be in _keys_to_ignore_on_load_missing.
- Fix assertion and value errors in sam-vit-h convertion script
- Integrate Liger (Linkedin GPU Efficient Runtime) Kernel to HuggingFace
- feat: DeepSeekMoE
- Uniformize model processors (models *with* special arg names)
- Uniformize model processors (models w/o special arg names)
- split head_dim from hidden_size for llama like gemma or mistral
- Docs
- Python not yet supported