transformers
https://github.com/huggingface/transformers
Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported35 Subscribers
View all SubscribersAdd a CodeTriage badge to transformers
Help out
- Issues
- MllamaForCausalLM not returning past_key_values even with use_cache=True
- Missing timestamp offset using Whisper with pipeline and sequential decoding
- Documentation improvement on tiktoken integration
- Inconsistent Output with and without Prompt Caching in Llama-3.1-8B-Instruct.
- Beam search incorrectly sampled 2*num_beams tokens instead of num_beams tokens.
- problem with rag
- Update language_modeling.py
- Update stale.py
- feat: add support for tensor parallel using Pytorch 2.0
- requests.exceptions.ReadTimeout on already cached/downloaded model using SentenceTransformers
- Docs
- Python not yet supported