transformers
https://github.com/huggingface/transformers
Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported35 Subscribers
View all SubscribersAdd a CodeTriage badge to transformers
Help out
- Issues
- Fix convert_tokens_to_string when decoder is None
- should use eigvalsh instead of eigvals for fast&stable covariance matrix diagonalization
- Feature to configure `stop_strings` in `generation_config.json` or other config files
- RuntimeError: linalg.vector_norm: Expected a floating point or complex tensor as input. Got Long
- when model.generate with num_beams=2 and num_return_sequences=2,the output seqs are different from input_ids of stopping_criteria
- Unhandled 'num_items_in_batch' in Mistral model
- Handle num_items_in_batch in Mistral's forward
- Mismatched keyword argument names of llama make GA fix invalid
- uniformize kwargs for SAM
- Does per_device_train_batch_size have a loss error similar to that of GA?
- Docs
- Python not yet supported