transformers
https://github.com/huggingface/transformers
Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported35 Subscribers
View all SubscribersAdd a CodeTriage badge to transformers
Help out
- Issues
- The model's address is https://huggingface.co/Xenova/nllb-200-distilled-600M/tree/main/onnx。I don't know how to load encode.onnx and decoder.onnx, and successfully translate a sentence into another language. Can you help me write an inference code to achieve the translation effect through the encoder and decoder? thank you
- CI: avoid human error, automatically infer generative models
- Support TF32 flag for MUSA backend
- Add warning and info message for beta and gamma parameters
- MultiTask Classification and label_names on Trainer
- Supporting Padding in llava processor
- Is it possible to infer the model separately through encoder.onnx and decoder.onnx
- Add LlavaImageProcessor
- [Zero-shot image classification pipeline] Remove tokenizer_kwargs
- Multi-GPU setup: indices should be either on cpu or on the same device as the indexed tensor (cuda:1)
- Docs
- Python not yet supported