transformers
https://github.com/huggingface/transformers
Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported35 Subscribers
View all SubscribersAdd a CodeTriage badge to transformers
Help out
- Issues
- RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cuda:0 and cuda:1! (when checking argument for argument mat1 in method wrapper_CUDA_addmm)
- Video-Llava model's generation error due to causal mask shape mismatch
- TypeError: Accelerator.__init__() got an unexpected keyword argument 'dispatch_batches'
- Fix Import Error for Trainer in run_ner.py
- WanDB callback fails on training end when eval dataset is provided
- FSDP with SFTrainer: expected dtype float for `end` but got dtype c10::BFloat16
- Allow handling files as args for a tool created with Tool.from_space
- Transformers 4.46.2 breaks model loading for Llama 3.2 90B Vision Instruct
- CPU processing is extremely slow for models loaded with `torch_dtype = torch.float16`
- top-p sampling gives different results even after fixing all random seeds
- Docs
- Python not yet supported