optimum
https://github.com/huggingface/optimum
Python
🚀 Accelerate training and inference of 🤗 Transformers and 🤗 Diffusers with easy to use hardware optimization tools
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported2 Subscribers
Add a CodeTriage badge to optimum
Help out
- Issues
- Trying to export a cohere model, that is a custom or unsupported architecture, but no custom onnx configuration was passed as `custom_onnx_configs`.
- RuntimeError: Expected all tensors to be on the same device, but found at least two devices
- onnx export for cuda does not work
- onnx optimum ORTOptimizer inference runs slower than setfit.export_onnx runtime.InferenceSession inference
- Add support for porting CLIPVisionModelWithProjection
- Phi-3 support for openvino export not working
- Not able to export a transformer model to onxx (SJ-Ray/Re-Punctuate)
- Unable to generate question-answering model for Llama and there is also no list of what are the supported models for question-answering
- Add LLava ONNX export has a problem
- Add support for nomic-ai/nomic-embed-text-v1.5 model
- Docs
- Python not yet supported