optimum
https://github.com/huggingface/optimum
Python
🚀 Accelerate training and inference of 🤗 Transformers and 🤗 Diffusers with easy to use hardware optimization tools
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported2 Subscribers
Add a CodeTriage badge to optimum
Help out
- Issues
- Add support for training with Wav2Vec2EncoderLayerBetterTransformer
- Issue Report: Unable to Export Qwen Model to ONNX Format in Optimum
- Cannot export jinaai models to onnx format because the model is > 2Gb
- CallbackManager error
- Add LLava ONNX export
- Downloading model from Huggingface Hub does not download ort_config.json
- NotImplementedError: The model type esm is not yet supported to be used with BetterTransformer.
- support for sfr embedding mistral & nomic models
- WIP: add support for Stable Diffusion safety checker
- How to convert a model(tf_model.h5) with tokenizer folder to the onnx format
- Docs
- Python not yet supported