deepspeed
https://github.com/microsoft/deepspeed
Python
DeepSpeed is a deep learning optimization library that makes distributed training easy, efficient, and effective.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported9 Subscribers
Add a CodeTriage badge to deepspeed
Help out
- Issues
- [BUG] Problems with MiCS training
- [BUG] 'type:transformer' partitioning doesn't ensure non-zero parameters on each pipeline rank.
- Does deepspeed support qdq model used pytorch-quantization for training?
- [REQUEST] Should MoE be parameter groups be partitionable by expert_group_name or expert number?
- [BUG] The NCCL timed out while using the zero3 model. How can I solve this problem?
- [BUG] qgZ doesn't work for odd number of nodes
- [BUG] Using Zero++, evaluation loss is high and evaluate accuracy is always 0
- How can we set zero2 not to save the overall weight
- [BUG] mutil optimizer param group
- [BUG] 1bit-Adam is not compatible with ZeRO
- Docs
- Python not yet supported