deepspeed
https://github.com/microsoft/deepspeed
Python
DeepSpeed is a deep learning optimization library that makes distributed training easy, efficient, and effective.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported11 Subscribers
Add a CodeTriage badge to deepspeed
Help out
- Issues
- Flops Profiler
- [REQUEST] Replace reduce in ZERO 1/2/3 with reduce_scatter
- [BUG] zero3 hang during inference, need to detach part of computational graph, .detach()/torch.no_grad do not work.
- configuration setting problems for parameters partitioning in training
- Unpin tests that previously used a pinned version of transformers
- [BUG] Logits are always ZERO(0) at first pass when using ZERO++
- [BUG] Getting this error: NotImplementedError: Cannot copy out of meta tensor; no data!
- [BUG] `reduce_bucket_size` influences training convergence of Zero2
- [BUG] Circular import error with PyTorch nightly
- [BUG] Trainer saves global_steps300 in LoRA training with deepspeed
- Docs
- Python not yet supported