deepspeed
https://github.com/microsoft/deepspeed
Python
DeepSpeed is a deep learning optimization library that makes distributed training easy, efficient, and effective.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported9 Subscribers
Add a CodeTriage badge to deepspeed
Help out
- Issues
- AssertionError: no sync context manager is incompatible with gradientpartitioning logic of ZeRo stage 3
- [REQUEST] Let ZeRO-offload use CPU and GPU parallelly
- Stage3: Use new torch grad accumulation hooks API
- [BUG] [Fix-Suggested] KeyError in stage_1_and_2.py Due to Optimizer-Model Parameter Mismatch
- [BUG] [Fix-Suggested] Checkpoint Inconsistency When Freezing Model Parameters Before `deepspeed.initialize`
- [BUG] [Fix-Suggested] ZeRO Stage 3 Overwrites Module ID Attribute Causing Incorrect Expert Placement on GPUs
- BLOOM fixes for DS Legacy Inference
- [BUG] clip_grad_norm for zero_optimization mode is not working
- grad is None
- Check transformers version in BLOOM for inference v1
- Docs
- Python not yet supported