deepspeed
https://github.com/microsoft/deepspeed
Python
DeepSpeed is a deep learning optimization library that makes distributed training easy, efficient, and effective.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported9 Subscribers
Add a CodeTriage badge to deepspeed
Help out
- Issues
- Confusing memory usage estimations
- [BUG] Error: Preparing DeepSpeed ZeRO stage 2 optimizer HIP error on AMD MI200 GPUs
- A possible solution to resolve the issue of deepspeed.initialize() hanging.
- [BUG] Error: Sizes of tensors must match except in dimension 1. Expected size 64 but got size 512 for tensor number 2 in the list.
- TEST: PR HIP-ifying and running the bias_activations kernel on AMD
- [BUG] Pipeline Parallel: Only few trainable parameters, Only GPU0 has parameters whose requires_grad is "True". Cause ValueError: optimizer got an empty parameter list when deepspeed.initialize
- [BUG]Loss decreases cyclically with epoch as the cycle
- [BUG] Profiler records values that differ from transformers
- [BUG] Problems with MiCS training
- [BUG] 'type:transformer' partitioning doesn't ensure non-zero parameters on each pipeline rank.
- Docs
- Python not yet supported