deepspeed
https://github.com/microsoft/deepspeed
Python
DeepSpeed is a deep learning optimization library that makes distributed training easy, efficient, and effective.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported9 Subscribers
Add a CodeTriage badge to deepspeed
Help out
- Issues
- Different seeds are giving the exact same loss on Zero 1,2 and 3 during multi gpu training [BUG]
- Issue with LoRA Tuning on llama3-70b using PEFT and TRL's SFTTrainer
- [BUG] 1-bit LAMB not compatible with bf16
- [BUG] Regression: 0.14.3 causes grad_norm to be zero
- [ERROR] [launch.py:321:sigkill_handler] exits with return code = -11
- [BUG] inference ValueError
- [BUG] Using and Building DeepSpeedCPUAdam
- Getting parameters of embeddings (safe_get_local_fp32_param)and setting the weight of embeddings (safe_set_local_fp32_param) does not work (bug?).
- [BUG] 'Invalidate trace cache' with Seq2SeqTrainer+predict_with_generate+Zero3
- Fail to use zero_init to construct llama2 with deepspeed zero3 and bnb!
- Docs
- Python not yet supported