lorax
https://github.com/predibase/lorax
Python
Multi-LoRA inference server that scales to 1000s of fine-tuned LLMs
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported1 Subscribers
Add a CodeTriage badge to lorax
Help out
- Issues
- Unexpected response with long-context model (Phi-3)
- Otel v2
- Phi 3.5 vision (4B model)
- Not able to run source code
- Performance issues on AWQ and Lora
- Running several adapters on the same input
- seems like when max total token is so huge like 130000, and in the request if there is no max new token the response will be wrong
- Fail to run server with prefix-caching option
- The server is failing to run
- Updated the documentaion about status code
- Docs
- Python not yet supported