sentence-transformers
https://github.com/ukplab/sentence-transformers
Python
Sentence Embeddings with BERT & XLNet
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported13 Subscribers
Add a CodeTriage badge to sentence-transformers
Help out
- Issues
- Changing the `torch.dtype` in `model_kwargs` does not change the tokenizer output dtype
- # Pass Image color channels information to Transformers
- Multi-GPU Training with DP or DDP combined with reentrant gradient checkpointing dies at first backward pass
- use_amp or apex not effective
- `local_files_only=False` makes `SentenceTransformer.__init__` 6-7x slower
- MatryoshkaLoss as 'module' object is not callable
- Batching Encoding gives drastically different results even with atol=1
- ValueError: You are attempting to perform batched generation with padding_side='right' this may lead to unexpected behaviour for Flash Attention version of Model. Make sure to call `tokenizer.padding_side = 'left'` before tokenizing the input.
- Unexpected results using `quantize_embeddings`
- Multi task Training global step mistake
- Docs
- Python not yet supported