Capability
Batch Embedding Computation
20 artifacts provide this capability.
Want a personalized recommendation?
Find the best match →Top Matches
via “batch-embedding-computation-with-pooling-strategies”
sentence-similarity model by undefined. 3,42,53,353 downloads.
Unique: Implements dynamic padding with configurable pooling strategies (mean, max, CLS) optimized for sentence-level embeddings; mean pooling strategy was specifically tuned on 215M+ sentence pairs to balance token importance without task-specific weighting
vs others: Achieves 3-5x higher throughput than cross-encoder models on batch embedding tasks due to symmetric architecture; outperforms naive pooling approaches by 2-3% on similarity tasks through contrastive training on diverse pooling objectives