nomic-embed-text-v2-moeModel48/100 via “multilingual semantic understanding with language-agnostic representations”
sentence-similarity model by undefined. 22,72,861 downloads.
Unique: Uses language-family-aware expert routing where different experts specialize in Romance languages, Germanic languages, East Asian languages, and Semitic languages, creating a hierarchical multilingual understanding. This differs from standard multilingual models that treat all languages equally; the expert specialization enables better within-family semantic understanding while maintaining cross-family alignment through the shared embedding space.
vs others: Achieves better cross-lingual retrieval performance than dense multilingual models (e.g., multilingual-e5-large) on low-resource language pairs due to expert specialization, while maintaining efficiency through sparse routing. Outperforms language-specific embedding models on cross-lingual tasks without requiring separate model management per language.