Capability
Neural Machine Translation
20 artifacts provide this capability.
Want a personalized recommendation?
Find the best match →Top Matches
via “neural machine translation with task-prefix conditioning”
translation model by undefined. 14,15,793 downloads.
Unique: Uses task-prefix conditioning ('translate X to Y: ') rather than separate translation-specific model heads or language-pair-specific parameters. Leverages shared multilingual encoder-decoder weights learned from C4 denoising, enabling zero-shot translation to unseen pairs through learned cross-lingual transfer.
vs others: Simpler and more parameter-efficient than separate language-pair-specific NMT models (e.g., MarianMT), while achieving comparable BLEU scores on WMT benchmarks for high-resource pairs; enables single-model deployment vs model-per-pair architecture.